Government failure and the California drought

Yesterday the New York Times had a story about California’s four-year drought, complete with apocalyptic imagery and despair over whether conservation would succeed. Alex Tabarrok used that article as a springboard for a very informative and link-filled post at Marginal Revolution digging into the ongoing California drought, including some useful data and comment participation from David Zetland:

California has plenty of water…just not enough to satisfy every possible use of water that people can imagine when the price is close to zero. As David Zetland points out in an excellent interview with Russ Roberts, people in San Diego county use around 150 gallons of water a day. Meanwhile in Sydney Australia, with a roughly comparable climate and standard of living, people use about half that amount. Trust me, no one in Sydney is going thirsty.

California’s drought is a failure to implement institutional change consistent with environmental and economic sustainability. One failure that Alex discusses (and that every economist who pays attention to water agrees on) is the artificially low retail price of water, both to residential consumers and agricultural consumers. And Alex combines David’s insights with some analysis from Matthew Kahn to conclude that the income effect arguments against higher water prices have no analytical or moral foundation — San Diego residents pay approximately 0.5 cents per gallon (yes, that’s half a penny per gallon) for their tap water, so even increasing that price by 50% would only decrease incomes by about 1%.

There’s another institutional failure in California, which is the lack of water markets and the fact that the transfer of water across different uses has been illegal. Farmers have not been able to sell any of their agricultural allocation to other users, even if the value of the water in those other uses is higher. According to the California Water Code as summarized by the State Water Resource Board,

In recent years, temporary transfers of water from one water user to another have been used increasingly as a way of meeting statewide water demands, particularly in drought years. Temporary transfers of post 1914 water rights are initiated by petition to the State Board. If the Board finds the proposed transfer will not injure any other legal user of water and will not unreasonably affect fish, wildlife or other instream users, then the transfer is approved. If the Board cannot make the required findings within 60 days, a hearing is held prior to Board action on the proposed transfer. Temporary transfers are defined to be for a period of one year or less. A similar review and approval process applies to long-term transfers in excess of one year.

Thus in a semi-arid region like California there’s a large rice industry, represented in Sacramento by an active trade association. Think of this rule through the lens of permissionless innovation — these farmers have to ask permission before they can make temporary transfers, Board approval is not guaranteed, and they are barred from making permanent transfers of their use rights. One justification for this rule is the economic viability of small farming communities, which the water bureaucrats believe would suffer if farmers sold their water rights and exited the industry. This narrow view of economic viability, assuming away the dynamism that means that residents of those communities could create more valuable lives for themselves and others if they use their resources and talents differently, is a depressing but not surprising piece of bureaucratic hubris.

Not surprisingly in year 4 of a drought, these temporary water transfers are increasing in value. Just yesterday, the Metropolitan Water District of Southern California made an offer to the Western Canal Water District in Northern California at the highest prices yet.

The offer from the Metropolitan Water District of Southern California and others to buy water from the Sacramento Valley for $700 per acre-foot reflects how dire the situation is as the state suffers through its fourth year of drought. In 2010 — also a drought year — it bought water but only paid between $244 and $300 for the same amount. The district stretches from Los Angeles to San Diego County. …

The offer is a hard one to turn down for farmers like Tennis, who also sits on the Western Canal Water District Board. Farmers can make around $900 an acre, after costs, growing rice, Tennis said. But because each acre of rice takes a little more than 3 acre-feet of water, they could make around $2,100 by selling the water that would be used. …

If the deal is made, Tennis said farmers like himself will treat it as a windfall rather than a long-term enterprise.

“We’re not water sellers, we’re farmers,” he said.

And that’s the problem.

Forthcoming paper: Implications of Smart Grid Innovation for Organizational Models in Electricity Distribution

Back in 2001 I participated in a year-long forum on the future of the electricity distribution model. Convened by the Center for the Advancement of Energy Markets, the DISCO of the Future Forum brought together many stakeholders to develop several scenarios and analyze their implications (and several of those folks remain friends, playmates in the intellectual sandbox, and commenters here at KP [waves at Ed]!). As noted in this 2002 Electric Light and Power article,

Among the 100 recommendations that CAEM discusses in the report, the forum gave suggestions ranging from small issues-that regulators should consider requiring a standard form (or a “consumer label”) on pricing and terms and conditions of service for small customers to be provided to customers at the tie of the initial offer (as well as upon request)-to larger ones, including the suggestions that regulators should establish a standard distribution utility reporting format for all significant distribution upgrades and extensions, and that regulated DISCOs should be permitted to recover their reasonable costs for development of grid interface designs and grid interconnect application review.

“The technology exists to support a competitive retail market responsive to price signals and demand constraints,” the report concludes. “The extent to which the market is opened to competition and the extent to which these technologies are applied by suppliers, DISCOS and customers will, in large part, be determined by state legislatures and regulators.”

Now in 2015, technological dynamism has brought to a head many of the same questions, regulatory models, and business models that we “penciled out” 14 years ago.

In a new paper, forthcoming in the Wiley Handbook of Smart Grid Development, I grapple with that question: what are the implications of this technological dynamism for the organizational form of the distribution company? What transactions in the vertically-integrated supply chain should be unbundled, what assets should it own, and what are the practical policy issues being tackled in various places around the world as they deal with these questions? I analyze these questions using a theoretical framework from the economics of organization and new institutional economics. And I start off with a historical overview of the industry’s technology, regulation, and organizational model.

Implications of Smart Grid Innovation for Organizational Models in Electricity Distribution

Abstract: Digital technologies from outside the electricity industry are prompting changes in both regulatory institutions and electric utility business models, leading to the disaggregation or unbundling of historically vertically integrated electricity firms in some jurisdictions and not others, and simultaneously opening the door for competition with the traditional electric utility business. This chapter uses the technological and organizational history of the industry, combined with the transactions cost theory of the firm and of vertical integration, to explore the implications of smart grid technologies for future distribution company business models. Smart grid technologies reduce transactions costs, changing economical firm boundaries and reducing the traditional drivers of vertical integration. Possible business models for the distribution company include an integrated utility, a network manager, or a coordinating platform provider.

The New York REV and the distribution company of the future

We live in interesting times in the electricity industry. Vibrant technological dynamism, the very dynamism that has transformed how we work, play, and live, puts increasing pressure on the early-20th-century physical network, regulatory model, and resulting business model of the vertically-integrated distribution utility.

While the utility “death spiral” rhetoric is overblown, these pressures are real. They reflect the extent to which regulatory and organizational institutions, as well as the architecture of the network, are incompatible with a general social objective of not obstructing such innovation. Boosting my innovation-focused claim is the synthesis of relatively new environmental objectives into the policy mix. Innovation, particularly innovation at the distribution edge, is an expression of human creativity that fosters both older economic policy objectives of consumer protection from concentrations of market power and newer environmental policy objectives of a cleaner and prosperous energy future.

But institutions change slowly, especially bureaucratic institutions where decision-makers have a stake in the direction and magnitude of institutional change. Institutional change requires imagination to see a different world as possible, practical vision to see how to get from today’s reality toward that different world, and courage to exercise the leadership and navigate the tough tradeoffs that inevitably arise.

That’s the sense in which the New York Reforming the Energy Vision (REV) proceeding of the New York State Public Service Commission (Greentech) is compelling and encouraging. Launched in spring 2014 with a staff paper, REV is looking squarely at institutional change to align the regulatory framework and the business model of the distribution utility more with these policy objectives and with fostering innovation. As Katherine Tweed summarized the goals in the Greentech Media article linked above,

The report calls for an overhaul of the regulation of the state’s distribution utilities to achieve five policy objectives:

  • Increasing customer knowledge and providing tools that support effective management of their total energy bill
  • Market animation and leverage of ratepayer contributions
  • System-wide efficiency
  • Fuel and resource diversity
  • System reliability and resiliency

The PSC acknowledges that the current ratemaking procedure simply doesn’t work and that the distribution system is not equipped for the changes coming to the energy market. New York is already a deregulated market in which distribution is separated from generation and there is retail choice for electricity. Although that’s a step beyond many states, it is hardly enough for what’s coming in the market.

Last week the NY PSC issued its first order in the REV proceeding, that the incumbent distribution utilities will serve as distributed system platform providers (DSPPs) and should start planning accordingly. As noted by RTO Insider,

The framework envisions utilities serving a central role in the transition as distributed system platform (DSP) providers, responsible for integrated system planning and grid and market operations.

In most cases, however, utilities will be barred from owning distributed energy resources (DER): demand response, distributed generation, distributed storage and end-use energy efficiency.

The planning function will be reflected in the utilities’ distributed system implementation plan (DSIP), a multi-year forecast proposing capital and operating expenditures to serve the DSP functions and provide third parties the system information they need to plan for market participation.

A platform business model is not a cut and dry thing, though, especially in a regulated industry where the regulatory institutions reinforced and perpetuated a vertically integrated model for over a century (with that model only really modified due to generator technological change in the 1980s leading to generation unbundling). Institutional design and market design, the symbiosis of technology and institutions, will have to be front and center, if the vertically-integrated uni-directional delivery model of the 20th century is to evolve into a distribution facilitator of the 21st century.

In fact, the institutional design issues at stake here have been the focus of my research during my sabbatical, so I hope to have more to add to the discussion based on some of my forthcoming work on the subject.

Geoff Manne in Wired on FCC Title II

Friend of Knowledge Problem Geoff Manne had a thorough opinion piece in Wired yesterday on the FCC’s Title II Internet designation. Well worth reading. From the “be careful what you wish for” department:

Title II (which, recall, is the basis for the catch-all) applies to all “telecommunications services”—not just ISPs. Now, every time an internet service might be deemed to transmit a communication (think WhatsApp, Snapchat, Twitter…), it either has to take its chances or ask the FCC in advance to advise it on its likely regulatory treatment.

That’s right—this new regime, which credits itself with preserving “permissionless innovation,” just put a bullet in its head. It puts innovators on notice, and ensures that the FCC has the authority (if it holds up in court) to enforce its vague rule against whatever it finds objectionable.

And that’s even at the much-vaunted edge of the network that such regulation purports to protect.

Asking in advance. Nope, that’s not gonna slow innovation, not one bit …

FCC Title II and raising rivals’ costs

As the consequences of the FCC vote to classify the Internet as a Title II service start to sink in, here are a couple of good commentaries you may not have seen. Jeffrey Tucker’s political economy analysis of the Title II vote as a power grab is one of the best overall analyses of the situation that I’ve seen.

The incumbent rulers of the world’s most exciting technology have decided to lock down the prevailing market conditions to protect themselves against rising upstarts in a fast-changing market. To impose a new rule against throttling content or using the market price system to allocate bandwidth resources protects against innovations that would disrupt the status quo.

What’s being sold as economic fairness and a wonderful favor to consumers is actually a sop to industrial giants who are seeking untrammeled access to your wallet and an end to competitive threats to market power.

What’s being sold as keeping the Internet neutral for innovation at the edge of the network is substantively doing so by encasing the existing Internet structure and institutions in amber, which yields rents for its large incumbents. Some of those incumbents, like Comcast and Time-Warner, have achieved their current (and often resented by customers) market power not through rivalrous market competition, but through receiving municipal monopoly cable franchises. Yes, these restrictions raise their costs too, but as large incumbents they are better positioned to absorb those costs than smaller ISPs or other entrants would be. It’s naive to believe that regulations of this form will do much other than softening rivalry in the Internet itself.

But is there really that much scope for innovation and dynamism within the Internet? Yes. Not only with technologies, but also with institutions, such as interconnection agreements and peering agreements, which can affect packet delivery speeds. And, as Julian Sanchez noted, the Title II designation takes these kinds of innovations off the table.

But there’s another kind of permissionless innovation that the FCC’s decision is designed to preclude: innovation in business models and routing policies. As Neutralites love to point out, the neutral or “end-to-end” model has served the Internet pretty well over the past two decades. But is the model that worked for moving static, text-heavy webpages over phone lines also the optimal model for streaming video wirelessly to mobile devices? Are we sure it’s the best possible model, not just now but for all time? Are there different ways of routing traffic, or of dividing up the cost of moving packets from content providers, that might lower costs or improve quality of service? Again, I’m not certain—but I am certain we’re unlikely to find out if providers don’t get to run the experiment.

You should probably raise prices a bit during emergencies

At the Master Resource blog today: “In Defense of Price ‘Gouging’ (lines and shortages are uneconomic, discriminatory).”

In the essay I emphasize the unintended bias that results when consumer demand surges and supplies are tight, as for example when winter storm forecasts lead consumers to rush to the grocery store for bread and milk. Because retailers rarely raise prices in such situations, shortages are the predictable result. The burden of those shortages isn’t spread randomly, however, but rather tends to fall more heavily on certain segments of the population.

When emergencies happen (or are first forecasted) some consumers are readily able to rush to the store while other consumers are not so lucky. The early bird gets the bread and milk and eggs, the late arrival finds little or nothing available….

Households with … “high opportunity costs of shopping,” for example those households with infants or with both parents working full time, were more likely to miss out on the opportunity to buy foods before they ran out. It is easy to see that elderly and mobility-impaired consumers, too, would be more likely to be shut out by any sudden rush of consumers to the store after a disaster.

Higher prices would discourage over-buying and help ensure that useful consumer goods get distributed to more households, not just the households best able to rush to the store.

We can debate how significant the effect is, and I do not argue that raising prices solves all interesting problems, but a modest increase in consumer prices would likely be an improvement.

How about grocery stores imposing a 5 percent storm surcharge that goes to charity, with an ability to opt out? Maybe a sign next to the bread shelves saying “Help those most hurt by the storm: We recommend you donate 25 cents to [name of charity] for every loaf of bread you buy. Our checkout staff will be happy to help you with your donation!”

While I have targeted many complaints at anti-price gouging laws here at Knowledge Problem, in the Master Resource post I broaden my focus a bit to encompass the consumer sentiment against price increases during emergencies. We need a social change to enable markets to work as best they can in such situations. More effective markets aid by reducing the scope of the problems to be addressed by charitable work (as Dwight Lee argues in his essay in Regulation magazine – see the related discussion and link in the Master Resource post).

The quoted part above in part relies on research done on consumer purchases in Japan after the Great East Japan Earthquake of March 2011:  Naohito Abe, Chiaki Moriguchi, and Noriko Inakura, “The Effects of Natural Disasters on Prices and Purchasing Behaviors: The Case of Great East Japan Earthquake.” DP14-1, RCESR Discussion Paper Series, September 2014.

How can the market price of oil fall so far so fast?

If the oil market is reasonably efficient, then the price of a barrel of oil should reflect something like the cost of production of the highest-cost barrel of oil needed to just satisfy demand. In other words, the market price of oil should reflect the marginal cost of production.

The price of oil on the world market was about $110 per barrel in June 2014 and now sits just under $50 per barrel. Can it be possible that the marginal cost of producing oil was $110 per barrel in June 2014 and is only $50 per barrel in January 2015?

Yes.

Here is how: in the first half of June 2014 oil consumption was very high relative to the then-existing world oil production capability. In addition, existing oil production capability is always declining as producing fields deplete. The marginal cost of a barrel of oil under such tight market conditions has to cover the capital cost of developing new resources as well as the operating costs.

Toward the end of 2014 additions to world oil production capability exceeded growth in consumption, meaning additions to production capability were no longer necessary, meaning the marginal cost of producing the last barrel of oil no longer needed to cover that capital cost. Sure, some oil company somewhere had to make the capital investment necessary to develop the resource, but most of those costs are sunk and competition in the market means they cannot make some consumer cover those costs. The market price under today’s looser market conditions only needs to cover the operating costs of production.

Given the large sunk cost component of investment in developing oil production capability, it is quite possible that the oil market was efficient at $110 per barrel and remains operating efficiently today with prices under $50 per barrel.

NOTE: Related data on world oil production and consumption is available in the U.S. Department of Energy’s Short Term Energy Outlook. Commentary prompting this explainer comes from the UC-Berkeley Energy Institute at Haas blog.