The New York REV and the distribution company of the future

We live in interesting times in the electricity industry. Vibrant technological dynamism, the very dynamism that has transformed how we work, play, and live, puts increasing pressure on the early-20th-century physical network, regulatory model, and resulting business model of the vertically-integrated distribution utility.

While the utility “death spiral” rhetoric is overblown, these pressures are real. They reflect the extent to which regulatory and organizational institutions, as well as the architecture of the network, are incompatible with a general social objective of not obstructing such innovation. Boosting my innovation-focused claim is the synthesis of relatively new environmental objectives into the policy mix. Innovation, particularly innovation at the distribution edge, is an expression of human creativity that fosters both older economic policy objectives of consumer protection from concentrations of market power and newer environmental policy objectives of a cleaner and prosperous energy future.

But institutions change slowly, especially bureaucratic institutions where decision-makers have a stake in the direction and magnitude of institutional change. Institutional change requires imagination to see a different world as possible, practical vision to see how to get from today’s reality toward that different world, and courage to exercise the leadership and navigate the tough tradeoffs that inevitably arise.

That’s the sense in which the New York Reforming the Energy Vision (REV) proceeding of the New York State Public Service Commission (Greentech) is compelling and encouraging. Launched in spring 2014 with a staff paper, REV is looking squarely at institutional change to align the regulatory framework and the business model of the distribution utility more with these policy objectives and with fostering innovation. As Katherine Tweed summarized the goals in the Greentech Media article linked above,

The report calls for an overhaul of the regulation of the state’s distribution utilities to achieve five policy objectives:

  • Increasing customer knowledge and providing tools that support effective management of their total energy bill
  • Market animation and leverage of ratepayer contributions
  • System-wide efficiency
  • Fuel and resource diversity
  • System reliability and resiliency

The PSC acknowledges that the current ratemaking procedure simply doesn’t work and that the distribution system is not equipped for the changes coming to the energy market. New York is already a deregulated market in which distribution is separated from generation and there is retail choice for electricity. Although that’s a step beyond many states, it is hardly enough for what’s coming in the market.

Last week the NY PSC issued its first order in the REV proceeding, that the incumbent distribution utilities will serve as distributed system platform providers (DSPPs) and should start planning accordingly. As noted by RTO Insider,

The framework envisions utilities serving a central role in the transition as distributed system platform (DSP) providers, responsible for integrated system planning and grid and market operations.

In most cases, however, utilities will be barred from owning distributed energy resources (DER): demand response, distributed generation, distributed storage and end-use energy efficiency.

The planning function will be reflected in the utilities’ distributed system implementation plan (DSIP), a multi-year forecast proposing capital and operating expenditures to serve the DSP functions and provide third parties the system information they need to plan for market participation.

A platform business model is not a cut and dry thing, though, especially in a regulated industry where the regulatory institutions reinforced and perpetuated a vertically integrated model for over a century (with that model only really modified due to generator technological change in the 1980s leading to generation unbundling). Institutional design and market design, the symbiosis of technology and institutions, will have to be front and center, if the vertically-integrated uni-directional delivery model of the 20th century is to evolve into a distribution facilitator of the 21st century.

In fact, the institutional design issues at stake here have been the focus of my research during my sabbatical, so I hope to have more to add to the discussion based on some of my forthcoming work on the subject.

Geoff Manne in Wired on FCC Title II

Friend of Knowledge Problem Geoff Manne had a thorough opinion piece in Wired yesterday on the FCC’s Title II Internet designation. Well worth reading. From the “be careful what you wish for” department:

Title II (which, recall, is the basis for the catch-all) applies to all “telecommunications services”—not just ISPs. Now, every time an internet service might be deemed to transmit a communication (think WhatsApp, Snapchat, Twitter…), it either has to take its chances or ask the FCC in advance to advise it on its likely regulatory treatment.

That’s right—this new regime, which credits itself with preserving “permissionless innovation,” just put a bullet in its head. It puts innovators on notice, and ensures that the FCC has the authority (if it holds up in court) to enforce its vague rule against whatever it finds objectionable.

And that’s even at the much-vaunted edge of the network that such regulation purports to protect.

Asking in advance. Nope, that’s not gonna slow innovation, not one bit …

You should probably raise prices a bit during emergencies

At the Master Resource blog today: “In Defense of Price ‘Gouging’ (lines and shortages are uneconomic, discriminatory).”

In the essay I emphasize the unintended bias that results when consumer demand surges and supplies are tight, as for example when winter storm forecasts lead consumers to rush to the grocery store for bread and milk. Because retailers rarely raise prices in such situations, shortages are the predictable result. The burden of those shortages isn’t spread randomly, however, but rather tends to fall more heavily on certain segments of the population.

When emergencies happen (or are first forecasted) some consumers are readily able to rush to the store while other consumers are not so lucky. The early bird gets the bread and milk and eggs, the late arrival finds little or nothing available….

Households with … “high opportunity costs of shopping,” for example those households with infants or with both parents working full time, were more likely to miss out on the opportunity to buy foods before they ran out. It is easy to see that elderly and mobility-impaired consumers, too, would be more likely to be shut out by any sudden rush of consumers to the store after a disaster.

Higher prices would discourage over-buying and help ensure that useful consumer goods get distributed to more households, not just the households best able to rush to the store.

We can debate how significant the effect is, and I do not argue that raising prices solves all interesting problems, but a modest increase in consumer prices would likely be an improvement.

How about grocery stores imposing a 5 percent storm surcharge that goes to charity, with an ability to opt out? Maybe a sign next to the bread shelves saying “Help those most hurt by the storm: We recommend you donate 25 cents to [name of charity] for every loaf of bread you buy. Our checkout staff will be happy to help you with your donation!”

While I have targeted many complaints at anti-price gouging laws here at Knowledge Problem, in the Master Resource post I broaden my focus a bit to encompass the consumer sentiment against price increases during emergencies. We need a social change to enable markets to work as best they can in such situations. More effective markets aid by reducing the scope of the problems to be addressed by charitable work (as Dwight Lee argues in his essay in Regulation magazine – see the related discussion and link in the Master Resource post).

The quoted part above in part relies on research done on consumer purchases in Japan after the Great East Japan Earthquake of March 2011:  Naohito Abe, Chiaki Moriguchi, and Noriko Inakura, “The Effects of Natural Disasters on Prices and Purchasing Behaviors: The Case of Great East Japan Earthquake.” DP14-1, RCESR Discussion Paper Series, September 2014.

How can the market price of oil fall so far so fast?

If the oil market is reasonably efficient, then the price of a barrel of oil should reflect something like the cost of production of the highest-cost barrel of oil needed to just satisfy demand. In other words, the market price of oil should reflect the marginal cost of production.

The price of oil on the world market was about $110 per barrel in June 2014 and now sits just under $50 per barrel. Can it be possible that the marginal cost of producing oil was $110 per barrel in June 2014 and is only $50 per barrel in January 2015?

Yes.

Here is how: in the first half of June 2014 oil consumption was very high relative to the then-existing world oil production capability. In addition, existing oil production capability is always declining as producing fields deplete. The marginal cost of a barrel of oil under such tight market conditions has to cover the capital cost of developing new resources as well as the operating costs.

Toward the end of 2014 additions to world oil production capability exceeded growth in consumption, meaning additions to production capability were no longer necessary, meaning the marginal cost of producing the last barrel of oil no longer needed to cover that capital cost. Sure, some oil company somewhere had to make the capital investment necessary to develop the resource, but most of those costs are sunk and competition in the market means they cannot make some consumer cover those costs. The market price under today’s looser market conditions only needs to cover the operating costs of production.

Given the large sunk cost component of investment in developing oil production capability, it is quite possible that the oil market was efficient at $110 per barrel and remains operating efficiently today with prices under $50 per barrel.

NOTE: Related data on world oil production and consumption is available in the U.S. Department of Energy’s Short Term Energy Outlook. Commentary prompting this explainer comes from the UC-Berkeley Energy Institute at Haas blog.

When does state utility regulation distort costs?

I suspect the simplest answer to the title question is “always.” Maybe the answer depends on your definition of “distort,” but both the intended and generally expected consequences of state utility rate regulation has always been to push costs to be something other than what would naturally emerge in the absence of rate regulation.

More substantive, though, is the analysis provided in Steve Cicala’s article in the January 2015 American Economic Review, “When Does Regulation Distort Costs? Lessons from Fuel Procurement in US Electricity Generation.” (here is an earlier ungated version of the paper.)

Here is a summary from the University of Chicago press release:

A study in the latest issue of the American Economic Review used recent state regulatory changes in electricity markets as a laboratory to evaluate which factors can contribute to a regulation causing a bigger mess than the problem it was meant to fix….

Cicala used data on almost $1 trillion worth of fuel deliveries to power plants to look at what happens when a power plant becomes deregulated. He found that the deregulated plants combined save about $1 billion a year compared to those that remained regulated. This is because a lack of transparency, political influence and poorly designed reimbursement rates led the regulated plants to pursue inefficient strategies when purchasing coal.

The $1 billion that deregulated plants save stems from paying about 12 percent less for their coal because they shop around for the best prices. Regulated plants have no incentive to shop around because their profits do not depend on how much they pay for fuel. They also are looked upon more favorably by regulators if they purchase from mines within their state, even if those mines don’t sell the cheapest coal. To make matters worse, regulators have a difficult time figuring out if they are being overcharged because coal is typically purchased through confidential contracts.

Although power plants that burned natural gas were subject to the exact same regulations as the coal-fired plants, there was no drop in the price paid for gas after deregulation. Cicala attributed the difference to the fact that natural gas is sold on a transparent, open market. This prevents political influences from sneaking through and allows regulators to know when plants are paying too much.

What’s different about the buying strategy of deregulated coal plant operators? Cicala dove deep into two decades of detailed, restricted-access procurement data to answer this question. First, he found that deregulated plants switch to cheaper, low-sulfur coal. This not only saves them money, but also allows them to comply with environmental regulations. On the other hand, regulated plants often comply with regulations by installing expensive “scrubber” technology, which allows them to make money from the capital improvements.

“It’s ironic to hear supporters of Eastern coal complain about ‘regulation’: they’re losing business from the deregulated plants,” said Cicala, a scholar at the Harris School of Public Policy.

Deregulated plants also increase purchases from out-of-state mines by about 25 percent. As mentioned, regulated plants are looked upon more favorably if they buy from in-state mines. Finally, deregulated plants purchase their coal from more productive mines (coal seams are thicker and closer to the surface) that require about 25 percent less labor to extract from the ground and that pay 5 percent higher wages.

“Recognizing that there are failures in financial markets, health care markets, energy markets, etc., it’s critical to know what makes for ‘bad’ regulations when designing new ones to avoid making the problem worse,” Cicala said. [Emphasis added.]

Moody’s concludes: mass grid defection not yet on the horizon

Yes, solar power systems are getting cheaper and battery storage is improving. The combination has many folks worried (or elated) about the future prospects of grid-based electric utilities when consumers can get the power they want at home. (See Lynne’s post from last summer for background.)

An analysis by Moody’s concludes that battery storage remains an order of magnitude too high, so grid defections are not yet a demonstrable threat. Analysis of consumer power use data leads them to project a need for a larger home system than other analysts have used. Moody’s further suggests that consumers will be reluctant to make the lifestyle changes–frequent monitoring of battery levels, forced conservation during extended low-solar resource periods–so grid defection may be yet slower than the simple engineering economics computation would suggest.

COMMENT: I’ll project that in a world of widespread consumer power defections, we will see two developments to help consumers avoid the need to face forced conservation. Nobody will have to miss watching Super Bowl LXXX because it was cloudy the week before in Boston. First, plug-in hybrid vehicles hook-ups so the home batteries can be recharged by the consumer’s gasoline or diesel engine. Second, home battery service companies will provide similar mobile recharge services (or hot-swapping home battery systems, etc.) Who knows, in a world of widespread defection, maybe the local electric company will offer spot recharge services at a market-based rate?

[HT to Clean Beta]

Tragedy of the commons, Yugoslavian apartment building laundry room edition

The tragedy of the commons story is well known and examples abound, but I still enjoy finding new examples in unexpected places. Here is one such example, first published in 1992 but new to me.

The building referred to is an apartment building in Yugoslavia; the time described isn’t exactly identified in the article, but perhaps 1960s or early in the 1970s:

In the cellar of our building there was a washing room with a huge concrete washing basin and three new washing machines. At the beginning, everyone washed their clothes downstairs. There was a schedule hung on the door and each family took its turn once a week. The machines didn’t work for long. To put it mildly, people didn’t take very good care of them. After all, these machines didn’t belong to anyone in person, so no one felt responsible for repairing, or even cleaning them. The first machine broke after about a year, then the second one, then the third. In the washing room, people started to store broken chairs, children’s bicycles, beach umbrellas, charcoal for barbecues, skis, mattresses. … The basins were filled with supplies for winter: bags of potatoes, green and red peppers, and wooden barrels of sauerkraut.

We’d lost our common laundry room precisely because it was common. But by that time the standard of living in the country was high enough so, instead of forty people using three common machines, everyone could buy an imported washing machine for themselves, however unnecessary and irrational this really was. Even our own country started to produce them, except that they all were very expensive. This, strangely enough, became a reason to buy one, to prove that you were earning enough, that your social status was high enough, so you could afford household appliances. Social differentiation, starting with cars and TV sets, continued in bathrooms and kitchens. A washing machine became an item of prestige, and it was good for women, even if it really wasn’t meant to ease their lives in the first place.

From “On doing laundry,”
How we survived communism and even laughed, Slavenka Drakulić.

Drakulić’s description of her grandmother doing laundry (in 1950s Yugoslavia) reminded me of Hans Rosling’s TED Talk, “The magic washing machine.”

I have only read a few of the essays in Drakulić’s book, but so far it impresses me as a good collection of sharply-observed and reported essays on life in communist Eastern Europe.