Forthcoming paper: Implications of Smart Grid Innovation for Organizational Models in Electricity Distribution

Back in 2001 I participated in a year-long forum on the future of the electricity distribution model. Convened by the Center for the Advancement of Energy Markets, the DISCO of the Future Forum brought together many stakeholders to develop several scenarios and analyze their implications (and several of those folks remain friends, playmates in the intellectual sandbox, and commenters here at KP [waves at Ed]!). As noted in this 2002 Electric Light and Power article,

Among the 100 recommendations that CAEM discusses in the report, the forum gave suggestions ranging from small issues-that regulators should consider requiring a standard form (or a “consumer label”) on pricing and terms and conditions of service for small customers to be provided to customers at the tie of the initial offer (as well as upon request)-to larger ones, including the suggestions that regulators should establish a standard distribution utility reporting format for all significant distribution upgrades and extensions, and that regulated DISCOs should be permitted to recover their reasonable costs for development of grid interface designs and grid interconnect application review.

“The technology exists to support a competitive retail market responsive to price signals and demand constraints,” the report concludes. “The extent to which the market is opened to competition and the extent to which these technologies are applied by suppliers, DISCOS and customers will, in large part, be determined by state legislatures and regulators.”

Now in 2015, technological dynamism has brought to a head many of the same questions, regulatory models, and business models that we “penciled out” 14 years ago.

In a new paper, forthcoming in the Wiley Handbook of Smart Grid Development, I grapple with that question: what are the implications of this technological dynamism for the organizational form of the distribution company? What transactions in the vertically-integrated supply chain should be unbundled, what assets should it own, and what are the practical policy issues being tackled in various places around the world as they deal with these questions? I analyze these questions using a theoretical framework from the economics of organization and new institutional economics. And I start off with a historical overview of the industry’s technology, regulation, and organizational model.

Implications of Smart Grid Innovation for Organizational Models in Electricity Distribution

Abstract: Digital technologies from outside the electricity industry are prompting changes in both regulatory institutions and electric utility business models, leading to the disaggregation or unbundling of historically vertically integrated electricity firms in some jurisdictions and not others, and simultaneously opening the door for competition with the traditional electric utility business. This chapter uses the technological and organizational history of the industry, combined with the transactions cost theory of the firm and of vertical integration, to explore the implications of smart grid technologies for future distribution company business models. Smart grid technologies reduce transactions costs, changing economical firm boundaries and reducing the traditional drivers of vertical integration. Possible business models for the distribution company include an integrated utility, a network manager, or a coordinating platform provider.

The New York REV and the distribution company of the future

We live in interesting times in the electricity industry. Vibrant technological dynamism, the very dynamism that has transformed how we work, play, and live, puts increasing pressure on the early-20th-century physical network, regulatory model, and resulting business model of the vertically-integrated distribution utility.

While the utility “death spiral” rhetoric is overblown, these pressures are real. They reflect the extent to which regulatory and organizational institutions, as well as the architecture of the network, are incompatible with a general social objective of not obstructing such innovation. Boosting my innovation-focused claim is the synthesis of relatively new environmental objectives into the policy mix. Innovation, particularly innovation at the distribution edge, is an expression of human creativity that fosters both older economic policy objectives of consumer protection from concentrations of market power and newer environmental policy objectives of a cleaner and prosperous energy future.

But institutions change slowly, especially bureaucratic institutions where decision-makers have a stake in the direction and magnitude of institutional change. Institutional change requires imagination to see a different world as possible, practical vision to see how to get from today’s reality toward that different world, and courage to exercise the leadership and navigate the tough tradeoffs that inevitably arise.

That’s the sense in which the New York Reforming the Energy Vision (REV) proceeding of the New York State Public Service Commission (Greentech) is compelling and encouraging. Launched in spring 2014 with a staff paper, REV is looking squarely at institutional change to align the regulatory framework and the business model of the distribution utility more with these policy objectives and with fostering innovation. As Katherine Tweed summarized the goals in the Greentech Media article linked above,

The report calls for an overhaul of the regulation of the state’s distribution utilities to achieve five policy objectives:

  • Increasing customer knowledge and providing tools that support effective management of their total energy bill
  • Market animation and leverage of ratepayer contributions
  • System-wide efficiency
  • Fuel and resource diversity
  • System reliability and resiliency

The PSC acknowledges that the current ratemaking procedure simply doesn’t work and that the distribution system is not equipped for the changes coming to the energy market. New York is already a deregulated market in which distribution is separated from generation and there is retail choice for electricity. Although that’s a step beyond many states, it is hardly enough for what’s coming in the market.

Last week the NY PSC issued its first order in the REV proceeding, that the incumbent distribution utilities will serve as distributed system platform providers (DSPPs) and should start planning accordingly. As noted by RTO Insider,

The framework envisions utilities serving a central role in the transition as distributed system platform (DSP) providers, responsible for integrated system planning and grid and market operations.

In most cases, however, utilities will be barred from owning distributed energy resources (DER): demand response, distributed generation, distributed storage and end-use energy efficiency.

The planning function will be reflected in the utilities’ distributed system implementation plan (DSIP), a multi-year forecast proposing capital and operating expenditures to serve the DSP functions and provide third parties the system information they need to plan for market participation.

A platform business model is not a cut and dry thing, though, especially in a regulated industry where the regulatory institutions reinforced and perpetuated a vertically integrated model for over a century (with that model only really modified due to generator technological change in the 1980s leading to generation unbundling). Institutional design and market design, the symbiosis of technology and institutions, will have to be front and center, if the vertically-integrated uni-directional delivery model of the 20th century is to evolve into a distribution facilitator of the 21st century.

In fact, the institutional design issues at stake here have been the focus of my research during my sabbatical, so I hope to have more to add to the discussion based on some of my forthcoming work on the subject.

Geoff Manne in Wired on FCC Title II

Friend of Knowledge Problem Geoff Manne had a thorough opinion piece in Wired yesterday on the FCC’s Title II Internet designation. Well worth reading. From the “be careful what you wish for” department:

Title II (which, recall, is the basis for the catch-all) applies to all “telecommunications services”—not just ISPs. Now, every time an internet service might be deemed to transmit a communication (think WhatsApp, Snapchat, Twitter…), it either has to take its chances or ask the FCC in advance to advise it on its likely regulatory treatment.

That’s right—this new regime, which credits itself with preserving “permissionless innovation,” just put a bullet in its head. It puts innovators on notice, and ensures that the FCC has the authority (if it holds up in court) to enforce its vague rule against whatever it finds objectionable.

And that’s even at the much-vaunted edge of the network that such regulation purports to protect.

Asking in advance. Nope, that’s not gonna slow innovation, not one bit …

FCC Title II and raising rivals’ costs

As the consequences of the FCC vote to classify the Internet as a Title II service start to sink in, here are a couple of good commentaries you may not have seen. Jeffrey Tucker’s political economy analysis of the Title II vote as a power grab is one of the best overall analyses of the situation that I’ve seen.

The incumbent rulers of the world’s most exciting technology have decided to lock down the prevailing market conditions to protect themselves against rising upstarts in a fast-changing market. To impose a new rule against throttling content or using the market price system to allocate bandwidth resources protects against innovations that would disrupt the status quo.

What’s being sold as economic fairness and a wonderful favor to consumers is actually a sop to industrial giants who are seeking untrammeled access to your wallet and an end to competitive threats to market power.

What’s being sold as keeping the Internet neutral for innovation at the edge of the network is substantively doing so by encasing the existing Internet structure and institutions in amber, which yields rents for its large incumbents. Some of those incumbents, like Comcast and Time-Warner, have achieved their current (and often resented by customers) market power not through rivalrous market competition, but through receiving municipal monopoly cable franchises. Yes, these restrictions raise their costs too, but as large incumbents they are better positioned to absorb those costs than smaller ISPs or other entrants would be. It’s naive to believe that regulations of this form will do much other than softening rivalry in the Internet itself.

But is there really that much scope for innovation and dynamism within the Internet? Yes. Not only with technologies, but also with institutions, such as interconnection agreements and peering agreements, which can affect packet delivery speeds. And, as Julian Sanchez noted, the Title II designation takes these kinds of innovations off the table.

But there’s another kind of permissionless innovation that the FCC’s decision is designed to preclude: innovation in business models and routing policies. As Neutralites love to point out, the neutral or “end-to-end” model has served the Internet pretty well over the past two decades. But is the model that worked for moving static, text-heavy webpages over phone lines also the optimal model for streaming video wirelessly to mobile devices? Are we sure it’s the best possible model, not just now but for all time? Are there different ways of routing traffic, or of dividing up the cost of moving packets from content providers, that might lower costs or improve quality of service? Again, I’m not certain—but I am certain we’re unlikely to find out if providers don’t get to run the experiment.

You should probably raise prices a bit during emergencies

At the Master Resource blog today: “In Defense of Price ‘Gouging’ (lines and shortages are uneconomic, discriminatory).”

In the essay I emphasize the unintended bias that results when consumer demand surges and supplies are tight, as for example when winter storm forecasts lead consumers to rush to the grocery store for bread and milk. Because retailers rarely raise prices in such situations, shortages are the predictable result. The burden of those shortages isn’t spread randomly, however, but rather tends to fall more heavily on certain segments of the population.

When emergencies happen (or are first forecasted) some consumers are readily able to rush to the store while other consumers are not so lucky. The early bird gets the bread and milk and eggs, the late arrival finds little or nothing available….

Households with … “high opportunity costs of shopping,” for example those households with infants or with both parents working full time, were more likely to miss out on the opportunity to buy foods before they ran out. It is easy to see that elderly and mobility-impaired consumers, too, would be more likely to be shut out by any sudden rush of consumers to the store after a disaster.

Higher prices would discourage over-buying and help ensure that useful consumer goods get distributed to more households, not just the households best able to rush to the store.

We can debate how significant the effect is, and I do not argue that raising prices solves all interesting problems, but a modest increase in consumer prices would likely be an improvement.

How about grocery stores imposing a 5 percent storm surcharge that goes to charity, with an ability to opt out? Maybe a sign next to the bread shelves saying “Help those most hurt by the storm: We recommend you donate 25 cents to [name of charity] for every loaf of bread you buy. Our checkout staff will be happy to help you with your donation!”

While I have targeted many complaints at anti-price gouging laws here at Knowledge Problem, in the Master Resource post I broaden my focus a bit to encompass the consumer sentiment against price increases during emergencies. We need a social change to enable markets to work as best they can in such situations. More effective markets aid by reducing the scope of the problems to be addressed by charitable work (as Dwight Lee argues in his essay in Regulation magazine – see the related discussion and link in the Master Resource post).

The quoted part above in part relies on research done on consumer purchases in Japan after the Great East Japan Earthquake of March 2011:  Naohito Abe, Chiaki Moriguchi, and Noriko Inakura, “The Effects of Natural Disasters on Prices and Purchasing Behaviors: The Case of Great East Japan Earthquake.” DP14-1, RCESR Discussion Paper Series, September 2014.

How can the market price of oil fall so far so fast?

If the oil market is reasonably efficient, then the price of a barrel of oil should reflect something like the cost of production of the highest-cost barrel of oil needed to just satisfy demand. In other words, the market price of oil should reflect the marginal cost of production.

The price of oil on the world market was about $110 per barrel in June 2014 and now sits just under $50 per barrel. Can it be possible that the marginal cost of producing oil was $110 per barrel in June 2014 and is only $50 per barrel in January 2015?

Yes.

Here is how: in the first half of June 2014 oil consumption was very high relative to the then-existing world oil production capability. In addition, existing oil production capability is always declining as producing fields deplete. The marginal cost of a barrel of oil under such tight market conditions has to cover the capital cost of developing new resources as well as the operating costs.

Toward the end of 2014 additions to world oil production capability exceeded growth in consumption, meaning additions to production capability were no longer necessary, meaning the marginal cost of producing the last barrel of oil no longer needed to cover that capital cost. Sure, some oil company somewhere had to make the capital investment necessary to develop the resource, but most of those costs are sunk and competition in the market means they cannot make some consumer cover those costs. The market price under today’s looser market conditions only needs to cover the operating costs of production.

Given the large sunk cost component of investment in developing oil production capability, it is quite possible that the oil market was efficient at $110 per barrel and remains operating efficiently today with prices under $50 per barrel.

NOTE: Related data on world oil production and consumption is available in the U.S. Department of Energy’s Short Term Energy Outlook. Commentary prompting this explainer comes from the UC-Berkeley Energy Institute at Haas blog.

When does state utility regulation distort costs?

I suspect the simplest answer to the title question is “always.” Maybe the answer depends on your definition of “distort,” but both the intended and generally expected consequences of state utility rate regulation has always been to push costs to be something other than what would naturally emerge in the absence of rate regulation.

More substantive, though, is the analysis provided in Steve Cicala’s article in the January 2015 American Economic Review, “When Does Regulation Distort Costs? Lessons from Fuel Procurement in US Electricity Generation.” (here is an earlier ungated version of the paper.)

Here is a summary from the University of Chicago press release:

A study in the latest issue of the American Economic Review used recent state regulatory changes in electricity markets as a laboratory to evaluate which factors can contribute to a regulation causing a bigger mess than the problem it was meant to fix….

Cicala used data on almost $1 trillion worth of fuel deliveries to power plants to look at what happens when a power plant becomes deregulated. He found that the deregulated plants combined save about $1 billion a year compared to those that remained regulated. This is because a lack of transparency, political influence and poorly designed reimbursement rates led the regulated plants to pursue inefficient strategies when purchasing coal.

The $1 billion that deregulated plants save stems from paying about 12 percent less for their coal because they shop around for the best prices. Regulated plants have no incentive to shop around because their profits do not depend on how much they pay for fuel. They also are looked upon more favorably by regulators if they purchase from mines within their state, even if those mines don’t sell the cheapest coal. To make matters worse, regulators have a difficult time figuring out if they are being overcharged because coal is typically purchased through confidential contracts.

Although power plants that burned natural gas were subject to the exact same regulations as the coal-fired plants, there was no drop in the price paid for gas after deregulation. Cicala attributed the difference to the fact that natural gas is sold on a transparent, open market. This prevents political influences from sneaking through and allows regulators to know when plants are paying too much.

What’s different about the buying strategy of deregulated coal plant operators? Cicala dove deep into two decades of detailed, restricted-access procurement data to answer this question. First, he found that deregulated plants switch to cheaper, low-sulfur coal. This not only saves them money, but also allows them to comply with environmental regulations. On the other hand, regulated plants often comply with regulations by installing expensive “scrubber” technology, which allows them to make money from the capital improvements.

“It’s ironic to hear supporters of Eastern coal complain about ‘regulation’: they’re losing business from the deregulated plants,” said Cicala, a scholar at the Harris School of Public Policy.

Deregulated plants also increase purchases from out-of-state mines by about 25 percent. As mentioned, regulated plants are looked upon more favorably if they buy from in-state mines. Finally, deregulated plants purchase their coal from more productive mines (coal seams are thicker and closer to the surface) that require about 25 percent less labor to extract from the ground and that pay 5 percent higher wages.

“Recognizing that there are failures in financial markets, health care markets, energy markets, etc., it’s critical to know what makes for ‘bad’ regulations when designing new ones to avoid making the problem worse,” Cicala said. [Emphasis added.]