Permissionless innovation in electricity: the benefits of experimentation

Last Monday I was scheduled to participate in the Utility Industry of the Future Symposium at the NYU Law School. Risk aversion about getting back for Tuesday classes in the face of a forecast 7″ snowfall in New York kept me from attending (and the snow never materialized, which makes the cost even more bitter!), so I missed out on the great talks and panels. But I’ve edited my remarks into the essay below, with helpful comments and critical readings from Mark Silberg and Jim Speta. Happy thinking!

If you look through the lens of an economist, especially an economic historian, the modern world looks marvelous – innovation enables us to live very different lives than even 20 years ago, lives that are richer in experience and value in many ways. We are surrounded by dynamism, by the change arising from creativity, experimentation, and new ideas. The benefits of such dynamism are cumulative and compound upon each other. Economic history teaches us that well-being emerges from the compounding of incremental changes over time, until two decades later you look at your old, say, computer and you wonder that you ever accomplished anything that way at all.

The digital technology that allows us to flourish in unanticipated ways, large and small, is an expression of human creativity in an environment in which experimentation is rife and entry barriers are low. That combination of experimentation and low entry barriers is what has made the Internet such a rich, interesting, useful platform for us to use to make ourselves better off, in the different ways and meanings we each have.

And yet, very little (if any) of this dynamism has originated in the electricity industry, and little of this dynamism has affected how most people transact in and engage with electricity. Digital technologies now exist that consumers could use to observe and manage their electricity consumption in a more timely way than after the fact, at the end of the month, and to transact for services they value – different pricing, different fuel sources, and automating their consumption responses to changes in those. From the service convergence in telecom (“triple play”) we have experimented with and learned the value of bundling. Such bundling of retail electricity service with home entertainment, home security, etc. are services that companies like ADT and Verizon are exploring, but have been extremely slow to develop and have not commercialized yet, due to the combination of regulatory entry barriers that restrict producers and reinforce customer inertia. All of these examples of technologies, of pricing, of bundling, are examples of stalled innovation, of foregone innovation in this space.

Although we do not observe it directly, the cost of foregone innovation is high. Today residential consumers still generally have low-cost, plain-vanilla commodity electricity service, with untapped potential to create new value beyond basic service. Producers earn guaranteed, regulation-constrained profits by providing these services, and the persistence of regulated “default service contracts” in nominally competitive states is an entry barrier facing producers that might otherwise experiment with new services, pricing, and bundles. If producers don’t experiment, consumers can’t experiment, and thus both parties suffer the cost of foregone innovation – consumers lose the opportunity to choose services they may value more, and producers lose the opportunity to profit by providing them. By (imperfect) analogy, think about what your life would be like if Apple had not been allowed to set up retail stores that enable consumers to engage in learning while shopping. It would be poorer (and that’s true even if you don’t own any Apple devices, because the experimentation and learning and low entry barriers even benefits you because it encourages new products and entry).

This process of producer and consumer experimentation and learning is the essence of how we create value through exchange and market processes. What Internet pioneer Vint Cerf calls permissionless innovation, what writer Matt Ridley calls ideas having sex — these are the processes by which we humans create, strive, learn, adapt, and thrive.

But regulation is a permission-based system, and regulation slows or stifles innovation in electricity by cutting off this permissionless innovation. Legal entry barriers, the bureaucratic procedures for cost recovery, the risk aversion of both regulator and regulated, all undermine precisely the processes that enable innovation to yield consumer benefits and producer profits. In this way regulation that dictates business models and entry barriers discourages activities that benefit society, that are in the public interest.

The question of public interest is of course central to any analysis of electricity regulation’s effects. Our current model of utility regulation has been built on the late 19th century idea that cost-based regulation and restricting entry would make reliable electric service ubiquitous and as cheap as is feasible. Up through the 1960s, while exploiting the economies of scale and scope in the conventional mechanical technologies, that concept of the public interest was generally beneficial. But by so doing, utility regulation entrenched “iron in the ground” technologies in the bureaucratic process. It also entrenched an attitude and a culture of prudential preference for those conventional technologies on the part of both regulator and regulated.

This entrenchment becomes a problem because the substance of what constitutes the public interest is not static. It has changed since the late 19th century, as has so much in our lives, and it has changed to incorporate the dimension of environmental quality as we have learned of the environmental effects of fossil fuel consumption. But the concept of the public interest of central generation and low prices that is fossilized in regulatory rules does not reflect that change. I argue that the “Rube Goldberg” machine accretion of RPS, tax credits, and energy efficiency mandates to regulated utilities reflects just how poorly situated the traditional regulated environment is to adapting to the largely unforeseeable changes arising from the combination of dynamic economic and environmental considerations. Traditional regulation is not flexible enough to be adaptive.

The other entrenchment that we observe with regulation is the entrenchment of interests. Even if regulation was initiated as a mechanism for protecting consumer interests, in the administrative and legal process it creates entrenched interests in maintaining the legal and technological status quo. What we learn from public choice theory, and what we observe in regulated industries including electricity, is that regulation becomes industry-protecting regulation. Industry-protecting regulation cultivates constituency interests, and those constituency interests generally prefer to thwart innovation and retain entry barriers to restrict interconnection and third-party and consumer experimentation. This political economy dynamic contributes to the stifling of innovation.

As I’ve been thinking through this aloud with you, you’ve probably been thinking “but what about reliability and permissionless innovation – doesn’t the physical nature of our interconnected network necessitate permission to innovate?” In the centralized electro-mechanical T&D network that is more true, and in such an environment regulation provides stability of investments and returns. But again we see the cost of foregone innovation staring us in the face. Digital switches, open interconnection and interoperability standards (that haven’t been compromised by the NSA), and more economical small-scale generation are innovations that make high reliability in a resilient distributed system more possible (for example, a “system of systems” of microgrids and rooftop solar and EVs). Those are the types of conditions that hold in the Internet – digital switches, traffic rules, TCP-IP and other open data protocols — and as long as innovators abide by those physical rules, they can enter, enabling experimentation, trial and error, and learning.

Thus I conclude that for electricity policy to focus on facilitating what is socially beneficial, it should focus on clear, transparent, and just physical rules for the operation of the grid, on reducing entry barriers that prevent producer and consumer experimentation and learning, and on enabling a legal and technological environment in which consumers can use competition and technology to protect themselves.

Interpreting Google’s purchase of Nest

Were you surprised to hear of Google’s acquisition of Nest? Probably not; nor was I. Google has long been interested in energy monitoring technologies and the effect that access to energy information can have on individual consumption decisions. In 2009 they introduced Power Meter, which was an energy monitoring and visualization tool; I wrote about it a few times, including it on my list of devices for creating intelligence at the edge of the electric power network. Google discontinued it in 2011 (and I think Martin LaMonica is right that its demise showed the difficulty of competition and innovation in residential retail electricity), but it pointed the way toward transactive energy and what we have come to know as the Internet of things.

In his usual trenchant manner, Alexis Madrigal at the Atlantic gets at what I think is the real value opportunity that Google sees in Nest: automation and machine-to-machine communication to carry out our desires. He couches it in terms of robotics:

Nest always thought of itself as a robotics company; the robot is just hidden inside this sleek Appleish case.

Look at who the company brought in as its VP of technology: Yoky Matsuoka, a roboticist and artificial intelligence expert from the University of Washington.

In an interview I did with her in 2012, Matsuoka explained why that made sense. She saw Nest positioned right in a place where it could help machine and human intelligence work together: “The intersection of neuroscience and robotics is about how the human brain learns to do things and how machine learning comes in to augment that.”

I agree that it is an acquisition to expand their capabilities to do distributed sensing and automation. Thus far Nest’s concept of sensing has been behavioral — when do you use your space and how do you use it — and not transactive. Perhaps that can be a next step.

The Economist also writes this week about the acquisition, and compares Google’s acquisitions and evolution to GE’s in the 20th century. The Economist article touches on the three most important aspects of this acquisition: the robotics that Alexis analyzed, the data generated and accessible to Google for advertising purposes, and the design talent at Nest to contribute to the growing interest in the Internet-of-things technologies that make the connected home increasingly feasible and attractive to consumers (and that some of us have been waiting, and waiting, and waiting to see develop):

Packed with sensors and software that can, say, detect that the house is empty and turn down the heating, Nest’s connected thermostats generate plenty of data, which the firm captures. Tony Fadell, Nest’s boss, has often talked about how Nest is well-positioned to profit from “the internet of things”—a world in which all kinds of devices use a combination of software, sensors and wireless connectivity to talk to their owners and one another.

Other big technology firms are also joining the battle to dominate the connected home. This month Samsung announced a new smart-home computing platform that will let people control washing machines, televisions and other devices it makes from a single app. Microsoft, Apple and Amazon were also tipped to take a lead there, but Google was until now seen as something of a laggard. “I don’t think Google realised how fast the internet of things would develop,” says Tim Bajarin of Creative Strategies, a consultancy.

Buying Nest will allow it to leapfrog much of the opposition. It also brings Google some stellar talent. Mr Fadell, who led the team that created the iPod while at Apple, has a knack for breathing new life into stale products. His skills and those of fellow Apple alumni at Nest could be helpful in other Google hardware businesses, such as Motorola Mobility.

Are we finally about to enter a period of energy consumption automation and transactive energy? This acquisition is a step in that direction.

Economist debate on solar power

The Economist often runs debates on their website, and their current one will be of interest to the KP community: Can solar energy save the world?

The debate is structured in a traditional manner, with a moderator and a proposer and a responder. Guest posts accompany the debate, and readers are invited to comment on each stage of the debate. The two debaters are Richard Swanson, founder of SunPower, and Benny Peiser of the Global Warming Policy Foundation. Geoff Carr, the Economist’s science editor, is moderating the debate.

One common theme among the debaters, the moderator, and the commenters is the distortions introduced due to decades of energy being politicized, which means (among other things) that the complicated web of subsidies across all fuel sources is hard to disentangle. Given the thorough and valuable discussion that Mike’s provided of his recent analysis of wind power cost estimates, this solar debate is a good complement to that discussion.

“If your toilet’s so smart, how come I can hack it?”

Thus reads the headlines on David Meyer’s Gigaom post on news that the Satis toilet, manufactured by the Japanese firm Lixii, comes with a smartphone app that can be used to control any Satis toilet (see also this BBC news article). You may wonder why a toilet needs an app, which is a valid question; this one allows recording of one’s activity (if you so choose …), remote flushing, remote air freshener spray, and remote bidet operation. Subjective utility being what it is, I’ll consider Lixii as entrepreneurs responding to what they perceive as some undersatisfied preference in the market, which the extent of their subsequent profits will indicate or not …

Although the story is scatologically humorous, Meyer’s closing observation hits upon exactly the same point I made recently in my post about the hackability of home management systems:

Of course, it’s not like someone will be exploiting this vulnerability to prank someone a continent away — Bluetooth is a pretty short-range wireless technology. However, it’s the kind of thing that should be borne in mind by manufacturers who are starting to jazz up previously low-tech appliances with new-fangled connectivity.

Because when it comes to security, as Trustwave SpiderLabs and others have warned, the home is the last place you want to be caught with your pants down.

Disruptive innovation and the regulated utility

Over the weekend the New York Times ran a good story about how rooftop solar and regulatory rules allowing net metering are putting pressure on the regulated distribution utility business model:

The struggle over the California incentives is only the most recent and visible dust-up as many utilities cling to their established business, and its centralized distribution of energy, until they can figure out a new way to make money. …

“Net metering right now is the only way for customers to get value for their rooftop solar systems,” said Adam Browning, executive director of the advocacy group Vote Solar.

Mr. Browning and other proponents say that solar customers deserve fair payment not only for the electricity they transmit but for the value that smaller, more dispersed power generators give to utilities. Making more power closer to where it is used, advocates say, can reduce stress on the grid and make it more reliable, as well as save utilities from having to build and maintain more infrastructure and large, centralized generators.

But utility executives say that when solar customers no longer pay for electricity, they also stop paying for the grid, shifting those costs to other customers. Utilities generally make their profits by making investments in infrastructure and designing customer rates to earn that money back with a guaranteed return, set on average at about 10 percent.

In a nutshell, what’s happening is that environmental and global warming policy initiatives are resulting in government subsidies and tax credits for consumer investments in rooftop solar, especially in states like California. As more consumers install rooftop solar they both make less use of the electricity distribution network to receive electricity and can put the excess power generated from their solar panels onto the distribution grid (called net metering). Under net metering they receive a per-kilowatt-hour payment that ranges between the averaged, regulated retail rate and the wholesale price of electricity at that time, depending on the net metering rules that are in operation in that state. From the regulated utility’s perspective, this move creates a double whammy — it reduces the amount of electricity sold and distributed using the wires network, which reduces revenue and the ability of the utility to charge the customer for use of the wires, but since most of the costs for the network are fixed costs and the utility is guaranteed a particular rate of return on those assets, that means increasing rates for other customers who have not installed solar.

Offsetting some of that revenue decrease/fixed cost dilemma is the fact that net metering means that the utility is purchasing power from rooftop solar owners at a price lower than the spot price they would have to pay to purchase power in the wholesale market in that hour (i.e., wholesale price as avoided cost) … except what happens when they have already entered long-term contracts for power and have to pay anyway? And in California, the net metering payment to the customer is the fully-loaded retail rate, not just the energy portion of the rate, so even though the customer is essentially still using the network (to sell excess power to other users via the regulated utility instead of buying it), the utility is not receiving the wires charge portion of the per-kilowatt-hour regulated rate.

Sounds like a mess, right? It sure is. And, as Katie Fehrenbacher pointed out yesterday on Gigaom, the disruption of the regulated electric utility in the same way that Kodak, Blockbuster, and Borders have been disrupted out of existence is not a new idea. In fact, I made the same argument here at KP back in 2003, building on a paper I co-authored for the International Association of Energy Economics meetings in 2002 (and here are other KP posts that both Mike and I have made on net metering). I summarized that paper in this Reason Foundation column, in which I argued

Many technological and market innovations have reduced the natural monopoly rationale for traditional electric industry regulation. For example, consider distributed generation. Distributed generation (DG) is the use of an energy source (gas turbines, gas engines, fuel cells, for example) to generate electricity close to where it will be used. Technological change in the past decade and deregulation in the natural gas industry have made DG an economically viable alternative to buying electricity from a monopoly utility and receiving it over the utility’s transmission and distribution grid. The potential for this competition to discipline a transmission owner’s prices for transmission services is immense, but it still faces some obstacles. …

Technological change and market dynamics have made the natural monopoly model of electricity regulation obsolete. While technological changes and market innovations that shape the electricity industry’s evolution have received some attention, their roles in making natural monopoly regulation of transmission and distribution obsolete have not received systematic treatment. For that reason, the policy debate has focused on creating regional transmission organizations to rationalize grid construction, but has not dug more deeply into the possible benefits of dramatically rethinking the foundations of natural monopoly regulation.

I may have been a bit ahead of my time in making this argument, but the improvements in energy efficiency and production costs for solar technology and the shale gas revolution have made this point even more important.

Think a bit about how the regulated utilities and regulators have come to this point. They have come to this point by trying to retain much of the physical and legal structure of traditional regulation, and by trying to fold innovation into that structure. The top-down system-level imposition of requirements for the regulated utility to purchase excess solar-generated electricity and to pay a specific, fixed price for it. The attempts of regulated utilities to block such efforts, and to charge high “standby charges” to customers who install distributed generation but want to retain their grid interconnection as an insurance policy. The fact that regulation ensures cost recovery for the wires company and how that implies that a reduction in number of customers means a price increase to those customers staying on the wires network. And adding on top of that the subsidies and tax credits to induce residential customers to purchase and install rooftop solar. I don’t think we could design a worse process and set of institutions if we tried.

You may respond that there’s no real alternative, and I’d say you’re wrong. You can see the hint in my remarks above from 2003 — if these states had robust retail competition, then retailers could offer a variety of different contracts, products, and services associated with distributed generation. Wires companies could essentially charge standard per-unit transportation rates (assuming they would still be regulated). In that market design, much of the pressure on the business model of the wires company from distributed generation gets diluted. The wires company would still have to be forward-looking and think (with the regulators) about what increased penetration of distributed generation would mean for the required distribution capacity of the wires network and how to invest in it and recover the costs. But the wires company would be just that, a wires company, and not the party with the retail relationship with the residential customer, so all of these distortions arising from net metering would diminish. If I were a wires company I would certainly use digital meters and monitors to measure the amount of current flow and the direction of current flow, and I would charge a per-kilowatt-hour wires transportation charge regardless of direction of flow, whether the residential customer is consuming or producing. Digital technology makes that granular observation possible, which makes that revenue model possible.

That’s why states like California have created such an entangled mess for themselves by retaining the traditional regulated utility structure for integrated distribution and retail and trying to both absorb and incentivize disruptive distributed generation innovation in that traditional structure. Not surprisingly, Texas with its more deregulated and dis-integrated structure has escaped this mess — the only regulated entity is the wires (transmission and distribution) company, and retailers are free to offer residential customers compensation for any excess generation from distributed renewable generation sources, at a price mutually agreed upon between the retailer and the customer in their contract. In fact, Green Mountain Energy offers such a contract to residential customers in Texas. See how much easier that is than what is happening in California?

Honey, someone hacked our smart home

Ever since the first “vision” meeting I attended at the Department of Energy in 2003 about the technologically advanced electric power grid of the future, digital network security in a smart grid has been a paramount concern. Much of the concern emphasizes hardening the electrical and communication networks against nefarious attempts to access control rooms or substations. Less attention goes to the security of the home automation system itself.

Here’s why privacy and security issues matter so much in customer-facing smart grid products and services: how likely is it that someone can hack into your home energy management system? The resourceful technology and privacy journalist Kashmir Hill gained access to eight homes, merely by doing an Internet search to see if any homes had their devices set to be discoverable by a search engine:

Googling a very simple phrase led me to a list of “smart homes” that had done something rather stupid. The homes all have an automation system from Insteon that allows remote control of their lights, hot tubs, fans, televisions, water pumps, garage doors, cameras, and other devices, so that their owners can turn these things on and off with a smartphone app or via the Web. The dumb thing? Their systems had been made crawl-able by search engines – meaning they show up in search results — and due to Insteon not requiring user names and passwords by default in a now-discontinued product, I was able to click on the links, giving me the ability to turn these people’s homes into haunted houses, energy-consumption nightmares, or even robbery targets. Opening a garage door could make a house ripe for actual physical intrusion.

In this instance, early adopters of a now-discontinued home automation system had not changed their default settings to implement security protocols. They had not followed the simple security protocols that we have become habituated to in our home wireless networks, which most of us now routinely know to secure with a password at least. This security hurdle doesn’t seem very high, and it shouldn’t be; securing a home automation system separately with a username/password login is not difficult, and can be made less difficult for the technologically challenged through helpful customer service.

She goes on in the story to relate her interactions with some of the people whose houses she was able to access, as well as her discussion with people at Insteon:

Insteon chief information officer Mike Nunes says the systems that I’m seeing online are from a product discontinued in the last year. He blamed user error for the appearance in search results, saying the older product was not originally intended for remote access, and to set this up required some savvy on the users’ part. The devices had come with an instruction manual telling users how to put the devices online which strongly advised them to add a username and password to the system. (But, really, who reads instruction manuals closely?)

“This would require the user to have chosen to publish a link (IP address) to the Internet AND for them to have not set a username and password,” says Nunes. I told Nunes that requiring a username/password by default is good security-by-design to protect people from making a mistake like this. “It did not require it by default, but it supported it and encouraged it,” he replied.

One of the interesting aspects of her story (and you get a much deeper sense of it reading the whole article) is the extent to which these early adopters/automation hobbyists identified some but not all of the potential security holes in the home automation system. These are eager, knowledgeable consumers, and even they did not realize that some ports on the router were left open and thus made the system discoverable externally.

I think she’s right that for such technologies in such sensitive applications as home automation, default username/password authentication is good design. This is an application in which I think the behavioral economics arguments about setting defaults to overcome inertia bias are valid.

Insteon has since changed their default settings to require username/password authentication on the automation system separate from the home wireless network authentication, and the rest of the article describes some other companies that are working to close security holes in their home automation systems.

As we extend the smart grid into our home and the “Internet of things” becomes more deeply embedded in our lives, being aware of the value of securing our privacy and reducing the risk of unauthorized access to our homes and the devices and appliances in them becomes more important. The digital rules we apply to our financial transactions should guide our privacy and security awareness and decision in our home network too. That way we can enjoy the benefits of home automation and transactive energy that Hill lays out in her article while minimizing the risk of unauthorized access to our homes and our information.

Nest and technology-service bundling

Lynne Kiesling

nest-rush-hour-alert

Nest’s recent business developments are refreshing and promising. Building on the popularity of its elegant and easy-to-use learning thermostat in its first couple of years, Nest is introducing new Nest-enabled services to automate changes in settings and energy use in the home. Called Rush Hour Rewards and Seasonal Savings, Nest claims:

Rush Hour Rewards could help you earn anywhere from $20-$60 this summer—it takes advantage of energy company incentives that pay you to use less energy when everyone else is using more. Seasonal Savings takes everything Nest has learned about you and automatically fine-tunes Nest’s schedule to save energy, without sacrificing comfort. Field trials have been impressive: Nest owners have used 5-10% less heating and cooling with Seasonal Savings and 80% said they’d keep their tuned-up schedules after Seasonal Savings ended.

The ever-incisive Katie Fehrenbacher calls their move a bundling of its “smart thermostat with data-driven services“, which sounds about right to me.

Behind these new services is the cloud-based big data algorithms that are the secret sauce of Nest, and which Nest has now named Auto-Tune. Now that Nest has gotten hundreds of thousands of thermostats out there in the market, and has done two years of field trials, it has been able to collect a large amount of data about how customers use and react to temperature and cooling changes. Nest uses this data about behavioral changes to inform its services and how its algorithms work.

She also remarks on something I noticed — in its marketing of its new services Nest assiduously avoids the phrase “demand response”, instead saying “New features save energy & make money. Automatically.” Once you get beyond the elegant interface, the thoughtful network and device connectivity, and the “secret sauce” algorithms, Rush Hour Rewards is little more than standard, administered, regulator-approved direct load control. But Nest’s elegance, marketing, and social-media-savvy outreach may make it more widespread and appealing than any number of regulator-approved bill inserts about AC cyling have over the decades.

In a very good Wired story on Nest Energy Services, Steven Levy analogizes between the technology-digital service bundle in energy and in music; quoting Nest CEO Tony Faddell, Levy notes that:

This pivot is in the best tradition of companies like Apple and even Amazon, whose hardware devices have evolved to become front ends for services like iTunes or Amazon Prime Instant Movies. Explaining how this model works in the thermostat world, Fadell compares power utilities to record labels. Just as Apple provided services to help customers link with the labels to get music, Nest is building digital services to help customers save money. Unlike the case with record labels, however, Nest isn’t eroding the utility business model, but fulfilling a long-term need–getting customers to change their behavior during periods of energy scarcity.

“Until now, if utilities wanted customers to change their behavior to use less electricity at those time, they instituted what was called unilateral demand response—they wouldn’t automate the process, they’d turn off the air-conditioning whenever they wanted. It was like DRM during the iPod days—where companies like Sony said, ‘I am the guardian, and I’m going to tell you what to do’.”

Faddell (and Levy and Fehrenbacher) articulates the value potential of technology-service bundles to automate energy consumption decisions in ways that save energy and money without reducing comfort. While the guts of their services are still direct load control and are not dynamic in any way that would make meaningful use of such a potentially transactive technology, I do think it’s a promising evolution beyond the monolithic, administrative, regulatory demand response approach.

Ford’s MyEnergi Lifestyle

Lynne Kiesling

You may know that the annual Consumer Electronics Show has been going on this week in Las Vegas (CES2013). CES is the venue for displaying the latest, greatest, wonderful electronic gadgets that will enrich your life, improve your productivity, reduce your stress, and make your breath minty fresh.

And, increasingly, ways to save energy and reduce energy waste. The most ambitious proposition to come out of CES2013 is Ford’s MyEnergi Lifestyle, as described in a Wired magazine article from the show:

Here at CES 2013, the automaker announced MyEnergi Lifestyle, a sweeping collaboration with appliance giant Whirlpool, smart-meter supplier Infineon, Internet-connected thermostat company Nest Labs and, for a green-energy slant, solar-tech provider SunPower. The goal is to help people understand how the “time-flexible” EV charging model can more cheaply power home appliances, and how combining an EV, connected appliances and the data they generate can help them better manage their energy consumption and avoid paying for power at high rates. …

Appliances are getting smarter, too. Some of the most power-hungry appliances, such as a water heater and the ice maker in your freezer, can now schedule their most energy-intensive activities at night. Nest’s Internet-connected thermostat can help homeowners save energy while their [sic] away. While some of the appliances and devices within MyEnergi Lifestyle launch early this year, others are available now, Tinskey said.

One reason why I think this initiative is promising is its involvement of Whirlpool and Nest, two very different companies that are both focused on ways to combine digital technology and elegant design to make energy efficiency in the home appealing, attractive, and easy to implement.

The value proposition is largely a cloud-based data one — gather data on the electricity use in the home in real time, program in some consumer-focused triggers, such as price thresholds, and manage the electricity use in the home with the objective of minimizing cost and emissions. Gee, I think I’ve heard that one here before

Regulation’s effects on innovation in energy technologies: the experimentation connection

Lynne Kiesling

Remember the first time you bought a mobile phone (which in my case was 1995). You may have been happy with your land line phone, but this new mobile phone thing looks like it would be really handy in an emergency, so you-in-1995 said sure, I’ll get a cell phone, but not really use it that much. Then, the technology improved, and more of your friends and family got phones, so you used it more. Then you saw others with cool flip phones, in colors, and you did some searching to see if other phones had features you might like. Then came text messaging, and you experimented with learning a new shorthand language (or, if you’re like me, you stayed a pedant about spelling even in text messages that you had to tap out on number pad keys). You adopted text messaging, or not. Then came the touch screen, largely via the disruptive iPhone, and the cluster of smartphone innovation was upon us.  Maybe you have a smartphone, maybe you don’t; maybe your smartphone is an iPhone, maybe it isn’t. But since 1995, your choice of communication technology, and the set from which you can choose, has changed dramatically.

This change didn’t happen overnight, and for most people was not a discrete move from old choice to new choice, A to B, without any other choices along the way. Similarly for technological change and the production of goods and services. For both consumers and producers, our choices in markets are the consequence of a process of experimentation, trial and error, and learning. Indeed, whether your perspective on dynamic competition is based on Schumpeter or Hayek or Kirzner (or all of the above), the fundamental essence of competition in market processes is that it’s a process of experimentation, trial and error, and learning, on the part of both producers and consumers. That’s how we get new products and services, that’s how we signal to producers whether their innovations are valuable to us as consumers, that’s how innovation creates economic growth and vibrancy, through the application of our creativity and our taste for creating and experiencing novelty.

This kind of dynamism is common in our world, and is increasingly an aspect of our lives that creates value for us; mobile telephony is the most obvious example, but even in products as mundane as milk, the fundamental aspect of the market process is this experimentation, trial and error, and learning. How else would Organic Valley have started coming out with a line of milk that is entirely from pasture-raised cows? (I am happily consuming this milk; pasture-raised cows make milk with more essential fatty acids and conjugated linoleic acid, very important for health)

But this kind of dynamism, while common, is not pervasive. Institutions matter, and in particular, various forms of government regulation can influence the extent to which such technological dynamism occurs in a market. The example I have in mind as a counterpoint, the example I want to explain and understand, is consumer-facing electricity technologies, like thermostats and home energy management systems. For the past several years there has been considerable innovation in this space, due to the application and extension of digital communication technology innovations. But despite the frequent claims over the past few years that this year will be the year of the consumer energy technology, it keeps not happening.

Tomorrow in New Orleans, at the Southern Economic Association meetings, I’ll be presenting a paper that grapples with this question. My argument is that traditional economic regulation of the electricity industry slows or stifles innovation because regulation undercuts the experimentation, trial and error, and learning of both producers and consumers. As I state in the abstract:

Persistent regulation in potentially competitive markets can undermine consumer benefits when technological change both makes those markets competitive and creates new opportunities for market experimentation. This paper applies the Bell Doctrine precedent of “quarantine the monopoly” to the electricity industry, and extends the Bell Doctrine by analyzing the role of market experimentation in generating the benefits of competition. The general failure to quarantine the monopoly wires segment and its regulated monopolist from the potentially competitive downstream retail market contributes to the slow pace and lackluster performance of retail electricity markets for residential customers. The form of this failure to quarantine the monopoly is the persistence of an incumbent default service contract that was intended to be a transition mechanism to full retail competition, coupled with the regulatory definition of product characteristics and market boundaries that is necessary to define the default product and evaluate the regulated monopolist’s performance in providing it. The consequence of the incumbent’s incomplete exit from the retail market suggests that as regulated monopolists and regulators evaluate customer-facing smart grid investments, regulators and other policymakers should consider the potential anti-competitive effects of the failure to quarantine the monopoly with respect to the default service contract and in-home energy management technology.

In August 2011 I wrote about the Bell Doctrine, Baxter’s precedent from the U.S. v. AT&T divestiture case, and how we have failed to quarantine the monopoly in electricity. This paper is an extension of that argument, and I welcome comments!

If you’ll be at the SEA meetings, I hope to see you there; I am headed to NOLA tonight, and look forward to a fun weekend full of good economic brain candy.

Just how “wasteful” are data centers?

Lynne Kiesling

You may have seen the article in Sunday’s New York Times on how “wasteful” data centers are — they use large amounts of electricity to enable the level of redundancy required to achieve the degree of reliability and uptime that consumers expect from their Internet activities. I put the word “waste” in quotes because I think Don Boudreaux has a point, in his letter to the editor in response to the article: where’s the line between “waste” and “use”? The NYT article presents data center power use as wasteful, implying that the author thinks that they should either figure out ways to deliver the same reliability with less electricity, or that we consumers should change our preferences so we don’t place as much value on reliability. I’ll argue later that data center operators have high-powered incentives to do the former, and as for the latter, I invite the NYT author to imagine how he thinks NYT readers would respond to a slowdown or lack of server availability that made it hard for them to access NYT articles.

Of course the undercurrent here is the argument that the price of our Internet activity does not include the environmental cost associated with power use, and consequently we should use public policy to impose a price on data centers, or on Internet use, to reflect that cost. The article isn’t explicit about carbon policy, but that’s the implication.

My initial reaction to the article was that it was biased and somewhat inaccurate, and that it overlooked a wide array of innovations that chip manufacturers, data center operators, and architects have created over the past few years to reduce power use per calculation as well as overall power use. Fortunately, Katie Fehrenbacher (who is more knowledgeable than I in these matters) had a similar reaction, and wrote up her assessment:

As my headline suggests they sound like the author, who spent over a year reporting out the series, jumped into a time machine and did his reporting a couple years ago. One of the reasons is that both articles so far start with anecdotes from 2006 about Microsoft and Facebook. The data centers that Facebook recently built in Forest City, North Carolina and Prineville, Oregon, are industry pioneers in terms of energy efficiency and openness. Microsoft, too, has more recently pledged to get rid of its diesel generators for its facilities, and has been using less air conditioning in its new data centers.

The data center operators at the largest Internet companies like Google, Facebook, Apple, Microsoft, Yahoo and eBay are so focused on energy efficiency of their newest data centers that new designs are starting to be widely adopted with technologies like open air cooling (getting rid of the huge air conditioning units in these things). New types of low power chips and servers are being developed by startups like SeaMicro, which was recently bought by AMD. The articles so far don’t mention these innovations.

She does, though, think that there’s value in the NYT series because it will shine some light on data center operators who aren’t thinking about energy efficiency and power use. She wrote a 4-part series on data center power use, which I recommend and to which she links in her article.

From a policy perspective, is there an “externality” here to be addressed? Data centers are expensive and take up a lot of space, and if you are incurring the cost of a data center, electricity is your top expense item. Thus firms have strong incentives to minimize those costs while still delivering the services and degree of reliability that they have promised to their customers. That’s a high-powered incentive to pursue energy efficiency innovations with a policy intervention, and that incentive has been inducing those innovations over the past 5 years, as Fehrenbacher notes in her article and her data center series. Companies like Google, Amazon, Apple, Microsoft, and Facebook have been driving those innovations, are in aggregate the largest data center operators, and thus are driving the majority of data center server traffic in a more energy-efficient direction. As is typical with innovation and new technology adoption, others will follow as the innovations are refined and made easier to implement.

Another important innovation that has implications for energy efficiency, but has the Bastiat-esque problem of being unseen, is the dramatic move toward server virtualization in data centers. With server virtualization, data center operators can essentially run several virtual servers off of one physical server. Obviously this increases the computing and storage capacity of the data center without increasing the physical assets, and on balance this means an increase in computing and storage capacity without an appreciable change in power use — more computing per watt of power consumed. In the absence of virtualization, to achieve that same capacity would have required a dramatic increase in physical server capacity, and in electricity use to power those servers. Neither the NYT article nor Fehrenbacher’s series address the role that virtualization has played in enabling capacity optimization and high reliability at lower power use levels. Here’s a concise Green Grid white paper on the subject.

Yes, there is some energy wasted in data center operations, just as there is in every single way that we use energy — we won’t be repealing the laws of thermodynamics any time soon. But data center operators have economic incentives to pursue energy efficiency, and a wide array of inventors, architects, and other entrepreneurs see opportunities in those incentives. We are seeing this process play out before our eyes.