Power Up: The framework for a new era of UK energy distribution

The Adam Smith Institute has published a research report I wrote for them, Power Up: The framework for a new era of UK energy distribution. From the press release:

The report … argues that new technologies such as smart grids and distributed energy production can revolutionise old models of energy distribution and pricing, in the same way that apps like Uber are disrupting traditional models of transport.

In a world of expensive of energy prices, the report suggests regulators should encourage experimentation with new technologies, rather than cutting them off at inception. Regulating the market too heavily – often justified by claims that consumers are being ‘ripped off’ or overwhelmed by the number of tariffs available – closes down consumer experimentation and prevents technological and economic progress, which keeps energy prices high.

The paper envisions a world of choices in the energy market; where smart meters that relay real-time price changes to encourage better energy use are just the beginning. The author, Dr Lynne Kiesling, imagines consumers being able to see where their energy is coming from, and to choose what kind of green-grey energy mix they want.

Most important, Dr Kiesling argues, is for OFGEM to adopt a structure of ‘permissionless innovation’ – which allows companies to experiment freely without being granted permission from regulators. In the early days of the internet, no-one envisioned a world of Amazon, iPhones and Uber; but these inventions were able to thrive, as there were not limited by regulatory barriers. OFGEM, Kiesling argues, needs to adopt a more relaxed regulatory structure that dismantles the barriers that have been created.

Increasingly throughout the economy we see how decentralized technologies empower us to make decisions and to automate our decisions, leaving us free to pursue other projects with our time. We also see how producer and consumer experimentation with new products and new combinations of products and services are the essence of a vibrant, value-creating economy. In electricity the electrons won’t change, but the applications and services we layer on top of the electrons and bundle with them are changing dramatically, and in ways that can lead to a cleaner and prosperous future if barriers to entry and to innovation are lowered.

ASI’s Charlotte Bowyer wrote a summary of the report in City A.M., and she made some trenchant points:

Digital sensors and two-way devices allow automatic, preset heating and lighting, while dynamic pricing and smart meters will allow households to adjust the type of energy they use with real-time changes in electricity costs; for example, switching to renewable power when it reaches a certain price.

These innovations can reduce waste, improve efficiency, save money, and allow households to move towards cleaner energy useage, tailoring a “green-grey” mix to suit their budget.

However, such technologies will only proliferate within the right regulatory environment – one which the UK currently lacks. According to a new Adam Smith Institute report – Power Up: The framework for a new era of UK energy distribution – the electricity market’s current model of regulation severely limits the level and rate at which large-scale innovations can occur.

The current regulatory model was borne out of the privatisation of a vertically-integrated industry, where everything from electricity generation and transmission to distribution was performed by a single utility. The future of electricity generation and distribution will be nothing like this.

And in supposing a natural monopoly and applying a static model of innovation, even well-intentioned interventions have an adverse effect on competition, creating an institutional framework in which supplier innovation is severely hampered.

I am pleased to be extending the application of the idea of permissionless innovation to the electricity industry, and I am grateful to the Adam Smith Institute for inviting me to engage in this research.

2015 Nobel laureate Angus Deaton

Angus Deaton is the worthy and deserving winner of this year’s economics Nobel. The arc of his work, from theory to data to empirical application, has been consumption, measuring consumption, and consumption as an indicator of well-being, poverty, and inequality. His analyses also incorporate political economy as a factor influencing those relationships and incentives.

If you haven’t read his book The Great Escape, do so. It’s an accessible and optimistic account of the relationship between poverty and economic growth. Here’s an interview with Deaton from Caleb Brown at the Cato Daily Podcast in 2013 on the themes of the book. For a longer discussion based on the book, Russ Roberts’ EconTalk podcast with Deaton from November 2013 is well worth the time.

As always, Tyler’s post and Alex’s post at Marginal Revolution today have good synopses of Deaton’s work and its importance, and many good links to further reading.

In my mind, Deaton’s work sits with the demographic analyses and data visualization of Hans Rosling (if you haven’t seen his 200 countries in 200 years video, put down everything right now and watch it; his TED talk about the impact of the washing machine is guaranteed to bring tears to the eyes of at least this unsentimental economist). It also complements Joel Mokyr’s economic history of innovation and technological change as factors enabling economic growth.

Deaton’s unstinting attention to detail in improving the measurement of consumption has always stood out to me as a defining feature of his work. I recommend it to your attention, and congratulate him for this well-deserved award.

Cass Sunstein on regulatory analysis and the knowledge problem

Cass Sunstein begins:

With respect to the past and future of regulation, there are two truly indispensable ideas. Unfortunately, they are in serious tension with one another. Potential solutions lie in three reforms, all connected with democracy itself – but perhaps not quite in the way that most people think.

The first indispensable idea is that it is immensely important to measure, both in advance and on a continuing basis, the effects of regulation of social welfare. As an empirical matter, what are the human consequences of regulatory requirements? That is the right question to ask, but inside and outside of government, it is tempting to focus on other things. […]

At the present time, the right way to answer the right question is to try to identify the costs and benefits of regulations, in order to catalogue and to compare the various consequences, and to help make sensible tradeoffs. To be sure, cost-benefit analysis can create serious challenges, and at the present time, it is hardly a perfect tool. Nonetheless, it is the best one we now have. Some people do not love cost-benefit analysis, but they should. If we want to know about the real-world effects of regulation, that form of analysis deserves a lot of love.

The second idea, attributable above all to Friedrich Hayek, is that knowledge is widely dispersed in society. As Hayek and his followers emphasize, governments planners cannot possibly know what individuals know, simply because they lack that dispersed knowledge. The multiple failures of plans, and the omnipresence of unintended consequences, can be attributed, in large part, to the absence of relevant information. Hayek was particularly concerned about socialist-style planning. He contended that even if socialist planners are well-motivated and if the public interest is their true concern, they will fail, because they will not know enough to succeed. Hayek celebrated the price system as a “marvel,” not for any mystical reason, but because it can aggregate dispersed information, and do so in a way that permits rapid adjustment to changing circumstances, values, and tastes.

In sum, while it is immensely important to measure the effects of regulation, we may lack the necessary information.

Sunstein notes, pragmatically, that regulators have access to substantial amounts of information and methods of cost-benefit analysis are well-established and improving. He wrote, “In the day-to-day life of cost-benefit analysis, regulators are hardly making a stab in the dark.” He continues:

Nonetheless, modern followers of Hayek are correct to emphasize what they call “the knowledge problem,” understood as the government’s potential ignorance, which can be a problem for contemporary regulators of all kinds […]

The tension, in short, is that regulators have to focus on costs and benefits (the first indispensable idea), but they will sometimes lack the information that would enable them to make accurate assessments (the second indispensable idea). … In light of the knowledge problem, can they produce reliable cost-benefit analyses, or any other kind of projection of the human consequences of what they seek to do, and of potential alternatives?

Sunstein identified three reforms he said respond to the first indispensable idea while helping overcome or mitigate the limits imposed by the knowledge problem: First, modern “notice-and-comment” rulemaking processes; second, retrospective analysis of regulations; and third, careful experiments and especially randomized control trials. As pragmatic responses to knowledge problems, each of the three have something to contribute.

Is it enough?

What would Hayek say? Sunstein responded to this question in a footnote:

I am not suggesting that Hayek himself would be satisfied. Consider this remarkable passage:

This is, perhaps, also the point where I should briefly mention the fact that the sort of knowledge with which I have been concerned is knowledge of the kind which by its nature cannot enter into statistics and therefore cannot be conveyed to any central authority in statistical form. The statistics which such a central authority would have to use would have to be arrived at precisely by abstracting from minor differences between the things, by lumping together, as resources of one kind, items which differ as regards location, quality, and other particulars, in a way which may be very significant for the specific decision. It follows from this that central planning based on statistical information by its nature cannot take direct account of these circumstances of time and place and that the central planner will have to find some way or other in which the decisions depending on them can be left to the “man on the spot.”

Hayek [“The Use of Knowledge in Society,” AER, 1946]. In my view, the claim here is a mystification, at least as applied to the regulatory context. Statistical information “by its nature” can indeed “take direct account of these circumstances of time and place.” Of course it is true that for some purposes and activities, statistical knowledge is inadequate.

It is an odd note.

Sunstein quoted Hayek, said Hayek’s point is a mystification, then admitted, “Of course it is true….” I guess Sunstein’s point is that it is a true mystery?

In any case, we should note few things about Sunstein’s reforms. First, his focus is not so much on the doing of regulation itself, but rather on regulatory oversight. His three reforms are not to be applied in day-to-day regulation, but rather serve as external correctives. A related and perhaps more fundamental issue concerns the manner in which regulatory operations can facilitate the production and coordination of knowledge in ways that promote better outcomes.

Second, extending this last point, as we compare government and market processes, we note the relative power of feedback processes in the private sector and the weakness of feedback processes in the public sector. If you are dissatisfied with your service in a restaurant, you can take choose to eat elsewhere next time. If you are dissatisfied with your service at the Department of Motor Vehicles (or the Veterans Administration or the local Zoning Board, etc.), you have many fewer options. Feedback processes are essential to the production and coordination of knowledge. How can regulatory agencies learn in an environment in which feedback processes are so significantly attenuated?

Third, Sunstein is operating with a rather flat view of knowledge. That is to say, in his explanation knowledge and information and data are various names for more or less identical things. But if we take seriously Sunstein’s remark “of course it is true that for some purposes and activities, statistical knowledge is inadequate,” then further questions arise. Sunstein does not explore the point, but it is exactly here, for Hayek, that the force of the knowledge problem emerges.

There is a research agenda here.

Obviously Sunstein endorses further research and development of benefit-cost analysis, expansion of notice-and-comment rulemaking processes, retrospective regulatory analysis, and experimental tests of regulation. Benefit-cost analysis has a dedicated academic society and journal and book level treatments. Sunstein discusses efforts to broaden understanding of regulatory proposals and encourage public engagement (for example, the government’s www.regulations.gov and the university-based outreach project regulationroom.org). Retrospective regulatory analysis and experimental tests get less attention, but a number of academic programs do at least some retrospective work (for example: the Mercatus Center Regulatory Studies Program at GMU, the Penn Program on Regulation, the George Washington University Regulatory Studies Center). As Sunstein notes, a number of federal agencies have committed to using experiments to help understand the impact of regulation.

How far can these efforts get us in the face of the knowledge problem? For which regulatory “purposes and activities” is it the case that “statistical knowledge is inadequate”? Are there patterns in this inadequacy that bias or undermine regulatory action? Assuming Sunstein’s reforms are fully implemented, what residual knowledge problems would continue to trouble regulation?

A good place to start is Lynne Kiesling’s article “Knowledge Problem” (obviously!) which appeared in the 2014 volume Oxford Handbook of Austrian Economics. Mark Pennington’s book Robust Political Economy examines how knowledge problems and incentive problems can frustrate political activity. Obviously, too, Hayek’s own “The Use of Knowledge in Society” and “Economics and Knowledge” are relevant, as is the later “Competition as a Discovery Process.” Don Lavoie’s “The Market as a Procedure for Discovery and Conveyance of Inarticulate Knowledge” condenses Hayek’s statements on the knowledge problem and further explains why the problem cannot be overcome merely by further developments in computer technology.

Going deeper, one might explore James Scott’s book Seeing Like a State, which emphasizes how the top-down processes of measurement and aggregation can strip meaning from knowledge and result in destruction of value. Then perhaps a work like Hans Christian Garmann Johnsen’s new book The New Natural Resource: Knowledge Development, Society and Economics might have something to say. (I’ve only just begun looking at Johnsen’s ambitious book, so it is too soon to judge.) Complementary to work on institutional frameworks and knowledge would be close studies of government agencies, like the Pressman and Wildavsky classic Implementation: How Great Expectations in Washington are Dashed in Oakland… and broader surveys of policy history such as Michael Graetz’s The End of Energy or Peter Grossman’s U.S. Energy Policy and the Pursuit of Failure.

And more.

Here I have focused just on those parts of the Sunstein article where he bumps up against the knowledge problem most explicitly. He explores each of the three reforms in more detail, providing much more to think about.

Recommended reading for regulatory analysts.


Sunstein’s paper is “Cost-Benefit Analysis and the Knowledge Problem.” Regulatory Policy Program Working Paper RPP-2015-03. Cambridge, MA: Mossavar-Rahmani Center for Business and Government, Harvard Kennedy School, Harvard University.

ABBACUS report highlights benefits of retail electric markets

On Tuesday the Distributed Energy Financial Group released its 2015 report, Annual Baseline Assessment of Choice in Canada and the United States (ABACCUS). The report provides an excellent overview of the current state of retail electricity markets in the 18 jurisdictions in the U.S. and Canada that permit at least some degree of retail competition. The overall result will not surprise anyone who follows the electricity industry or is a KP reader: for the eighth year in a row, Texas tops the rankings by a wide margin, with Alberta second and Pennsylvania third. And the general trend is promising, both in terms of market experimentation and regulatory institutional change to reduce barriers:

In nearly every jurisdiction in North America, REPs continue to expand their presence, increase the number of offerings, and increase the variety of offerings in the residential marketplace. These positive developments are primarily in response to market opportunities, but the activities of regulators to facilitate retail choices should not be glossed over. Regulators have reduced barriers to entry, facilitated the speed and ease of market transactions, and raised public awareness about the opportunities for retail choice. Numerous states have invested in advanced metering infrastructure, providing lower-cost access to detailed consumption data. These data are essential to offer time-differentiated prices, to track the costs to serve an individual consumer (rather than relying on an estimated load profile) and to offer new products and services, including prepaid service, high-usage alerts, or targeted price-risk management offers. Combining advanced mobile communications with advanced metering data has also facilitated new products and services.

The report is also extremely clear and well-written, so if you are interested in learning more about retail electricity markets and regulatory policy, and what the current trends are in the distribution and retail segments of the industry, read this report. Its appendices also provide state-by-state (province-by-province) summaries with extensive detail.

The report’s policy recommendations are in keeping with the idea that market processes provide opportunities for producers and consumers to benefit through experimentation and trial-and-error learning, and that product differentiation through innovation is the most potent form of dynamic competition for creating meaningful consumer benefits. Note in particular that their recommendations focus on default service, suggesting to

Reform default service in the near term … Allow competitive suppliers to provide default service instead of the incumbent utilities … Limit residential default service pricing to basic (“plain vanilla”) service and let the market offer other choices … Adopt a plan to phase out default service. The plan must reflect the realities of each jurisdiction. No two plans would be the same as each jurisdiction must be mindful of past decisions.

I am thrilled to see these recommendations, because incumbent default service can be a costly obstacle and entry barrier for small, new potential entrants. In fact, my Independent Review article from last fall lays out the precise economic argument for how incumbent default service can be an entry barrier and why regulatory policy should “quarantine the monopoly”:

Incumbent vertical market power in deregulating markets can be anticompetitive, as seen in the current process of retail electricity restructuring. This paper uses the AT&T antitrust case’s Bell Doctrine precedent of “quarantine the monopoly” as a case study in incumbent vertical market power in a regulated industry. It then extends the Bell Doctrine by presenting an experimentation-based theory of competition, and applies this extended framework to analyzing the changing retail electricity industry. The general failure to quarantine the monopoly wires segment and its regulated monopolist from the potentially competitive downstream retail market contributes to the slow pace and lackluster performance of retail electricity markets for residential customers.

In the case of Texas, default service was indeed transitional, as intended, and was not provided by the incumbent.

The issue of incumbent default service as an entry barrier may be part of the upcoming “utility of the future” discussion that will take place in Illinois, according to this Retail Energy X story:

If the Illinois Commerce Commission opens a “utility of the future” proceeding, the structure of default service, including its potential elimination, would likely be discussed in such proceeding, ICC Commissioner Ann McCabe said during a media call discussing the Annual Baseline Assessment of Choice in Canada and the United States.

Asked about the ABACCUS recommendation to end default service, McCabe said, “That’s subject to discussion. If we pursue some kind of ‘utility of the future’ initiative, that will be one of the questions likely to be addressed.”

DOT’s airline price gouging investigation and a political economy-based prediction

On Friday, the U.S. Department of Transportation announced it had launched an investigation into possible “unfair practices (e.g., price gouging) affecting air travel during the period of time that Amtrak service along the Northeast Corridor was delayed or suspended as a result of the May 12th derailment.” Five airlines received letters from the agency seeking information on prices for travel between airports most affected by the derailment. CBS News in New York had the story, as did many other media outlets.

In the statement released by the DOT Transportation Secretary Anthony Foxx said, “The idea that any business would seek to take advantage of stranded rail passengers in the wake of such a tragic event is unacceptable. This Department takes all allegations of airline price-gouging seriously, and we will pursue a thorough investigation of these consumer complaints.” The DOT was responding to consumer complaints and a letter from U.S. Senator Chris Murphy (D-CT).

Pure political theater.

The law DOT cites, 49 US Code § 41712, allows the department to investigate whether an airline “has been or is engaged in an unfair or deceptive practice or an unfair method of competition in air transportation or the sale of air transportation.” In the event the department finds price gouging, the sole remedy present in the law is to order the airlines to stop. Given that rail travel was restored after five days, prices have already returned to normal. No meaningful remedy is possible…

…unless DOT wants to go big: rather than finding the prices constituted unfair practices, the DOT could conclude that the airlines’ computerized pricing algorithms constitute unfair practices and order airlines to cease employing them. The airlines’ dynamic pricing systems are not popular with consumers, so they might make an appealing political target. Such a response would be meaningful, in that it would impose significant costs on airlines to reform their systems, but is such a conclusion likely?

The word “unfair” is not defined in the law; the DOT said it relies upon the U.S. Federal Trade Commission’s Policy Statement on Unfairness for a working definition. The policy statement provided a three factor approach to fairness. considering: (1) consumer injury, (2) violation of public policy, and (3) unethical or unscrupulous conduct. In practice the FTC relies only on the first two factors.

Under the policy, apparent consumer injury is judged against the commercial benefits associated with the trade practice. While dynamic pricing is unpopular with consumers, it is profitable for airlines. In addition, it likely produces prices and service quality that are, on average, better for consumers than otherwise. A balancing of apparent harms and apparent benefits should tilt in favor of dynamic pricing.

Here is my political economy-based prediction:

After a month or two the DOT will report finding that airline prices did jump suddenly after the derailment as demand for air travel jumped up. They will observe that initial price spikes resulted from airlines’ computerized pricing mechanisms and did not reflect an intent to “take advantage of stranded passengers in the wake of such a tragic event.” They will note that airlines responded by adding flights and pressing larger aircraft into service. The report will conclude temporary price spikes reflected the ordinary workings of supply and demand under short-lived extraordinary circumstances. No finding of unfair practices will result, and no trade practices will be condemned.

While the announcement of the investigation produced a lot of press, the release of the report will produce little press. A finding of “ordinary workings of supply and demand” is not newsworthy.

What is more, the DOT already knows this answer. It already believes there is nothing to find in the data it is requesting. Still, a Senator wrote a letter — by the way Senator Murphy sits on the Senate subcommittee that oversees the DOT budget — and the DOT responded.

The Senator himself, too, either already knows this answer or simply has not thought too hard about it. But why should he think about a future no-news report? The announcement of the investigation and the press that the announcement garnered, that was the goal. The rest is noise.

[Thanks to Tom Konrad for bringing the story to my attention.]

Dallas Morning News on competitive retail power market fees and rate designs in Texas

At the Dallas Morning News James Osborne reports on the controversy over minimum use fees in the competitive retail power market that includes most Texas households. As discussed here at Knowledge Problem last week, retail suppliers sometimes design contract offers to be especially cheap for consumers using 1000 kWh per month. The state’s powertochoose.org website defaults to presenting offers from low to high by the average cost of power at exactly 1000 kWh per month. Minimum use fees can be used to help boost an offer to the top of the ranking.

Minimum use fees are sometimes controversial — consumers feel penalized for conserving resources — and the Texas state legislature looked into the issue earlier this year. Osborne wrote:

During this year’s legislative session, Rep. Sylvester Turner, D, Houston, introduced a bill banning retailers from charging customers for using too little power. Power companies quickly lined up against the bill, arguing the fees were essential to the financial health of the retail industry, which must navigate wide swings in wholesale power prices. The legislation never made it onto the House floor.

But Osborne noticed, “For some companies, the controversy presents an opportunity.” He explains retailers offer an increasing variety of plans – some with free nights and weekends for example, others designed to accommodate solar panels, and still others that reward conservation over consumption.

A Houston Chronicle analysis in January 2015, link below, concluded over 70 percent of the contract offers for the Houston area included minimum use fees, but then nearly 30 percent of the offers did not. Consumers simply need to understand a bit about their own power consumption and shop accordingly.


Gaming the rankings on the Texas Power to Choose website

To help provide consumer information on competitive retail offers in the Texas electric power market, the Public Utility Commission of Texas maintains a website at www.powertochoose.org. Enter your zip code, click a button, and it will display the top ten (out of nearly 300) offers.

Because the table shows the lowest priced offers first, with the average calculated assuming 1000 kWh of energy consumption, companies can compete for the front page by minimizing their average price at exactly 1000 kWh. But as it turns out, the low cost offer at exactly 1000 kWh of consumption may not be the low cost offer for consumers using a little bit less or a bit more than 1000 kWh.

For example, when I search for downtown Houston (zip code of 77002) the website tells me there are 290 plans available. Click “View Results” and at the top of the table is Texans Energy’s “12 Month Texans Choice” plan at an amazing 4.3¢ per kWh (for comparison, the U.S. average residential power price is around 11¢ or 12¢ per kWh).

But check the offer’s “Electricity Facts Label”: Texans Energy obtains the top spot via a rate design of 10.8¢ per kWh but tossing in a $65 credit if usage falls between 999 kWh and 1200 kWh during the billing cycle. The usage credit creates a low-price sweet spot at 1000 kWh. (The second result is Texans Energy’s similarly designed “6 Month Texans Choice” plan.)

The rest of the first 10 results work similarly to blend in a bill credit kicking in around 1000 kWh to create a low-price sweet spot at that consumption level:

  • Power Express’s “#Super6” rate at an amazing 4.4¢ per kWh, which they produce via 12.4¢ per kWh rate with an $80 bill credit for consumers using more than 999 kWh.
  • Pennywise Power’s “Wise Buy Conserve 6 Plus” also showing 4.4¢ per kWh. Their rate includes a usage credit of $25.00 per cycle for consumption between 999 and 2001 kWh.
  • Discount Power “Prime 24” showing at 4.5¢ per kWh, obtained with an 11.04¢ per kWh and a bill credit of $65 for consumers using more than 999 kWh in a billing cycle.
  • Gexa Energy’s “Gexa Choice 6” showing at 4.5¢ per kWh. The rate includes a “monthly service fee” of $14.95 for consumption of less than 1000 kWh and a “residential usage credit” of $25 for consumption of more than 999 kWh.

Also showing on the first page of the Power To Choose results: The Frontier Utilities “Frontier Credit Back 3” plan similarly offers a $60 credit for usage of more than 999 kWh per month. Our Energy “Our Optimal Residential Plan” also offers an $80 credit for usage greater than 999 kWh per month.

Clearly rates are being designed for prominent position on the Power To Choose website. No doubt these offers are good deals for consumers who regularly consume at or a little above the 1000 kWh level and watch their usage. But because the consumer bill will jump significantly at consumption levels just below 1000 (and in some cases above either 1200 or 2000 kWh), they could be surprisingly expensive contracts for consumers who are not so careful.

The power to choose

But the consumer has the “power to choose.” On the results page the viewer can select estimated use levels of “500 – 1,000 kWh,” “1,000 – 2,000 kWh,” and “more than 1,000 kWh.” The selection re-sorts the results based on average price estimates at the 500, 1,000, and 2,000 kWh usage levels. (The middle is the default.) Consumers who use less than 1000 kWh monthly can easily find a contract good for them, perhaps the Infinite Energy “Conserve and Save 3-month” offer that includes a $9.95 credit for consuming less than 1,000 kWh and averages just 4.2¢ per kWh at 500 kWh!

A little different deisgn is 4CHANGE ENERGY’s “Value Saver 6,” which has a per kWh rate that drops between 500 and 1000 kWh usage, but jumps by nearly 5¢ per kWh at 1001 kWh. The effect is that of a rolled-in billing credit that grows in size with usage from 500 to 1000 kWh. The average rate is 8.6¢ per kWh at 500 kWh, drops to 4.9¢ per kWh at 1000 kWh, but rises to 7.6¢ per kWh at 2000 kWh. Unlike many of the above rate designs, however, the “Value Saver 6” does not have sharp jumps up or down.

Don’t like complications? Maybe Discount Power’s “Saver – No Gimmicks No Minimum Usage Fee – 24” plan is for you. Just as described: no minimum usage fee, just a fixed flat rate that averages about 8¢ per kWh.

Gaming is good

Overall, the competition is good for consumers. These supplier games may create traps for unwary customers, but consumers in Downtown Houston currently have 290 contract offers from over 50 different suppliers to choose among. Most residential consumers in the competitive retail part of Texas have a similar number of opportunities. It is easy to avoid the traps!

These suppliers offer contracts that differ across many margins: fixed price vs. variable, flat price vs. time-of-use, different levels of renewable content, pre-paid or not, terms ranging from 1 month to several years. On the other hand, Texas residential consumers outside of the competitive region (either because outside of ERCOT or because served by a municipal or coop utility in ERCOT that opted to stay out) are lucky to see more than two or three different options from their local monopoly utility. These offers might include a fuel cost adjustment tracking wholesale costs to a degree, and might get adjusted annually. But dynamic? No. Competitive? No.

The competitive retail electric power market for residential consumers in Texas is probably the most dynamic one around. With a little care consumers can avoid the gimmicks and find a pretty good deal.

HT: Big thanks to the economically-savvy anonymous tipster who brought the gamesmanship to my attention.