Coal-Powered Jets

Lynne Kiesling

From Technology Review today, news of new research from Penn State on liquid fuel from coal:

Schobert and his colleagues make the fuel using refined coal oil, which is a byproduct of coke manufacture; the byproduct is mixed at an oil refinery with a product of crude oil called light cycle oil. This mix is then hydrogenated using equipment that already exists at refineries, and then it’s distilled into various products — mostly diesel fuel and jet fuel (about 40 percent of each), as well as some gasoline and heating oil.

Other potential benefits of the coal-based fuel: it can replace the three or four different jet fuels used by the military for aircraft and missiles, and the same fuel can be used in diesel engines if those engines are modified slightly. The fuel could also be used without modification in high-temperature stationary fuel cells for generating electricity, Schobert says.

But significant hurdles remain before the fuel can see widespread use. So far, only 500 gallons of it have been produced, far too little to assess production costs, Schobert says. Nevertheless, he suspects that the coal-based fuel could compete with other fuels.

One interesting thing about this development is its lack of asset specificity on the production and consumption end; it can be produced using existing refining technologies, and can be used in diesel engines with only a small amount of adjustment. That lack of asset specificity means that this fuel could contribute to a more flexible and adaptable fuel portfolio, because it doesn’t require customized plants or engines. It can even be used in fuel cells.

That Felt Good!

Lynne Kiesling

Phew, it felt good to write a post like that last one. You’ve probably noticed some self-censoring going on here at KP over the past couple of months. The reason for that is that I was doing some expert testimony analysis on a retail competition issue, and felt the need to hold back in advance of being cross-examined on that testimony.

But now I’m getting back in the groove.

Evaluating Electricity “Deregulation” in a Period of Rising Fuel Costs

Lynne Kiesling

Periods of rising costs make it hard to be a market process supporter. Nowhere is this more true than in electric power, where a century of regulator-regulated co-dependency has created a culture of price control.

Right now Maryland is the center of this debate, triggered by economic and political motives, including rising natural gas prices and the impending merger between Constellation and FPL. This Baltimore Sun article from Thursday gives a thorough overview of the situation:

Proponents of deregulation say the consumer and political angst is an overreaction to rising global energy prices that have clouded the debate over whether free-market reforms have benefited consumers.

But those voices have been increasingly shouted down by lawmakers, academic scholars and some regulators, who are raising questions about whether complex rules governing wholesale power markets are structured in a way that does more to line the pockets of power generators than save money for ratepayers.

Though contested by many, the debate directly challenges the contention of electric industry officials and others who say that rate increases sweeping the nation are solely the result of rising fuel prices and are not exacerbated by the rules creating free markets.

Such claims have bolstered consumer watchdogs, who say Maryland and other states would be better off if they hadn’t deregulated.

OK. Let’s be clear. What the states like Maryland have done in the past decade should not be called “deregulation”. In almost every case, legislation that allowed wholesale market transactions and perhaps (perhaps) some measure of retail choice of commodity provider has been accompanied by retail rate caps. This compromise has been part of the political dynamic in most states. So as soon as people start complaining about “deregulation”, bear in mind that they are playing fast and loose with the terminology to suit the objectives of their own arguments.

Note also in the above excerpt that this debate conflates wholesale market liberalization and retail market competition. Because of the aforementioned retail rate caps, retail market competition has been slow to evolve in most states, and particularly slow for residential and small commercial customers. Even in Maryland, where the restructuring and market design analyses were performed more carefully and thoroughly than in most states, retail competition has been more vibrant for large industrial and commercial customers.

The political challenge is that fuel costs have been rising for the past three years, and regulators don’t do their constituents any favors by artificially imposing rate caps that ignore those fundamentals and disconnect the prices that customers face from the true, underlying costs of electricity. But if policymakers allow politics to trump economics, as so often happens when decisions like these are so politicized, that is exactly what happens, and customers will be worse off in the long run because they are not making decisions that take into account the real cost of electricity.

Market opponents often counter that high prices don’t only reflect rising fuel costs, but also reflect supplier market power in wholesale power markets. They then go on to recommend rate caps, such as the dynamic in place right now in Maryland (and Illinois, for that matter).

That’s the wrong answer. If so-called consumer advocates really want to empower their constituents and enable them to discipline the exercise of market power in wholesale markets, then they should advocate retail choice for their constituents. Retail choice integrates customer preferences with wholesale market fundamentals, and provides an invaluable route for the expression of consumer preferences and the communication of that important information into wholesale power markets. If you want to control the exercise of market power in wholesale power markets, then active, empowered demand and retail choice is the most-bang-for-your-buck policy option.

Sadly, not enough people are making that argument forcefully enough; it doesn’t even come up in this otherwise very good Sun article.

The article also points out the benefits of what limited restructuring we’ve had:

Some energy consultants and researchers argue that politicians are being shortsighted by declaring deregulation a failure just because energy prices have increased. Ratepayers in deregulated states nationwide saved a combined $34 billion since the late 1990s, according to Cambridge Energy Research Associates, an independent research firm that gets its funding from a mixture of utilities, state regulators and other clients.

Most of that savings came as a result of rate caps put in place as part of the move to free markets. But the firm says deregulation has benefited consumers in other ways, such as by shifting the financial risks associated with building power plants to utility shareholders and away from ratepayers. More benefits – including new energy products – will come in time, they said, just as was true when phone companies, railroads, airlines and other industries deregulated.

“Often, it takes decades, not years,” said Dalton Perras, associate director of Cambridge Energy Research Associates.

This is precisely the healthy direction in which change happens with real deregulation. It’s about shifting risk, and empowering consumers to choose how much price (and outage, if you sell differentiated reliability contracts) risk they are willing to bear. Retail energy providers reflect that information into the wholesale market through their purchases from generators, and if consumers aren’t willing to pay those prices, they will change their behavior. That dynamic disciplines prices to the maximum extent possible in a period of rising fuel costs.

Instead of re-regulating through extending price caps, policymakers would create long-lived, resilient, meaningful benefits for their constituents by removing the artifical barriers to choice that exist.

Posner, Kling Cynical About Government Reorganizations

Michael Giberson

At EconLog, Arnold Kling challenges James Pinkerton’s push for reorganization of the federal government into five super-departments. Kling cites a former colleague of his as saying, “When they don’t know what to do, they re-org.”

Kling writes:

A re-organization like the [proposed] plan would create all sorts of uncertainty about where people fit in relative to the hierarchy. Middle managers would spend years jockeying for position, causing effectiveness to suffer. I am convinced that is what happened to the departments that were consolidated into Homeland Security.

According to the Washington Post, Richard Posner severely criticized reform of U.S. intelligence services on somewhat similar grounds and in much greater detail in a speech at an off-site conference of the CIA’s office of general counsel.

In Posner’s analysis, the director of national intelligence (DNI), created by Congress to be the president’s top intelligence adviser, was given too much to do. DNI John D. Negroponte oversees the CIA and 15 other intelligence agencies, including those at the Pentagon. Negroponte’s staff, which has grown to about 1,000, “has become a new bureaucracy layered on top of the intelligence community,” Posner said.

In his speech, a revised version of which is available from the Washington Post website, Posner cites an article in the New York Times that echos the point Kling made:

“[A] year after the sweeping government reorganization [of intelligence] began, the [intelligence] agencies…remain troubled by high-level turnover, overlapping responsibilities and bureaucratic rivalry,” and that the reorganization has “bloated the bureaucracy, adding boxes to the government organization chart without producing clearly defined roles.”

A little bureaucratic rivalry can be a good thing, if the result is that the more effective bureau wins. Reorganization can also help shake an agency out of established patterns of thought and action and inject a little dynamism. Reorganization is a way to execute an end run around the status quo. But to succeed, reorganization has got to be more than just political cover for past mistakes. The creation of Homeland Security and the reorganization of the intelligence service seem born of the politician’s impulse to “look busy,” so as not to be to blame.

George Mason University’s Other Stars

Michael Giberson

The Washington Post‘s business section, no doubt eager to jump into the Mason Mania that has permeated the rest of the newspaper, ran a story today about the other star team on campus: the economics department. It’s a fairly lightweight story — after all it ain’t basketball — but they spelled all the names right, talked to Robin Hanson about prediction markets, and gave props to former Mason professor Bob Tollison, a “sport-o-metrics ” pioneer.

UPDATE: See GMU Law Prof Todd Zywicki’s more detailed and insightful post on this topic at The Volokh Conspiracy. Of course I should have mentioned the GMU law faculty as the other other star team on campus.

The Wooly Concept of Sustainability: Gore and Blood in the WSJ

Lynne Kiesling

Today’s Wall Street Journal has a commentary from Al Gore and David Blood that asks the question: when will we start accounting for environmental costs?

Gore and Blood begin by invoking the concept of sustainability, and the relationship between capitalism and sustainability. Sadly, they do not bother to define what they mean by either term, and as “sustainability” is a very wooly concept, they can proceed to use it as they see fit to suit their argument. The Wikipedia entry on sustainability, which is extremely thorough, offers some insight into the breadth, and subjectivity, of the various definitions of sustainability.

I have always been quite skeptical about the definition and uses of the concept of sustainability, particularly when they start from a position of antagonism between long-run resource supplies and economic growth. This antagonism is implicit in Gore and Blood’s introduction to their commentary today, and reflects the widely-held belief in “the sustainability crowd” that resource preservation and economic growth are incompatible.

I reject that argument. Resource preservation and economic growth can be compatible. Under what conditions are they compatible? Under conditions in which information about relative resource scarcity can be communicated across time and place in a low-cost manner. In other words, clear price signals and complete capital markets are the fundamental foundations of aligned economic and environmental sustainability. Capital markets provide agents in the economy with ways to shift resource use through time depending on the intertemporal opportunity costs they see. As a resource becomes more scarce, its price rises. Instruments like futures contracts and financial derivatives enable agents to communicate information about expected future scarcity and opportunity costs into today’s resource use decisions.

But what do we need to have clear price signals and complete capital markets? In reality, completely clear price signals and full caital markets are impossible, but the way to get them as close to that benchmark as possible is to focus on reducing transaction costs and increasing the clarity of property rights definitions. When transaction costs are low and property rights are as well defined as is economical, then scarcity information flows across time and space.

Thus I think Gore and Blood come at the question of accounting for environmental costs from the wrong direction. Not surprisingly, they think top down, and want to see accounting costs reflected in national income accounts. Gore and Blood put too much trust in national income accounts, when they should instead be focusing on the fundamental economic and policy reasons why agents do not take into account the interdependence of their actions.

In other words, if Gore and Blood want, as they say in their commentary:

Not until we more broadly “price in” the external costs of investment decisions across all sectors will we have a sustainable economy and society.

then they should advocate policies that reduce transaction costs and create better opportunities for agents to trade off their decisions across time and space. That means better-defined property rights and using capital markets. This idea may currently be anathema to those who think that economic growth and resource preservation are incompatible, but if those people are really intellectually committed to resource preservation, then they should advocate this approach instead of the business-as-usual political, regulatory, top-down approach.

Messrs. Gore and Blood should think more bottom-up than top-down if they want to see that alignment happen.

Bacon is Now a Health Food

Lynne Kiesling

Researchers have used a nematode to clone pigs that produce meat that is high in omega-3 fatty acids (see articles from Wired and the International Herald Tribune, for example). Yay, bacon as a health food!

The Wired article notes how consumer tastes are driving this research:

Hoping to create healthier, cheaper and tastier products that consumers crave, Monsanto of St. Louis and its biotech farming competitors like DuPont are developing omega-3-producing crops that yield healthier cooking oils. Kang said 30 academic laboratories are now working with his omega-3 gene, presumably pursuing similar projects.

Notwithstanding the prissy whingeing of the anti-GM crowd, this is a great example of the power of human creativity and technological change. It doesn’t nullify the saturated fat content of pork, or the effects of consuming that saturated fat. But it does open up the opportunity for people who don’t like oily fish, or can’t easily get access to oily fish, or are worried about mercury content in tuna, to consume healthy fats without having to do flaxseed oil shots (yuck).

How cool is that?

Markets vs. The Wisdom of Crowds in picking NCAA Basketball winners

Michael Giberson

Once again this year, like millions of folks around the country, I filled out a NCAA basketball tournament bracket. Once again, my bracket is far from perfect. Once again, this year, I have been searching for an edge. (Preferably, the kind of edge that will enable me to out-pick my sports-minded, math-teacher wife.)

By the way, I take no comfort from Carl Bialik’s column in the WSJ which suggests that the odds of picking a perfect bracket are 1 in 150 million, or 1 in 124 billion, or maybe 1 in 36 million billion, depending on how you set up your model. (Hat tip to Skip Sauer.)

Having followed the Ticket Reserve market for NCAA basketball tournament “fan forwards,” (see here and here) I wondered whether fan forward prices were a reliable indicator of likely winners in the tournament. A fan forward for the regional games pays off in the ability to purchase tickets to the game at face value if the selected team wins in the first two rounds. If the value of a ticket to a regional game is independent of the teams that reach the regional games, then the fan forward price reflects the probability of the team’s success.

After entering my own bracket into the ESPN.com Tournament Challenge, at 9 AM last Thursday I checked Ticket Reserve prices and filled out a “Market Signal” bracket, in each case selecting as winner the team with the higher priced fan forward. For comparison, I entered a third bracket based upon the favorites shown in ESPN’s national bracket, which tabulates all submissions to the site. I called this third bracket “Wisdom of Crowds” (with apologies to James Surowiecki). I thought that picks based on Ticket Reserve prices in a real money market for ticket options would be superior to picks based on the aggregation of entries into the no-cost-to-enter ESPN Tournament Challenge. By the way, the brackets were very similar, only differing in 8 of 63 picks.

With two rounds played, Market is beating Wisdom of Crowds. Using ESPN’s scoring the results are: Market-460, Wisdom-420. These scores put Market surpassing 95 percent of all entries submitted to ESPN, while Wisdom is down at 67 percent.

Market has won out so far by picking Kentucky over UAB, Wichita State over Seton Hall; and Arizona over Wisconsin in the first round, and picking West Virginia to win in the second round. On the other hand, Market had Arkansas while Wisdom picked first-round winner Bucknell. Notice that in each of these cases Market took the favorite and Wisdom picked the slight upset; Wisdom was right once.

For the regional games beginning tonight, Market has UCLA beating Gonzaga and losing to Memphis in Oakland, while Wisdom has Gonzaga prevailing over both teams and reaching the final four. The remaining picks are identical until the championship, where Market says Duke and Wisdom says UConn.

My own bracket scores a 410 — 57 percent — thank you for asking. Next year I think I’ll play the “Market.”

UPDATE: After the 3rd Round, Market hit the 97th percentile on the ESPN Tournament Challenge, while Wisdom sank lower. But Saturday’s game were no help to either bracket, with Market listing Duke and Memphis as winners while Wisdom had Duke and Gonzaga. (At 660 pts, 88th percentile; and 580 points, 61st percentile).

On the other hand, my own bracket had UCLA beating Memphis and has come to 690 pts in the ESPN scoring system, at the 94th percentile.

Innovation and Its Discontents: Jaffe and Lerner in WSJ

Lynne Kiesling

We’ve seen lots of discussion of how dysfunctional the US patent system is at the moment; from business process patents to laws of nature patents to the recent Research in Motion (Blackberry) lawsuit, we see the pernicious entry barriers that too broad a patent policy can erect.

In today’s Wall Street Journal, Adam Jaffe and Josh Lerner have a lengthy commentary on patent policy. Among other things, they provide an overview of the current situation:

The problems of the U.S. patent system are under discussion today with an urgency not seen in decades. The Supreme Court will soon hear oral arguments in eBay v. MercExchange LLC, which promises to be its most far-reaching examination of patent law in many years. Today the court will also consider LabCorp v. Metabolite Laboratories — the contested matter is whether a patent can be issued for the correlation between a disease and a naturally occurring substance in the human body. That is: Can you actually patent the laws of nature? And shockingly, Research in Motion has been forced to pay $612 million to prevent all of our BlackBerry handhelds from going dark, even though the U.S. Patent and Trademark Office (USPTO) has indicated that it is likely to find all of the patents behind this ransom demand invalid. Congressional subcommittees, with good reason, have recently held hearings asking fundamental questions about developments like these in the patent system.

The importance of this long-overdue focus on patents cannot be overemphasized. The past decade has seen periodic uproars over particular patents, such as Amazon’s “one click” patent for online shopping. The troubling patents have been well publicized, but the wrong lessons have typically been drawn. Commentators have tended to focus on the incompetence of the USPTO in allowing “bad patents.” Others have concluded that the patent system is not working with respect to a particular area of technology. Concerns about software awards led, for instance, Jeff Bezos of Amazon to propose a new patent type for software; others have suggested that biotechnology be excluded in various ways from the patent regime.

Jaffe and Lerner then go on to make an argument with which I agree, and which becomes increasingly incontrovertible as we see events like the RIM lawsuit: Congress changed patent law, and institutions matter. Thus the problems with the patent law are systemic and would not, contra Bezos, be remedied by increasingly layered and complex patent definitions. Legal changes have made patents much easier to get.

Congress set us on this road in 1982 when it created a centralized appellate court for patent cases, the Court of Appeals for the Federal Circuit. Its decisions — which advocates argued would simply ensure judicial consistency — are largely responsible for the significant strengthening of the legal potency of patents. Then, a decade later, Congress turned the USPTO into a “profit center.” The office has been pushed to return “excess” revenue to the U.S. Treasury. This shift led to pressures to grant more patents, difficulties in attracting and retaining skilled examiners, and a torrent of low-quality patent grants. These include such absurdities as patents on wristwatches (paw-watches?) for dogs, a method of swinging on a swing (“invented” by a five-year-old), and peanut butter and jelly sandwiches. But they also include the patents on broad ideas related to mobile email — virtually devoid of any details of implementation — that have imposed a $612 million tax on the maker and users of BlackBerries.

People respond to incentives, and that increased ease has led to increased patent filings. The USPTO is overwhelmed, and thus is less likely to give filings the thorough due diligence and prior art check that would nullify some applications. But even more problematic is the set of incentives unleashed by these institutional changes. The incentive to use the USPTO as a profit center leads to a distortion in thee number of patents (more than is efficient), which creates deadweight loss due to the suppression of otherwise valuable innovations that would otherwise come to market.

It might be tempting to view patent law as just another area where litigation has spun out of control. But there is more at work here than a general increase in litigation; and its effects are particularly worrisome, because a faulty patent system can have profound impact on the overall process of innovation. The hugely successful BlackBerry may be able to bear an enormous “innovation tax” and still succeed, but other valuable but not-quite-blockbuster innovations may be driven from the market entirely.

Jaffe and Lerner then go on to recommend changes to the patent system: the review process has to include more information about prior art, and the incentive for frivolous patent-holders to bring antagonistic lawsuits (such as the recent RIM one) need to be removed.

Our proposed reforms start with the recognition that much of the information needed to decide if a given application should be approved is in the hands of competitors of the applicant, rather than the USPTO. A review process with multiple levels efficiently balances the need to bring in outside information with the reality that most patents are unimportant. Multilevel review — with barriers to invoking review increasing at higher levels, along with the review’s thoroughness — would naturally focus attention on the most potentially important applications. Most patents would never receive anything other than the most basic examinations. But for those applications that really mattered, parties would have an incentive and opportunities to bring information in their possession before the USPTO, and the USPTO would have more resources to help it make the right decision. (Changes in this direction are at the heart of the patent reform bill currently under consideration in the House Subcommittee on Courts, the Internet and Intellectual Property.)

With such review, incentives to file frivolous applications fall, which cascades through to reduced frivolous patent litigation.

For a more extensive analysis, see Jaffe and Lerner’s book Innovation and Its Discontents.

Ben Stein on High Oil and Gas Prices

Michael Giberson

So, the mob goes after someone to lynch, even if that person is innocent. After all, as the immortal Bob Dylan sang long ago, “A lot of people have knives and forks, and they don’t have nothing on their plates, and they have to cut something.”

This comes to mind because of the recent actions in the U.S. Senate that attempt to “crack down” on energy companies because the price of oil is so much higher than it used to be and because one large oil concern, Exxon (XOM), is reporting very large profits (after many years of modest earnings).

That’s Ben Stein on Who’s to Blame for High Oil and Gas Prices?