by Steve Forbes
When all the indirect costs of a national program—such as higher taxes and slower economic growth—are added up, the Heritage Foundation estimates that the cost of living for a family of four, starting in 2012, will increase by some $4,300. And it will only go higher with the future CO2 restrictions that are expected.
In Europe, the Emissions Trading Scheme (ETS) was put into effect in 2005 based on carbon emissions levels (caps) established by the Kyoto Protocol of the late 1990s. It is emerging as a major failure. European manufacturers have bitterly complained that the extra costs are making them uncompetitive, thus forcing them to consider moving facilities elsewhere.
Soundly based market initiatives can actually work. The problem with cap-and-trade is that it is a poorly conceived idea masquerading as a market solution.
The market for carbon emissions credits did not develop spontaneously, like the market for the pencil. It was dreamed up and imposed on people by bureaucrats. In other words, without government it would not exist.
In a sense, a marketplace is commerce’s equivalent of an ecosystem. In economics, as in biological science, it is extremely difficult to successfully reproduce what spontaneously evolves in nature. In the early 1990s, oil billionaire Edward Bass poured $150 million into a miniature version of the world’s ecosystem in the desert near Tucson, Arizona—Biosphere 2. It was essentially a giant, airtight terrarium that was supposed to contain the earth’s ecosystem in miniature—an ocean stocked with fish and a dense forest in an oxygenated atmosphere that was supposed to be self-sustaining.
The idea was to create a self-sufficient environment where people, plants, and animals could survive without help from the outside world. The problem was that the scientists in charge of the project, some of whom had national reputations, couldn’t possibly know all the conditions and components of a fully self-sufficient ecosystem. Starved for the right amount of oxygen, the fish in the ocean died. CO2 levels in the air became too high. The big terrarium was overrun with an infestation of desert cockroaches. Several years after its much-publicized launch, Biosphere 2 was widely acknowledged to be a momentous failure.
Government attempts to create a cap-and-trade market for CO2 permits are a little like the efforts of the scientists who tried to create Biosphere 2. There are inevitably distortions and unintended consequences because a handful of bureaucrats simply can’t know all the workings of a marketplace “ecosystem.”
Cap-and-trade failed in Europe because the market and its values were dreamed up by bureaucrats and were essentially arbitrary. Member states of the EU allocated permits free of charge to companies based on how many the government believed they needed. This arbitrary system resulted in an oversupply of permits. Politics polluted the allocation process. Large companies lobbied for more permits than they needed, only to sell them at a profit. Smaller organizations less effective at lobbying got too few permits and had to pay more than their fair share of fees. A December 2008 article in the New York Times reported:
The European Union started with a high-minded ecological goal: encouraging companies to cut their greenhouse gases by making them pay for each ton of carbon dioxide they emitted into the atmosphere. But that plan unleashed a lobbying free-for-all that led politicians to dole out favors to various industries, undermining the environmental goals. Four years later, it is becoming clear that system has so far produced little noticeable benefit to the climate—but generated a multibillion-dollar windfall for some of the Continent’s biggest polluters.24
Despite the immense expenditures, data are suggesting that the European program may even have had a negative effect on the environment. A 2007 report by the London-based think tank Open Europe found that across the EU, emissions from installations covered by the ETS actually rose by 0.8 percent.
Thus experts, such as highly regarded economist Martin Feldstein, believe that a U.S. cap-and-trade program is not likely to work any better:
Since the U.S. share of global CO2 production is now less than 25 percent (and is projected to decline as China and other developing nations grow), a 15 percent fall in U.S. CO2 output would lower global CO2 output by less than 4 percent. Its impact on global warming would be virtually unnoticeable.25
Even if a nominal benefit is achieved, Feldstein says that cap-and-trade will devastate the economic lives of Americans. The higher taxes, higher prices, and slower growth, he predicts, will kill the nation’s chances of recovery from the 2009 recession. Everyone wants clean air. But cap-and-trade means diverting massive resources—billions of hard-earned taxpayer dollars and job-creating capital—into a government-created, politically driven artificial market that is, at best, an economic version of the Biosphere.
REAL WORLD LESSON
Reflecting spontaneous decisions of thousands or even millions of people, markets are economic “ecosystems” whose behavior cannot be duplicated or controlled according to the preconceptions of a handful of bureaucrats.
Q WEREN’T THE ORIGINAL ANTITRUST LAWS NEEDED TO SET THE BASIC CONDITIONS OF COMPETITION IN FREE MARKETS?
A NO. ANTITRUST ACTIONS ARE MOST OFTEN EXAMPLES OF “RENT SEEKING,” ATTEMPTS BY LARGE COMPANIES TO STRIKE BACK AGAINST THEIR MOST SUCCESSFUL COMPETITORS BY USING THE LEGAL SYSTEM.
We’ve already discussed why, in a freemarket economy, competition laws aren’t needed. Sooner or later, natural forces of creative destruction undermine even the biggest players in a market. Unfortunately, freemarket opponents have for generations failed to appreciate this basic Real World principle. This failed understanding has given us largely unnecessary and highly destructive antitrust laws.
Since the Sherman Antitrust Act of 1890, Congress has enacted laws ostensibly designed to ensure competition and protect consumers. They have resulted in antitrust actions or investigations—and often massive penalties—against companies ranging from the old Standard Oil and U.S. Steel to IBM, Microsoft, Staples, Toys “R” Us, and others.
Public antitrust debates generally center on whether the activities of a corporate giant—say, a WalMart or a Google—are “anticompetitive” and deserve antitrust intervention. Take the widely debated Microsoft case, the most celebrated antitrust action in recent history. Some company supporters argued that government action wasn’t needed, because high tech, with its ever-emerging new technologies, was a new competitive game. Microsoft was therefore different from early antitrust cases, such as Standard Oil, that involved low-tech marketplaces for more limited resources. Therefore, the argument went, Microsoft was not deserving of the antitrust prosecution inflicted upon those early industrial “monopolies.”
But this skirts a far bigger question: were the Sherman Act and other early competition laws truly needed in the first place? Historians and economic and legal experts are now saying that the nation’s earliest antitrust laws and court cases were then, as now, totally unnecessary—and economically destructive.
Ostensibly, antitrust laws are intended to preserve competition and to protect consumers from being harmed by overly powerful, monopolistic companies. But were the targets of those classic antitrust actions really hurting consumers? In other words, were they behaving as monopolies?
Antitrust historian Dominick Armentano examined fifty-five of the most famous antitrust cases in U.S. history. He found that the targets of classic antitrust actions rarely, if ever, could be considered monopolies. Armentano cites the classic case of Standard Oil of New Jersey, whose growth through mergers caused it to be broken up in 1911 by the government.
Standard never even monopolized petroleum refining, let alone the entire oil industry (production, transportation, refining, distribution). Even in domestic refining, Standard’s share of the market declined for decades prior to the antitrust case (64% in 1907) and there were at least 137 competitors (firms like Shell, Gulf, Texaco) in oil refining in 1911.26
Other supposed monopolies targeted in classic cases included IBM, which at the time had about 65 percent of the mainframe computer market. In the early 196
0s, the government also ruled that Brown Shoe Company would become a monopoly if it acquired Kinney Shoes. But the two companies together would have had a 2 percent market share, a far cry from monopolistic dominance.
Were these goliaths hurting consumers? Far from it. Armentano says that in every single case, companies had been dropping prices, innovating, and expanding production. If there was any “harm” done, it was to less efficient competitors.
Standard Oil, for example, had been accused of controlling the market for kerosene. Yet kerosene prices during the period of supposed monopolization actually fell—from thirty cents a gallon in 1869 to about six cents a gallon at the time of the trial.
In the 1930s and 1940s, the Aluminum Company of America (ALCOA) was the subject of a thirteen-year-long antitrust case. What had the company done? Developed refining methods that had lowered aluminum ingot prices. Another target, the old American Can, was accused of “coercing” companies into signing long-term leases by offering attractive, generous price discounts for large orders of its cans.
Armentano and others say the classic antitrust cases weren’t needed to protect consumers any more back then than they are now. The early cases, like most today, were brought about by a group of large competitors hoping to use the government to protect or enhance their interests. Economists have a term for this: rent seeking.
In the case of the Sherman Act, rural cattlemen and butchers hoped to use the law as a way to reduce the pricing power of the big meat packers in Chicago, whose size gave them immense power to negotiate lower shipping prices. Meanwhile, the Standard Oil antitrust case produced what Dominick Armentano has called a “government sanctioned cartel in oil”—a marketplace where players and pricing had to meet the approval of Uncle Sam.
In the case of American Can, the judge actually forced the company to raise its prices in order to help less productive, higher-priced competitors. None of these outcomes could be said to help consumers.
What antitrust true believers fail to appreciate is that, as we explained in chapter 2, even the biggest market players fall prey to new competition and marketplace creative destruction—often from unexpected quarters.
Government tried—and failed—to break up the mammoth U.S. Steel in another classic case in 1911. The free market did the job instead: the company’s once-immense “market power” was gradually whittled away by competitors—not just by other steel makers but by alternatives like aluminum. U.S. Steel once made 67 percent of the steel produced in the United States. Today it produces only about 10 percent. Competition from foreign firms in countries like Japan and South Korea cut into the company’s market share. New domestic competition came from so-called minimills, such as Nucor, which made steel from scrap.
The irony of antitrust is that the economy’s only genuine monopolies are imposed or created by government. Think Fannie and Freddie, your local cable company, or, for those old enough to remember, “Ma Bell”—the old AT&T. Entities such as Medicare, the government’s health insurer, have a far greater ability to set prices and eliminate competition than any antitrust target in the private sector. When a company does these things, it’s accused of being “anticompetitive.” But when government does them, it’s ostensibly “a public good.”
REAL WORLD LESSON
Contrary to the view of antitrust believers, classic antitrust cases reflected market politics, not economics.
Q WHAT’S WRONG WITH MINIMUMWAGE LAWS? DON’T THEY HELP PEOPLE?
A NO. THE MINIMUM WAGE IS A PROVEN JOB KILLER FOR UNSKILLED WORKERS.
What’s wrong with a law requiring that unskilled workers get a decent wage? This question has long evoked intense and heated debate. The late senator Ted Kennedy, for one, went so far as to argue that a government-mandated minimum wage was no less than
a defining issue about what our society is really about. Whether we reward work, whether we have respect for individuals that work hard and play by the rules, whether we are going to follow the great teachings of the Beatitudes, which inspire so many of us in terms of our responsibilities to our fellow human beings, and if we believe in those fundamental tenets of the Judeo-Christian ethic we cannot fail but to believe that the minimum wage must be a livable wage for all our fellow citizens.27
No doubt, emotional appeals like this can be persuasive. They’re one reason that 90 percent of nations have instituted some kind of minimum wage. The problem is that studies have shown that a minimum wage does not help the poor in the Real World. Instead it increases joblessness among unskilled people.
Why? The answer comes down to basic Real World economics: if you make unskilled workers more expensive to employ, employers will simply hire fewer people.
Economist Thomas Sowell is one of many experts who warn that Americans need to pay close heed to the European experience with the minimum wage.
Because minimum wage laws are more generous in Europe than in the United States, they lead to chronically higher rates of unemployment in general and longer periods of unemployment than in the United States—but especially among younger, less experienced and less skilled workers. Unemployment rates of 20 percent or more for young workers are common in a number of European countries. Among workers who are both younger and minority workers, such as young Muslims in France, unemployment rates are estimated at about 40 percent.28
Sowell says that by reducing the number of jobs available to the lowest-skilled workers, minimumwage laws have similarly hurt minorities in this country, making it harder for them to get a foothold in the economy: “Blacks in general, and younger blacks in particular, are the biggest losers from such laws, just as younger and minority workers are in Europe.”29Few people today realize that the unemployment rate of black teenagers was at one time about the same as that of white teens. But after steady increases in the minimum wage it climbed to 40 percent by the late 1980s.
Sowell says that this was the intention of some of the supporters of the early minimumwage laws, who had an openly racist agenda. “The last year in which the black unemployment rate was lower than the white unemployment rate in the United States was 1930. The next year, the first federal minimum wage law, the Davis-Bacon Act, was passed. One of its sponsors explicitly stated that the purpose was to keep blacks from taking jobs from whites.”30
Concurring with Sowell, economist Walter Williams has called the minimum wage “one of the most effective tools in the arsenal of racists around the world.”31
A Real World truth ignored by politicians is that minimumwage jobs are basically starter jobs that allow workers to enter the workforce and learn skills. As Walter Williams points out, no one questions when “college students forego considerable amounts of money in the form of tuition and foregone income so that they may develop marketable skills. It is ironic, if not tragic, that low skilled youths from poor families are denied an opportunity to get a start in life.”32
A Real World fact overlooked by well-intentioned advocates of minimumwage laws: people do not make minimum wages for long. They quickly move up. According to the Heritage Foundation’s James Sherk, an expert on the minimum wage,
[B]etween 1998 and 2003—a time when the federal minimum wage did not rise—the median minimum wage worker earned a 10 percent raise within a year of starting work. During this period, over two-thirds of workers starting out at the minimum wage earned more than the minimum a year later. Once workers have gained the skills and experience that make them more productive, they can command higher wages.33
Sherk and his colleague Rea S. Hederman Jr. also say that minimumwage workers rarely rely exclusively on their minimumwage paychecks. The majority of them are under twenty-five and “typically not their family’s sole breadwinner. Rather, they live in middle-class households that do not rely on their earnings.”34 As for older minimumwage workers, “the vast majority … live above the poverty line.” Nor do they fit the stereotype of a worker “living on the edge of destitution.” They report, “More than half—56 percent—work part-time j
obs … while 45 percent have incomes over twice the poverty line.”35
In their study “Raising the Minimum Wage: Another Empty Promise to the Working Poor,” professors Richard V. Burkhauser of Cornell University and Joseph J. Sabia of the University of Georgia note that by 2003, only 17 percent of low-wage employees were living in poor households and only 9 percent of minimumwage employees were actually the heads of such households.36
Furthermore, studies also show that in addition to making it harder to find jobs, minimumwage increases have other unintended consequences. Michigan State University economics professor David Neumark and Federal Reserve researcher William Wascher believe the minimum wage encourages teenagers from lower-income families to drop out of school. With fewer part-time minimumwage jobs available, they’re forced to work full-time. The researchers found that a 10 percent increase in the minimum wage caused teenage school enrollment in certain states to drop by 2 percent.
Raising the minimum wage not only hurts the poorest of the poor, but it burdens society with higher prices. Walter Williams believes it is at least partly responsible for the decline in the services we associate with a gentler era.
When I was a kid growing up, neighborhood theatres had ushers to take you to your seats. … Now you don’t see ushers in theatres, and that’s not because Americans of today like to stumble down the aisles in the dark to find their seats. When you pulled into gasoline stations, there were young people out there to wash your windshield, fill your tank with gas, check the air in your tires, and check the water in your radiator. Now we have selfservice stations, not because Americans today like to smell gasoline fumes and get gasoline on their shoes while they fill up the car. The minimum wage destroyed those kinds of jobs.37