Imagine growing up in a world where only the Macintosh operating system existed. If every computer you ever saw ran OS X, you would never know there were any other possible systems. In fact, you wouldn’t even know there was such a thing as an operating system. OS X would just be what a computer looks like. The same is true for money. The type of currency we use is the only one we know, so we assume its problems are part of the nature of money itself.
Something about our economy isn’t working anymore. But it’s hard to call attention to the flaws in our system or suggest improvements without challenging a few seemingly sacred truths about the money we use and what it was invented for. Since the Cold War—which is when we put “In God We Trust” on our money1—doing anything that called the tenets of capitalism into question was considered traitorous. We were steeped in an ideological war over a whole lot of things, but it seemed to be mostly about the freedom of corporate capitalism versus the tyranny of state Communism. To deconstruct anything about capital—where it came from, how it worked, what it favored—felt very much like deconstructing the American way. Inquiring into our currency’s origins was to call attention to its very inventedness. And no good could come of that—not when the faith of investors, consumers, and borrowers was so inextricably tied to our prospects for growth.
Corporate grants and tenure appointments went to economists whose research confirmed the merits of capitalism; those whose work fell outside this purview were shunned or blacklisted.2 As a result, the economists who passed through our finest universities ended up seeing the architecture of capitalism as a precondition for any marketplace. That’s why the economic models to emerge over the last half century, however complex and intelligently conceived, almost invariably assume that the given circumstances of our particular market economy are fixed, preexisting conditions. Our economists have been trained in an intellectual universe with just one recognized economic rule set. That might be fine if economics were part of the natural world, whose principles are discovered through science, but it’s not. The economy is less like a forest or a weather system than it is like a technology or a medium. It was created not by God but by people.
Indeed, if the chartered monopoly can be thought of as a piece of software, the central currency system on which it runs might best be understood as an operating system. The one we use—the bank-issued central currency of capitalism—is the only one most of us know. Even “foreign” money is just someone else’s bank-issued central currency. Like the fictional computer users who know nothing but Macs, we think the stuff in our wallets or bank accounts is money, when it’s really just one way of accomplishing some of money’s functions.
Luckily, as members of a digital society, we are adopting more hands-on approaches to many of our most entrenched systems—or at least are expressing a willingness to understand them more programmatically. As we have seen, an individual corporation can be recoded, like a piece of software, to create and distribute more value to its various stakeholders. To do so, it must prioritize value creation and circulation over growth. The company may even stop growing or begin to shrink, which is perfectly okay as long as it spends down its frozen capital to satisfy its debtors and investors and then arrives at an appropriate scale for its market.
But while such companies may better serve the needs of people and even culture, they are incompatible with the underlying economic operating system. If corporations stop growing, then the economy stops growing, and unless something else comes in and changes the equation, the whole house of cards comes down. This is less a function of corporate greed or investor impatience than of the currency system we use and the fact that we use it to the exclusion of all others. Its universal acceptance has allowed currency to become a largely unrecognized player in the economy, as if it were an original feature of market activity, like supply, demand, labor, or commodities.
Currencies, tokens, and precious metals have indeed been used as means of exchange for thousands of years; but debt-based, interest-bearing, bank-issued central currency is a very particular tool with very particular biases—most significantly, a bias for growth. Capitalism itself is less the driver of this currency than it is the result. Capital is not an ideology so much as an artifact of a kind of money—a way of exploiting a particular operating system that runs on growth.
There used to be many kinds of money, all operating simultaneously. This may seem counterintuitive today, when the very point of money is to be able to count how much you have and compare it to how much everyone else has. But before the invention of central currency, money’s primary purpose was to help people exchange goods with one another more efficiently than simple bartering allowed. Anything that promoted the circulation of goods between people was considered a plus.
In fact, prior to the emergence of the bazaar, most people didn’t have any need for money, anyway. They were peasants and farmed the land of a noble in return for a bit of the crop for themselves. The only money was precious-metal coin, either left over from the Roman Empire or issued by one of the trading centers, such as Florence. A bit more currency was issued to pay for soldiers during the Crusades, and some of this returned home with the survivors, along with the crafts and technologies of foreign lands.3
As we saw, this gave rise to the bazaar, where locals traded crops and crafts with one another and purchased spices and other “imports” from the traveling merchants.4 But there wasn’t enough gold and silver coin in circulation for people to buy what they wanted. Precious metals were considered valuable in their own right. What little existed was hoarded, often by the already wealthy.
People bartered instead, but barter just wasn’t capable of handling complex transactions. What if the shoemaker wants a chicken but the chicken farmer already has shoes? Barter facilitators arose to negotiate more complicated, multistep deals, much in the style of multiteam sports trades. So the shoes go to the oat miller, who gives oats to the wheelwright, who makes a wheel for the chicken farmer, who gives a chicken to the shoemaker. But the relative values of all these items were different, making the brokered barter system incapable of executing these complicated transactions with any efficiency.
That’s when clever merchants invented currencies based on something other than precious metal. Instead, vendors whose sales over the course of a market day were fairly regular could issue paper receipts for the chickens or loaves of bread they knew they would sell by the end of the day: “This receipt is redeemable for one chicken at Mary’s chicken stand.” Market money could be issued by any merchant whose sales were stable enough to engender trust.5
So at the beginning of the market, the chicken farmer could spend her chicken receipts and the shoemaker could spend his shoe receipts on the items they needed, jump-starting the whole marketplace. The receipts would then circulate through the market throughout the day—just like money—until they got into the hands of people who needed chickens or shoes, at which point they would be exchanged with the original merchant for goods. To make matters even easier, the receipts would have a declared value stamped right on them—the market price of the products they represented. At the end of the day, extra receipts would be brought back to the merchants who issued them in return for metal coin, or saved for the next market day. The purpose of the money was not to make the issuer rich but to promote transactions in the marketplace and make everyone prosperous by getting trade moving.6
Almost anything could be represented as currency. Another very popular, longer-lasting form of money was the grain receipt. A farmer would bring his crop to the grain store and receive a written receipt for the amount of oats or barley he brought in. The receipt might be for a hundred pounds of grain, which had an equivalent numerical value in coin. It would usually be printed on thin metal foil, with perforations on it so that the farmer could tear off a piece and spend a portion at a time.7
Since the grain was already banked and in a facility that wasn’t going anywhere, grain re
ceipts tended to have a bit more long-term value. But they couldn’t be hoarded like precious metals. Instead of gaining value over time, grain receipts lost value. The people running the storage facility had to be paid, and a certain amount of grain was lost to spoilage and vermin. So the issuing grain store reduced the value of the receipts by a specified amount each month or year. A hundred-pound receipt in March might be worth only ninety pounds of grain by December.
Again, this wasn’t so much a problem as a feature of this money. People were incentivized to spend receipts as soon as they received them. Money moved through the economy quickly, encouraging transactions. Ideally, someone who needed grain ended up with the receipt just before its next date of reduction and redeemed it for the oats.
These local moneys worked right alongside the long-distance precious-metals currencies. Gold coins and silver pennies were still required by traveling merchants, who had no real use for stored grain or a future pair of shoes. They also provided easy metrics through which to denominate all those local currencies. The declared value of a loaf of bread on a bread receipt could be some fraction of a gold coin, making it easier for consumers to negotiate transactions, as well as for issuers to reconcile unredeemed receipts with one another at the end of the day.
The lords and monarchs tolerated all this trade for a while but began to resent people putting real monetary values on their self-issued currencies. Besides, the more people traded laterally, that is, with one another, the less dependent they were on the aristocracy. The peasants were getting wealthy from the bottom up, in an economy whose strength was based on the robustness of its transactions. Growth, a happy side effect of their increased capacity to transact, had to be appropriated. In doing so, it was turned into a financial weapon.
The nobles hired financial advisors—mostly Moors, who had more advanced arithmetic techniques than the financiers of Europe—to come up with monetary innovations through which the wealthy could retain their class advantages over the rising middle class. We already saw how the chartered monopoly would give royals the ability to assign entire industries to particular companies in return for stock in the enterprise.8 But not every industry was that scalable—at least not back then. Kings also needed a way to extract value from all those little transactions between people and, ideally, to slow down all that economic activity so that the middle class did not overtake them.
So one by one, the monarchs of the late Middle Ages and early Renaissance outlawed local currencies and replaced them with what amounted to coin of the realm.9 By law, people were forbidden to use any other currency—a rule officially justified, ironically, by the fact that the non-Christian icons of the Muslims appeared on some of the coins people had been using since the Crusades.10 The real reason, of course, is that with absolute control over coin, monarchs could exert absolute control over their economies. People protested and much blood was shed, but they lost the right to issue their own currencies. Instead, all money would be coined by the king’s treasury. As many economic historians have noted, this allowed the monarch to tax the people simply by debasing the currency and keeping the extra gold. What these same historians seem loath to point out, however, is that monarchs made money simply by issuing coin.11 The monetary system itself gave those who owned capital a way to grow it.
In a practice analogous to the way central banks issue currency to this day, monarchs created coin by lending it into existence. If a merchant wanted cash to purchase supplies or inventory, he needed to borrow it from the king’s treasury, then pay it back with interest. It was a bet on future growth. Unlike market money, which had no fees, or grain receipts, whose fees went toward a working grain store, central currency cost money. If people wanted to use money, they would have to pay for the privilege.
This was a brilliant, if exploitative, innovation: money whose core function was to make wealthy people more wealthy. Since the aristocrats already had wealth, they were the only ones who could participate in the new supply side of money lending. If people and businesses in the real economy wanted to purchase anything, they would have to get some cash from the central treasury. Then they’d have to use that money to make some deals and somehow end up with more money than they started with. Otherwise, there was no way to pay the lender back the principal and the additional interest.
So if a merchant borrowed a thousand coins from the treasury or its local agents, he might have to pay back twelve hundred by the end of the year. Where did the other two hundred come from? Either from someone else who went bankrupt (and was therefore facing debtors’ prison) or, in the best case, from some new borrower. As long as there was more new business, there was more money being borrowed to pay the interest on the money borrowed earlier. This was great for the wealthy, who could sit back and earn money simply by having money.
Participants in the bazaar didn’t fare so well. This new money was still scarce and expensive. Where market money was as plentiful as the demand for goods at the market, central currency was only as abundant as the participants’ credit. Merchants who used to cooperate now competed against one another for coin in order to have enough to pay back their loans. Frequent currency debasements also led people to hoard money out of fear that current coin had more gold in it than whatever was coming next. Moreover, everything in the market now cost more, because money itself was extracting value from people in the form of interest. They weren’t just paying for a chicken, but also for the chicken farmer’s debt overhead. If they were participating in growth businesses, they might have stood a chance of keeping up with the cost of capital. But these were largely subsistence enterprises.
In country after country where local moneys were abolished in favor of interest-bearing central currency, people fell into poverty, health declined, and society deteriorated12 by all measures. Even the plague can be traced to the collapse of the marketplace of the late Middle Ages and the shift toward extractive currencies and urban wage labor.
The new scheme instead favored bigger players, such as chartered monopolies, which had better access to capital than regular little businesses and more means of paying back the interest. When monarchs and their favored merchants founded the first corporations, the idea that they would be obligated to grow didn’t look like such a problem. They had their nations’ governments and armies on their side—usually as direct investors in their projects. For the Dutch East India Company to grow was as simple as sending a few warships to a new region of the world, taking the land, and enslaving its people.
If this sounds a bit like the borrowing advantages enjoyed today by companies like Walmart and Amazon, that’s because it’s essentially the same money system in operation, favoring the same sorts of players. Yet however powerful the favored corporations may appear, they are really just the engines through which the larger money system extracts value from everyone’s economic activity. Even megacorporations are like competing apps on a universally accepted, barely acknowledged smartphone operating system. Their own survival is utterly dependent on their ability to grow capital for their debtors and investors.
Central currency is the transactional tool that has overwhelmed business itself; money is the tail wagging the economy’s dog. Financial services, slowly but inevitably, become the biggest players in the economy. Between the 1950s and 2006, the percentage of the economy (as measured by GDP) represented by the financial sector more than doubled, from 3 percent to 7.5 percent.13 This is why, as Thomas Piketty demonstrated in Capital in the Twenty-First Century, the rate of return on capital exceeds the growth rate of the economy.14 Money makes money faster than people or companies can create value. The richest people and companies should, therefore, position themselves as far away from working or creating things, and as close to the money spigot, as possible.
Some companies, such as General Electric in the 1980s, understood this principle quite well and acted on it. They came to realize that their core enterprises were really just in service of the much
more profitable banking industry. GE’s CEO Jack Welch determined that the company made less money making and selling washing machines than it did lending money to consumers to purchase those washing machines. So he began selling off GE’s factory businesses and turning the corporation into more of a financial services company. The washing-machine companies were sold to the Chinese. The new GE made loans, sold insurance, and provided capital leasing.
This worked quite well for the company and for those who followed in Jack Welch’s footsteps.15 His approach was canonized by Harvard and other top business schools, which began training their graduates to see productive industries as mere stepping-stones to becoming holding companies. The further up the money chain you can get—the more like a bank issuing money—the better.
The American economy became almost entirely dependent on its companies’ and citizens’ willingness to use credit. It didn’t matter what they bought with that credit. Most of it went to Chinese goods, but that didn’t matter. We were growing not the real economy of goods and services but the synthetic economy of money itself. The Western companies at the top of the food chain were selling credit, not consumables. The perfect productless product. Those unwilling to participate were researched by psychologists, then subjected to techniques of “behavioral finance” until they got with the program.16
To many of us, the whole system seemed to be working. Regular people began taking mortgages out on their homes every time real estate values went up, as a way of generating more capital for themselves. Even Alan Greenspan thought the triumph of capital and credit meant we had embarked on a new era of riskless investing. As the former chairman of the Federal Reserve remarked in 2000: “I believe that the general growth in large [financial] institutions has occurred in the context of an underlying structure of markets in which many of the larger risks are dramatically—I should say, fully—hedged.”17 That is, right up until the crash of 2007, when there turned out not to be enough real economic activity to support the overcrowded field of moneylending.
Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity Page 15