The People's Republic of Walmart

Home > Other > The People's Republic of Walmart > Page 18
The People's Republic of Walmart Page 18

by Leigh Phillips


  The field of cybernetics had been ideologically taboo, officially condemned as an American mechanism of neutering worker control. Under Khrushchev, a reversal had occurred: the Academy of Sciences was now publishing the journal Cybernetics in the Service of Communism, and Moscow had ordered the building of computer factories. Victor Glushkov, the founder of Soviet cybernetics, even got the green light from the premiere to develop a decentralized computer network—a Soviet internet—but it was never completed. It was too little, too late. By the time the Thaw drew to a close, with the putsch toppling Khrushchev and the return of the Stalinists in 1964, Soviet computing was far behind its Western counterparts. There was no common standard, and computers and peripherals were frequently incompatible. The country’s limited computing power was a primary reason for the failure of its manned lunar program, and in the early 1970s, the Soviet leaders decided to abandon development of a domestic computer industry, opting for pirating Western computers instead.

  Added to this abandonment of computation, authoritarianism did not disappear, and it was revived under Brezhnev, once again undermining the quality of information needed to engage in planning. And once the great economic cushion of oil was discovered in Siberia, the cyberneticians, computer scientists and economic reformers who were still committed to planning appeared no longer to be needed. The next version of economic reformers, in the ’80s, would be a more market-oriented variety, who had all but given up on the idea of planning and socialism.

  After the fall of the Soviet Union, the debate naturally became something of an academic discussion, rather than a live controversy, and certainly a discourse that was lost to those engaged in day-to-day social justice struggle.

  But in the 1990s, two progressive computer scientists, Paul Cockshott at the University of Glasgow and his collaborator, economist Allin Cottrell at Wake Forest University, began to argue in a series of academic papers that improved algorithmic techniques had once again made the question worth exploring.

  In their 1993 book, Towards a New Socialism, a text that in places reads less like a left-wing polemic than a university programming textbook, Cockshott and Cottrell argue against the idea that planning is destined to fail, employing new knowledge from the world of computer science: “Modern developments in information technology open up the possibility of a planning system that could outperform the market in terms of efficiency (in meeting human needs) as well as equity.”

  Computers are better than markets—so went the argument. All the worries of Mises and Pareto—that while in theory, socialist economic calculation is no different from market calculation, it remains impractical—were being made moot by technological change. However, they contend, while the project is made easier by some level of technical sophistication, it is not so much the availability of superfast central computers that has been the major constraint. A distributed planning network of quite modest personal computers, linked by an economy-wide telecommunications system and employing a standardized system of product identification and computer databases, would be sufficient. It would, however, require universal access to computers and the free flow of information.

  Given a new lease on life by the advent of new technologies, the debate has continued into the 2000s. A 2002 rejoinder to the Cockshott-Cottrell perspective from Polish logician Witold Marciszewski of the University of Warsaw argued that socialist planning would require what are called super-Turing machines, or hypercomputers—theoretical computers that go beyond the computability of standard computers, which some claim are not only physically impossible to build, but logically impossible to devise. And in 2006, Robert Murphy, a young Austrian School economist with the Pacific Research Institute, a Californian free-market think tank, employed set theorist Georg Cantor’s diagonal argument to claim that the list of prices in any planning board’s matrix would need to contain not merely billions or trillions of prices, but—as with the set of all real numbers or set of all subsets of integers—an uncountably infinite number of them, therefore making economy-wide socialist calculation impossible in principle, not just in practice, because the full list of all prices could never be listed. Think about it this way: however large the set of integers is, stretching off into infinity (0, 1, 2, 3 … ∞), given an infinite amount of time, you could count them just listing one after the other. But the infinity of real numbers that fits just between 0 and 1 is even larger, containing an infinite number of such infinite strings of integers! And so it could never, even with an infinite amount of time, be counted. It is this second sort—an uncountable infinity—that Murphy says describes the full set of prices needed to engage in planning.

  Essentially, Cockshott, Cottrell, Marciszewski, Murphy and a handful of others had revived the long-dormant calculation debate but recast it as a problem for the field of computational complexity theory, a branch of theoretical computer science that seeks to classify the inherent difficulty of different sorts of problems, and the resources needed to solve them. In the same way that neuroscientists have in recent decades stolen debates over the theory of mind away from philosophers, complexity theorists and computer scientists are stealing this debate away from economists and political scientists.

  However, the discussion still largely remains hidden within the realm of scientific journals—and even there, for many, it has become something of a mathematical parlor game. There is no active audience outside a tiny sprinkling of academics. Again, it’s capitalist realism: “Of course a nonmarket economy is absurd, Jim, but just as an exercise for my students …”

  Published just a bare two years after the 2008 financial crisis, Francis Spufford’s novel about economic planning, Red Plenty, prompted a burst of responses, particularly online. Perhaps the most interesting among them was a lengthy essay from self-decribed “vaguely lefty” Carnegie Mellon statistician Cosma Shalizi, who “learned linear programming at my father’s knee as a boy.” In it, he argues against Spufford’s hope that as processing power improves, the idea of planning can return. He shows how computation of a list of optimal prices by planners turns out to be as complex as computation of the optimal plan itself, due to the interdependency of all the possible variables within an economy. Roughly speaking, he is making a similar argument to those of Murphy and Marciszewski, although he does at least concede that rather than being outright impossible, the problem could become technically tractable after a century of Moore’s law (which posits that computing power doubles approximately every two years) holding true.

  But this places optimal planning in the realm of science fiction, rather than that of serious options that can be considered today. We fall back on the depressing position that prices in the market are just a better mechanism for the processing of all the information needed to efficiently allocate resources. Why expend such vast energy constructing what is otherwise immanent in market exchange?

  “We need … some systematic way for the citizens to provide feedback on the plan, as it is realized,” Shalizi writes. “There are many, many things to be said against the market system, but it is a mechanism for providing feedback from users to producers, and for propagating that feedback through the whole economy, without anyone having to explicitly track that information.” Now, unlike Murphy and Marciszewski, Shalizi is no arch–free marketeer. He acknowledges, and is horrified by, what markets produce: “At the extreme, the market literally starves people to death, because feeding them is a less ‘efficient’ use of food than helping rich people eat more.”

  He recognizes that in many domains (at least in some countries)—such as education, healthcare, policing, the fire department, search and rescue, and disaster response—planning, rather than the market, is used to allocate resources and does a far better job. So, like Nove, he advocates a mixed economy where some goods and services are removed from market allocation.

  But this is a fudge. If the market allocation argument is correct, it should be correct for these realms as well. Why should healthcare, education and the fire department
work so well if the theory shows that they should entail monstrous inefficiencies? (Indeed, libertarians make exactly this argument: that there should also be a market not merely in health and education, but also in policing, fire services and the armed forces). In another inversion of the old rightist canard that communism works in theory but not in practice, communism again appears to work in practice but not in theory.

  But between gross inefficiencies in allocation of resources and absolutely perfect, immaculate optimization, there is reality—where people actually live. There is a series of confusions here that relate to the complexity of coming to an exact algebraic solution to a problem, as opposed to getting an acceptable economic answer to a problem. According to Cockshott, if you take a large economy and use standard input-output techniques—the method developed by Russian American economist Wassily Leontief to represent interlocking economic relationships, today commonly used to calculate GDP—you can represent it as a huge matrix, with columns for every industry and the rows for how much of each output of another industry one will consume. So for, say, the steel industry column, at the bottom it will say how much steel is produced, while the rows will indicate how much coal, how much iron-ore, or how much limestone it uses.

  Now, in principle, the number of steps in this matrix calculation to reach a certain mixture of final output will grow as the cube of the size of your matrix; so if you have a matrix with, say, 10 million entries in it, it will appear that to come up with an answer, the number of steps required will be 10 million to the power of three. But this is only if you choose to write it out as a matrix—because if you did that, you’d find almost all the entries in the matrix would be zero since you don’t use, say, limestone in the making of a book. Most things aren’t used in other processes. Therefore, most products require only a small number of inputs.

  “The conception that everything affects everything,” says Cockshott, “is not true. You can disaggregate many aspects of the economy.” Through experimentation, Cockshott and his colleagues suggest that this disaggregation allows the number of steps to grow logarithmically rather than exponentially, enormously simplifying the complexity of the problem. In essence this means that at first there is a rapid increase in the number of steps, followed by a period where the growth slows. But the growth nonetheless keeps going, as opposed to a case where the number of steps begins slowly and then increases very rapidly as you go on.

  Cockshott explains: “You say: ‘I only want to get an answer to three significant figures, because how many businesses really can plan their output to more than this?’ Because you don’t want an exact solution, but an approximation to a certain number of significant figures.” This rougher requirement for the calculation also limits the number of iteration steps you have to run on the algorithm. “So when you actually look at it in terms of a practical problem in terms of how the data is really structured, what the real world demands, you find you’re dealing with something very much simpler than the abstract algebra would suggest.” This is something that is now relatively well known in computer science. There are many algorithms attacking problems that are in principle intractable, but in practice we can use them to solve a lot of problems because they’re only intractable for certain ranges of numbers.

  Cockshott has pushed the debate from the realm of theory to experimentation. It’s very difficult to do practical research in planning for obvious reasons, but after testing his ideas with a modestly advanced departmental computer costing around £5,000, he claims to have solved such optimizing equations for an economy roughly the size of Sweden in about two minutes. He projects that if he had used the sort of computers used by his university’s physics department or any weather-forecasting center, then it would be a very simple matter for larger economies, with the cycle time for computation on the order of hours, rather than months or years or millions of years.

  “It’s relatively easy to show that these algorithms are tractable. They’re polynomial or subpolynomial. They’re in the best tractability class. They’re easily amenable to industrial-scale economies with a fraction of the processing power that Google has.”

  The question, then, turns to the collection of the right information. But this too is becoming easier, as products are increasingly tracked using barcodes, and purchasers and suppliers share vast databases containing information monitoring every aspect of production, the ordering of components, calculating of costs, and so on.

  Now, all of this is an extraordinary claim. Cockshott’s methodology and results need to be interrogated and replicated by other researchers. But some of this replication has already happened right under our noses. The colossal multinational corporations and financial institutions already engage in planning internally, but on a worldwide scale, coordinating economic activities continents apart. Cockshott points to air transport as the first industry to be subject to comprehensive computerized planning, under the Boadicea airline booking system that launched in the 1960s. Shipping clerks are also long since a thing of the past.

  To be clear: a non-market economy is not a question of unaccountable central planners, or equally unaccountable programmers or their algorithms making the decisions for the rest of us. Without democratic input from consumers and producers, the daily experience of the millions of living participants in the economy, planning cannot work. Democracy is not some abstract ideal tacked on to all this, but essential to the process.

  And most importantly, computer-assisted, decentralized, democratic economic decision making will not arise as a set of technocratic reforms of the system that can simply be imposed. First there must be a fundamental transformation of the relations and structures of society, including the confection of new networks of interdependence—frameworks that the masses of people will have to fight for, build and ultimately sustain. While such a system can and must be built from the ground up, to reach the scale of what is realistically required both to construct a just economy and to deal with the ecological crisis, this system will have to be global and throughgoing in its demands for both human liberation and technological advance.

  9

  ALLENDE’S SOCIALIST INTERNET

  The story of Salvador Allende—president of the first-ever democratically elected Marxist administration, who died when General Augusto Pinochet overthrew his barely three-year-old administration in a US-backed coup on September 11, 1973—is well known and lamented among progressives. For much of the Left, the crushing of the Allende administration represents a revolutionary road not taken, a socialism unlike that of the Soviet Union or China, committed to constitutional democracy, the rule of law and civil liberties, even in the face of fascist paramilitary terror. The litany of human rights horrors committed under Pinochet and tales of los desaparecidos, or “the disappeared”—a euphemism for the more than 2,000 of Pinochet’s secretly abducted victims whose fate the state refused to acknowledge—have until recently eclipsed a bold and pioneering experiment in cybernetic economic planning that was initiated under Allende.

  The project, called Cybersyn in English and Proyecto Synco in Spanish, was an ambitious (perhaps overambitious) effort to network the economy, and indeed, society. It has been described in the Guardian, not without reason, as a “socialist internet”—an endeavor decades ahead of its time.

  Largely unknown for decades, it has finally received its due. Around the time of the fortieth anniversary of Pinochet’s coup, a suite of articles appeared in the mainstream media, from the New Yorker to the popular podcast 99% Invisible, many drawing on the extensive research and interviews with the architects of Cybersyn performed by electrical engineer and technology historian Eden Medina to produce her 2011 volume on the triumphs and travails of the Cybersyn team, Cybernetic Revolutionaries. The reason for the flurry of interest in Cybersyn today, and for the recovery of its story, is due in part to its remarkable parallel to the US military’s Advanced Research Projects Agency Network (ARPANET)—the precursor of the internet—and the revelation, like something
out of an alternate universe, that an internet-like structure may first have been developed in the global South. The attraction to the tale of Chile’s socialist internet is likely also due to the raft of lessons for today offered by this artifact from Allende’s democratic revolution—“flavored with red wine and empanadas,” as he put it—on privacy and big data, the dangers and benefits of the Internet of Things, and the emergence of algorithmic regulation.

  Our interest here, though, is primarily to consider Cybersyn in terms of its success or otherwise as an instrument of non-centralized economic planning. Freed from the Cold War’s constraints, we can today consider Cybersyn more objectively and ask whether it might serve as a model for leaping over both the free market and central(ized) planning.

  Cybernetics as Herding of Cats

  In 1970, the newly elected Popular Unity coalition government of Salvador Allende found itself the coordinator of a messy jumble of factories, mines and other workplaces that in some places had long been state-run, in others were being freshly nationalized, while some were under worker occupation, and others still remained under the control of their managers or owners. The previous centrist administration of Christian Democrat Eduardo Frei had already partially nationalized the copper mines, the producer of the country’s largest export. Frei’s government had also developed a massive public house-building program and significantly expanded public education, all with substantial assistance from the United States. Washington was fretful that if it did not pay for social reforms, it would witness social revolution within the hemisphere that it viewed as its own. Thus, substantial sections of Chile’s relatively small economy were already in the public sector when the socialists took over, stretching the bureaucracy’s management capability to its limit. A more efficient strategy of coordination was required.

 

‹ Prev