Alongside these two pivotal, general institutions, local schools sprang up, the vocational écoles d’arts et métiers; and specialized industrial schools, often founded by employers, for training in particular branches: chemicals in Lyons, watchmaking in Besançon, textiles in Mulhouse. Some of this aimed to make up for the disappearance of the older apprenticeship system. Finally, such older technical institutions as the Conservatoire des Arts et Metiers—a museum to start—took to giving courses, often directed to adults who had passed beyond the normal sequence but wanted to bring themselves up to date.
The French initiatives were a beacon to countries farther east. The Polytechnique in particular sparked emulation in Prague, Vienna, Zurich, places as far off as Moscow. In addition, each country had its own combination of associated schools. The Germans, for example, developed a network of trade schools (Gewerbeschulen) that fed middle technological management; and a growing array of technical higher schools (technische Hochschulen)—the first in Karlsruhe in 1825—that taught at university level and formed generations of chemists and engineers. Finally, the Germans pushed scientific instruction and research in the universities. This was the cutting edge of experiment and inquiry, and the invention of the teaching laboratory (Justus Liebig, 1830s) capped an educational system that became by the end of the century the world’s envy and model.
The reliance on formal education for the diffusion of technical and scientific knowledge had momentous consequences. First, it almost always entailed instruction in abstract and theoretical matters that lent themselves to a variety of applications, old and new. I would emphasize the new. Secondly, it opened the way to new branches of knowledge of great economic potential.
Compare such schooling with the British strategy of learning by doing—the strategy that had driven the Industrial Revolution. This had worked well enough so long as technology remained an accretion of improvements and invention a recombination of known techniques. (Even so, one can only marvel at Britain’s continuing ability to generate appropriate genius and talent, much of it autodidact.) But from the late eighteenth century on, as the frontiers of technological possibility and inquiry moved outward, exploration went beyond the lessons of sensory experience.
These new directions found their biggest return in two areas, chemicals and electricity—in both, thanks to advances in scientific knowledge. The older chemical branches remained a kind of industrial cookery: mix, heat, stir, keep the good and dump the waste. They did not stand still. They gained especially from mechanization—bigger and faster kilns, mixers, grinders, and the like—as producers went after economies of scale. Another source of progress was the invention of uses for waste products (thus coal gas for illumination), sometimes in response to laws penalizing pollution. (Better to use the waste than be sued or fined.) But the revolutionary advances came in the new field of organic chemistry and derived directly from studies of carbon-based molecules. These opened the door to a multitude of applications, first in the field of dyestuffs (crucial to textile manufacture), then in pharmaceuticals and photography, and finally, toward the end of the century, in artificial matter—what we loosely call plastics.
Electricity was known, but not understood, by the ancients, and curious savants played with it, almost as with a toy, from the eighteenth century on. Such experiments could have practical consequences; hence Benjamin Franklin’s invention of the lightning rod. But the systematic use of electricity as a form of energy and its application to industrial processes had to wait for the nineteenth century, after research by such people as Volta, Ampere, and Faraday, whose names have been immortalized in scientific terminology. The first industrial applications were small though impressive: batteries (Voltaic piles), which could drive telegraphs and clocks; and electrolytic techniques, used especially to plate metals and cutlery. Both of these were pre-1850. But electricity’s flowering came with the invention of generators and dynamos to produce current in quantity and the building of a system of distribution. The biggest stimuli were Thomas Edison’s incandescent lighting (1879) and electric motors, which justified the outlay for overhead capital.
In both chemicals and electricals, learning and competence depended on formal instruction. These phenomena are not apprehensible by sensory perception; it takes diagrams and schemas to explain them, and the underlying principles are best learned in the classroom and laboratory. Here Continental reliance on schooling paid off, generating and imparting new technologies. Catching up turned into a leap ahead, while Britain, caught in the net of habit, fell behind.
In British electricity, moreover, local autonomies exacerbated the difficulty. In some places, municipal gas networks successfully opposed electrification; elsewhere Britain built a multiplicity of power networks, each with its own voltage arrangements and hardware. Later improvements only added to the menu. To this day, British buyers of electrical appliances must deal with a diversity of plugs and outlets, and customers pay shopkeepers to ready equipment for use. The British economy grew in these new branches as it had in the old—like Topsy.
This marriage of science and technique opened an era that Simon Kuznets called “modern economic growth.”14 It was not only the extraordinary cluster of innovations that made the Second Industrial Revolution so important—the use of liquid and gaseous fuels in internal combustion engines, the distribution of energy and power via electric current, the systematic transformation of matter, improved communications (telephone and radio), the invention of machines driven by the new sources of power (motor vehicles and domestic appliances). It was also and above all the role of formally transmitted knowledge.
The marriage of science and technique had been preceded by a number of couplings. One can take the courtship back to the Middle Ages, to the use of astronomical knowledge to transform navigation (the calculation of latitudes), the use of mathematics in ballistics, the application of the pendulum to the construction of a far more accurate timekeeper. And back to the steam engine, that classic triumph of scientific empiricism. But not until the late nineteenth century does science get ahead and precede technique. Now would-be inventors and problem solvers found it profitable to survey the literature before undertaking their projects; or for that matter, before conceiving their objective—what to do and how to do it.
So it was that the leader/innovator was caught and overtaken. And so it was that all the old advantages—resources, wealth, power—were devalued, and the mind established over matter. Henceforth the future lay open to all those with the character, the hands, and the brains.
The Secrets of Industrial Cuisine
Steel, we have seen, was always the metal of choice in the making of “white arms” (swords and daggers), knives and razors, edge tools, and files (crucial to the manufacture of precision parts). In the beginning, steel was an accidental by-product of smelting in furnaces that were not hot enough to produce a homogeneous mass and yielded some steel along with soft and hard iron. Later on, with the invention of the tall blast furnace operating at higher temperatures, one had to go through multiple processes to get from pig iron to steel. One way was to reheat the metal and burn off enough of the carbon to get down to 1.2-1.5 percent. The results were not even (it was not easy to stop at the right moment) and gave a variety of steels that were then used for different purposes. The best went for guns and fine cutlery; the poorer, for plowshares and sickles.
Another way was to remove the carbon to get wrought iron, and then add carbon to get steel. The adding was typically done by packing bars of wrought iron in carbon, heating and soaking, and then hammering. The aim was to beat the composite metal in such a way as to distribute the carbon evenly and homogenize the result—something like kneading dough. And just as kneading produces a more homogeneous dough by folding, pressing, and folding and pressing again, so the best of this cementation steel was folded upon itself, rehammered, and then again and again. The result was a layered bar of steel; the more the layers (that is, the more the folding and kneading), the nervier
and stronger the metal. The finest examples of this kind of work are the famous Japanese samurai swords, which still hold their edge and gleam after five centuries. Layered steel, invented in Europe in Nuremberg at the beginning of the seventeenth century (Nuremberg was an old center of tool-and instrument making), was immediately picked up by the English. The French did not learn the technique until about 1770.
But even samurai swords cannot compare for homogeneity with crucible steel, that is, steel heated to liquid so that the carbon additive mixes completely. The inventor of crucible steel, in 1740, was an English clockmaker, Benjamin Huntsman, who had an obvious professional interest in getting better metal for springs and files. The technique would remain a British monopoly for about three quarters of a century—not for want of would-be imitators.
The French in particular spent mightily to learn the secret. France was comparatively weak in steel and understandably saw this as a serious political disability. Early in the eighteenth century that scientific jack of all trades Rene Antoine de Reaumur (1683-1757), best known for his thermometer, claimed to have found the secret of what he himself compared to the “philosophers’ stone,” established a “royal manufactory” for the purpose of turning iron into steel, and got a generous government pension for his efforts. He failed, because he thought the answer lay in adding sulfur and the right salts. The role of carbon never occurred to him. He also thought that French iron was good enough for the purpose, unlike the British, who imported the finer Swedish iron for making steel.15 He should have looked around.
“This error of analysis and this ‘patriotic choice’ would long be accepted in France and would aggravate the backwardness of the national industry.”16 Others came forward subsequently and bragged that they had made steel comparable to the English and German product. No go. The biggest push came after Gabriel Jars’s English trip of 1765. Jars himself set out to produce cementation steel but got mediocre results, largely because he worked with French iron à la Réaumur; death in 1769 interrupted his efforts. Another technician named Duhamel, traveling companion of Jars and protege of the minister Turgot, was hired by the comte de Broglie, owner of a forge and recipient of a government subsidy of some 15,000 livres, to undertake similar experiments. Fifteen years later the government was obliged to recognize that Duhamel was getting nowhere. Lesser metallurgists tried on their own initiative. No question about it: France needed steel and wanted to know how to make it.
Enter the Englishman, Michael Alcock of Birmingham, whom we came to know above. He told the French that there was nothing to it: making steel was easy; the hard part was making good steel. So with the help of Director of Commerce Trudaine de Montigny (son of the man who had sent Jars and Duhamel to visit England), he set up a plant of his own and produced samples of cementation and crucible steel. He never got beyond the stage of samples.
Meanwhile two of Alcock’s partners went off on their own and bought a small filemaking forge at Amboise on the Loire (better known for its royal chateau). The forge caught the interest of the due de Choiseul and got the French government (again with the support of Trudaine de Montigny) to sponsor a “royal manufactory” of fine steels and subsidize it in the amount of 20,000 livres a year. The subsidy, however, came with a curse: the obligation to use French-made wrought iron. The enterprise invested heavily in equipment—six large furnaces, forty power hammers, eighty steel forges—and undertook experiment after experiment. To no avail. It never made crucible steel, and its cementation steel did not inspire confidence.
Other enterprises, more or less well connected but equally determined, also entered the race, aiming particularly at the manufacture of good files, which were becoming ever more important as mechanization advanced and metals replaced wood. One of them, in the Dauphiné, had the support of the intendant and of the financial group of the due d’Orléans. To begin with, it set its sights low: the manufacture of blades for scythes and sundry hardware. But then it ran into trouble because of embezzlement. It was easier to take money than to make it.
Denis Woronoff, historian of the French iron industry, sums up: sixty years after Reaumur, the French steel industry was still “marking time.” Announcement after announcement of success had proved false. Not that the government inspectors were gullible or complaisant, but they put more emphasis on the theoretical purity of the metal than on its performance (hardness, edge, etc.). They were also “sold” on the importance of size (gigantisme) in circumstances of diseconomies of scale. The result was waste, dead ends, and commercial failure.17
After that came the revolution and Napoleon. More marking time. Only in the 1820s did the French learn how to make crucible steel, thanks to a British expatriate named James Jackson. The Germans did it about ten years earlier, essentially without outside help. The Swiss Johann Conrad Fischer, a keenly observant and indefatigably peripatetic visitor of foreign enterprises—his nose and eyes were everywhere—learned to do it from about 1805.18
It takes more than recipes, blueprints, and even personal testimony to learn industrial cuisine.
Genius Is Not Enough
In the mid-nineteenth century, the alkaloid quinine was of vital importance to British rule in India, where malaria enfeebled and killed civilian and military personnel. Quinine did not cure the disease, but it relieved the symptoms. At that time, quinine was obtained from the bark of the cinchona tree, which was native to Peru. The British government, working through the world-famous botanical gardens at Kew, was making strenuous efforts to obtain cinchona seeds in Peru, nurse them into seedlings, and then plant them in India, but the results proved disappointing. India remained dependent on high-cost imports from Java, where the Dutch had managed to obtain a better transplant. The British would have preferred their own supply.
William Henry Perkin, born in London in 1838, was the son of a builder and had no connection with India. His father wished him to be an architect (social promotion), but early on he wanted to do chemistry. In 1853, only fifteen, he entered the newly founded Royal College of Chemistry, then under the direction of a German scientist, August Wilhelm Hofmann, who liked the boy and took him on as an assistant. Hofmann put Perkin onto the importance of finding a way to synthesize quinine, and Perkin took the problem home with him to a little laboratory he had fitted up in his family house. He did not find a way to make quinine, but by the by he did obtain a precipitate from naphtha (an ingredient of coal tar), aniline black, from which he then derived the color aniline blue, or mauve. (Chemistry has always been a science of serendipity.)
Perkin was alert enough to recognize the value of his find. His blue coloring matter made an excellent dye, and after patenting it, Perkin, then only nineteen years of age, set up a plant for its manufacture with funds provided by his father and brother. That was the end of his training at the Royal College. From this first lucky strike to purposeful others, Perkin soon became a millionaire. And then, another turn: he went back to his first love, experimental and theoretical chemistry. Besides, the German chemical industry was leaving the British far behind.
This first artificial dye was the stuff of dreams—the beginning of the enormously important coal-tar color industry. Once Perkin had given the cue, chemists in England, France, Germany, and Switzerland turned to the task and a rainbow of artificial colors came forth—fuchsia (fuchsine), magenta (after the blood shed in the battle of that name), a range of purples, the whole alizarin family of reds, pinks, oranges, and yellows, and a green that caused a sensation because it did not turn blue in gaslight.* These colors in turn stimulated demand for fashionable fabrics and weaned the women of the rich countries of Europe from their traditional economical and lugubrious black. (Today, richer still, many of them have gone back to black, even at weddings.) More important in the long run, however, was the ramification of the new techniques to wider chemical developments: new illuminants, pharmaceuticals (aspirin, salvarsan, sundry barbiturates, novocain, and dozens more), photographic materials, artificial fertilizers, and, down the li
ne, plastics—all of these with the usual share of unexpected and accidental finds.
Thanks to Perkin, Britain led the new industry. Britain had everything going for it. To begin with, it had a huge, traditionally based heavy-chemical industry producing alkalis, acids, and salt. Then, it had all the ingredients for a carbon-based manufacture: no country produced more of the raw material coal tar; nowhere was it cheaper. Finally, it offered the world’s biggest market for textile dyes. Yet within a generation the industry left British shores to settle in Germany and to a lesser extent in France and Switzerland. By 1881, Germany was making about half of the world’s artificial dyestuffs; by 1900, between 80 and 90 percent. The major German producers were able to pay dividends of over 20 percent, year in and year out through good times and bad, while investing large sums in plant, research, and equipment. This was one of the biggest, most rapid industrial shifts in history.19
Why? Why this apparent violation of the “laws” of comparative advantage and path dependency? Because apart from Perkin and a few other sports, Britain did not have the trained and gifted chemists needed to generate invention. Certainly not so many and so well trained as could be found on the Continent. So when A. W Hofmann, Heinrich Caro, and their German colleagues in Britain were drawn back home by attractive offers, the British organic chemical industry shriveled. In Germany, by contrast, big corporations arose and flourished: Hoechst, BASF (Badische Anilin u. Soda-Fabrik), Bayer, Agfa, built around top-flight chemists and chemical engineers, equipped with well-fitted house laboratories, and closely tied to the universities.
The Wealth and Poverty of Nations: Why Some Are So Rich and Some So Poor Page 34