Midnight Ride, Industrial Dawn

Home > Other > Midnight Ride, Industrial Dawn > Page 17
Midnight Ride, Industrial Dawn Page 17

by Robert Martello


  Iron from Antiquity to America

  Revere’s shift from silver to ironworking mirrors the technological progression followed by the metalworkers of antiquity. Many early societies first worked with precious metals such as silver and gold because these metals were attractive to the eye and could occasionally be found in a pure state in large surface veins or riverbeds. Gold and silver have a high degree of malleability and ductility, and workers could easily shape them with simple physical processes such as hammering, requiring only a straightforward set of tools. Precious metals usually served ornamental purposes and had a high degree of symbolic value due to their rarity and beauty. Iron, one of the most abundant metals on Earth, contrasted with these precious metals in nearly all aspects, lending its name to the vibrant “iron age” period of prehistory.

  Many metals such as iron occur in either a “native” state, where the metal is relatively uncontaminated by other elements, or as an ore combining the metal and other elements within a quantity of rock. Metalworkers in the earliest human societies of the Near East such as Sumeria lacked the ability to convert iron ore into usable metal, because iron melts at the extremely high temperature of approximately 1535 degrees C, well beyond the reach of early furnaces. Early craftsmen did create a small number of iron implements from native iron sources, which usually came from iron-rich meteors. In fact, the Sumerian word for iron meant “heaven-metal,” while Egyptians used the term black copper from heaven. Archaeologists and scientists can identify meteoric iron with some degree of certainty because it is the only form of iron to contain trace quantities of nickel. This nickel also helped it to resist rust, explaining why some of these ancient iron objects still exist today.2

  Widespread human use of iron had to wait until technology provided the means of separating iron from rock pieces and unusable elements chemically bonded to the iron, a process called smelting. Current consensus indicates that the first iron smelting took place in the Near East, most likely in northern Anatolia or near the shores of the Black Sea, around the end of the Bronze Age between the fifteenth and twelfth centuries BC. As with many major technological advances, iron smelting probably began as an accident. Copper melts at a lower temperature than iron, and coppersmiths unintentionally smelted small quantities of iron in their ovens around the early second millennium BC.3 Enterprising metalworkers fashioned these byproducts into valuable iron jewelry, and in the nineteenth century BC one ounce of iron was priced the same as 40 ounces of silver. The high melting temperature of iron prohibited the larger-scale routine manufacture of iron, but resourceful ironworkers eventually overcame this limitation and even found ways to harden at least the outer skin of worked iron to make it stronger and harder than bronze. The Hittites are often credited with these ironworking breakthroughs, and the dissemination of iron technology accelerated between 1200 and 1000 BC, at which time iron objects became widely used in the Near East and Eastern Mediterranean regions. And thus began the age of iron.4

  Over the next two thousand years, a series of ironworkers and governments seeking military or economic advancement transferred the technology of iron-working across a continent and an ocean, and finally into the eager hands of Paul Revere.5 Greece introduced ironmaking to Europe, and iron weapons and implements play a central role throughout Greek history, also appearing throughout Greek literature and mythology. The Greeks passed on ironmaking technology to the Romans, who spread it throughout Europe and even into England. English iron production began a period of rapid expansion in the fifteenth century with the importation of blast furnaces and foreign iron-workers, soon giving England a position of worldwide technological dominance. Deforestation caused the price of charcoal to rise faster than the price of iron in the sixteenth and seventeenth centuries, leading to iron shortages. By the seventeenth century, new ironworks in heavily forested nations such as Scandinavia and Russia, along with larger and more efficient furnaces, stabilized the price of iron. America, the next forested region to receive the transfer of ironmaking technology, nourished Britain’s hopes for continued resource sustainability.6

  The ironworking technology landing on America’s shores in the seventeenth century fell into two categories that employed different processes for converting iron ore into usable metal. In the direct, or “blooming,” process, workers smelted iron ore on a small scale with simple equipment in facilities called bloomeries. A single skilled worker placed a small quantity of iron ore over a stone hearth and heated it with a charcoal fire, fanned vigorously with bellows to reach higher temperatures. The charcoal fire separated the metallic iron from its waste products in two ways: some impurities chemically combined with carbon monoxide from the fire and escaped as gases, while others physically solidified into slag. Most of the slag melted and drained away from the usable metal, and the remaining slag became hard, brittle lumps throughout the “bloom,” a spongy mass of iron. The ironworker hammered the bloom to shatter and separate the brittle slag, leaving only the iron. This bloomery process directly produced bar iron (also known as wrought iron), a fairly pure substance consisting primarily of metallic iron with a small percentage of slag particles. Bar iron was tough and malleable, easily worked into different shapes and by far the most versatile type of iron in use during the colonial period. Iron workers could fuse two pieces of bar iron by heating them to a white-hot temperature and hammering them together, and craftsmen further refined bar iron goods by chiseling and filing them after they cooled.7 Early ironworkers understood the advantages of bar iron over most other metals, but attempted to discover an alternate production method that would allow them to produce it faster, cheaper, and more efficiently. The solution to this dilemma involved the use of a technological and managerial innovation known as the blast furnace.

  The blast furnace method of iron production was also known as the “indirect” process because it required two steps to produce bar iron. In the first step the blast furnace converted iron ore into an intermediate substance called pig or cast iron. Workers filled a gigantic stone chimney twenty or more feet high with alternating layers of iron ore, charcoal fuel, and a calcium-rich “flux” material such as limestone that chemically combined with the slag to help separate it from the iron. Once ignited, this mixture burned for days or weeks at a time, fueled by new inputs of ore, charcoal, flux, and a stream of air pumped via water-driven bellows. Dense molten iron pooled in a crucible at the bottom of the furnace and workers periodically poured it into sand molds, where it hardened into pig iron. Pig iron contained small amounts of carbon and silicon impurities that entered the iron from the charcoal and other substances in the stack. The presence of these and even less-desirable impurities determined the quality and final properties of the end product. Pig iron’s heat resistance and hardness made it useful for large or heavy objects such as cauldrons or fireplace backs, but its inevitable brittleness prevented its use in tools or items subject to impacts. Pig iron lacked malleability even when hot, and once it left its molten state it could only be reshaped by being re-melted, which took place in facilities called foundries under the supervision of specialized “founders” such as Paul Revere. Skilled founders could design intricate molds and produce detailed objects that most smiths had trouble matching.8

  Figure 4.1. Sketch of a blast furnace. From Edwin Tunis, Colonial Craftsmen and the Beginnings of American Industry (New York: Thomas Y. Crowell Company, 1965), p. 151. This cutaway sketch shows the components of a typical eighteenth-century American blast furnace. The waterwheel on the right drives air-pumping equipment (often including a bellows not shown here) that fans the fire inside the “bosh.” Workers load large quantities of charcoal, iron ore, and flux materials in alternating layers from the top of the furnace, keeping the furnace going for days or weeks at a time. Liquid iron sinks to the bottom and workers periodically release it through the forehearth into a sand pit, where it cools to form iron slabs or “pigs.” Paul Revere depended on the output of New England blast furnaces, remelting and casting the
iron slabs into utilitarian forms such as fireplace backs or window weights.

  Blast furnaces represented only the first half of the indirect ironmaking process, because their pig iron was less versatile and functional than the bar iron made in bloomeries. In order to produce bar iron, blast furnaces often formed symbiotic links with other ironworking institutions called fineries. Skilled finery ironworkers (or “finers”) converted pig iron into bar iron by heating the cast iron in a fanned charcoal fire. The heat and draft melted the pig iron and oxidized the carbon and silicon impurities, which escaped as gases. The finer then hammered the cooling lump of metal to separate the purer bar iron from the impurities in the liquid slag.9

  Blast furnace operation depended upon managerial as well as technical skills. By contemporary standards all ironworks required large quantities of coordinated labor, but blast furnaces, the largest type of ironworks, required twelve or more workers to feed and tap the furnace all day and all night, every day, for most of the April through November working season. Larger furnaces produced sixteen to twenty tons of iron a week, or around eight hundred tons per year. Blast furnaces required intensive organization and discipline, since one missed step in the process could cause costly setbacks and injuries. As one of the earliest forms of large-scale business in America, blast furnaces fostered the separation of a managerial role from the work of the skilled labor pool. Skilled ironworkers needed intelligence, perceptiveness, and stamina in order to interpret subtle clues and gauge invisible chemical transformations for long shifts under trying conditions. Competent managers ideally possessed some of the same technical skills as well as the organizational and leadership abilities needed to coordinate the workers and deal with problems as they arose.10 The blast furnace’s amalgamation of technological and managerial challenges helped inaugurate a new world of manufacturing endeavors: the large technological system.

  In addition to physically constructing and running the blast furnace, an ironworks operator also had to establish mechanisms for continual imports of raw materials, a financial infrastructure ensuring regular payments to creditors and employees, legal or political support, and many other societal factors. Blast furnaces and other complex industrial operations, known as “large technological systems,” do not exist in a vacuum, and their technological components depend upon numerous connections with the larger social, political, and economic environment. Large technological systems collect physical artifacts, organizations, intellectual property, legislative support, and natural resources in order to solve problems with increased efficiency. Entrepreneurs and inventors take a leadership role in the early years of a system when it needs to grow rapidly, and managers take over when systems reach a mature state. These managers often attempt to reduce the potential for worker errors through deskilling, routinization, and bureaucracy, and growing systems incorporate external resources such as raw material sources in order to minimize problems such as price fluctuations or shortages.11

  A blast furnace owner could not carry on operations without considering local customs, market conditions, laws, tariffs, price of competing products, skilled laborer expectations, and many other factors. Similarly, blast furnaces impacted their social and cultural settings by affecting the local labor market, altering the price of goods, forming relationships with military purchasers, sponsoring different political figures, and so on. As blast furnaces grew more prominent many manufacturers received an early taste of the importance of integrating technological challenges with social, financial, and other concerns. In this manner the age of industry gained strength.

  Revere chose to enter the ironworking field as a founder, a sensible decision given his circumstances. Bloomeries only made sense in frontier areas close to iron ore sources and far from competing sources of iron products, certainly not the case in Boston. Blast furnaces operated on a massive scale well beyond Revere’s means and required considerable capital investments; a large, trained labor force; reliable inputs of iron ore, charcoal, and waterpower; and knowledge of complex chemical processes. A foundry depended upon a process Revere had already learned, the art of heating and casting metal into molds, and as a smaller endeavor it lent itself to artisan ownership and management. Revere might not be in a position to start his own technological system in the late 1780s, but by joining the already established ironworking network he increased his income while teaching himself how to think beyond the scope of an artisan’s workshop.

  By the time Revere entered the foundry business, America had close to two centuries of ironworking experience under its belt. The story of large-scale American iron production began in Virginia in 1619, when the Southampton Adventurers assembled 4,000 pounds sterling and 80 workmen in Falling Creek, Virginia. This attempt to establish the first colonial ironworks came to a sudden, less than triumphant ending when Indians killed the workmen and destroyed the facilities prior to the start of production. John Winthrop Jr. led the colonists’ second ironworking attempt in 1646, when he constructed and oversaw the Saugus ironworks (also called Hammersmith) in Lynn, Massachusetts. Although it closed in 1668 due to disappointing profits and legal difficulties, Hammersmith collected enough expertise and labor over the twenty-two years of its operation to lay the groundwork for spin-off ironworks in other colonies. By the 1680s, this technology had spread to the mid-Atlantic colonies, and the first forges in Maryland and Pennsylvania appeared in the early eighteenth century. By the Revolution, ironworks could be found in every colony except Georgia. While all American ironworks benefited from abundant supplies of wood and waterpower, iron ore was far more abundant in the mid-Atlantic region, with Pennsylvania rapidly asserting its dominance over the field. New iron manufactories often started with a bloomery because of its low startup and operating costs, and later added a blast furnace to produce cast hollowware and pigs. After the blast furnace attained steady production, owners could easily convert the bloomery to a finery and possibly even add a slitting mill, a water-powered cutting device that sliced iron into long, thin strips that could be made into nails.12

  British investors initially funded and operated most colonial ironworks, following the mercantilist mindset. In other words, the mother country intended to process America’s abundant natural resources in English workshops in order to solve its employment and manufacturing shortfalls, and then ship finished products back to the colonies for profitable sales. The earliest colonial ironworks, sawmills, and other technical endeavors attempted to fulfill this plan by focusing upon the processing of raw materials. The earliest ironworks also produced negative social impacts in their local communities: one might imagine the consequences of large groups of male laborers congregating in environments characterized by hard work, liquor, and profanity. The resulting social disorder and societal diversity stood in contrast to early New England’s cohesive religious atmosphere.13

  Americans expanded their ironworking in the 1720s after Britain’s disintegrating relationship with Sweden led to restricted Swedish iron imports. Colonial iron helped make up this deficit, and skilled laborers from earlier ironworks provided vital expertise. Although many of these new ironworks sold their output to British merchants, others intended to produce for local markets whose rapidly increasing populations and desire for a higher standard of living fueled a voracious hunger for iron products. At this point the mercantile theory of colonial manufactures had worn thin, since colonial iron-works started producing larger quantities of finished products that competed with goods from the mother country. When British representatives eventually caught on to America’s growing iron sector, Parliament passed the Iron Act of 1750 in an attempt to force a return to mercantilist ideals. This legislation removed all import duties from American pig and bar iron in order to encourage the export of raw materials to English manufacturers, but forbade the construction of American slitting and rolling mills, plating and triphammer forges, and other advanced processing works. This law did not have much effect because colonists offered iron regulations the same res
ponse they gave to other unpopular British policies: contempt and skillful evasion. The blatant construction of new finishing facilities continued and even increased until the Revolutionary War, fed by an ever-growing population demanding iron goods. American ironworks in the 1770s employed approximately 8,000 men and produced 30,000 tons of pig and bar iron a year, approximately 15 percent of the world’s output. The American colonies ranked as the third largest iron-producing nation.14

  The Iron Act did have one impact: it succeeded in infuriating American ironworkers, many of whom became extremely active in the Revolutionary War. Prescient almanac publisher Nathaniel Ames affirmed America’s attachment to iron in 1758, writing, “This Metal, more useful than Gold or Silver, will employ millions of hands, not only to form the martial Sword and peaceful Share alternately; but an Infinity of Utensils improved in the exercise of Art, and Handicraft amongst Men.”15 The British government drastically underestimated the size of the colonial ironmaking industry and its potential value to the colonists in the Revolution. War between the colonies and Britain both helped and hindered the iron industry: wartime demands engendered large ordnance contracts from federal and state governments, but invading armies destroyed furnace equipment and created crippling manpower shortages. In 1785 visiting Swedish engineer Samuel Gustav Hermelin observed that the war destroyed some furnaces and left others idle, while also dispersing or killing a number of the nation’s already scarce supply of skilled laborers. Lord Sheffield, a spokesman for British commercial interests, concurred with Hermelin, reporting that Britain would face no American competition whatsoever on a list of articles including copper sheets and utensils, bar steel, and “iron and steel manufacture of every kind.”16

  American ironworkers developed a new business model called the “iron plantation” in the late eighteenth century, most common in the mid-Atlantic states. An even larger variant of the blast furnace, iron plantations occupied thousands of acres of land and integrated operations such as agricultural food production for workers, forest management to ensure sustainable fuel supplies, and diversified ironworks. Plantations heralded modern industrial organizations in many ways, such as the large-scale use of wage labor, collection of all aspects of production (e.g., blast furnace, forge, finery, slitting mill) in one facility, advanced transportation networks, and complicated market analysis and accounting systems. However, the plantations also retained old-fashioned elements such as communal on-site housing, a mix of agricultural and industrial work for the laborers, and paternal family-based ownership.17 In the same way that Revere and other artisans blended traditional practices with some of the hallmarks of the upcoming age of industrial capitalism, iron plantations also played a transitional proto-industrial role that placed them in, or between, two worlds.

 

‹ Prev