Book Read Free

Consider the Fork

Page 9

by Bee Wilson


  By the eighteenth century, polite Westerners sat at the dinner table delicately holding their pretty little knives, trying to avoid at all cost any gesture reminiscent of violence or menace. As a cutting technology, the table knife was now more or less redundant. By the late eighteenth century, the celebrated Sheffield table knife from the north of England, though still made of top-notch steel, had become less about cutting and more about display. In London society, these were beautiful objects, laid out on the table as marks of a host’s good taste and wealth. It would be easy to write off table knives as technologically obsolete in the modern era. The uselessness of table knives is shown by the appearance of sharp, serrated steak knives (pioneered in the southern French town of Laguiole), whose presence acts as a kind of rebuke to normal knives: what steak knives say is that when you actually need to cut something at the table, a table knife won’t do.

  The table knife was now an entirely separate object from the knife as weapon. There was no need to carry a knife with you; indeed, to do so could be considered poor form, in Britain at any rate. In 1769, an Italian man of letters, Joseph Baretti, was indicted for stabbing a man in self-defense in London, using a small folding fruit knife. Baretti’s defense was that it was common practice on the Continent to carry a sharp knife for cutting apples, pears, and sweetmeats. The fact that he had to explain this in such detail to a British court was a sign of how the nature of knives had changed in Britain by 1769. Sharpness was no longer seen as necessary or even desirable in a table knife. In this, Britain was leading the way.

  There is more to table knives than sharpness, however. There is also the question of how they make food more pleasurable to eat—or not. From this perspective, for most people, table knives really only came into their own in the twentieth century, with the advent of stainless steel.

  I said earlier that the carbon steel favored by Sheffield cutlers was a far better metal for forging blades than previous alternatives. What I didn’t mention was this: the downside of carbon steel, like iron, is that it can make certain foods taste disgusting. Anything acidic has a potentially disastrous effect on non-stainless steel. “Upon the slightest touch of vinegar,” wrote the great American etiquette expert Emily Post, steel-bladed knives turned “black as ink.” Vinaigrette and steel knives were a particularly bad combination, hence the French prejudice, that persists to this day, against cutting salad leaves.

  Another problem was fish. For centuries, people have found lemon to be the ideal accompaniment to fish. But until the 1920s, and the invention of stainless steel, the taste of lemony fish was liable to be ruined by the tang of blade metal from the knife. The acid in the lemon reacted with the steel, leaving a foul metallic aftertaste that entirely overpowered the delicate flesh of the fish. This explains the production of fish knives made of silver in the nineteenth century. Nowadays, these seem a pointless affectation. In fact, fish knives were a mainly practical invention, albeit one that only the rich could afford. Unlike normal steel knives, silver knives were noncorrosive and did not react with the lemon juice on the plate. The signature scalloped shape was firstly a way to distinguish them in the cutlery drawer (as well as signaling the fact that fish, unlike meat, was not tough and did not need to be sawed at). If you had no silver fish knives, the only other option was to eat fish with two forks, or a single fork and a piece of bread, or suffer the taste of corroded steel.

  So, the launch of stainless steel in the twentieth century ranks as one of the greatest additions to happiness at the table. Once it entered cheap mass production after World War II, it placed stylish, shiny cutlery within the reach of most budgets and removed at a stroke all those fears about knives making food taste funny. Never again would you have to worry when you squirted a lemon over a piece of cod or feel that you mustn’t use a knife to cut dressed salad.

  Stainless steel (otherwise known as inox steel or nonrusting steel) is a metal alloy with a high chromium content. The chromium in the metal forms an invisible layer of chromium oxide when exposed to the air, which is what enables stainless steel to remain resistant to corrosion and also splendidly lustrous. It was only in the early years of the twentieth century that a successful stainless steel—strong and tensile enough as well as corrosion resistant—was made. In 1908, Friedrich Krupp built a 366-ton yacht—Germania—with a chrome-steel hull. Meanwhile, in Sheffield, Harry Brearley of Thomas Firth and Sons had discovered a stainless steel alloy while trying to find a corrosion-resistant metal for making gun barrels. Noncorrosive cutlery was a happy by-product of the search for military advantage between Britain and Germany on the road to total war. At first, the new metal was hard to work in all but the simplest cutlery patterns; it took the industrial innovations of World War II for stainless steel knives to become something that could be worked efficiently and cheaply in the shapes people desired. Stainless steel was another step in domesticating the knife, in rendering it cheaper, more accessible, and less threatening than the knives our ancestors carried around on their person.

  The Western table knife now seems an altogether harmless object (though they were still thought menacing enough to have been banned from planes in the wake of 9/11 ). Our preference for these blunt implements over the past two hundred years has had powerful unseen consequences, however.

  Knives do not just leave their mark on food. They leave it on the human body Every chef has scars to show, and often does so proudly, giving you the story behind each wound. Hack marks on a thumb from paring vegetables; the missing chunk of finger from an unfortunate encounter with a turbot. My finger still bulges tenderly where the mandolin sliced it. Then there are the blisters and calluses that chefs acquire, which appear without any accidents or mistakes, just through the action of good knife work. Blisters and gashes are the most obvious legacy of the kitchen knife, but the marks the knife has left on our bodies go further still. The basic technology of cutting food at the table has shaped our very physiology, and above all, our teeth.

  Much of the science of modern orthodontics is devoted to creating—through rubber bands, wires, and braces—the perfect “overbite.” An overbite refers to the way our top layer of incisors hangs over the bottom layer, like a lid on a box. This is the ideal human occlusion. The opposite of an overbite is the “edge-to-edge” bite seen in primates such as chimpanzees, where the top incisors clash against the bottom ones, like a guillotine blade.

  What the orthodontists don’t tell you is that the overbite is a very recent aspect of human anatomy and probably results from the way we use our table knives. Based on surviving skeletons, this has only been the “normal” alignment of the human jaw for 200 to 250 years in the Western world. Before that, most human beings had an edge-to-edge bite, comparable to apes. The overbite is not a product of evolution—the time frame is far too short. Rather, it seems likely to be a response to the way we cut our food during our formative years. The person who worked this out is Professor Charles Loring Brace (born 1930), a remarkable American anthropologist whose main intellectual passion was Neanderthal man. Over decades, Brace built up the world’s largest database on the evolution of hominid teeth. He possibly held more ancient human jaws in his hand than anyone else in the twentieth century.

  As early as the 1960s, Brace had been aware that the overbite needed explaining. Initially, he assumed that it went back to the “adoption of agriculture six or seven thousand years ago.” Intuitively, it would make sense if the overbite corresponded to the adoption of grain, because cereal potentially requires a lot less chewing than the grainy meat and fibrous tubers and roots of earlier times. But as his tooth database grew, Brace found that the edge-to-edge bite persisted much longer than anyone had previously assumed. In Western Europe, Brace found, the change to the overbite occurred only in the late eighteenth century, starting with “high status individuals.”

  Why? There was no drastic alteration in the nutritional components of a high-status diet at this time. The rich continued to eat large amounts of protein-rich meat and
fish, copious pastries, modest quantities of vegetables, and about the same amount of bread as the poor. Admittedly, the rich in 1800 would expect their meat to come with different seasonings and sauces than in 1500: fewer currants, spices, and sugar, but more butter, herbs, and lemon. Cooking styles certainly evolved. But most of these changes in cuisine long predated the emergence of the overbite. The fresher, lighter nouvelle cuisine that appeared on tables across Europe during the Renaissance goes back at least as far as 1651, with the French cookbook by La Varenne called Le Cuisinier françois; arguably, it goes back still further, to the Italian chef Maestro Martino in the 1460s, whose recipes included herb frittata, venison pie, parmesan custard, and fried sole with orange juice and parsley, all things that would not have looked out of place at wealthy dinners three hundred years later. At the time that aristocratic teeth started to change, the substance of a high-class diet had not radically altered in several hundred years.

  What changed most substantially by the late eighteenth century was not what was eaten but how it was eaten. This marked the time when it became normal in upper- and middle-class circles to eat with a table knife and fork, cutting food into little pieces before it was eaten. This might seem a question of custom rather than of technological change, and to some extent it was. After all, the mechanics of the knife itself were hardly new. Over millennia, people have devised countless artificial cutting implements to make our food easier for our teeth to manage. We have hacked, sawed, carved, minced, tenderized, diced, julienned. The Stone Age mastery of cutting tools seems to have been one of the factors leading to the smaller jaws and teeth of modern man, as compared with our hominid ancestors. But it was only 200 to 250 years ago, with the adoption of the knife and fork at the dining table, that the overbite emerged.

  In premodern times, Brace surmises that the main method of eating would have been something he has christened “stuff-and-cut.” As the name suggests, it is not the most elegant way to dine. It goes something like this. First, grasp the food in one of your hands. Then clamp the end of it forcefully between your teeth. Finally, separate the main hunk of food from the piece in your mouth, either with a decisive tug of your hand or by using a cutting implement if you have one at hand, in which case you must be careful not to slice your own lips. This was how our ancestors, armed only with a sharpened flint, or, later, a knife, dealt with chewy food, especially meat. The “stuff-and-cut” school of etiquette continued long after ancient times. Knives changed—from iron to steel, from wood-handled to porcelain-handled—but the method remained.

  The growing adoption of knife-and-fork eating in the late eighteenth century marked the demise of “stuff-and-cut” in the West. We will return to the fork (and the chopstick and the spoon) in Chapter 6. For the moment, all we need to consider is this. From medieval to modern times, the fork went from being a weird thing, a pretentious object of ridicule, to being an indispensable part of civilized dining. Instead of stuffing and cutting, people now ate food by pinning it down with the fork and sawing off little pieces with the table knife, popping pieces into the mouth so small that they hardly required chewing. As knives became blunter, so the morsels generally needed to be softer, reducing the need to chew still further.

  Brace’s data suggest that this revolution in table manners had an immediate impact on teeth. He has argued that the incisors—from the Latin incidere, “to cut”—are misnamed. Their real purpose is not to cut but to clamp food in the mouth—as in the “stuff-and-cut” method of eating. “It is my suspicion,” he wrote, “that if the incisors are used in such a manner several times a day from the time that they first begin to erupt, they will become positioned so that they normally occlude edge to edge.” Once people start cutting their food up very small using a knife and fork, and popping the morsels into their mouths, the clamping function of the incisors ceases, and the incisors continue to erupt until the top layer no longer meets the bottom layer: creating an overbite.

  We generally think that our bodies are fundamental and unchanging, whereas such things as table manners are superficial: we might change our manners from time to time, but we can’t be changed by them. Brace turned this on its head. Our supposedly normal and natural overbite—this seemingly basic aspect of modern human anatomy—is actually a product of how we behave at the table.

  How can we be sure, as Brace is, that it was cutlery that brought about this change in our teeth? The short answer is that we can’t. Brace’s discovery raises as many questions as it answers. Modes of eating were far more varied than his theory makes room for. Stuff-and-cut was not the only way people ate in preindustrial Europe, and not all food required the incisor’s clamp; people also supped soups and potages, nibbled on crumbly pies, spooned up porridge and polenta. Why did these soft foods not change our bite much sooner? Brace’s love of Neanderthals may have blinded him to the extent to which table manners, even before the knife and fork, frowned upon gluttonous stuffing. Posidonius, a Greek historian (born c. 135 BC) complained that the Celts were so rude, they “clutch whole joints and bite,” suggesting that polite Greeks did not. Moreover, just because the overbite occurs at the same time as the knife and fork does not mean that one was caused by the other. Correlation is not cause.

  Yet Brace’s hypothesis does seem the best fit with the available data. When he wrote his original 1977 article on the overbite, Brace himself was forced to admit that the evidence he had so far marshaled was “unsystematic and anecdotal.” He would spend the next three decades hunting out more samples to improve the evidence base.

  For years, Brace was tantalized by the thought that if his thesis was correct, Americans should have retained the edge-to-edge bite for longer than Europeans, because it took several decades longer for knife-and-fork eating to become accepted in America. After years of fruitless searching for dental samples, Brace managed to excavate an unmarked nineteenth-century cemetery in Rochester, New York, housing bodies from the insane asylum, workhouse, and prison. To Brace’s great satisfaction, he found that out of fifteen bodies whose teeth and jaws were intact, ten—two-thirds of the sample—had an edge-to-edge bite.

  What about China, though? “Stuff-and-cut” is entirely alien to the Chinese way of eating: cutting with a tou and eating with chopsticks. The highly chopped style of Chinese food and the corresponding use of chopsticks had become commonplace around nine hundred years before the knife and fork were in normal use in Europe, by the time of the Song dynasty (960-1279 AD), starting with the aristocracy and gradually spreading to the rest of the population. If Brace was correct, then the combination of tou and chopsticks should have left its mark on Chinese teeth much earlier than the European table knife.

  The supporting evidence took a while to show up. On his eternal quest for more samples of teeth, Brace found himself in the Shanghai Natural History Museum. There, he saw the pickled remains of a graduate student from the Song dynasty era, exactly the time when chopsticks became the normal method of transporting food from plate to mouth.

  This fellow was an aristocratic young man, an official, who died, as the label explained, around the time he would have sat for the imperial examinations. Well, there he was, in a vat floating in a pickling fluid with his mouth wide open and looking positively revolting. But there it was: the deep overbite of the modern Chinese!

  Over subsequent years, Brace has analyzed many Chinese teeth and found that—with the exception of peasants, who often retain an edge-to-edge bite well into the twentieth century—the overbite does indeed emerge 800—1,000 years sooner in China than in Europe. The differing attitude to knives in East and West had a graphic impact on the alignment of our jaws.

  The knife as a technology goes beyond sharpness. The way a knife is used matters just as much as how well it slices. The tou that cut this Chinese aristocrat’s food a thousand years ago would not have been significantly sharper or stronger than the carving knives that were cutting the meat of his European counterparts at the time. The greatest difference was what was
done with it: cutting raw food into tiny fragments instead of carving cooked food into large pieces. The cause of this difference was cultural, founded on a convention about what implements to use at the table. Its consequences, however, were starkly physical. The tou had left its mark on the Chinese student’s teeth, and it was there for Brace to see.

  Mezzaluna

  WITH ITS STUBBY WOODEN HANDLES AND arching blade, the mezzaluna looks like an implement that should have fallen out of use several centuries ago. Some version of this curved mincing knife has been in kitchens at least since the Renaissance in Italy. Before the mezzaluna, Italian cooks employed many single-handled curved knives. There were also double-handled knives, but they were for scraping the table clean, not chopping. Finally, some enterprising palace smith must have thought to combine the sharp curved blade with the double handle, to create the perfect utensil for mincing. And still the mezzaluna endures, chopping herbs and lending its pretty name—“half-moon” in Italian—to numerous upscale restaurants.

  The mezzaluna’s staying power is a warning not to underestimate the power of romance in the kitchen. This is a thrilling object to use. It’s like taking your hands for a swing-boat ride in some ancient Italian city Up-down, up-down. You look down and inhale the giddy aroma of parsley, lemon peel, and garlic—the gremolata you’ve made to sprinkle on an osso bucco.

 

‹ Prev