Consider the Fork
Page 16
Having exhausted the measuring possibilities of the human hand, cooks turned to other familiar objects. Among these, the walnut stands out for its ubiquity. The “size of a walnut” has been used by cooks as far afield as Russia and Afghanistan; England, Italy, and France; and America. The comparison has been used at least since medieval times. It has been applied to carrots, to sugar, to parmesan fritters, to cookie dough, to fried nut paste, and above all, to butter. What made the walnut so prized as a unit of measurement?
Imagine you are holding a whole, unshelled walnut in the palm of your hand, and its value becomes clearer. Like a finger, the walnut was a familiar object; almost everyone would have known what one looked like. “The size of a walnut” was much more helpful than the other common shorthand, “nut-sized” (which always begs the question: what kind of nut?). Even now, most of us could estimate the size of a walnut pretty accurately, even if we only see them once a year for Christmas. Unlike apples or pears, which come in many shapes and sizes, walnuts are relatively uniform. It is true that there are freakishly small varieties of walnut, notably the French noix noisette, no bigger than a hazelnut. Usually, however, when we speak of walnuts we mean Juglans regia, which was cultivated in ancient Greece, to which it was imported from Persia. By 400 AD, it had reached China. It was an important crop in medieval France, though it did not reach Britain until the fifteenth century. The great thing about the Persian walnut, apart from its rich oily taste and delicate brain-like kernels, is its constancy of size. Its fruits do not vary much between around 2.5 and 3.5 cm in diameter, a handy quantity. Picture a walnut on a spoon. Now imagine the walnut has transformed into butter. It’s a decent amount, isn’t it? A walnut is somewhat more than a smidgen and less than a dollop. Much the same as a knob.
In numerous recipes where butter is required, a walnut is indeed just right. In 1823, Mary Eaton used a piece of butter “the size of a walnut” to stew spinach. Mrs. Beeton in 1861 advised walnut-sized butter for broiling rump steaks. Fannie Farmer might protest: how do I know if my butter is really walnut-sized? But the more confident you feel in the kitchen, the less you worry about these things. “Butter the size of a walnut” reflects the agreeable fact that in most forms of cooking—baking partially excepted—a tad more or less of any ingredient is not critical.
A walnut was not always the size needed, however, and cooks developed an entire vocabulary of measurements based on other familiar objects. The analogies chosen depended on time and place. Peas were common; as was the nutmeg, to signify a quantity somewhere in the region of our teaspoon. Seventeenth-century cooks wrote of bullets and tennis balls. Coins were another useful reference point, from shillings and crowns in England to silver-dollar pancakes in America.
These measurements through analogy feel like a window onto the domestic life of the past. They reveal a shared world of imagination, in which nutmegs, bullets, coins, and tennis balls are all forms of currency. These quantities may not be “scientific”; but they reflect great consideration on the part of recipe writers, attempting to translate their dishes into terms others would understand. Elena Molokhovets was a skilled Russian cook of the late nineteenth century. Her recipes are thick with these comparative measurements. When she rolled pastry, it might be to the thickness of a finger or of two silver ruble coins. Molokhovets cut ginger the size of a thimble, and dough the size of a wild apple, and butter—what else?—the size of a walnut.
We still rely on a shared imagery of measurement. When we “dice” vegetables, we are harking back to older cooks like Robert May who cut beef marrow into “great dice” and dates into “small dice.” When celebrity chef Jamie Oliver tells us how big to form the chopped meat for a homemade hamburger, he may not evoke walnuts or wild apples. But he does tell us to make it as big as a smallish cricket ball.
Quantity is just the beginning. The two hardest things to quantify in the kitchen are timing and heat. “Hold out your left hand,” says the Canadian chef John Cadieux, in a voice that tells me he is used to being obeyed. We are sitting at a darkly lit table at Goodman’s City, a steak restaurant in London near the Bank of England. Cadieux is the executive chef. We are talking steak. “Now take your right forefinger.” He shows me how to use this finger to touch the palm of my left hand, the fleshy part at the base of the thumb. “That’s what rare steak feels like,” says Cadieux. My finger sinks into my own squishy flesh: it feels just like raw meat, which doesn’t bounce back. “Next, bring your left forefinger and thumb together and continue to feel the base of the thumb with your right finger—that’s medium rare. Add your middle finger—medium. Ring finger—medium-well done. Finally, the little finger—that’s well done.” I am genuinely amazed at how the flesh behind my thumb tightens with each additional finger, like steak cooking in a pan. Cadieux, a shaven-headed thirty-something who has been working in high-end steakhouses for more than seven years now, sits back and grins. “It’s an old chef’s trick,” he says.
The restaurant has state-of-the-art charcoal ovens (two of them, $20,000 apiece), numerous digital timers lined up to cope with the endless stream of different steak orders, and the best meat thermometers money can buy. Cadieux insists on training his chefs for a minimum of two weeks (on top of whatever training they already have) before he lets them cook a steak. They must memorize the exact temperatures required for every cut and every order. Cadieux applies different standards to himself, however. “I don’t like thermometers,” he says. “I’m a romantic.” He has cooked so many thousands of steaks, he can tell instantly by the look and touch of a steak whether it is done to order.
Which is all very well until Cadieux needs to translate his own superior knowledge for his apprentices. At that point, he gets over his dislike of thermometers. Even if he himself has no need of measuring devices, he gets his sous-chefs to use them as crutches until they develop the instinctive knowledge of a master. For the medieval master chef, this question of passing on culinary skills was much harder. He would have had the same practical knowledge of cooking as Cadieux but none of the digital probes and timers to fall back on. How did you know when a dish was done? You just knew. But this wouldn’t help you to explain the principles to someone who didn’t “just know.” For this, you needed a range of ciphers to act as translations. Luckily, the medieval master had far longer than Cadieux’s two weeks to convey the finer points of measuring to his apprentices, most of whom started work as children, watching and absorbing the secrets of timing over many years.
Cooks have always needed to measure time, one way or another. The kitchen clock, quietly ticking on the wall, is one of the least recognized but most vital pieces of technology. No one seems to know when it first got there, though it had certainly arrived by the eighteenth century. We can tell that kitchen timepieces were not the norm in medieval and early modern times from the number of recipes giving timings not in minutes but in prayers. A French medieval recipe for preserved walnuts calls for boiling them for the time it takes to say a Miserere (“Oh wash me and more from my guilt . . . ”), about two minutes. The shortest measurement of time was the “Ave Maria,” twenty seconds, give or take. You might say that such recipes reflect the fact that medieval France was a society in which religion permeated everything. Yet this timing-by-prayer had a very practical underpinning in an age when clocks were rare and expensive. Like the walnut-sized butter, these timings depended on communal knowledge. Because prayers were said out loud in church, everyone knew the common pace at which they were chanted. If you asked someone to “boil and stir the sauce for the time it takes to say three Pater Nosters” or “simmer the broth for three Lord’s Prayers” they knew what this meant. And so far from being otherworldly, it was more sensible advice than some of the more secular examples in written recipes such as “Let the solid particles precipitate from the mixture for the time a person would take to walk two leagues.” The use of prayers as timing devices belonged to the long centuries when cooks had to use deep ingenuity and vigilance to ensure that
a meal came out right: cooked, but not burned.
If time was measured through prayer, heat was measured through pain. To test the heat of an oven, you reached inside. This is how bakers still proceed in many parts of rural Europe. You would put a hand in the oven and gauge from the level of pain whether the oven was ready for baking loaves—which require the fiercest heat.
One step up from this was the paper test. This was much used by confectioners in the nineteenth century. The point here was not to gauge the fiercest heat as you stoked a fire up, but the subtle gradations of gentler warmth as the oven cooled down, suitable for baking cakes and pastries, whose high butter and sugar content made them much more liable to burn than bread. Each temperature was defined by the color a sample of white kitchen paper turned when put on the oven floor. First, you had to put a piece of kitchen paper inside the oven and shut the door. If it caught fire, the oven was too hot. After ten minutes, you introduced another piece of paper. If it became charred without burning, it was still too hot. After ten minutes more, a third piece went in. If it turned dark brown, without catching fire, it was right for glazing small pastries at a high heat: this was called “dark brown paper heat.” Jules Gouffé, chef at the Paris Jockey-Club, explained the other types of heat and their uses. A few degrees below dark brown paper heat was “light brown paper heat, suitable for baking vol-au-vents, hot pie crusts, timbale crusts etc.” Next came “dark yellow heat,” a moderate temperature, good for larger pastries. Finally, there was the gentle heat of “light yellow paper heat,” which, said Gouffé, was “proper for baking manqués, génoises, meringues.” A variation was the flour test, which was the same but with a handful of flour thrown on the oven floor: you were meant to count to forty seconds; if the flour slowly browned, it was right for bread.
All of this to-ing and fro-ing was done away with at a stroke once an integrated oven thermostat became commonplace in the twentieth century. The thermostat is one of those examples of a technology that feels as though it should have entered the kitchen much sooner than it did. Various thermometers were developed by scientists—including Galileo—as early as the sixteenth century, mostly for measuring air temperature. In 1724, Fahrenheit produced his temperature scale; and in 1742, Celsius produced his competing scale (from the melting point to the boiling point of ice). The kitchen is a place where plenty of water boils and ice melts, yet for hundreds of years no one thought to bring a thermometer to bear on the question of what temperature to use to bake a cake. By the 1870s, people routinely spoke of the weather in relation to thermometers—in 1876, English cricketers played on a “blazing July day” when “the thermometer in the sun stands at about 110 degrees,” wrote the New York Times—yet when they stepped into the kitchen, they were still happy with “dark yellow heat” and “light yellow heat.”
Finally, around the turn of the twentieth century, cooks started to see that thermometers might be rather useful after all. An American oven called the “new White House” advertised itself with an oven thermometer, included “in order to keep . . . strictly up to the minute.” The first fully integrated gas oven with thermostat was marketed in 1915, and by the 1920s, electric stoves fitted with electromechanical thermostats were being produced. But the easiest thing, for those who already had a stove, was to buy a stand-alone oven thermometer and get it fitted to your existing oven.
One of the first cookbooks written after these newfangled oven thermometers came on the market was Mrs. Rorer’s New Cook Book in 1902 by Sarah Tyson Rorer. Mrs. Rorer was the principal of the Philadelphia Cooking School, a woman with twenty years’ teaching cooking. She greeted this new device with nothing less than ecstasy. Thermometers, she wrote, cost only $2.50 and “relieve one of all anxiety and guesswork.” A note of early adopter pity creeps into her voice when she writes of those who are “without a thermometer” and must “guess at the heat of the oven (a most unsatisfactory way).” Rorer gave all her recipes in Fahrenheit (though she also gave a system for Celsius conversion). She clearly loved her new toy and the precision it appeared to guarantee. She thrust her thermometer into freshly baked bread and boiled meat (“plunge a thermometer into the center of the meat, and to your surprise it will not register over 170° Fahr.”). Rorer liked deep-fried oysters, an authentic old Philadelphia dish. Now she could abandon the device of putting a cube of bread to sizzle in the hot fat for deep-frying, waiting to see how quickly it browned. A thermometer told her at once if the fat was hot enough. Above all, Rorer used it for measuring oven heat. With the new ability to have a thermometer installed in any kind of “modern range,” whether gas-fired, coal, or wood, the cook was freed from the need to stand and watch and make “unsatisfactory attempts to ascertain the true heat of the oven.” All the old worry was gone, because the responsibility for guessing what was meant by “a moderate, moderately cool or quick oven” was lifted from the harried cook’s shoulders.
With the new thermometer, all that fussing was a thing of the past.
A potato will bake in three-quarters of an hour at a temperature of 300° Fahr.; it will harden on the outside and almost burn at a temperature of 400° in twenty minutes, and if the oven is only 220° it will take one hour and a quarter to a half.
The worry was gone because, as with Fannie Farmer’s cup measures, any need for individual judgment was also gone. There was no squinting at a bit of paper and wondering if it was closer to yellow or brown. You only had to follow the system and you would be fine. At least by some people’s standards.
When Nathan Myhrvold investigated ovens, he found that the thermostat in “nearly all traditional ovens is just plain wrong.” The margin of error on the average thermostat is so high that it gives a false sense of security: the temperature dial in which we mistakenly place our faith is not a true reflection of what is going on inside the oven. Myhrvold calls the oven thermostat an “underwhelming” technology.
For one thing, thermostats only measure dry heat and take no account of humidity. We know that moisture content in the oven hugely affects the way something cooks: whether it roasts or steams or bakes, and, if so, how fast. Yet cooks—until Myhrvold—have hardly ever thought to measure humidity: the technical term is wet-bulb temperature. An oven thermostat cannot measure how a leg of lamb’s cooking time is affected by a glass of wine sloshed in the pan; or how a bread’s crust is softened by a splash of water thrown on a scorching oven floor.
That’s the first problem. A greater drawback is that most domestic thermostats don’t even measure dry heat very accurately. A thermostat makes its reading based on a sensor probe filled with fluid—similar to the old mercury-filled glass thermometer used by doctors in days gone by. The location of the probe can skew our impression of how hot an oven is. The probes that Myhrvold most dislikes are located “behind the oven walls,” which may have a much lower temperature than the main body of the oven; it’s better to have the sensor protruding inside the oven cavity, though even this is not perfect, because the further the food is from the sensor, the less likely it is to be accurate. Myhrvold found that thermostats mismeasure the dry-bulb temperature in a domestic oven by “as much as 14°C/25°F,” which could make the difference between success and failure in a recipe. Every oven has its own hot spots. The answer is to calibrate your own oven: place an oven thermometer in different spots in the oven as it heats, then write down the true measurements and proceed accordingly.
In making the average home-cooked meal, temperamental ovens are a fact of life. Once you have learned that your oven gets too hot or too cold, you can turn the dial up or down, like tuning a musical instrument. This kind of rough-and-ready adjustment would not be accepted, however, in the modernist restaurant cooking of the early twentieth-first century, which sets great store by fantastically precise and accurate measuring devices. Chefs who cook in the style of Ferran Adria of the now closed El Bulli in Spain need to be able both to measure very large amounts (up to 4 kilos) and very small amounts (to within 0.01 g) with the same degree of accurac
y over and over again. Most kitchen scales, even digital ones, fall far short of these exacting standards. The solution is to have not one but two sets of scales, both of laboratory standard, one for large quantities and one for small.
Weight and temperature are by no means the only things measured in the modernist kitchen. These high-tech cooks are like explorers mapping new culinary lands. They wish to quantify everything, from the heat of chilies (measurable on the Scoville scale) to the chill in the ultra-low-temperature freezers they favor. If they wish to test how tart a fruit puree tastes, they do not use their tongue, but whip out an electronic pH meter that can give an instant and exact reading on how acidic or alkaline any fluid is. To gauge the sugar content in a sorbet mix, they use a refractometer, a tool that responds to the way light bends as it passes through a given material. It will bend more or less depending on the density of a given liquid, which in turn tells you how sweet a syrup is (sweeter is denser). This is a technological step up from the old saccharometers used by brewers and ice-cream makers from the eighteenth-century onward, consisting of a calibrated glass bulb that measured sugar content through the principle of buoyancy (the higher the bulb floated, the sweeter the substance). Before that, mead makers would drop an unshelled egg into the honey-sweet liquid; if it floated, it was sweet enough.