Book Read Free

THAT WAS THE MILLENIUM THAT WAS

Page 12

by John Scalzi


  There are hundreds of types of cheese, from Abbaye de la Joie Notre Dame to Zamorano; varied nature of cheese initially had less to do with anything humans were doing than to the fact that every place on the planet has its own sorts of bacteria, so milk goes bad in different ways in different places. Eventually people gained some sort of control over the cheese-making process and started intentionally making different kinds of cheese, although the high-volume commercial aspect of cheese making had to wait until 1851, when the first cheese factory was constructed in upstate New York. Wisconsin, cheese capital of the world, saw its first cheese factory open seventeen years later. It was a limburger cheese factory. There's no punchline there, it's the truth.

  Processed cheese, the cheese of the millennium, reared its bland orange head in 1911 in Switzerland. However, the cheese gods had already favored that land with its own sort of cheese, the one with all the holes in it, so it was left to the Americans to take the process and popularize it. And they did: James Kraft developed his cheese processing process in 1912, perfected it five years later, and unleashed the cheese food product on the world shortly thereafter.

  The process of processed cheese is the secret to its blandness -- the natural cheese ripening process is interrupted by heat (read: they fry the bacteria before it gets out of hand and gives the stuff actual taste), and what you get is a block of proto-cheese that has an indefinite shelf life. It's bland, but it lives forever: The Dick Clark of cheese.

  Within the realm of processed cheese, there are gradations, relative to the amount of actual cheese in the cheese; the higher the number of qualifiers, the less cheese it has. To begin there's processed cheese, which is 100% cheese, just not a very dignified kind (usually some humiliated form of interrupted cheddar, labeled "American" so the other cheddars won't beat it up and steal its lunch money). Then there's processed cheese food, which features cheese by-products as filler. This is followed by cheese food product, which includes some entirely non-dairy ingredients such as vegetable oils. Finally, of course, there's cheez, which may or may not feature plastics. The less said about that stuff, the better.

  I certainly wouldn't argue that processed cheese is the best cheese of the millennium in terms of taste, texture, quality or snob appeal (I may be glib, but I ain't stupid), but I will suggest the utter ubiquity of processed cheese, American cheese, allows it to walk away with the title. Indeed, American cheese is to cheese as American culture is to culture: It's not necessarily better, it's just designed to travel, to be convenient to use, to be standard and unvaried and largely non-biodegradeable no matter where you find it.

  We can even go so far as to say that American culture and American cheese will go hand in hand, right to the last. Thousands of years from now, after the inevitable apocalypse of some sort wipes out our civilization, future archeologists will scour the land to make some sense of our times, and I think the process will go something like this.

  Archeologist 1: Look, it's another one temple of the ancestors' dominant faith. Note the golden arches.

  Archeologist 2: And look what I've found in the storage crypt!

  (pulls out a box of cheese slices)

  Archeologist 1: Ah, the communion squares. For their ritual obescience to Ro-Nald, the demon destroyer of worlds. You can see his terrible visage, bedecking the illuminated windows from behind the tithing altar.

  Archeologist 2 (sniffing the cheese): These smell terrible. It must have been some sort of penance to ingest these.

  Archeologist 1 (glancing over): You know, these samples have maintained their unholy orange taint. They may still be potent.

  Archeologist 2: What are you saying?

  Archeologist 1: I'll give you 10 glars if you eat one.

  Archeologist 2: You're out of your freakin' mind.

  Archeologist 1: All right, 20.

  Archeologist 2: Okay.

  Best Curmudgeon of the Millennium.

  Ambrose Bierce, 19th century newspaper columnist, wit, short story writer, misanthrope. Some people wake up on the wrong side of the bed. He woke up on the wrong side of the universe.

  Let us stipulate that it's not that difficult to be a curmudgeon. The word simply means a cranky, nasty person, usually a man, usually old. Further reflection on this definition leads to the conclusion that this definition is somewhat self-reflexive. Old men tend to be cranky and nasty because after a certain age, your prostate swells and you have to pee three times a night and you look like an ambulatory prune and you're going to die. Eventually, there is just no goddamn upside to putting on a happy face.

  In a way, I imagine it's liberating not to have to worry about being nice anymore. But I would bet most men would happily trade the freedom of curmudgeonlyness for the freedom to have sex with an attractive 27 year old (obviously, Hugh Hefner is excluded from this, though frankly, the thought of that man snorkling around with three girls whose combined ages don't equal his own makes me want to infiltrate the Playboy mansion, steal his Viagra, and then laugh cruelly as he undergoes blue-tinted withdrawal symptoms).

  Be that as it may, one's curmudgeonosity doesn't count if its brunt is only felt by one's children, one's noisy neighbors, and one's burly, no-nonsense male nurse ("Oh, you better believe it's time for your enema. We can do this easy, or we can do this hard"). One must spread their crankiness around, let it flow as if from an overburdened spigot. As it happens, the written word is a perfect medium for this sort of thing; as fun as it is to shout bitter invective randomly at people, it's even more fun to daintily handcraft a stream of pure venom, print it up and put it out on the street. If nothing else, people will save the clippings.

  Ambrose Bierce was an undisputed master of venom. He wrote with an animosity of such rare degree that in this day and age, any one of his newspaper columns would have been slapped with six different sorts of libel suits, presuming his editor somehow agreed to publish it in the first place (which any sane editor would not). Read, for example, Bierce's lashing of Oscar Wilde in an 1882 column:

  That sovereign of insufferables, Oscar Wilde has ensued with his opulence of twaddle and his penury of sense. He has mounted his hind legs and blown crass vapidities through the bowel of his neck, to the capital edification of circumjacent fools and foolesses, fooling with their foolers. He has tossed off the top of his head and uttered himself in copious overflows of ghastly bosh. The ineffable dunce has nothing to say and says it— says it with a liberal embellishment of bad delivery, embroidering it with reasonless vulgarities of attitude, gesture and attire. There never was an impostor so hateful, a blockhead so stupid, a crank so variously and offensively daft. Therefore is the she fool enamored of the feel of his tongue in her ear to tickle her understanding.

  And that's just the first paragraph!

  What was Bierce's problem? Oh, where to begin. First off, he had, of course, a terrible childhood. Then he fought in the Civil War and took part in several of the bloodiest battles of the entire war, including Shiloh, in which Union soldiers barely fought back a surprise Confederate attack (which probably would have succeeded, had not the Confederacy's commanding general been shot and mortally wounded early in the fighting. D'oh).

  During the war, Bierce got a severe head wound that in his words "cracked my head like a walnut." The combination of his own wounds with the general horrors of that fraternal war didn't do anything to brighten Bierce's outlook, and post-war job as a treasury official in the Reconstruction south (the Treasury department being heinously corrupt at the time) didn't help matters, either.

  Bierce had a few good years, when he married, raised some kids, and wrote for papers in San Francisco and in England. But it didn't last. Bierce accused his wife of infidelity, based on suspicious letters, and left her; eventually she filed for divorce on grounds of abandonment, but died before the divorce was finalized. Bierce's two sons both died young, one at 16, in a fight over a girl, and the other at age 23, from pneumonia related to alcoholism.

  Even Bierce's writing career had
its taints: Bierce railed against the undue influence that California's railroad magnates had over politics and newspapers, but then he found the magazine he worked for, the Wasp, was in the back pocket of another industrial concern (a water company). He was compelled to resign. So you see, he had plenty to be cranky about.

  Bierce's nasty streak didn't win him any friends (and alienated what friends he did have, as he showed no reluctance to keep them off his lists of targets). But it sure sold newspapers and magazines, enough so that William Hearst hired Bierce to write a weekly column for the San Francisco Examiner for $100 a week, which was not a bad rate back then (and which would have some writers salivating even now, alas for all us poor writers). Bierce hated Hearst and was something of a prima donna, refusing to do interviews or any sort of legitimate reportage, sniffily explaining that his job did not include "detective work." But a hundred bucks is a hundred bucks, I suppose. Even a curmudgeon's gotta eat.

  It was largely from the Examiner columns that Bierce's most famous work is extracted: The Devil's Dictionary, a collection of witheringly sarcastic definitions that peppered Bierce's columns for two decades. Reading these bitter definitions, one is struck at how contempory they are; they could have been written by those snide toads over at Suck.com last week as easily as by a Civil War vet 120 years ago (my favorite is actually a two parter: "ACADEME, n. An ancient school where morality and philosophy were taught. ACADEMY, n. [from ACADEME] A modern school where football is taught.").

  This isn't necessarily a compliment to the current age, mind you. Today's bitter ironists are ironic for no good reason -- it's just kinda kooky and fun to have a nasty 'tude, or the appearance thereof. The trenchant wit of spoiled middle-class is not something one actually fears, you know (trust me). One imagines that the head-crushed, Shilo-surviving, cuckolded, two-child-burying Bierce would have taken a look at today's smartasses sniggering into their lattes, and, having somehow successfully kept himself from strangling them with his own hands, would have decided to head out for points unknown rather than to deal with them any further.

  As it happens, something very much like that is exactly what happened to Bierce. He got tired of America and headed to Mexico in 1913, arriving just in time for Pancho Villa's revolution. Once he crossed the border, he was never heard from again. Responsible folks suggest he was killed in the siege of Ojinaga in January 1914; less responsible folks suggest he joined the revolutionaries as a goad. Indeed, there is a legend of a curmudgeonly old gringo in Villa's camp, who apparently gave the revolutionary leader no end of grief. It may not have been Bierce. But it would have been perfect if it was.

  Best Unit of Time of the Millennium.

  The second. Because the second is a lot newer than might expect. How new? Try 1967. Yes, the definition of a second and Sgt. Pepper's, both in the same year. Kinda makes you think, doesn't it.

  Well, now, wait a second, you say (no pun intended). I'm pretty sure they had seconds before 1967. And it's true: They did. Previous to 1967, everything was not rounded up to the nearest minute. However, that second, and the second we have today, are entirely different things. And if you think a second is defined as 1/60th of a minute, boy, are you ever living in the past.

  Let's work this through. Most of us think of the second as 1/60th of a minute, which is itself 1/60th of an hour, which is 1/24th of an entire day (for those of you lost in the math, a second would thus be 1/86,400th of a day). The day, in this respect, is the ultimate unit of time, from which the others are derived.

  But -- get this -- there's more than one kind of day. There's the sidereal day -- the time it takes one point on the planet to fully rotate relative to the background of stars -- and then there's the solar day, which is the amount of time from sunrise to sunrise. The sidereal day is shorter than the solar day -- it takes 23 hours, 56 minutes and 4.09 seconds, in mean solar time. This is because the sun moves east, relative to the stars (this is why the constellations "move" in the sky as the year progresses).

  Sure, okay, but there's the solar day (the mean solar day, that is, as the actual solar day speeds up and slows down eentsy bits relative to the position of the earth in its orbit around the sun), and that one is 24 hours -- by which we mean that the hour is the amount of time in which it takes the earth to rotate 15 degrees. That seems like a pretty fixed amount of time. The earth doesn't suddenly speed up or slow down, right?

  Well, no. Not suddenly. But the earth's rotation is in fact slowing down. You've heard of the moon, no doubt. The moon's gravitational pull on the earth causes tides (so does the sun's, to a lesser extent). Those tides, combined with the fact that the earth's land masses keep the tidal bulge from remaining perfectly positioned under the moon, cause tidal friction. Tidal friction, in turn, is causing two things: it's pushing the moon further away from the earth, and the earth's rotation is slowing. Quite a bit over the course of time -- 2 billion years ago, Earth's day was only about 58% as long as today's. The moon was a hell of a lot closer, too.

  (Science trivia: The first mathematical presentation of the tidal friction concept was presented by astronomer George Darwin, son of, you guessed it, Charles. First they tell us we're related to monkeys, then they tell us the day is getting longer! Those Darwin boys are nothing but bad news.)

  At some point billions of years from now, earth's rotation will stop completely; the period of the day will match the period of the earth's rotation around the sun. At which point any idiot will be able to run a "four minute mile."

  Point is -- if you peg your hours, minutes, and seconds to the rotation of the earth, those units of time are inevitably lengthening. Not enough that you or I would notice over the astronomically puny lengths of our own little lives. But still.

  (This doesn't even take into consideration additional relativistic fluctuations of time based on the earth's own gravity well, which conspire to make a "second" at 36,000 feet infinitesimally different from a "second" at sea level. But, you know, don't even think about that. You'll just get a headache.)

  How about if we define the second in terms of an hour? Go right ahead -- but be aware that for most of history, including much of this millennium, the length of an hour wasn't 1/24th of a day, it was 1/12th of a period of sunlight or darkness. The length of an hour varied not only from day to day (as you received more or less sunlight through the seasons of the year) but also from daytime to nighttime -- in the middle of winter, your daytime hours would be short, but your nighttime hours would be very long, indeed. Some attempt was made before mechanical clocks to regularize the hour as a passage of time, through candles, water-clock and hourglasses, but short of a few amazing engineering marvels (mostly in China) no one would try to pretend that these methods were anything but imprecise.

  Minutes? Please. The first clocks didn't even bother with a minute hand.

  In any event, while the essentially variable nature of these units of time don't matter to most of us, they presented a problem for scientists, who tend to be anal about things having exact definitions. You can't describe an event as occurring over a specified duration of time, if that definition of time is not in fact definite. Once scientists realized this was the case, they decided something had to be done. They decided to change the definition of the second.

  Why the second? Because it's the most useful unit of time in terms of science. Also, one suspects that as removed from the real world as scientists are, they knew enough to know that the average Joe might raise a stink if them eggheads tried to screw with his whole day. A second -- eh. Who cares if they screw with that? A second's over before you know it.

  The first attempt to redefine the second came in 1956 when the (conspiracy theory lovers, start your engines now) International Committee on Weights and Measures defined the second as 1/31,556,925.9747th of the length of the tropical (seasonal) year of 1900. Everybody congratulated themselves on a job well done until someone pointed out that since 1900 was actually 56 years previous, it was going to be sort of difficult to get an actual
bead on how long the second was back then. Entropy had occurred in the interim, after all. Back to the drawing board.

  The second attempt, in 1964, was somewhat more intelligent. For the first time, the second was defined entirely independent of either the rotation or revolution of our planet. It had been discovered that under particular conditions, the vibration of the cesium-133 atom occurred at phenomenally regular (though of course phenomenally small) intervals. So scientists pegged the second to that. Provisionally in 1964, and exclusively since 1967, the second has been defined as (you'll love this): "9,192,631,770 cycles of radiation associated with the transition between the two hyperfine levels of the ground state of the cesium-133 atom." Why 9,192,631,770 cycles? Well, because that was pretty much the length of time that everybody thought of as a "second." You have to hand it to those scientists.

  However they came about it, the length of a second is now carved in cesium, an element whose cycle is not susceptible to the ravages of tidal friction, rotational energy or any other messy details of planetary rotational periods. It is still susceptible to relativistic issues, of course. But, hey. I told you not to think about that. Just enjoy it, for a second.

  Best End of the World of the Millennium.

  Thomas Muentzer's Armageddon, in 1525. It wasn't actually the end of the world, but really. When is it ever?

  The history of the human species is the history of a people waiting for the other shoe to drop. The very first human who had the ability to think beyond the next five minutes probably got up one morning, looked around the cave and the savannah outside, smiled briefly and then thought, you know, this just can't last. Humans are innately eschatological -- looking for the signs and portents that signify that the end of the world is nigh. It beats Yahtzee.

 

‹ Prev