The Most Human Human

Home > Nonfiction > The Most Human Human > Page 24
The Most Human Human Page 24

by Brian Christian


  Information, defined intuitively and informally, might be something like “uncertainty’s antidote.” This turns out also to be the formal definition—the amount of information comes from the amount by which something reduces uncertainty. (Ergo, compressed files look random: nothing about bits 0 through n gives you any sense what bit n + 1 will be—that is, there is no pattern or trend or bias noticeable in the digits—otherwise there would be room for further compression.5) This value, the informational equivalent of mass, comes originally from Shannon’s 1948 paper and goes by the name of “information entropy” or “Shannon entropy” or just “entropy.”6 The higher the entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things—from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test.

  The Shannon Game

  One of the most useful tools for quantitatively analyzing English goes by the name of the Shannon Game. It’s kind of like playing hangman, one letter at a time: the basic idea is that you try to guess the letters of a text, one by one, and the (logarithm of the) total number of guesses required tells you the entropy of that passage. The idea is to estimate how much knowledge native speakers bring to a text. Here’s the result of a round of the Shannon Game, played by yours truly:7

  We can see immediately that the information entropy here is wildly nonuniform: I was able to predict “the_living_room_is_a_” completely correctly, but almost exhausted the entire alphabet before getting the h of “handful”—and note how the “and” in “handful” comes easily but the entropy spikes up again at the f, then goes back down to the minimum at l. And “remo” was all I needed to fill in “te_control.”

  Search and the Shannon Game

  We computer users of the twenty-first century are perhaps more aware of information entropy—if not by name—than any generation before. When I use Google, I intuitively type in the most unusual or infrequent words or phrases, ignoring more common or expected words, as they won’t much narrow down my results. When I want to locate a passage in the huge MS Word document that contains this manuscript, I intuitively start to type the most unusual part of the passage I have in mind: either a proper noun or an unusual diction choice or a unique turn of phrase.8 Part of the effectiveness of the strange editing mark “tk,” for “to come,” is that k rarely follows t in English, much more rarely than c follows t, and so a writer can easily use a computer to sift through a document and tell him whether there are any “tk’s” he missed. (Searching this manuscript for “tc” pulls up over 150 red herrings, like “watch,” “match,” throughout the manuscript; but with only one exception, all occurrences of “tk”—out of the roughly half-million characters that comprise a book—appear in this paragraph.) When I want to pull up certain songs or a certain band in my iTunes library, say Outkast, I know that “out” is such a prevalent string of letters (which pulls up all Outkast songs, plus 438 others I don’t want) that I’m better off just typing “kast” into the search box. Or even just that same trusty rare bigram “tk,” which pulls up all the songs I want and only three I don’t.

  Art and the Shannon Game

  “Not-knowing is crucial to art,” says Donald Barthelme, “is what permits art to be made.” He’s referring here to the what happens if I try this? and what do I do next? of the creative process, but I think it’s just as true a statement about what it’s like to be a reader. “Every book, for me, is the balance of YES and NO,” writes one of Jonathan Safran Foer’s narrators in Extremely Loud and Incredibly Close. The Shannon Game represents one approach, one very fine-grained approach, to thinking about the reading experience as a kind of extremely rapid sequence of guesses, and much of the satisfaction, it would seem, is in the balance between yes and no, affirmation and surprise. Entropy gives us a quantifiable measure of where exactly the not-knowing lies, how exactly the YES and NO congregate on the page. Going back to the original spirit of Barthelme’s statement, does entropy give us a road into the creative imagination as well? Are the unguessable moments also the most creative ones? My intuition says yes, there’s a link. Anecdotally, here’s the next round of the Shannon Game I tried:

  The highest-entropy letters (when I attempted this passage) were the Y in the first “you,” the C in “cat,” and the M in “move.” They’re also, interestingly, key grammatical moments: respectively, the subject of the first dependent clause and the subject and verb of the second dependent clause. And are these not also the moments where the author’s intention and creativity peak too? And are these not the words—especially “cat”—that, if removed, would be the most difficult for the reader to guess back in?

  This last measure actually has its own name; it’s called the “cloze test.” The name comes from something in Gestalt psychology called the “law of closure”—which refers to the observation that when people look at a shape with some missing pieces or erasures, they still in some sense “experience” the parts that are missing.9 Cloze tests make up some of the familiar SAT-type questions where you have to fill the most context-sensible word into a blank in a _______ (a) pretzel, (b) Scrabble tile, (c) machine-gun, (d) sentence. Remove the context clues and ask the question anyway, and you have one of my favorite sources of amusement as a child: Mad Libs.

  Crowded Room as Cloze Test

  However, even if one has to sit down to the SAT or open a Mad Libs book to run into the cloze test on paper, the oral version of the cloze test is so common it’s essentially unavoidable. The world is noisy—we are always trying to talk over the sound of the wind, or construction across the street, or static on the line, or the conversations of others (who are themselves trying to talk over us). The audible world is a cloze test.

  Though it seems academic, the effects of this are visible everywhere. My conversation with the Brighton hotel clerk was a cloze test with the blanks almost the size of the sentences themselves; still I could guess the right answers back in. But—and for this very reason—I wouldn’t call it a particularly human interaction. When I think about, say, the most thrilling intellectual discussions I’ve had with friends, or the best first dates I’ve had, I can’t begin to imagine them holding up with that many blanks to fill in. I wouldn’t have kept up.

  Also, think about how you talk to someone when there’s loud music playing—so often we start to iron out the idiosyncrasies in our diction and phrasing. Whether we’ve heard of the cloze test, the Shannon Game, and information entropy or not makes no difference: we know intuitively when—and how—to play them, and when and how to enable them to be played easily by others. “Let’s blow this popsicle stand,” I might say if I can be heard clearly. But noise does not tolerate such embellishment. (“Let’s go get pot stickers, man?” I imagine my interlocutor shouting back to me, confused.) No, in a loud room it’s just, “Let’s go.”

  It strikes me as incredibly odd, then, that so many of the sites of courtship and acquaintance making in our society are so loud.10 There are any number of otherwise cool bars and clubs in Seattle that my friends and I avoid: you walk in and see small clusters of people, huddled over drinks, shouting at each other, going hoarse over the music. I look in and think, as a confederate, the Turing test would be harder to defend in here. The noise has a kind of humanity-dampening effect. I don’t like it.

  Lossiness

  There are two types of compression: “lossless” and “lossy.” Lossless compression means that nothing is compromised; that is, upon decompression we can reconstruct the original in its entirety without any risk of getting it wrong or missing anything or losing any detail. (ZIP archives are one such example, your photos and documents unharmed by the process of making the archive.) The other type of compression is what’s called lossy; that is, we may lose some data or some level of detail as a cost of the compression. Most images that you see on the web, for instance, are lossy compressions of larger digital photos, and the MP3 files on your computer and iPod are lossy compressions
of much higher-resolution recordings at the labels. The cost is a certain amount of “fidelity.” tkng ll f th vwls t f ths sntnc, fr xmpl, nd mkng t ll lwrcs, wld cnsttt lssy cmprssn: for the most part the words can be reconstructed, but ambiguities arise here and there—e.g., “swim,” “swam,” and “swum”; “I took a hike today” and “Take a hike, toady”; “make a pizza” and “Mike Piazza.” In many instances, though—as is frequently the case with images, audio, and video—getting an exact replication of the original isn’t so important: we can get close enough and, by allowing that wiggle room, save a great deal of time, space, money, and/or energy.

  Entropy in Your Very Own Body

  Lest you think that the Shannon Game’s scoring of texts is an abstraction pertinent only to computer scientists and computational linguists, you might be interested to know that Shannon entropy correlates not only to the metrical stresses in a sentence but also to the pattern by which speakers enunciate certain words and swallow others.11 So even if you’ve never heard of it before, something in your head intuits Shannon entropy every time you open your mouth. Namely, it tells you how far to open it.12

  It turns out, too, that if we map the movements of readers’ eyes, their “saccades” and “fixations,” the way they dance over the text, their time spent snagging on (or returning to) certain parts of a passage corresponds quite nicely to its Shannon Game values. “Transitional probabilities between words have a measurable influence on fixation durations,” write the University of Edinburgh’s Scott McDonald and Richard Shillcock. “Readers skipped over predictable words more than unpredictable words and spent less time on predictable words when they did fixate on them,” writes a team of psychologists from the University of Massachusetts and Mount Holyoke College.

  As researchers Laurent Itti and Pierre Baldi put it, “Surprise explains best where humans look … [It] represents an easily computable shortcut towards events which deserve attention.” In other words, entropy guides the eye. It gives every passage a secret shape.

  Artifacts

  Lossy compression brings along with it what are known as compression “artifacts”—the places where the lossiness of the compression process leaves its scars on the data. The interesting thing about compression artifacts is that they are not random—as a matter of fact, they have a kind of signature. Two of the primary image formats on the web—GIF and JPEG—each leave a characteristic mark: The JPEG’s will be regions of what looks like turbulence or heat distortion in areas that would otherwise be either uniform color or sharply divided between colors and textures. The GIF’s will be speckles of one color against a background of a similar color (dithering) or smooth gradients of color divided into strips of uniform color (color banding).

  A “computer forensics” researcher named Neal Krawetz has used the compression artifacts in al Qaeda videos—employing a technique called “error level analysis”—to demonstrate that elements of the background are frequently green-screened in. He’s used the same techniques to document the startling ways that graphic designers in the fashion industry modify the photographs of their models.

  One of the strange artifacts that we’re quickly growing subconsciously used to is lag. When you play a DVD on a slow computer, notice that the moments when the scene changes quickly or when the camera is moving swiftly through an environment are the moments when the computer is most likely to start lagging. (For this reason, music videos, action movies, and, ironically, commercials, all with higher than normal rates of camera cuts and/or camera motion, suffer the most from compression. Romantic movies and sitcoms, with their slower cuts and frequent shots of people standing still and talking, hold up better, for instance, when you stream them over the Internet.) The implication is that these moments must contain more information per second of footage than other moments, such as a character talking against a static background. When you play a graphics-intensive computer game, look for the moments when the frame rate—how many fresh updates to the screen the computer can provide per second—suddenly drops. Some MP3 files use “variable bit rate encoding”—their sampling frequency changes depending on how “complex” the song is at each moment. Sound files are in general so much smaller than video files that you’re not likely to hear lag at the moments when the bit rate spikes up, but the principle is the same.

  Compare this to watching a projected film in the theater: the entire image is swapped out every twenty-fourth of a second. In many shots, much of what’s on-screen remains the same or changes very little in that twenty-fourth of a second. So there’s a needless waste of energy. But the waste has a freeing effect: the projector and the filmstrip don’t need to know or care how dramatically the image is changing, whereas your computer, streaming data over the Internet and trying to milk the most from each bit, is sensitive to those things.

  And, because so much of what we see, hear, and do is compressed, we become sensitive to these things too. A live recording gets assigned a variable bit rate; a computer simulation of an event produces a variable frame rate. Could it be that the entropy of life itself is uneven? What might a study of compression have to teach us about the business of living well?

  Lossiness and Stakes

  One of the strange things about lossless compression is that certain things turn out to have a counterintuitively high information entropy. One example is static. Because static, both audio and visual, is random, by definition there aren’t patterns that a compressor could exploit; thus it has essentially the highest information entropy. What seems strange about this is that the stakes of that information are low—how can we have a lot of information, yet none of it worthwhile?

  The picture changes, though, when we consider lossy compression. Ultimately, the quality of lossy compressions has to be judged by human eyes and ears—it isn’t as exact a science as lossless compression. One thing that does become clear is that certain things can be pushed fairly far before they show subjective quality loss. Here’s where “stakes” enter. To “accurately” capture television snow, for instance, one simply needs a general sense of its range of colors and general texture. Because it’s so nearly random, it is extremely hard to losslessly compress, but will lossily compress down to almost nothing.

  Lossy compression is fuzzier: it’s more subjective and inexact as a field. But it’s the far more pervasive kind. Every time you answer a phone call from your mother and she asks you how your day was, the answer she receives will be lossily compressed thrice: by the phone company, shaving away certain measures of audio fidelity to fit more calls on its network bandwidth; by you, turning roughly 60 sec/min X 60 min/hr X 8 hrs = 30,000 seconds of lived experience into a couple hundred syllables; and, last, by her own memory, which shucks those syllables to get at a paraphrasable “gist,” forgetting most of the syllables themselves within seconds.

  It follows that not only excerpt and quotation but description is a form of lossy compression. In fact, lossy compression is the very essence of what language is. It accounts for both its immense shortcomings and its immense value—and is another example of the “not-knowing” that allows for art.

  Compression and Literary Technique

  Synecdoche, the linguistic device by which we name a part but mean the whole—“a new set of wheels,” meaning car; “mouths to feed,” meaning people; “nice threads,” meaning clothing item—gives us one option of transmission in which we attempt to preserve and transmit the most salient part of an experience, with the understanding that the reader will fill in the rest. The storyteller who uses synecdoche is like a botanist who returns from the field with a stem cutting that grows an entire tree—or the blue Linckia sea star, whose severed arm will generate itself a new body. The piece gives you back the whole.

  T. S. Eliot says in his famous 1915 poem “The Love Song of J. Alfred Prufrock,” “I should have been a pair of ragged claws / Scuttling across the floors of silent seas.” Given the claws, we imagine the rest of the crustacean’s body quite clearly, whereas if he’d said “I
should have been a ragged crab,” we would know the claws were there, of course, but they’d be less vivid, fuzzed out, lower resolution.

  Similar to synecdoche is the use of “enthymemes,” a technique in argumentation where you explain a piece of reasoning but leave out a premise (because it’s assumed to be understood) or the conclusion (because you want your audience to derive it on their own). An example of the former would be to say “Socrates is a man, so Socrates must eventually die,” where the obvious second premise, “All men must eventually die,” is left unstated. Leaving out a premise, when you’re confident that your interlocutor can fill it back in, speeds things up and avoids stating the obvious.13 And leaving off the conclusion can produce drama, leading the audience all the way up to a point but letting them come to it themselves: “Well, Socrates is a man, and all men must eventually die, so …” There’s some evidence that in courtroom closing statements and classroom lectures, making the audience (jurors or students) assemble the conclusion or “punch line” themselves is more engaging and therefore makes a greater impact. (This assumes, however, that they arrive at the conclusion you intend. Other conclusions—e.g., corpses are ticklish!—may be possible; this is the lossiness of enthymemes.)

  Similarly, when we use the technique of “aposiopesis,” a technique popular among screenwriters and playwrights where a thought or line of dialogue is suddenly broken off—

 

‹ Prev