The Design of Everyday Things
Page 31
There is another problem: the general conservatism of large companies. Most radical ideas fail: large companies are not tolerant of failure. Small companies can jump in with new, exciting ideas because if they fail, well, the cost is relatively low. In the world of high technology, many people get new ideas, gather together a few friends and early risk-seeking employees, and start a new company to exploit their visions. Most of these companies fail. Only a few will be successful, either by growing into a larger company or by being purchased by a large company.
You may be surprised by the large percentage of failures, but that is only because they are not publicized: we only hear about the tiny few that become successful. Most startup companies fail, but failure in the high-tech world of California is not considered bad. In fact, it is considered a badge of honor, for it means that the company saw a future potential, took the risk, and tried. Even though the company failed, the employees learned lessons that make their next attempt more likely to succeed. Failure can occur for many reasons: perhaps the marketplace is not ready; perhaps the technology is not ready for commercialization; perhaps the company runs out of money before it can gain traction.
When one early startup company, Fingerworks, was struggling to develop an affordable, reliable touch surface that distinguished among multiple fingers, it almost quit because it was about to run out of money. Apple however, anxious to get into this market, bought Fingerworks. When it became part of Apple, its financial needs were met and Fingerworks technology became the driving force behind Apple’s new products. Today, devices controlled by gestures are everywhere, so this type of interaction seems natural and obvious, but at the time, it was neither natural nor obvious. It took almost three decades from the invention of multitouch before companies were able to manufacture the technology with the required robustness, versatility, and very low cost necessary for the idea to be deployed in the home consumer market. Ideas take a long time to traverse the distance from conception to successful product.
VIDEOPHONE: CONCEIVED IN 1879—STILL NOT HERE
The Wikipedia article on videophones, from which Figure 7.3 was taken, said: “George du Maurier’s cartoon of ‘an electric camera-obscura’ is often cited as an early prediction of television and also anticipated the videophone, in wide screen formats and flat screens.” Although the title of the drawing gives credit to Thomas Edison, he had nothing to do with this. This is sometimes called Stigler’s law: the names of famous people often get attached to ideas even though they had nothing to do with them.
The world of product design offers many examples of Stigler’s law. Products are thought to be the invention of the company that most successfully capitalized upon the idea, not the company that originated it. In the world of products, original ideas are the easy part. Actually producing the idea as a successful product is what is hard. Consider the idea of a video conversation. Thinking of the idea was so easy that, as we see in Figure 7.3, Punch magazine illustrator du Maurier could draw a picture of what it might look like only two years after the telephone was invented. The fact that he could do this probably meant that the idea was already circulating. By the late 1890s, Alexander Graham Bell had thought through a number of the design issues. But the wonderful scenario illustrated by du Maurier has still not become reality, one and one-half centuries later. Today, the videophone is barely getting established as a means of everyday communication.
FIGURE 7.3Predicting the Future: The Videophone in 1879. The caption reads: “Edison’s Telephonoscope (transmits light as well as sound). (Every evening, before going to bed, Pater- and Materfamilias set up an electric camera-obscura over their bedroom mantel-piece, and gladden their eyes with the sight of their children at the Antipodes, and converse gaily with them through the wire.”) (Published in the December 9, 1878, issue of Punch magazine. From “Telephonoscope,” Wikipedia.)
It is extremely difficult to develop all the details required to ensure that a new idea works, to say nothing of finding components that can be manufactured in sufficient quantity, reliability, and affordability. With a brand-new concept, it can take decades before the public will endorse it. Inventors often believe their new ideas will revolutionize the world in months, but reality is harsher. Most new inventions fail, and even the few that succeed take decades to do so. Yes, even the ones we consider “fast.” Most of the time, the technology is unnoticed by the public as it circulates around the research laboratories of the world or is tried by a few unsuccessful startup companies or adventurous early adopters.
Ideas that are too early often fail, even if eventually others introduce them successfully. I’ve seen this happen several times. When I first joined Apple, I watched as it released one of the very first commercial digital cameras: the Apple QuickTake. It failed. Probably you are unaware that Apple ever made cameras. It failed because the technology was limited, the price high, and the world simply wasn’t ready to dismiss film and chemical processing of photographs. I was an adviser to a startup company that produced the world’s first digital picture frame. It failed. Once again, the technology didn’t quite support it and the product was relatively expensive. Obviously today, digital cameras and digital photo frames are extremely successful products, but neither Apple nor the startup I worked with are part of the story.
Even as digital cameras started to gain a foothold in photography, it took several decades before they displaced film for still photographs. It is taking even longer to replace film-based movies with those produced on digital cameras. As I write this, only a small number of films are made digitally, and only a small number of theaters project digitally. How long has the effort been going on? It is difficult to determine when the effort stated, but it has been a very long time. It took decades for high-definition television to replace the standard, very poor resolution of the previous generation (NTSC in the United States and PAL and SECAM elsewhere). Why so long to get to a far better picture, along with far better sound? People are very conservative. Broadcasting stations would have to replace all their equipment. Homeowners would need new sets. Overall, the only people who push for changes of this sort are the technology enthusiasts and the equipment manufacturers. A bitter fight between the television broadcasters and the computer industry, each of which wanted different standards, also delayed adoption (described in Chapter 6).
In the case of the videophone shown in Figure 7.3, the illustration is wonderful but the details are strangely lacking. Where would the video camera have to be located to display that wonderful panorama of the children playing? Notice that “Pater- and Materfamilias” are sitting in the dark (because the video image is projected by a “camera obscura,” which has a very weak output). Where is the video camera that films the parents, and if they sit in the dark, how can they be visible? It is also interesting that although the video quality looks even better than we could achieve today, sound is still being picked up by trumpet-shaped telephones whose users need to hold the speaking tube to their face and talk (probably loudly). Thinking of the concept of a video connection was relatively easy. Thinking through the details has been very difficult, and then being able to build it and put it into practice—well, it is now considerably over a century since that picture was drawn and we are just barely able to fulfill that dream. Barely.
It took forty years for the first working videophones to be created (in the 1920s), then another ten years before the first product (in the mid-1930s, in Germany), which failed. The United States didn’t try commercial videophone service until the 1960s, thirty years after Germany; that service also failed. All sorts of ideas have been tried including dedicated videophone instruments, devices using the home television set, video conferencing with home personal computers, special video-conferencing rooms in universities and companies, and small video telephones, some of which might be worn on the wrist. It took until the start of the twenty-first century for usage to pick up.
Video conferencing finally started to become common in the early 2010s. Extremely exp
ensive videoconferencing suites have been set up in businesses and universities. The best commercial systems make it seem as if you are in the same room with the distant participants, using high-quality transmission of images and multiple, large monitors to display life-size images of people sitting across the table (one company, Cisco, even sells the table). This is 140 years from the first published conception, 90 years since the first practical demonstration, and 80 years since the first commercial release. Moreover, the cost, both for the equipment at each location and for the data-transmission charges, are much higher than the average person or business can afford: right now they are mostly used in corporate offices. Many people today do engage in videoconferencing from their smart display devices, but the experience is not nearly as good as provided by the best commercial facilities. Nobody would confuse these experiences with being in the same room as the participants, something that the highest-quality commercial facilities aspire to (with remarkable success).
Every modern innovation, especially the ones that significantly change lives, takes multiple decades to move from concept to company success A rule of thumb is twenty years from first demonstrations in research laboratories to commercial product, and then a decade or two from first commercial release to widespread adoption. Except that actually, most innovations fail completely and never reach the public. Even ideas that are excellent and will eventually succeed frequently fail when first introduced. I’ve been associated with a number of products that failed upon introduction, only to be very successful later when reintroduced (by other companies), the real difference being the timing. Products that failed at first commercial introduction include the first American automobile (Duryea), the first typewriters, the first digital cameras, and the first home computers (for example, the Altair 8800 computer of 1975).
THE LONG PROCESS OF DEVELOPMENT OF THE TYPEWRITER KEYBOARD
The typewriter is an ancient mechanical device, now found mostly in museums, although still in use in newly developing nations. In addition to having a fascinating history, it illustrates the difficulties of introducing new products into society, the influence of marketing upon design, and the long, difficult path leading to new product acceptance. The history affects all of us because the typewriter provided the world with the arrangement of keys on today’s keyboards, despite the evidence that it is not the most efficient arrangement. Tradition and custom coupled with the large number of people already used to an existing scheme makes change difficult or even impossible. This is the legacy problem once again: the heavy momentum of legacy inhibits change.
Developing the first successful typewriter was a lot more than simply figuring out a reliable mechanism for imprinting the letters upon the paper, although that was a difficult task by itself. One question was the user interface: how should the letters be presented to the typist? In other words, the design of the keyboard.
Consider the typewriter keyboard, with its arbitrary, diagonally sloping arrangement of keys and its even more arbitrary arrangement of their letters. Christopher Latham Sholes designed the current standard keyboard in the 1870s. His typewriter design, with its weirdly organized keyboard, eventually became the Remington typewriter, the first successful typewriter: its keyboard layout was soon adopted by everyone.
The design of the keyboard has a long and peculiar history. Early typewriters experimented with a wide variety of layouts, using three basic themes. One was circular, with the letters laid out alphabetically; the operator would find the proper spot and depress a lever, lift a rod, or do whatever other mechanical operation the device required. Another popular layout was similar to a piano keyboard, with the letters laid out in a long row; some of the early keyboards, including an early version by Sholes, even had black and white keys. Both the circular layout and the piano keyboard proved awkward. In the end, the typewriter keyboards all ended up using multiple rows of keys in a rectangular configuration, with different companies using different arrangements of the letters. The levers manipulated by the keys were large and ungainly, and the size, spacing, and arrangement of the keys were dictated by these mechanical considerations, not by the characteristics of the human hand. Hence the keyboard sloped and the keys were laid out in a diagonal pattern to provide room for the mechanical linkages. Even though we no longer use mechanical linkages, the keyboard design is unchanged, even for the most modern electronic devices.
Alphabetical ordering of keys seems logical and sensible: Why did it change? The reason is rooted in the early technology of keyboards. Early typewriters had long levers attached to the keys. The levers moved individual typebars to contact the typing paper, usually from behind (the letters being typed could not be seen from the front of the typewriter). These long type arms would often collide and lock together, requiring the typist to separate them manually. To avoid the jamming, Sholes arranged the keys and the typebars so that letters that were frequently typed in sequence did not come from adjacent typebars. After a few iterations and experiments, a standard emerged, one that today governs keyboards used throughout the world, although with regional variations. The top row of the American keyboard has the keys Q W E R T Y U I O P, which gives rise to the name of this layout: QWERTY. The world has adopted the basic layout, although in Europe, for example, one can find QZERTY, AZERTY, and QWERTZ. Different languages use different alphabets, so obviously a number of keyboards had to move keys around to make room for additional characters.
FIGURE 7.4.The 1872 Sholes Typewriter. Remington, the manufacturer of the first successful typewriter, also made sewing machines. Figure A shows the influence of the sewing machine upon the design with the use of a foot pedal for what eventually became the “return” key. A heavy weight hung from the frame advanced the carriage after each letter was struck, or when the large, rectangular plate under the typist’s left hand was depressed (this is the “space bar”). Pressing the foot pedal raised the weight. Figure B shows a blowup of the keyboard. Note that the second row shows a period (.) instead of R. From Scientific American’s “The Type Writer” (Anonymous, 1872).
Note that popular legend has it that the keys were placed so as to slow down the typing. This is wrong: the goal was to have the mechanical typebars approach one another at large angles, thus minimizing the chance of collision. In fact, we now know that the QWERTY arrangement guarantees a fast typing speed. By placing letters that form frequent pairs relatively far apart, typing is speeded because it tends to make letter pairs be typed with different hands.
There is an unconfirmed story that a salesperson rearranged the keyboard to make it possible to type the word typewriter on the second row, a change that violated the design principle of separating letters that were typed sequentially. Figure 7.4B shows that the early Sholes keyboard was not QWERTY: the second row of keys had a period (.) where today we have R, and the P and R keys were on the bottom row (as well as other differences). Moving the R and P from the fourth row to the second makes it possible to type the word typewriter using only keys on the second row.
There is no way to confirm the validity of the story. Moreover, I have only heard it describe the interchange of the period and R keys, with no discussion of the P key. For the moment, suppose the story were true: I can imagine the engineering minds being outraged. This sounds like the traditional clash between the hard-headed, logical engineers and the noncomprehending sales and marketing force. Was the salesperson wrong? (Note that today we would call this a marketing decision, but the profession of marketing didn’t exist yet.) Well, before taking sides, realize that until then, every typewriter company had failed. Remington was going to come out with a typewriter with a weird arrangement of the keys. The sales staff were right to be worried. They were right to try anything that might enhance the sales efforts. And indeed, they succeeded: Remington became the leader in typewriters. Actually, its first model did not succeed. It took quite a while for the public to accept the typewriter.
Was the keyboard really changed to allow the word typewriter to be
typed on one row? I cannot find any solid evidence. But it is clear that the positions of R and P were moved to the second row: compare Figure 7.4B with today’s keyboard.
The keyboard was designed through an evolutionary process, but the main driving forces were mechanical and marketing. Even though jamming isn’t a possibility with electronic keyboards and computers and the style of typing has changed, we are committed to this keyboard, stuck with it forever. But don’t despair: it really is a good arrangement. One legitimate area of concern is the high incidence of a kind of injury that befalls typists: carpal tunnel syndrome. This ailment is a result of frequent and prolonged repetitive motions of the hand and wrist, so it is common among typists, musicians, and people who do a lot of handwriting, sewing, some sports, and assembly line work. Gestural keyboards, such as the one shown in Figure 7.2D, might reduce the incidence. The US National Institute of Health advises, “Ergonomic aids, such as split keyboards, keyboard trays, typing pads, and wrist braces, may be used to improve wrist posture during typing. Take frequent breaks when typing and always stop if there is tingling or pain.”
August Dvorak, an educational psychologist, painstakingly developed a better keyboard in the 1930s. The Dvorak keyboard layout is indeed superior to that of QWERTY, but not to the extent claimed. Studies in my laboratory showed that the typing speed on a QWERTY was only slightly slower than on a Dvorak, not different enough to make upsetting the legacy worthwhile. Millions of people would have to learn a new style of typing. Millions of typewriters would have to be changed. Once a standard is in place, the vested interests of existing practices impede change, even where the change would be an improvement. Moreover, in the case of QWERTY versus Dvorak, the gain is simply not worth the pain. “Good enough” triumphs again.