>>> guidelines reconfirmed
Our work is generating many interesting new findings on questions such as: What makes a website credible? What inspires user loyalty? We’re running more studies to dig into these issues, which are among the most important for improving website profitability over the next decade. Once we’ve analyzed the mountains of data we’re collecting, we’ll announce the new findings at our upcoming usability conference.
For now, one thing is clear: we’re confirming more and more of the old usability guidelines. Even though we have new issues to consider, the old issues aren’t going away. A few examples:• E-mail newsletters remain the best way to drive users back to websites. It’s incredible how often our study participants say that a newsletter is their main reason for revisiting a site. Most business professionals are not very interested in podcasts or newsfeeds (RSS).
• Opening new browser windows is highly confusing for most users. Although many users can cope with extra windows that they’ve opened themselves, few understand why the Back button suddenly stops working in a new window that the computer initiated. Opening new windows was #2 on my list of top ten Web design mistakes of 1999; that this design approach continues to hurt users exemplifies both the longevity of usability guidelines and the limited improvement in user skills.
• Links that don’t change color when clicked still create confusion, making users unsure about what they’ve already seen on a site.
• Splash screens and intros are still incredibly annoying: users look for the “skip intro” button—if not found, they often leave. One user wanted to buy custom-tailored shirts and first visited Turnbull & Asser because of its reputation. Clicking the appropriate link led to a page where a video started to play without warning and without a way to skip it and proceed directly to actual info about the service. The user watched a few seconds, got more and more agitated about the lack of options to bypass the intro, and finally closed down the site and went to a competitor. Customer lost.
• A fairly large minority of users still don’t know that they can get to a site’s home page by clicking its logo, so I still have to recommend having an explicit “home” link on all interior pages (not on the home page, of course, because no-op links that point to the current page are confusing—yet another guideline we saw confirmed again several times last week). It particularly irks me to have to retain the “explicit home link” guideline, because I had hoped to get rid of this stupid extra link. But many users really do change very slowly, so we’ll probably have to keep this guideline in force until 2020—maybe longer. At least bread crumbs are a simple way to satisfy this need.
• People are still very wary, sometimes more so than in the past, about giving out personal information. In particular, the B2B sites in this new study failed in exactly the same way as most B2B sites in our major B2B research: by hitting users with a registration screen before they were sufficiently committed to the site.
• Nonstandard scrollbars are often overlooked and make people miss most of the site’s offerings. Consider two examples from last week’s testing.
On the Carl’s Jr. hamburger chain website, we asked users to look up nutritional information for various meals. Many participants thought the quick item view menu covered only breakfast items, because those were the only choices visible without scrolling. Users overlooked the nonstandard scrollbar, and instead often suffered through the PDF files available through the nutrition guide link. (These PDF files caused many other problems, confirming more age-old usability guidelines. That said, some users are now skillful enough to adjust PDF views so that they’re slightly more readable. Still, it’s a painful process.)
On the Sundance Resort’s site, one user was thrilled to see photos of celebrations hosted at the resort. She eagerly clicked through all five visible thumbnails, but never noticed the small triangles at the top and bottom that let users scroll to see more photos.
Web usability guidelines are not the only guidelines our new studies confirm. On VW’s site, we asked participants to use the configurators to customize a car according to their preferences. Unfortunately, this mini-application violated some of the basic application usability guidelines, causing people many problems.
Users can select their car’s wheel style from two options. This simple operation was difficult and error prone, however, because the option for the wheel that’s currently mounted on the car was grayed out—a GUI convention that’s supposed to mean that something is unavailable, not that it’s the current selection. It would have been much better to show both available wheels at all times, placing a selection rectangle—or some other graphical highlighting convention—around the current selection. (Poor feedback is #4 on my list of top ten mistakes of application design.)
Interface conventions exist for a reason: they allow users to focus on your content (in this case, the car and its options). When all interface elements work as expected, users know how to operate the UI to get the desired effect. Conversely, when you deviate from user expectations, you erect a great barrier between users and their ability to get things done. Some designers think this makes the site more exciting. In reality, nonstandard design makes the site more frustrating and drastically reduces the user’s chance of success. Users are thus more likely to quickly leave the site.
In VW’s case, the designers probably suffered from a case of metaphor overload: the design mimics the experience of actually assembling a physical car in a real workshop. If you had two wheels on the workshop floor and mounted one on the car, then the chosen wheel would no longer be on the floor.
In reality, though, users are not grease monkeys. They’re clicking on interface elements, and they expect the picture of a wheel to behave like a GUI element.
We’re confirming hundreds more of the existing usability guidelines every week as our testing continues. Even though we have upscale users and it’s a new study testing new sites, most of the findings are the same as we’ve seen year after year after year. Usability guidelines remain remarkably constant over time, because basic human characteristics stay the same.
< Nicholas Carr >
is google making us stupid?
Originally published in The Atlantic (July/August 2008).
NICHOLAS CARR is the author of The Shallows: What the Internet Is Doing to Our Brains (2010) and The Big Switch: Rewiring the World, from Edison to Google (2008). He has been a columnist for The Guardian and executive editor of Harvard Business Review, and has written for The Atlantic, The New York Times, The Wall Street Journal, Wired, The Times (London), and The New Republic. His blog is roughtype.com.
DAVE, STOP. Stop, will you? Stop, Dave. Will you stop, Dave?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.”
I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and
surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s infothickets reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the Web not so much because the way I read has changed, i.e., I’m just seeking convenience, but because the way I THINK has changed?”
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the Web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”
Anecdotes alone don’t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it. The authors of the study report:It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.
Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.
But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.”
“You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.”
The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”
As we use what the sociologist Daniel Bell has
called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the fourteenth century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.”
The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.
The Digital Divide Page 7