Physics of the Future
Page 5
(Some examples of Pentagon projects include the Internet, which was originally designed to connect scientists and officials during and after a nuclear war, and the GPS system, which was originally designed to guide ICBM missiles. But both the Internet and GPS were declassified and given to the public after the end of the Cold War.)
In 2004, the contest had an embarrassing beginning, when not a single driverless car was able to travel the 150 miles of rugged terrain and cross the finish line. The robotic cars either broke down or got lost. But the next year, five cars completed an even more demanding course. They had to drive on roads that included 100 sharp turns, three narrow tunnels, and paths with sheer drop-offs on either side.
Some critics said that robotic cars might be able to travel in the desert but never in midtown traffic. So in 2007, DARPA sponsored an even more ambitious project, the Urban Challenge, in which robotic cars had to complete a grueling 60-mile course through mock-urban territory in less than six hours. The cars had to obey all traffic laws, avoid other robot cars along the course, and negotiate four-way intersections. Six teams successfully completed the Urban Challenge, with the top three claiming the $2 million, $1 million, and $500,000 prizes.
The Pentagon’s goal is to make fully one-third of the U.S. ground forces autonomous by 2015. This could prove to be a lifesaving technology, since recently most U.S. casualties have been from roadside bombs. In the future, many U.S. military vehicles will have no drivers at all. But for the consumer, it might mean cars that drive themselves at the touch of a button, allowing the driver to work, relax, admire the scenery, watch a movie, or scan the Internet.
I had a chance to drive one of these cars myself for a TV special for the Discovery Channel. It was a sleek sports car, modified by the engineers at North Carolina State University so that it became fully autonomous. Its computers had the power of eight PCs. Entering the car for me was a bit of a problem, since the interior was crammed. Everywhere inside, I could see sophisticated electronic components piled on the seats and dashboard. When I grabbed the steering wheel, I noticed that it had a special rubber cable connected to a small motor. A computer, by controlling the motor, could then turn the steering wheel.
After I turned the key, stepped on the accelerator, and steered the car onto the highway, I flicked a switch that allowed the computer to take control. I took my hands off the wheel, and the car drove itself. I had full confidence in the car, whose computer was constantly making tiny adjustments via the rubber cable on the steering wheel. At first, it was a bit eerie noticing that the steering wheel and accelerator pedal were moving by themselves. It felt like there was an invisible, ghostlike driver who had taken control, but after a while I got used to it. In fact, later it became a joy to be able to relax in a car that drove itself with superhuman accuracy and skill. I could sit back and enjoy the ride.
The heart of the driverless car was the GPS system, which allowed the computer to locate its position to within a few feet. (Sometimes, the engineers told me, the GPS system could determine the car’s position to within inches.) The GPS system itself is a marvel of modern technology. Each of the thirty-two GPS satellites orbiting the earth emits a specific radio wave, which is then picked up by the GPS receivers in my car. The signal from each satellite is slightly distorted because they are traveling in slightly different orbits. This distortion is called the Doppler shift. (Radio waves, for example, are compressed if the satellite is moving toward you, and are stretched if it moves away from you.) By analyzing the slight distortion of frequencies from three or four satellites, the car’s computer could determine my position accurately.
The car also had radar in its fenders so that it could sense obstacles. This will be crucial in the future, as each car will automatically take emergency measures as soon as it detects an impending accident. Today, almost 40,000 people in the United States die in car accidents every year. In the future, the words car accident may gradually disappear from the English language.
Traffic jams may also be a thing of the past. A central computer will be able to track all the motions of every car on the road by communicating with each driverless car. It will then easily spot traffic jams and bottlenecks on the highways. In one experiment, conducted north of San Diego on Interstate 15, chips were placed in the road so that a central computer took control of the cars on the road. In case of a traffic jam, the computer will override the driver and allow traffic to flow freely.
The car of the future will also be able to sense other dangers. Thousands of people have been killed or injured in car accidents when the driver fell asleep, especially at night or on long, monotonous trips. Computers today can focus on your eyes and recognize the telltale signs of your becoming drowsy. The computer is then programmed to make a sound and wake you up. If this fails, the computer will take over the car. Computers can also recognize the presence of excessive amounts of alcohol in the car, which may reduce the thousands of alcohol-related fatalities that happen every year.
The transition to intelligent cars will not happen immediately. First, the military will deploy these vehicles and in the process work out any kinks. Then robotic cars will enter the marketplace, appearing first on long, boring stretches of interstate highways. Next, they will appear in the suburbs and large cities, but the driver will always have the ability to override the computer in case of an emergency. Eventually, we will wonder how we could have lived without them.
FOUR WALL SCREENS
Not only will computers relieve the strain of commuting and reduce car accidents, they will also help to connect us to friends and acquaintances. In the past, some people have complained that the computer revolution has dehumanized and isolated us. Actually, it has allowed us to exponentially expand our circle of friends and acquaintances. When you are lonely or in need of company, you will simply ask your wall screen to set up a bridge game with other lonely individuals anywhere in the world. When you want some assistance planning a vacation, organizing a trip, or finding a date, you will do it via your wall screen.
In the future, a friendly face might first emerge on your wall screen (a face you can change to suit your tastes). You will ask it to plan a vacation for you. It already knows your preferences and will scan the Internet and give you a list of the best possible options at the best prices.
Family gatherings may also take place via the wall screen. All four walls of your living room will have wall screens, so you will be surrounded by images of your relatives from far away. In the future, perhaps a relative may not be able to visit for an important occasion. Instead, the family may gather around the wall screen and celebrate a reunion that is part real and part virtual. Or, via your contact lens, you can see the images of all your loved ones as if they were really there, even though they are thousands of miles away. (Some commentators have remarked that the Internet was originally conceived as a “male” device by the Pentagon, that is, it was concerned with dominating an enemy in wartime. But now the Internet is mainly “female,” in that it’s about reaching out and touching someone.)
Teleconferencing will be replaced by telepresence—the complete 3-D images and sounds of a person will appear in your glasses or contact lens. At a meeting, for example, everyone will sit around a table, except some of the participants will appear only in your lens. Without your lens, you would see that some of the chairs around the table are empty. With your lens, you will see the image of everyone sitting in their chairs as if they were there. (This means that all participants will be videotaped by a special camera around a similar table and then their images sent over the Internet.)
In the movie Star Wars, audiences were amazed to see 3-D images of people appearing in the air. But using computer technology, we will be able to see these 3-D images in our contact lens, glasses, or wall screens in the future.
At first, it might seem strange talking to an empty room. But remember, when the telephone first came out, some criticized it, saying that people would be speaking to disembodied voices. Th
ey wailed that it would gradually replace direct person-to-person contact. The critics were right, but today we don’t mind speaking to disembodied voices, because it has vastly increased our circle of contacts and enriched our lives.
This may also change your love life. If you are lonely, your wall screen will know your past preferences and the physical and social characteristics you want in a date, and then scan the Internet for a possible match. And since people sometimes lie in their profiles, as a security measure, your screen will automatically scan each person’s history to detect falsehoods in their biography.
FLEXIBLE ELECTRONIC PAPER
The price of flat-screen TVs, once more than $10,000, has dropped by a factor of about fifty just within a decade. In the future, flat screens that cover an entire wall will also fall dramatically in price. These wall screens will be flexible and superthin, using OLEDs (organic light-emitting diodes). They are similar to ordinary light-emitting diodes, except they are based on organic compounds that can be arranged in a polymer, making them flexible. Each pixel on the flexible screen is connected to a transistor that controls the color and intensity of the light.
Already, the scientists at Arizona State University’s Flexible Display Center are working with Hewlett-Packard and the U.S. Army to perfect this technology. Market forces will then drive down the cost of this technology and bring it to the public. As prices go down, the cost of these wall screens may eventually approach the price of ordinary wallpaper. So in the future, when putting up wallpaper, one might also be putting up wall screens at the same time. When we wish to change the pattern on our wallpaper, we will simply push a button. Redecorating will be so simple.
This flexible screen technology may also revolutionize how we interact with our portable computers. We will not need to lug heavy laptop computers with us. The laptop may be a simple sheet of OLEDs we then fold up and put in our wallets. A cell phone may contain a flexible screen that can be pulled out, like a scroll. Then, instead of straining to type on the tiny keyboard of your cell phone, you may be able to pull out a flexible screen as large as you want.
This technology also makes possible PC screens that are totally transparent. In the near future, we may be staring out a window, and then wave our hands, and suddenly the window becomes a PC screen. Or any image we desire. We could be staring out a window thousands of miles away.
Today, we have scrap paper that we scribble on and then throw away. In the future, we might have “scrap computers” that have no special identity of their own. We scribble on them and discard them. Today, we arrange our desk and furniture around the computer, which dominates our office. In the future, the desktop computer might disappear and the files will move with us as we go from place to place, from room to room, or from office to home. This will give us seamless information, anytime, anywhere. Today at airports you see hundreds of travelers carrying laptop computers. Once at the hotel, they have to connect to the Internet; and once they return back home, they have to download files into their desktop machines. In the future, you will never need to lug a computer around, since everywhere you turn, the walls, pictures, and furniture can connect you to the Internet, even if you are in a train or car. (“Cloud computing,” where you are billed not for computers but for computer time, treating computation like a utility that is metered like water or electricity, is an early example of this.)
VIRTUAL WORLDS
The goal of ubiquitous computing is to bring the computer into our world: to put chips everywhere. The purpose of virtual reality is the opposite: to put us into the world of the computer. Virtual reality was first introduced by the military in the 1960s as a way of training pilots and soldiers using simulations. Pilots could practice landing on the deck of an aircraft carrier by watching a computer screen and moving a joystick. In case of a nuclear war, generals and political leaders from distant locations could meet secretly in cyberspace.
Today, with computer power expanding exponentially, one can live in a simulated world, where you can control an avatar (an animated image that represents you). You can meet other avatars, explore imaginary worlds, and even fall in love and get married. You can also buy virtual items with virtual money that can then be converted to real money. One of the most popular sites, Second Life, registered 16 million accounts by 2009. That year, several people earned more than $1 million per year using Second Life. (The profit you make, however, is taxable by the U.S. government, which considers it real income.)
Virtual reality is already a staple of video games. In the future as computer power continues to expand, via your glasses or wall screen, you will also be able to visit unreal worlds. For example, if you wish to go shopping or visit an exotic place, you might first do it via virtual reality, navigating the computer screen as if you were really there. In this way, you will be able to walk on the moon, vacation on Mars, shop in distant countries, visit any museum, and decide for yourself where you want to go.
You will also, to a degree, have the ability to feel and touch objects in this cyberworld. This is called “haptic technology” and allows you to feel the presence of objects that are computer generated. It was first developed by scientists who had to handle highly radioactive materials with remote-controlled robotic arms, and by the military, which wanted its pilots to feel the resistance of a joystick in a flight simulator.
To duplicate the sense of touch scientists have created a device attached to springs and gears, so that as you push your fingers forward on the device, it pushes back, simulating the sensation of pressure. As you move your fingers across a table, for example, this device can simulate the sensation of feeling its hard wooden surface. In this way, you can feel the presence of objects that are seen in virtual reality goggles, completing the illusion that you are somewhere else.
To create the sensation of texture, another device allows your fingers to pass across a surface containing thousands of tiny pins. As your fingers move, the height of each pin is controlled by a computer, so that it can simulate the texture of hard surfaces, velvety cloth, or rough sandpaper. In the future, by putting on special gloves, it may be possible to give a realistic sensation of touch over a variety of objects and surfaces.
This will be essential for training surgeons in the future, since the surgeon has to be able to sense pressure when performing delicate surgery, and the patient might be a 3-D holographic image. It also takes us a bit closer to the holodeck of the Star Trek series, where you wander in a virtual world and can touch virtual objects. As you roam around an empty room, you can see fantastic objects in your goggles or contact lens. As you reach out and grab them, a haptic device rises from the floor and simulates the object you are touching.
I had a chance to witness these technologies firsthand when I visited the CAVE (cave automatic virtual environment) at Rowan University in New Jersey for the Science Channel. I entered an empty room, where I was surrounded by four walls, each wall lit up by a projector. 3-D images could be flashed onto the walls, giving the illusion of being transported to another world. In one demonstration, I was surrounded by giant, ferocious dinosaurs. By moving a joystick, I could take a ride on the back of a Tyrannosaurus rex, or even go right into its mouth. Then I visited the Aberdeen Proving Ground in Maryland, where the U.S. military has devised the most advanced version of a holodeck. Sensors were placed on my helmet and backpack, so the computer knew exactly the position of my body. I then walked on an Omnidirectional Treadmill, a sophisticated treadmill that allows you to walk in any direction while remaining in the same place. Suddenly I was on a battlefield, dodging bullets from enemy snipers. I could run in any direction, hide in any alleyway, sprint down any street, and the 3-D images on the screen changed instantly. I could even lie flat on the floor, and the screens changed accordingly. I could imagine that, in the future, you will be able to experience total immersion, e.g. engage in dogfights with alien spaceships, flee from rampaging monsters, or frolic on a deserted island, all from the comfort of your living room.
&n
bsp; MEDICAL CARE IN THE NEAR FUTURE
A visit to the doctor’s office will be completely changed. For a routine checkup, when you talk to the “doctor,” it will probably be a robotic software program that appears on your wall screen and that can correctly diagnose up to 95 percent of all common ailments. Your “doctor” may look like a person, but it will actually be an animated image programmed to ask certain simple questions. Your “doctor” will also have a complete record of your genes, and will recommend a course of medical treatments that takes into account all your genetic risk factors.
To diagnose a problem, the “doctor” will ask you to pass a simple probe over your body. In the original Star Trek TV series, the public was amazed to see a device called the tricorder that could instantly diagnose any illness and peer inside your body. But you do not have to wait until the twenty-third century for this futuristic device. Already, MRI machines, which weigh several tons and can fill up an entire room, have been miniaturized to about a foot, and will eventually be as small as a cell phone. By passing one over your body, you will be able to see inside your organs. Computers will process these 3-D images and then give you a diagnosis. This probe will also be able to determine, within minutes, the presence of a wide variety of diseases, including cancer, years before a tumor forms. This probe will contain DNA chips, silicon chips that have millions of tiny sensors that can detect the presence of the telltale DNA of many diseases.
Of course, many people hate going to the doctor. But in the future, your health will be silently and effortlessly monitored several times a day without your being aware of it. Your toilet, bathroom mirror, and clothes will have DNA chips to silently determine if you have cancer colonies of only a few hundred cells growing in your body. You will have more sensors hidden in your bathroom and clothes than are found in a modern hospital or university today. For example, simply by blowing on a mirror, the DNA for a mutated protein called p53 can be detected, which is implicated in 50 percent of all common cancers. This means that the word tumor will gradually disappear from the English language.