Metaskills are no different, just pitched at a higher level of knowing. For example, if you’ve mastered the metaskill of playing sports, your broad range of experience may reveal some patterns that you can then transfer to golf, making that particular game easier to learn.
But there’s more. Metaskills have a superpower—they can be reflexive. Reflexivity allows you to apply a metaskill to itself, not just to other skills, thereby multiplying the effect. While the ability to learn is powerful, the ability to learn how to learn is unstoppable.
The problem today is that schools don’t teach metaskills. They barely teach ordinary skills, since, by nature, skills are harder to measure than academic knowledge. It’s easier to check the correctness of math answers, for example, than the quality of thought that went into them. It’s as if we believe metaskills enter the body by osmosis. “You may never use calculus,” the argument goes, “but the experience will teach you how to solve logical problems.” If problem solving is important, why not teach it as a metasubject, and use calculus, statistics, philosophy, physics, debate, and other subjects as expressions of it?
Many of our policymakers believe if we just double down on testing standards and push harder on STEM subjects—science, technology, engineering, mathematics—we’ll revive the economy and compete better with countries that are taking our jobs. This is not a strategic direction for any developed country. The world doesn’t want human robots. It wants creative people with exceptional imagination and vision—and standardized testing won’t get us there.
The Institute of the Future, on behalf of the Apollo Research Institute, took a long look at the workplace of tomorrow. They issued a document called “Future Work Skills 2020,” which identified these six drivers of change:
1. Extreme longevity. Medical advances are steadily increasing the human lifespan, which will change the nature of work and learning. People will work longer and change jobs more often, requiring lifelong learning, unlearning, and relearning.
2. The rise of smart machines and systems. Automation is nudging human workers out of jobs that are based on rote, repetitive tasks.
3. Computational world. Massive increases in the number and variety of sensors and processors will turn the world into a programmable system. As the amount of data increases exponentially, many new roles will need computational thinking skills.
4. New media ecology. New communication tools are requiring media literacies beyond writing. Knowledge workers will be asked to design presentations, make models, and tell stories using video and interactivity.
5. Superstructed organizations. Social technologies are driving new forms of production and value creation. Superstructing means working at the extreme opposites of scale, either at very large scales or at very small scales.
6. Globally connected world. Increased interconnectivity is putting diversity and adaptability at the center of organizational operations. People who can work in different cultures, and work virtually, will deliver extra value to companies.
These future scenarios will demand metaskills such as sense-making, determining the deeper meaning of what is being expressed; social intelligence, the ability to connect with others; adaptive thinking, the ability to imagine solutions beyond the rote; a design mindset, the ability to prototype innovative outcomes; and cognitive load management, the ability to filter out nonessential information and focus on the essential problem at hand.
In Florence during the Renaissance, the archetype of l’uomo universale, the universal man, was born. The “Renaissance Man” was a person well-versed in all branches of knowledge, and capable of innovation in most of them. The total body of knowledge in the 16th century was modest enough that one person could hope to get his arms around it.
There’s a growing recognition that the great advances of the future will come not from a single man or woman, but from the concentrated effort of a group. The operating principle today is, “None of us are as smart as all of us.” Yet to activate the creativity of a group—whether it’s a team, a company, a community, or a nation—we’ll need to bring our best selves to the party. We’ll need to come with our skills, our metaskills, and our full humanity. In the postindustrial era, success will no longer hinge on promotion or job titles or advanced degrees. It will hinge on mastery.
As long as business innovation is in the hands of the few, wealth will be distributed unfairly. What metaskills do is democratize creativity, spreading the responsibility for change more evenly and building a stronger middle class—the best-known engine for economic growth.
Congratulations, you’re a designer
Given the seriousness of the problems we face today, we need all hands on deck. A few creative specialists stashed here and there in the back rooms of our organizations won’t be enough to crack complex problems like environmental responsibility, sustainable energy, or food production for seven billion people. These are called “wicked problems.” A wicked problem is any puzzle so persistent, pervasive, or slippery that it can seem insoluble. You can never really “solve” a wicked problem. You can only work through it.
“Working through problems” is a phrase you commonly hear in design circles, since most designers know that there is never a complete or final answer to anything. There are only provisional answers that lie somewhere on a scale from bad to good. Designers are therefore comfortable—or at least not too uncomfortable—with the task of bringing order to complexity and ambiguity. They’re accustomed to cutting cubes out of fog.
Design as a distinct profession emerged only in the 20th century. It came out of the divide-and-conquer approach to production, in which one broke a complex process into its constituent parts so that each part could be studied and streamlined. Before that, designing was part of a general activity that included problem solving, form giving, and execution. When it finally became a discipline in its own right, with its own professional organizations and special history, design became more detached from the industrial world that spawned it.
In my last book, The Designful Company, I explained how organizations can transform themselves in order to harness innovation as a competitive advantage. The secret is simple: If you want to innovate, you have to design. Design and design thinking—as opposed to business thinking—is the core process that must be mastered to build a culture of nonstop innovation.
The problem with traditional business thinking is that it has only two steps—knowing and doing. You “know” something, either from past experience or business theory, then you do something. You put your knowledge directly into practice. Yet if you limit yourself to what you already know, your maneuver will necessarily be timid or imitative. Traditional business thinking has no way of de-risking bold ideas, so it simply avoids them. This is not a recipe for innovation but for sameness.
Design thinking fixes this deficiency. It inserts a middle step between knowing and doing called making. Making is the process of imagining and prototyping solutions that weren’t on the table before. While this concept is easy to grasp, it’s difficult to practice. Why? Because new solutions, by definition, cannot be drawn directly from an organization’s repertoire of past responses. Neither can they be found in case studies or business books. They’re new. And since true innovation is not a best practice, it sets off alarm bells in the boardroom: “If no one has done this before,” the executive asks, “why should we take a chance? Why not just wait until someone else tries it, then jump on board if it works?” Of course you can, if your goal is to follow. But if your goal is to lead, you have to embrace design. That’s why innovation is so hard, and why it confers such a powerful advantage on those who master it.
Designing is not the exclusive territory of designers. If it were, the amount of innovation in the world would be a mere fraction of what it is today. A designer is simply someone who doesn’t take yes for an answer—a person who searches for better and better solutions to what could be, when others are satisfied with what is.
According to Nobel Pr
ize winner Herbert Simon, a pioneer in artificial intelligence, “A designer is anyone who works to change an existing situation into a preferred one.” Using this definition, any one can be a designer. Even you. And while you may not have the aesthetic sensitivity of a trained professional, you’re nevertheless following the same thought process that guides the work of automotive designer Chris Bangle or international architect Rem Koolhaas.
Design is not limited to the styling of cars or the planning of buildings. It can determine the success of any man-made object, process, or experience. It can be used to improve decision-making, corporate strategy, or government policy. It can shape the letterforms on this page so they say more than the words by themselves. Or it can show us how to recombine DNA to make living systems.
“Designers are in the miracle business,” says Dr. Carl Hodges, founder of the Seawater Foundation. He’s not intimidated by what is. He’s an innovative scientist who’s using the rise in sea level caused by global warming to turn coastal deserts into agricultural Edens.
The greening of deserts happens to be a good metaphor for the experience of furniture-maker Steelcase. Said President James Hackett, “Design can bring back value where it has been sucked completely dry by commoditization.”
And the former CEO of Procter & Gamble, A.G. Lafley, underwent a religious conversion to design as he injected new life into his portfolio of brands. “I’m not doing this because I’m a frustrated liberal arts major,” he said. “Good design is serious business.”
The future in your hands
As Watson’s performance on Jeopardy! suggests, the competition between people and machines is heating up. The Robot Curve will continue to take jobs away, and we’ll continue to search for higher ground where our contributions will have uniqueness and value. Our machines are forcing us to confront who we are.
Homo sapiens is Latin for knowing man. Are we the species that succeeds by knowing? If our machines end up knowing more than we do—then what? Will we be the slaves of “our new computer overlords”?
The fact is, our technology is so interesting that we often forget to credit the special gift that made it possible. It wasn’t just knowing that brought us to this stage of our evolution—it was making. Our ability to make and use tools, starting with simple hammers and axes, and moving to spears, brushes, needles, grinding stones, and horticultural tools, came from a two-way conversation between our brains and our hands. Our hands—with their powerful grip, articulate fingers, and opposable thumbs—gave us the evolutionary advantage that created our superior intellect. In other words, our hands made our brains as much as our brains controlled our hands.
A turning point in human evolution may have come with the invention of language some 50,000 years ago. Language unleashed a torrent of creativity, including the invention of new tools, music, art, and mythmaking, plus enough survival and navigational skills to migrate thousands of miles from Africa to Europe and Australia. Without language, it seems, our culture would have been constrained to very slow progress indeed.
When a baby reaches toward her mother and utters her first word, we usually take this as evidence of language. It’s not. The baby has no idea what “mama” means. Her actual “first word” is her outstretched hand. “Mama” is simply the attention getter for the real message, which is something like “Mama, come here,” or “Mama, pick me up,” or “Mama, give me that.” Over time, she learns that different words stand for different things, and her language skills take off. But notice that the gesture—the outstretched hand and extended fingers—precedes the words for it.
All languages use similar structural elements, even those that developed in cultural isolation. How can this be? Are language skills hereditary, or even instinctual? No. It’s more likely that they’re simply patterned on the universal human experience of manipulating physical objects—in other words, moving things around with our hands. Neurologist Frank R. Wilson has written that “evolution has created in the human brain an organ powerfully disposed to generate rules that treat nouns as if they were stones and verbs as if they were levers or pulleys.” While not all languages use nouns and verbs in exactly the same way, they all have rules that treat words as building materials—to be selected, shaped, and placed into meaningful structures.
When we talk about thinking, we often use words that are metaphors for the hand. We hold onto a thought or handle a problem. We cling to beliefs or attempt to manipulate people. We reach for a word or grasp a situation. We experience things firsthand and touch upon subjects. We feel our way forward and point to a solution. Our numbering system began with our fingers, so now we digitize information. There’s a reason we talk with our hands—we think with our hands. The evolution of our hands pushed our brains forward, and our brains pushed back.
Over the two millennia since Plato, and especially during the last 500 years after the Renaissance, academic education in the West has been successful in separating the hand from the brain. We’ve decided that making things is less valuable than knowing things, and therefore making has a less exalted place in the classroom. This is not only wrong, but it denies the very evolutionary advantage that makes us human. And now, with the advent of ubiquitous information, our knowing muscles seem overdeveloped while our making muscles seem atrophied.
Biology would suggest reversing course. According to Wilson, “The most effective techniques for cultivating intelligence aim at uniting—not divorcing—mind and body.” If our goal is to reshape the world, we’ll need to cultivate a new set of talents in which making is rejoined with knowing.
These are the five talents—the metaskills—that I believe will serve us best in an age of nonstop innovation:
Feeling, including intuition, empathy, and social intelligence.
Seeing, or the ability to think whole thoughts, also known as systems thinking.
Dreaming, the metaskill of applied imagination.
Making, or mastering the design process, including skills for devising prototypes.
Learning, the autodidactic ability to learn new skills at will. Learning is the opposable thumb of the five talents, since it can be used in combination with the other four.
The bright thread that weaves through all five metaskills is aesthetics, a set of sensory-based principles that can stitch together the new and the beautiful. After all, would you really want to live in a world of robots, extended lifespans, body implants, space travel, and virtual reality if it weren’t also filled with delight? The primary art form of our time is technology. To stay human—and become even more human—we need to imbue our inventions with the soul-stirring attributes of aesthetics. It’s both our legacy and our destiny.
Across the walls and ceiling of Pech Merle, in the flicker of candlelight, the fluidly stylized drawings of horses, mammoths, and reindeer seem to thunder as they fly overhead. Go ahead. Place your hand over the stenciled hand of the ancient cave painter. Even after 25,000 years of continuous human evolution, it will still fit.
FEELING
Brain surgery, self-taught
One advantage of computers is that they never get emotional. They’re not misled by their dreams or desires. They’re not seduced by the lazy answer or the simplistic story. They’re not subject to mood swings, and they’re not swayed by irrelevant data. In short, they don’t suffer from the cognitive biases that make humans so irrational. Computers simply follow the instructions they’ve been given—quickly and accurately. That’s their charm.
Yet here’s a question no one ever asks: If the ability to make fast, accurate calculations is so valuable, why hasn’t evolution equipped us to think like computers? Isn’t four million years enough time to endow our brains with at least the computing power of, say, a cheap calculator? Or is computerlike processing a biological impossibility?
Apparently it’s a distinct possibility, given the amazing feats of mathematical savants like Daniel Tammet. Tammet can do cube roots quicker than a computer and recite pi out to 22,514 decimal plac
es. He can multiply any number by any number, and learn languages as easily as others learn capital cities. He can read two books simultaneously, one with each eye, and recall the details of all 7,600 books he’s read so far. As far as Tammet is concerned, there is no calculating needed. He simply “sees” all this information in a flash, as others would see a photograph. Numbers, for example, look like shapes, colors, sounds, and gestures. The number two is a motion and number five is a clap of thunder.
Daniel is a high-functioning autistic, and, along with ten percent of autistic people, is a savant. Other savants have mastered a wide range of challenges, from memorizing every line of Grove’s Dictionary of Music—all nine volumes of it—to accurately measuring long distances without any instruments. How many people could draw a precise map of the London skyline after viewing it briefly from a helicopter? Or play Tchaikovsky’s Piano Concerto Number One the first time they heard it—without a single piano lesson? These are the kinds of feats we expect from computers, but we’re amazed to see them performed by humans.
Allan Snyder, professor at the Centre for the Mind at The University of Sydney, believes we could all do these kinds of things with a little more insight into brain mechanics. “Savants usually have had some kind of brain damage.” However, he says, “I think it’s possible for a perfectly normal person to have access to these abilities.” Yet do we really want them? If the human brain is fully capable of machinelike computation, and if nature has supplied us with a steady stream of savants, why hasn’t natural selection figured out how to work the “savant gene” into the general population?
We have to consider the possibility that computerlike thinking is not central to our success on Earth; that there may be another set of abilities more important to our continued survival than calculation and memorization; and that what makes us truly human isn’t so much our rational brain as our emotional brain. This hypothesis would seem to contradict 2,000 years of Western orthodoxy, but the more we use modern neuroscience as a lens, the more likely it appears. René Descartes famously wrote: Cogito ergo sum, or “I think, therefore I am.” Yet Aristotle may have been closer to the mark some 1500 years earlier when he wrote: Sentio ergo sum, or “I feel, therefore I am.” Our emotions are what tell us that we’re more than machines.
Metaskills- Five Talents for the Robotic Age Page 4