Labyrinth- the Art of Decision-Making
Page 26
All this combined to lay the foundations for the developing phenomenon of what’s currently termed Big Data, which has essentially revolutionized business analysis. Companies that find a way to use the bounty of data harvested wisely get a priceless support instrument, not only at an operational level but also in terms of strategic planning. Of course, support in making operational decisions is important, but using Big Data only to this end, despite its current domination, means failing to fully utilize its potential to improve strategic decision-making (this is excellently illustrated by the words often attributed to Mark Twain: “Most people use statistics like a drunk man uses a lamppost—more for support than illumination”). Making full use of Big Data, however, enables us to spot new market trends, new consumer behaviors, and the market potential for new products or services and break-through inventions.
Big Data is a great ally for the inquiry approach. It’s now possible to access highly detailed information that might give us the answers to the questions that are troubling us.
Rule #13
Don’t be a “decision drunk”—use data for illumination, not only for support. Data can be a great ally, when properly analyzed.
Regardless of the industry, technology is the hottest topic in business today. IBM’s Global C-suite Study, conducted every two years with company presidents and CEOs, has been showing a clear trend in recent years. While in 2004 “technological factors” were rated sixth by the respondents in a list of factors identified as influencing the participating companies’ futures, by 2012 they were in first place, and have remained there ever since.
Technological factors include more than just processing data. The Internet of Things (IoT) is important here, as it enables devices connected to the Internet to communicate with one another in real time to optimize their operation (in other words, making instantaneous decisions based on data without human involvement). All devices labeled “smart” use some application of the IoT: from smart cities through smart grids to smart homes. The number of appliances operating within the IoT today is in the tens of billions, 2 and the renowned advisory company McKinsey estimates that by 2025, the potential economic impact of the IoT will be worth $11.1 trillion. 3 With the development of deep learning (postulated in 1957 by Marvin Minsky) and deep neural networks, creating truly “intelligent” AI is just a question of time—AI in driverless cars is already becoming more reliable than a human decision-maker behind the steering wheel.
There is also enormous potential in blockchain technology, which some consider the most important innovation in the financial world in many years. The uses of blockchain go far beyond banking—in September 2017, the French insurance giant AXA introduced blockchain technology and the use of smart contracts to automatically compensate passengers for flight delays, for example.
Regardless of the accuracy of the predictions, we can be sure of one thing: in the years to come, rapidly changing technology will have ever-greater influence, as will rapidly changing methods of gathering, analyzing, and using data. This, in turn, will lead to increased automation of decision-making processes. If we don’t fully understand these changes, we should seek out people who can navigate seamlessly through this new reality because they were born into it.
The passing of time means the arrival in the world of new generations who, every now and then, radically alter the view of reality through the values and attitudes they express. An example of such a change was the appearance between 1945 and 1965 4 of the Baby Boomers, who saw themselves as a generation of change. Among other things, the sexual revolution of the 1960s and ’70s occurred during the period in which Baby Boomers were young adults. That period also heralded the arrival of the first wave of businesses availing themselves of new technologies and creating businesses based on new values—not using traditional “hard resources” and making things that lasted, but drawing on innovation and human capital. Members of that generation include Bill Gates, Paul Allen, Larry Ellison, and Richard Branson. Steve Jobs was a Baby Boomer, too. With time, the Baby Boomers began being replaced by Generation X, people born between 1965 and 1983. 5 A characteristic feature of this generation was the further liberalization of attitudes and habits (affecting religion, race, sexuality), but this was accompanied by a less revolutionary approach to reality. Where the Baby Boomers genuinely aspired to overthrow earlier ideals and behaviors, replacing them with their own, the majority of Generation X was keener to fit into the existing scheme of things (hence, for example, the blossoming trend in recent years for classic corporate careers and unfettered consumerism).
In the middle of the 1980s, there appeared a new generation, named Generation Y, or, from the turn of the century, Millennials. We can easily see that Generation Y consists of people who have spent almost their entire, conscious lives enjoying the benefits of virtual lifestyles and technologies which were innovations for earlier generations who had to learn to use them. 6 Research carried out by Boston Consulting Group in 2012 showed that the basic difference between Generations X and Y lies (not surprisingly) in how they use the Internet. While both generations spend a similar amount of time online, Millennials spend that time predominantly on social media platforms, building and expanding their virtual world. Not surprisingly, social media and mobile devices have become fundamental means of communication for Generation Y.
This natural ability that new generations have for assimilating and using technologies that constituted a sea change for older generations is brilliantly captured by Don Tapscott (himself a classic representative of the Baby Boomers) in his book Grown Up Digital, where he describes a scene that played out one day in 1997 in his own home, with his twelve-year-old son Alex as the unwitting lead role. That day, Alex discovered that Don, a noted figure in the world of new technology, was taking part in a TV program, during which he was to demonstrate to viewers a newfangled piece of technology called the Internet. When Alex’s mom told him about the program he reacted like this:
That’s the dumbest TV show I’ve heard of. Why would anyone want to watch Dad use the Internet? [... ] Mom, this is so embarrassing. All my friends are going to see it. You don’t need to show people how to use the Internet. 7
When Don heard about Alex’s reaction, he asked his son why he thought that way, and got the following reply:
Dad, no offense, but I think you adults are obsessed with technology. You call this a technology revolution and you are so fascinated by how technology works. Imagine some other technology, Dad. [... ] The television—is that a technology to you, Dad? Imagine a TV show where people watch you surf television! Wow! Let’s see if my dad can find a football game on television! Now my dad is going to try and find a sitcom. 8
If that wasn’t enough, Don’s daughter, one year older than Alex, chipped in with her two cents:
Yeah Dad, how about a refrigerator? Remember, it’s a technology, too. Why don’t we have a TV show where we can all watch you surf the fridge? Check this out, my dad has found some meatloaf... 9
That just about sums it up. Something that seemed a breakthrough to Generation X and the Baby Boomers, and that they had to make a conscious effort to adapt to, is to Millennials, who were born into a digital world, as natural as breathing.
Rule #14
Never ignore the values and convictions of other generations, especially those only just entering the market. Even if their influence on decision-making today is minimal, the new normal means this may change sooner than you expect.
One of the most interesting mechanisms enabled by technology and social media is crowdsourcing, a term coined by Jeffrey Howe in Wired magazine in June 2006. Put simply, crowdsourcing involves using modern technologies to engage large groups of people—often millions—in a specific task. It is used both in business and in projects of a social and academic nature.
One of the oldest examples of crowdsourcing is the SETI program, which has been running since the 1960s. SETI—the Search for Ex
traterrestrial Intelligence—was initiated in 1960 by American astronomer Frank Drake of Cornell University, and its aim was to analyze signals from outer space registered by radio telescopes that were looking for specific patterns that might indicate that the signals had an intelligent origin. The first targets were the stars Tau Ceti and Epsilon Eridani (chosen because they’re at the relatively close distance of 12 and 10.5 light-years, respectively), which the Green Bank Observatory’s radio telescope listened into at a frequency of 1,420 MHz. While the project didn’t provide the hoped-for results, the researchers weren’t deterred, and the next phase of observations extended to monitoring as many as 650 stars over four years. One of the listening instruments used in this stage was Ohio State University’s Big Ear radio telescope. With the help of this device, on August 15, 1977, an unusually strong signal was registered. It was named the WOW! signal, after a note made by Dr. Jerry Ehman, who observed it. The signal came from the constellation of Sagittarius and was never repeated, but it gave scientists new hope of using their method to find intelligent life elsewhere in the universe. Other institutes (including Berkeley University) joined the SETI program, and even NASA showed an interest. More and more listening installations were used, including the largest single radio telescope in the world, near Arecibo in Puerto Rico.
Initially, only the scientific personnel of a dozen or so institutions were involved. It soon emerged, though, that there was a prodigious amount of data to be analyzed, far exceeding the capabilities of the computers the scientists had at their disposal, and so a bottleneck developed. In 1999, someone had a bright idea. Seeing the limitations on their computing power and the impossibility of adding further supercomputers to their inventory, the scientists decided to turn to scaling. They asked hundreds of thousands of Internauts from around the world to participate in the project, asking them to download onto their home computers a special screensaver that analyzed packs of data taken from the Internet while the computer wasn’t being otherwise used. When the owner wanted to use their computer, the screensaver and the analysis were switched off.
The SETI@home project was officially launched on May 17, 1999. The scientists hoped to acquire 100,000–150,000 users, but such hopes turned out to be modest, to say the least. Over 5 million people from 233 countries signed up, and their combined computer power in the following years provided a calculating time of over 2 million years (yup!). By mid-2013, the total calculating power of SETI@home was 670 teraflops (that’s 670 trillion floating-point operations per second).
While the SETI@home project hasn’t resulted in any contact with extraterrestrial intelligence, it does offer an amazing example of exploiting the possibilities of new technologies in developing crowdsourcing campaigns. Another milestone in this area was Wikipedia, where the combined intellectual effort of millions of Internet users led to the creation of the largest compendium of knowledge in human history. These were just two of many events that encouraged other firms to attempt something similar. Two interesting examples of business applications of crowdsourcing against a backdrop of open innovation were My Starbucks Idea, set up by the coffee chain, and IdeaStorm, set up by Dell. Another was InnoCentive, an open innovation platform that helped to connect the world of large corporations seeking a solution to a specific problem with that of millions of experts (be it individuals or small organizations) who might be able to help.
When it comes to decision-making, crowdsourcing can be an interesting ally on several levels. First, in the case of decisions affecting the customer, engaging a large number of them gives an organization a unique perspective, so we can very precisely assess which choice will win the greatest approval among those whose opinions are of the greatest importance to us. This approach is also used in some places by local communities and authorities trying to decide how to allocate what are often referred to as citizens’ budgets: a pool of public finances that local residents, not the authorities, can decide how to use by voting for—in their opinion—the most beneficial initiative. The point of this is to ensure that funds are allocated in a way that reflects locals’ priorities. Some businesses have behaved similarly, allowing customers and partners to decide to whom a company distributes its corporate social responsibility (CSR) funds.
Second, crowdsourcing can be used to provoke or deepen the inquiry approach. If we can engage people from different circles and with different points of view in the discussion, we have a better chance of examining a problem from every possible angle.
Third, we can use crowdsourcing in decision-making to multiply the potential for compiling a complete picture of a situation and questioning the status quo, which in the world of the new normal is enormously beneficial. Many companies that take this path use the dispersed leadership approach, in which every employee is expected to display leadership, speaking out about any threats and opportunities they perceive, and actively promoting new approaches and solutions to the company. There will be more on this topic in the Epilogue.
Crowdsourcing and its related tools are also revolutionizing areas that at first glance might not seem threatened by the growth of online social networking mechanisms. An obvious example is the Encyclopædia Britannica, whose activities and product seemed far removed from the dynamically developing new technologies, but which eventually was seriously impacted by new trends and forced to radically reorganize its business model. Other traditional businesses have been similarly affected (Uber, Airbnb), and have also been forced to admit that crowdsourcing and its related tools have turned their worlds upside down.
Internet communities and the solutions emerging from them, such as crowdsourcing, are the natural environments of and an obvious mode of operating for Generation Y. Due to their nature (virtually free to use, increasingly credible and valuable), they are attracting more and more people of a slightly older vintage, for whom they have also become a priceless tool in making the best possible decisions with the help of Big Data.
Rule #15
The world of data overload is also a world of new possibilities.
Actively seek out opportunities to engage a cost-free force that can radically improve the quality of your decision-making.
Far murkier possibilities lurk in the current and future development of nanotechnology and digital biology. As you probably know, one of the commonest reasons for bad decisions being made is the fallibility of human memory. Scientists disagree wildly in their assessments of the capacity of the human brain: according to Paul Reber, a professor of psychology at Northwestern University, we have at our disposal a gigantic memory capable of storing around 2.5 petabytes (!) of data; Ralph Merkle, a nanotechnology and cryptography expert, is far more cautious, estimating its capacity at a “mere” several hundred megabytes. This enormous disparity isn’t actually very significant to our current context, because the problem lies elsewhere. As humans, our bottleneck isn’t the quantity of data we can store, but the permanence of its recording. Unlike the hard drive of a computer, where data can only be erased through the deliberate action of the user (deleting data), in the case of the human brain, data can be retained only fleetingly—think of the times you’ve forgotten the five things you went to the store for, within minutes of making a mental list of them. We simply forget, whether it’s important data, stakeholders we need to take into account, or our previous experiences in similar situations, and so on.
Our susceptibility to illusion (see Chapter 8) and imperfections in our memory combine to form one of two fundamental barriers to our making full use of the capabilities Big Data provides; the second is the fallibility of intelligence... artificial intelligence, which, in comparison to our amazing brains, is still limited in pattern recognition. We still can’t get computers to emulate the workings of our brains.
It’s only a question of time, though. Ray Kurzweil, the American visionary and futurologist, believes we are hurtling toward an epoch in which these weaknesses will be eliminated, or—if you prefer—there will be
a fusion of the strengths of both computers and the human brain. In practice, this phenomenon will take the form of better and better machines enabling faster and better decision-making in situations where a human wouldn’t normally have even a hypothetical chance of reacting in time. The phenomenon of a computer’s being able to fully emulate the functions of the human brain has been named by Kurzweil, borrowing from quantum physics, a “singularity.” According to Kurzweil, singularity will be achieved before 2030, ushering in a whole new reality for making decisions.
The second aspect of improving decision-making processes in the future, and so eliminating the weaknesses of the human brain, looks a little terrifying. According to Kurzweil, the solution to poor memory will no longer be dubiously effective infusions of Chinese herbs. Their role will be taken over by microscopic hard drives implanted into our brains. If you’re thinking right now that such a combination of nanotechnology and bioengineering—an artificial piece of equipment with living tissue—seems like an ethically unacceptable development, think back a little. Just such a fusion took place many years ago and is already commonplace. How else would you describe a pacemaker?
Furthermore, the cost of human genome sequencing dropped from $2.7 billion in 2003 to $1,000 in 2017, and one start-up has announced that it will bring it down to $100 very soon. With genetic engineering tools such as CRISPR-Cas9, we can not only read but also write DNA, which means we will at some point be able to modify the physical and intellectual capacities of a human being (which obviously raises hundreds of ethical issues).