While many individuals at companies have job titles related to data science, most are unskilled at machine learning and AI. Businesses still think of data scientists as analysts who perform business intelligence using dashboards, or at best, as statisticians who sample from data sets to draw static inferences. Most organizations are just starting their evolution toward AI and do not have a strong bench of AI practitioners.
Since 2000, the number of AI startups has increased by a factor of 14, while venture investment in AI startups has grown sixfold during the same period. And the share of jobs requiring AI skills has grown by a factor of nearly 4.5 since 2013.34 The burgeoning global demand for data scientists and managers skilled in analytics has captured the attention of politicians, governments, corporations, and universities worldwide.35
Of course, the data scientist pipeline starts with training at university. The rise in high-paying data science jobs has sparked a surge in enrollment in data science programs: Graduates with degrees in data science and analytics grew by 7.5 percent from 2010 to 2015, outpacing other degrees, which grew only 2.4 percent collectively.36 Today, more than 120 master’s programs and 100 business analytics programs are available in the U.S. alone. To train existing workers, boot camps, massive open online courses (MOOCs), and certificates have grown in popularity and availability.
In 2018 LinkedIn reported data scientist roles in the U.S. had grown by 500 percent since 2014 while machine learning engineer roles had increased by 1,200 percent.37 Another 2017 study found that by 2020 the total number of data science and analytics jobs will increase to 2,720,000 and will have an impact across a broad range of industries.38 On job sites like Glassdoor and LinkedIn, machine learning engineers, data scientists, and big data developers are among the most popular, with demand coming from numerous industries.
As a result, companies are paying high prices to acquire data scientists. In 2014, for example, Google acquired AI startup DeepMind Technologies, with just 75 employees, for an estimated $500 million—more than $6 million per employee.39 The acquisition produced at least two significant results: It led to the development of AlphaGo, the first AI program to defeat a top professional player in the ancient Chinese board game Go—which turned out to be a “Sputnik moment” for China, propelling its government to make AI a top strategic priority.40 More recently, DeepMind’s AlphaFold algorithm won the 2018 Critical Assessment of Structure Prediction (CASP) competition—considered the “virtual protein-folding Olympics, where the aim is to predict the 3D structure of a protein based on its genetic sequence data.”41 This is an important area of biomolecular research with significant potential to improve understanding of diseases and the discovery of new drugs.
FIGURE 6.3
To address the overall demand for data science skills, governments have begun to act. The UK’s Open Data Institute and Alan Turing Institute, the European Commission’s 2014 big data strategy, and the U.S. federal government’s 2016 Big Data Research and Development Strategic Plan are all examples of coordinated efforts to address the need for trained data scientists. China, which has made AI a central pillar of its thirteenth Five-Year Plan and New Generation of Artificial Intelligence Development Plan, is investing massively in AI research including university programs to train data scientists.42 But China also estimates it will face data scientist shortfalls: In 2016 the information technology ministry estimated China will need 5 million more AI workers to satisfy its needs.
Globally, more traditional research programs are contributing to core research and are publishing papers at a rapid rate. Leading institutions include MIT, Carnegie Mellon, Stanford, and USC in the U.S.; Nanyang Technological University, National University of Singapore, Hong Kong Polytechnic University, Chinese University of Hong Kong, the Institute of Automation, Tsinghua University, and Chinese Academy of Sciences in Asia; University of Grenada and Technical University of Munich in Europe; as well as others in Canada, Switzerland, Italy, the Netherlands, Australia, and Belgium, to name a few.
Data science programs in the U.S. are expanding across many vectors. In 2014, University of California at Berkeley launched an online data science master’s program and now offers an executive education program in data science and analytics. More than 30 high schools in California have started offering data science classes for juniors and seniors.43 Over the longer term, a significant focus on mathematical and computer science education starting in the K-12 curriculum will be required to address the AI skills gap.
There are also a growing number of “boot camps” and training programs for aspiring data scientists. These programs take in professionals with strong technical backgrounds—e.g., mathematics, physics, or other engineering disciplines—to train and prepare them for AI careers. Some of these boot camp courses are available online. For example, Coursera offers an online curriculum for both machine learning and deep learning.44 Other courses are in-person, such as the Insight Data Science program in the San Francisco Bay Area.45
In addition to data scientists, companies will also increasingly need individuals whom McKinsey calls “translators.”46 Translators can bridge the divide between AI practitioners and the business. They understand enough about management to guide and harness AI talent effectively, and they understand enough about AI to ensure algorithms are properly integrated into business practices.
Without a doubt, we are in a transitional time as organizations retrain their workforces, recruit graduates with AI degrees, and adjust to the many changes driven by AI innovation and adoption. But there is a clear path forward for organizations that realize the inevitability of an AI-powered future and the need to start building AI capabilities now. Today organizations can rely on the expertise of AI advisors and proven technology partners while simultaneously developing their own internal AI competencies.
Succeeding as an AI-Driven Enterprise
We have seen that to succeed with AI, organizations require new technological and business capabilities to manage big data, as well as new skills in data science and machine learning.
A final challenge organizations face in succeeding with AI is to implement changes in business processes that AI requires. Just as the emergence of the internet drove organizations to change business processes in the 1990s and early 2000s, AI drives a similar if not greater scale of change. Organizations may need to adapt their workforce to accept recommendations from AI systems and provide feedback to AI systems. This can be difficult. For example, maintenance practitioners who have been doing their jobs in a specific way for decades often resist new recommendations and practices that AI algorithms may identify. Therefore, capturing value from AI requires strong leadership and a flexible mindset on the part of both managers and front-line employees.47
For all these reasons, organizations that embark on digital transformation initiatives increasingly engage experienced technology partners to help overcome the challenges of building, deploying, and operating AI-based applications and driving business process changes required to capture value. Organizations are investing in a new technology stack—that I describe in chapter 10—providing the capabilities to address these requirements in a far more efficient way than traditional approaches.
This new generation of technology is increasingly important as AI applications get larger and more complex, particularly as enterprises and value chains are instrumented with sensors and actuation devices—the phenomenon known as the internet of things (IoT). This increases the amount of data available to organizations by orders of magnitude and also increases the fidelity and accuracy of data sets.
Organizations will be challenged to interpret the large amounts of data IoT generates and to leverage these data to take appropriate action in a timely manner. Interpreting and acting over large data sets will require the application of AI, which will therefore play an important role in unlocking value from IoT. I will describe the IoT phenomenon and its implications for business more fully in the next chapter.
While the threat of mis
sing the digital transformation opportunity is existential, the rewards for embarking on a strategic, organization-wide transformation will be truly game-changing. As studies by PwC, McKinsey, the World Economic Forum, and others show, digital transformation will drive trillions of dollars of new value creation globally over the next decade. Organizations that act now will position themselves to take an outsized share of that prize.
Chapter 7
The Internet of Things
The previous three chapters discussed how the technology trends of elastic cloud computing, big data, and artificial intelligence are driving forces of digital transformation. The fourth trend, the internet of things (IoT), refers to the ubiquitous sensoring of value chains so all devices in the chains become remotely machine addressable in real or near-real time.
I first came across the term “internet of things” in 2007 during a business trip in China. Initially, I assumed the internet of things was only about the sensoring of value chains. But I have since given this a lot of thought, and I’ve discovered what’s happening is more significant and transformational.
With ever-cheaper and lower-power processors and faster networks, computing is rapidly becoming ubiquitous and interconnected. Inexpensive AI supercomputers the size of credit cards are deployed in cars, drones, surveillance cameras, and many other devices. This goes well beyond just embedding machine-addressable sensors across value chains: IoT is a fundamental change in the form factor of computing, bringing unprecedented computational power—and the promise of real-time AI—to every manner of device.
Origin of the Internet of Things
IoT, along with AI, has created one of the most disruptive waves we’ve ever seen in IT and business. IoT allows us to connect low-cost, high-speed chips and embedded sensors through fast networks. At the root of IoT was the introduction of smart, connected products and the hypergrowth of the internet.
Three decades ago, the notion of smart objects was a new idea. Wearable computing devices were proposed by researchers such as Rank Xerox’s Mik Lamming and Mike Flynn, who in 1994 created the Forget-Me-Not, a wearable device that used wireless transmitters “designed to help with everyday memory problems: finding a lost document, remembering somebody’s name, recalling how to operate a piece of machinery.”1 In 1995, MIT’s Steve Mann created a wearable wireless webcam. That same year, Siemens developed the first wireless machine-to-machine (M2M) communication, used in point-of-sale systems and for remote telematics.
In 1999, MIT Auto-ID Center co-founder and Executive Director Kevin Ashton used the term “internet of things” for the first time. In the title of a presentation designed to get the attention of Procter & Gamble’s executive management, he linked the new idea of radio frequency identification (RFID) tags in the supply chain to the intense and growing interest in the internet.2 The use of RFID tags to track objects in logistics is a well-known early example of IoT, and the technology is commonly used today to track shipments, prevent loss, monitor inventory levels, control entry access, and much more.
In fact, industrial uses of IoT took hold first ahead of consumer uses. In the late 1990s and early 2000s, a wave of industrial applications emerged following the introduction of M2M communication, with companies like Siemens, GM, Hughes Electronics, and others developing proprietary protocols to connect industrial equipment. Often managed by an onsite operator, these early M2M applications evolved in parallel as IP-based wireless networks gained traction with office workers using laptops and mobile phones. By 2010, the idea of moving these largely proprietary networks to IP-based Ethernet protocols was seen as an inevitable direction. Called “Industrial Ethernet,” these applications focused on remote servicing of equipment and factory floor monitoring, often from remote locations.
IoT was slower to take hold in the consumer products world. In the early 2000s, companies repeatedly made (largely unsuccessful) forays into connecting products like washing machines, lamps, and other household items. In 2000, for example, LG was the first to introduce an internet-connected “smart refrigerator” (with a $20,000 price tag), but few consumers at the time wanted a fridge that told them when to buy milk. In contrast, wearable computers like the Fitbit and Garmin (both introduced in 2008) started capturing consumer interest, leveraging motion-detecting accelerometers and global positioning system (GPS) capabilities for uses such as fitness and navigation.
Consumer IoT was further sparked in 2011–12, when several successful products like the Nest remote thermostat and the Philips Hue smart lightbulb were introduced. In 2014, IoT hit the mainstream when Google bought Nest for $3.2 billion, the Consumer Electronics Show showcased IoT, and Apple introduced its first smart watch. Consumer IoT is most visible in the fast-growing adoption of wearables (particularly smart watches) and “smart speaker” devices like the Amazon Echo, Google Home, and Apple HomePod—a category growing at nearly 48 percent a year in the U.S.3
Today, we see even more changes in the form factor of computing devices. I expect that in the next few years virtually everything will have become a computer—from eyeglasses to pill bottles, heart monitors, refrigerators, fuel pumps, and automobiles. The internet of things, along with AI, creates a powerful system that was barely imaginable at the beginning of the 21st century, enabling us to solve problems previously unsolvable.
The IoT Technology Solution
To take advantage of IoT, businesses and governments need a new technology stack, connecting the edge, an IoT platform, and the enterprise.
The edge consists of a very broad range of communication-enabled devices, including appliances, sensors, and gateways, that can connect to a network. At a minimum, edge devices contain monitoring capabilities, creating visibility into the product’s location, performance, and status. For example, a smart meter in an electrical power grid sends status and usage readings to the utility’s operations center throughout the day. As the form factor of computing devices continues to evolve, more edge devices are expected to have bidirectional control capabilities. Edge devices that can be monitored and controlled allow for a new set of business problems to be solved. By leveraging a device’s monitoring and control capabilities, its performance and operation can be optimized. For instance, algorithms can be used to predict equipment failures, allowing a maintenance crew to service or replace the device before it fails.
FIGURE 7.1
An IoT platform is the connection between the enterprise and the edge. IoT platforms must be able to aggregate, federate, and normalize large volumes of disparate, real-time operational data. The ability to analyze data on petabyte scale—aggregating all relevant historical and operational data from both modern and legacy information systems into a common cloud-based data image—is a critical requirement.
Today’s state-of-the-art IoT platforms function as application development platforms for enterprises. Rapid development of applications that monitor, control, and optimize products and business units greatly increases productivity.
There are many real-world examples of industries that have already integrated IoT as a core element of business transformation. A notable example is the smart grid. The electric power grid, as it existed at the end of the 20th century, was largely as originally designed more than a hundred years earlier by Thomas Edison and George Westinghouse: power generation, power transmission over long distances at high voltage (115 kilovolts or greater), distribution over medium distances at stepped-down voltage (typically 2 to 35 kilovolts), and delivery to electric meters at low voltage (typically 440 volts for commercial or residential consumption).
Composed of billions of electric meters, transformers, capacitors, phasor measurement units, power lines, etc., the power grid is the largest and most complex machine ever developed and, as noted by the National Academy of Engineering, the most important engineering achievement of the 20th century.
The smart grid is essentially the power grid transformed by IoT. An estimated $2 trillion will be spent this decade to sensor this value chain by upg
rading or replacing the multitude of devices in the grid infrastructure so that all the devices emit telemetry and are remotely machine addressable.4 A familiar example is the smart meter. Traditional electromechanical meters are manually read, usually at monthly intervals, by field personnel. Smart meters are remotely monitored and commonly read at 15-minute intervals.
When a power grid is fully sensored, we can aggregate, evaluate, and correlate the interactions and relationships of all the data from all the devices, plus weather, load, and generation capacity in near-real time. We can then apply AI machine learning algorithms to those data to optimize grid performance, reduce the cost of operation, increase resiliency and reliability, harden cybersecurity, enable bidirectional power flow, and reduce greenhouse gas emissions. Combining the power of IoT, cloud computing, big data, and AI results in a digital transformation of the utility industry.
The smart grid is an illustrative example of how value chains in other industries may become interconnected through IoT to create transformative change and value. For example, as self-driving technology develops and takes hold, autonomous vehicles will communicate with one another to optimize traffic flow throughout entire city street networks, resulting in fewer traffic jams, less transit time for commuters, and reduced environmental stress.
The Internet of Things: Potential and Impact
The internet of things is poised to significantly alter how organizations operate. Although this is no longer the controversial statement it was when I first heard the term in 2007, it begs three questions: why, how, and how much?
There are three primary reasons IoT will change the way business is done. First, the volume of data that IoT systems can generate is wholly unprecedented. The internet of things is projected to generate 600 zettabytes of data annually by 2020—that’s 600 million petabytes.5 This number may strike you as almost unbelievable, but recall our discussion of the smart grid: Power plants, transmission substations, transformers, power lines, and smart meters constantly generate data. When properly sensored, these assets often produce multiple reads per second. When you consider that the U.S. electrical grid alone has 5.7 million miles of transmission and distribution infrastructure, 600 zettabytes is not an implausible number.6
Digital Transformation Page 14