by Gabriel René
More Connected
Experts estimate that the IoT will consist of about 30 billion objects by 2020, growing to trillions of devices in the decades after. The evolutionary trend here is fundamentally more connected devices and more types of connected things. Effective application of this expanding capability can help us to use energy more efficiently, reduce carbon emissions, minimize waste, design better cities, predict diseases, track epidemics, and more.
The IoT as the Physical Computing hardware of the Spatial Web will capture and distribute physical data to the Cognitive Tier and store that Data at the Decentralized Data Tier via Blockchain, Edge, and Mesh networks for data storage and compute.
Logic Tier: Cognitive Computing
Artificial Intelligence, Smart Contracts, and Quantum Computing
Cognitive Computing is the digital application of the adaptive, contextual learning and logic systems modeled from our understanding of human cognition. These bring “smartness” into the physical world to analyze, optimize and prescribe activities in the Spatial Web.
The Web 3.0 Logic Tier will be driven by the trend of Cognitive Computing in the form of several core technologies like Artificial Intelligence, Smart Contracts, and Quantum Computing. Populated by billions of self-executing smart contracts and programs, every building, room, object, and phenomenon will exhibit smart behaviors, and the environments around us will appear to have sentience. AI, Machine Learning, Smart Contracts, and related “cognitive” computing technologies shift us from the punch-card programs of early computers to autonomous, self-initiating, and self-learning agents that are adaptive. These will soon exceed the intelligence capabilities of humans but at the speed, scale, and scope of exponential technology.
Wikipedia defines the term “artificial intelligence” or AI as a machine that mimics human cognition. Many of the things we associate with human minds, such as learning and problem solving, can be done by computers. This is why everything in the future is frequently called “smart.” It is a way of suggesting that it includes some programmable set of rules.
An ideal (perfect) AI is an autonomous dynamic agent that can perceive and act. It can see, hear, smell, touch, and even program its environment, modifying its behavior to maximize its chance of success at some defined goal. As a result, as AI becomes increasingly capable of mental, sensory, and physical abilities once thought to be exclusively human, our definition of “person” may need to be adapted.
Smart Contracts are “contracts as code”—they are programmable, automated, and self-executing software that removes legal contracts from the realm of documents that require constant human involvement and instead self-execute and self-enforce agreements between parties, provided the terms are met. If the program executing the contract is trustworthy, it’s unnecessary to trust that the other party will fulfill the terms.
Due to the immutable nature of a Smart Contract’s existence on Distributed Ledgers like blockchains, Smart Contracts provide security that is superior to traditional contract law and can reduce other transaction costs associated with contracting. Artificial Intelligence or AI will be capable of doing many things with Distributed Ledger Technologies, but for our purposes here, we will emphasize the role of AI as “smarter” contract agents that can enable data-driven, hyper-customized smart contract creation, analysis, execution, and enforcement.
Together, AI and Smart Contracts can simplify the negotiation and execution processes, while simultaneously facilitating more complex and dynamic agreements that can ultimately lead to greater efficiencies. This integral partnership launches the fields of law and software into an entirely new dimension. In the context of digital assets, smart contracts and AI can provide for the terms of use, payment, ownership transfer, and location-based terms or conditions automating entire supply chains including their transactions and segmentation of analytics for the data marketplaces of tomorrow.
Furthermore, Cognitive Computing will be applied to the vast amounts of data being pumped through the trillions of sensors of Web 3.0’s IoT infrastructure. This will further augment and accelerate the human cognitive and creative processes across every domain and will increasingly allow AI to explore an unlimited number of possible futures.
Next, let’s take a look at the role Quantum computing plays in the Logic Tier. Today’s computers, called “classical” computers store information in binary, as either a “1” or a “0;” each “bit” is either on or off. Quantum computation uses quantum bits or qubits , which in addition to being possibly on or off, can be both on and off. Qubits can store a tremendous amount of information and utilize far less energy than a classical computer. By entering into this quantum area of computing where the traditional laws of physics no longer apply, we can compute significantly faster (a million or more times) than the classical computers that we use today.
This gives Quantum Computers the ability to decipher the chaos patterns of traffic and the pulse of global markets, the nuances of the reflectivity of light as well as the neuronal activity of infants, the velocity of rain droplets, and the brush strokes of painters like a fortune teller reads tea leaves. It will seem so magical and impossible and yet it will likely uncover quantifiable patterns where we were sure none existed. Quantum Computing can act as a microscope for reality, revealing untold secrets of the universe capable of providing AI with the information necessary to make an infinite number of micro-adjustments to improve how our city traffic flows, or our children learn. It will help make VR even more realistic, route our resources where they are most needed, and might even inspire a level of appreciation for art in the most disinterested of Luddites.
The Cognitive Computing trend at the Logic Tier of the Spatial Web will make use of Distributed Ledger-secured Smart Contract logic and autonomous and adaptive AI as well as Quantum Computing, just as computer programs, web and mobile applications, and cloud computing drove the Logic Tier of the earlier iterations of the web. Collectively, the power of Cognitive Computing technologies will intelligently automate every aspect of our personal and collective daily lives as well as operate our civil, social, political, and economic systems. In time, these algorithmic controls and micro-edits to our reality will appear to happen by themselves, occurring almost “automagically.”
Greater Intelligence
In the beginning, we programmed computers in their language. Now they are speaking to us in our languages. They are seeing the world with their own eyes, and will soon apply Cognitive Computing to every aspect of our lives. Web 3.0 brings autonomous intelligence or “smartness” to everything.
Data Tier: Distributed Computing:
Distributed Ledgers and Edge Computing
Distributed Computing is the trend of pushing data storage and compute power closer and across multiple devices for speed and performance or farther away and more decentralized for greater trust.
At the bottom of the Web 3.0 Stack, at its Data Layer, are Distributed Ledger Technologies (DLT) like Blockchains and Directed Acyclic Graphs (DAGs)—decentralized and immutable ledgers—with the capacity to verify the provenance of information. DLTs like Blockchain offer a cryptographically secure, globally redundant method for storing records. These records are shared and updated across multiple computers (or nodes), distributed across the planet, and secured by cryptography. This creates a nearly unhackable, globally shared ledger of our records of events, activities, and transactions. Nodes can be financially incentivized to compete to validate each new record and punished if the data doesn’t match against others across the network. With Blockchains, the most recent records and transactions are bundled into “blocks” of data and then added to the “chain” of previous blocks once their accuracy is validated by all the nodes in the network.
The Directed Acyclic Graph or DAG is another form of Distributed Ledger. A DAG is a network of individual transactions that are linked to multiple other transactions. DAGs trade the chain-of-blocks of transactions for a tree-like structure that uses branches to
link one transaction to another, and to another, and so on. Some see DAGs as replacements for Blockchains, others as an enhancement. In either case, the combination of cryptography, social consensus, and innovative algorithms allow Distributed Ledger technologies to ensure “data provenance.” A new generation of Blockchain startups has arisen to offer solutions to address the age-old problem of trust between humans. Today we see DLT-based solutions emerging for everything from global financial transactions to medical record storage, supply chain authentication to digital asset sales, and even shared custody of both physical and digital collectibles.
Distributed Ledger Technologies enable a world where every identity, contract, transaction, and currency can be trusted and verified. Trust emerges from the inherent architecture of Distributed Ledgers and does not need to rely on a corporation, government, or similar body to act as a trusted central authority. It promises a new economy and information marketplaces that could be genuinely open and decentralized. Like many new technologies, it is not immune from issues of standardization, scalability, and performance issues, but if history has proven anything, it is that if the need is great enough, then these problems are eventually addressed and overcome. And the need is great.
The Data Tier in Web 3.0 must be secured and trustworthy for the Spatial Web to work in the long run. Because the hyper-realistic, hyper-personalized, highly immersive and experiential “realities” that spatial technologies create (projecting our information and imaginations into the world itself, and displaying right before our eyes) also means that it will become increasingly difficult to accept the old adage that “seeing is believing.”
Given recent advances in the computer-vision and rendering power of Artificial Intelligence and its ability to recognize and re-create everything from our faces, expressions, and voices to the objects and environments in the world around us, how we determine the real from the unreal , the true from the false, highlights the seriousness of Trust across the Spatial Web. Consider the fact that these technologies will not only have the ability to fake what we see or feel, the information and interactions of reality, but also will have the power to mine our information, influence us, advertise to us, and facilitate our transactions.
This poses a serious problem for our future, with individuals, societies, governments, and economies at risk. Critical data security foundations must be laid out in advance, followed by universal standards and policies that enable their adoption and support their enforcement. We must be able to reliably trust the who, what and “where” of our reality.
But how do we establish trust in a world where we can’t trust our senses? Built on top of the current insecure web architecture, the potential for these technologies to be hijacked and abused by malevolent actors both human and algorithmic presents us with an unacceptable risk and a threat.
Since the dawn of civilization, humans have been trying to create reliable records about the things or assets they value. Our civilizations, economies, laws, and codes rely on records that we can trust. These records must provide reliable answers to the following critical questions about a valuable item such as:
What is it? Who owns it? What can be done with it? And...where is it?
Provenance is what makes a record “trustable.” It is the historical record of the description, ownership, custody, and location of something. Many of our technological and societal inventions like letters, numbers, bookkeeping, contracts, maps, laws, banks, and governments have emerged to address and manage the provenance of our records in the physical world.
However, empires fall, banks crash, and companies dissolve. Like them, our data records are all vulnerable to the passing of time, and like these institutions, many of our historical records have turned to dust. In the Information Age, we’ve progressed from paper records in cabinets to digital files stored in databases spread across the globe. In Web 2.0, more and more of the personal information collected online and via mobile applications has been stored on “cloud” databases by an ever-growing list of third-party companies that we have unwittingly given our trust to, and who are tracking and selling our data while leaving it vulnerable to hacks.
Now, with the arrival of Distributed Ledger Technologies as the Data Tier of Web 3.0, humans finally have a cryptographically secure, globally redundant method for storing and authenticating records. These records are shared and updated across multiple computers (or nodes), decentralized across the planet, and secured by cryptography.
This provides Data Provenance which enables unprecedented Data Integrity.
Edge Computing is another distributed computing paradigm. With Edge Computing, computation is largely or completely performed on distributed device nodes known as smart devices or edge devices as opposed to primarily taking place in a centralized cloud environment. The term “edge” refers to the geographic distribution of computing nodes in the network as Internet of Things devices, which are at the “edge” of an enterprise, a city, or other location. The idea is to provide server resources, data analysis, and compute resources closer to data collection sources and IoT systems such as smart sensors and actuators. Edge computing is seen as critical to the realization of physical computing, smart cities, ubiquitous computing, augmented reality and cloud gaming, and digital transactions.
More Trust and Access
From the siloed office database of the pre-web era to the globally accessible web servers of Web 1.0, to the mobile accessibility facilitated by the cloud infrastructure of Web 2.0, and on to the Distributed Ledger and Edge Computing that will secure AR information and power the IoT in Web 3.0, the evolutionary trend of the Data Tier is one of increased decentralization and democratization of data. At each stage, we have increased access and a “circle of trust” to include more and more participants at greater scales. This is the inherent value created by decentralized and distributed systems.
THE INTEGRATED WEB 3.0 STACK
L ooking at the convergence of these various technologies through the lens of the Web 3.0 Stack makes it easier to see the benefits that can result from the integration of Spatial, Physical, Cognitive and Distributed technologies.
For example, in the Interface Tier, as the IoT provides us with sensor-enabled networks, Physical Computing will allow us to capture, measure, and communicate data regarding the performance of all physical activities. Robotics will perform any movement or transportation necessary in the physical world, from growing and picking our food to the manufacture and transportation of people and products globally.
Also in the Interface Tier, Spatial Computing as AR will provide the interface to a new world painted with a digital layer of information and contextual content that is constantly updated from the sensors of IoT, the intelligence of AI, and the new conditions set by smart contracts, secured by blockchains and incentivized by cryptocurrencies.
And Spatial Computing as VR will serve as a superior “pre-vis” experiential environment for the creation and exploration of our information, ideas, and imaginations. It will enable the most ideal virtual simulation or digital twin of any given object, environment, human or system.
In the Logic Tier, Cognitive Computing as Artificial Intelligence will provide analysis, prediction, and decision-making using Quantum Computing. Running simulations on our virtual digital twins will help determine ideal adaptations.
Also in the Logic Tier, Cognitive Computing as Smart Contracts can contextually govern, enforce, and execute all interactions and transactions via blockchain and distributed ledger networks informed the insights captured by the IoT and optimized by AI.
In the Data Tier, Distributed Computing as Distributed Ledgers and decentralized cryptocurrency platforms will maintain the trusted records for various people, places, things, and activities, and manage the storage and transfer of value across and between all parties. Distributed Computing as Edge and Mesh Networks will enable fast and powerful compute on location utilizing federated AI systems that can process information on device, sharing insights wit
h the community while ensuring personal privacy.
The benefits of these various exponential technologies working together in an open and interoperable way is truly astounding. And it’s because of this extraordinary potential that we, the authors, propose that Web 3.0 should be defined and described as a connected stack of technologies all working together as a part of a unified network—a network leading us across the various trends to The Spatial Web.
However, a network is not merely a converging stack of technologies. A key technology is necessary to enable the networks, something that connects them together and communicates between them. This technology is referred to as a “protocol” like Hypertext Transfer Protocol or HTTP that the world wide web uses to reference and communicate between the tiers of its stack.
The term protocol has many diverse definitions. In social etiquette, a protocol may refer to the acceptable behaviors in diplomacy or social affairs. A protocol in science is a predefined written procedural method of conducting experiments and in medicine it is the prescription that outlines how and when certain medicines or procedures should take place. In technology, a protocol is a common method for various objects or entities to communicate with each other. A cryptographic protocol is a method for encrypting messages. A blockchain protocol is a method for programming the consensus of datasets. A communication protocol is a defined set of rules and regulations that determine how data is transmitted in telecommunications and computer networking.