10.37 – Multiple occupancy: your house is now a yielding asset
10.38 – You wait ages for one: your car could make you money too
Look out: regulators and the taxman on the case
Where business are naturally monopolistic there will always be a tendency over time to use barriers to entry to create monopoly profits and potentially indulge in the type of behaviour that inevitably attracts the attention of regulators. Just as U.S. Steel, the railroads and Standard Oil became the targets of antitrust action, so too will the platforms that evolve into natural monopolies. It is not just anti-competitive behaviour that will attract scrutiny. The sensitive nature of the personal and commercial information collated by Internet-based companies marks them as permanent targets for hackers ranging from hobbyists to criminals and foreign intelligence agencies. Cyber security requirements can therefore only grow over time as criminals find digital crime to be more efficient than traditional methods. Identity theft and misrepresentation are the keys to fraud and the Internet lexicon is already replete with terms such as ‘spear phishing’, ‘clone phishing’ and ‘whaling’, which describe the attempts to open the doors to illegally accessing funds. It is highly likely that regulators will impose more stringent duties of care on the confidentiality and protection of personal data. The antitrust aspect of this relates to the role of the Internet in removing consumer and producer surplus, i.e. the difference between what consumers are willing to pay and what they actually pay and where producers would be willing to sell and where they actually sell. Price-comparison sites remove producer surplus to the extent that consumers are able to access this knowledge. Hitherto this has been the dominant influence but as greater information on the customer accumulates it can be used to inform the seller as to the price the customer might be willing to pay and to price accordingly. Whether it be Uber pricing higher during peak periods, or websites that differentially price according to the characteristics of the user, there is the clear potential for discriminatory pricing. No regulator would wish to act where prices are being lowered, but where personal information is being used to create ‘artificially’ higher pricing then it is not hard to see regulatory action following, particularly if the supplier of this information had some form of monopoly access to the data.
Government involvement in commerce has not been restricted to regulation of markets. Military requirements have typically been the most significant drivers of scientific development. From ancient times the first responsibility of government has been to protect its citizens. This has underpinned expenditure on defence from ballistics to the Internet. Paying for defence by raising taxes has been the natural counterpart. The Internet was the child of the Cold War and in its early stages the focus was on its physical deployment, efficiency and bandwidth. While development of these areas has continued, the focus of commercial development has shifted to applications and software. A by-product of this has been the ability to utilise transfer pricing to reduce effective tax rates in a way that was rarely possible before.
Under current accounting rules, set against the backdrop of competing tax regimes, companies now effectively have freedom to ascribe profits to the geographic jurisdiction of their choice. Revenues may be generated in one country but costs can be allocated to another as payments for intellectual property, such that profits are transferred from a higher-tax regime to a lower one. Companies legitimately argue that they are only following the guidance of the relevant tax codes and have done nothing illegal. Individual governments counter with the view that this is all smoke and mirrors. In the ‘national interest’, companies should pay a ‘fair’ share of taxes relative to revenues generated within their borders. The logical conclusion is that if redress under the current tax rules proves unsuccessful, the rules will have to be changed. This process is already evident in the European Union, with cases pending against Apple, Amazon and Google. For some it is an example of economic nationalism. This is certainly how the companies involved will wish it to be viewed. The underlying cause is that a tax regime created for companies in an era of physical assets and products is proving less than optimal in a digital one. It is slightly ironic that the infrastructure of the Internet was created by one arm of government but has given rise to consequences that require the intervention of another.
The divorce of ownership and control
In the early stages of the Internet, new software businesses tended to have relatively light capital requirements. The core element was the intellectual capital of the founders, often still operating in an academic or near-academic environment, and because of their cerebral nature frequently characterised as ‘geeks’. Future development involving commercial viability or maintaining market leadership required more substantive amounts of capital and was the cue for private investors and venture capitalists to enter. As in other technology cycles, companies that lacked continued access to capital or could not reach self-sustaining cash flow fast enough risked seeing their market positions swiftly competed away. The first-mover advantage that so exercised the early pioneers frequently proved to be a mirage. When funding dried up after the TMT crash, it was not first-entrant status that mattered, but who had the resources to survive and become the ‘last man standing’. The current era of freely available low-cost debt will not last indefinitely and it is not a wild guess to assume that when the cycle eventually reverses the Darwinian experience of the TMT crash will repeat itself.
For those who have been able to navigate the boom-and-bust cycle successfully, the rewards have been phenomenal. We have seen the emergence of a new generation of exceptionally wealthy and powerful individuals to rival the robber barons of the late 19th century. For Vanderbilt, Bell, Rockefeller and Ford, read Bezos, Brin/Page and Zuckerberg. Over time it is a safe bet that these new ‘titans’ will attract both the praise and opprobrium that their predecessors historically attracted.
What is remarkable is that in their desire to participate in the success of these new companies, institutional investors have been willing to jettison standard corporate governance principles, including accepting classes of shares that enable the founders to retain unequal voting rights. The original owners argue that their track record justifies such voting structures and (more convincingly) that investors were not forced to invest. Even where management has produced outstanding performance, the question for the future is whether these structures will stand the test of time as the stewards of the company change. History suggests that there are very few companies with a commercial track record so exceptional that the investor should be willing to indefinitely surrender voting control. Indeed there are clear examples of companies where warning signs are emerging on the need for vigilance on corporate governance matters and management accountability. The owners of a business have a duty to hold management to account for their stewardship and this duty should only be abrogated in very exceptional circumstances – and certainly not because of a desire to participate in shorter-term share-price movements.
New professions
The explosion in data created by the Internet and the low cost of storage has commercial value only if it can be properly analysed. The market has responded to this need by developing a variety of techniques, ranging from traditional statistical and econometric methods to tools such as artificial intelligence. Although this has been ongoing for some time, we are only at the rudimentary stage. Advertising and retailing have been the most affected so far. Next on the agenda will be financial services, health and the professions. Looking for patterns in data has a long-established history, primarily driven by the needs of military intelligence, which includes the remarkable achievements of the cryptographers under Alan Turing at Bletchley Park during World War II.
These techniques will be used in a much more ubiquitous manner to allow the identification of data patterns. The combination of algorithm development and ever more specific processors to improve the efficiency of searches will allow much more rapid analysis of data patterns which hith
erto were difficult to discern. In healthcare, the history of medical knowledge can be assimilated and over time artificial intelligence techniques will create decision rules that both enhance diagnosis and lead to new preventive discoveries. To give another example, much of the legal profession is based on the understanding of historic precedent and existing case law. This is exactly the type of analysis at which AI excels. Once a computer can find and rank all relevant cases, the role of the lawyer who once did that job inevitably changes. As taxi drivers in London have discovered, the painstaking process of acquiring ‘the knowledge’ (passing a rigorous test about the roads of the city) no longer protects their income when an Uber driver can use a smartphone to fulfil exactly the same function. In none of these cases does the technology make professionals redundant, but it does potentially remove a large slice of the costs that underpin their current earnings. If any of this feels familiar it is because it is uncannily close to the vision set out by Vannevar Bush in 1945. Over 70 years on, we appear to have reached a point where the critical elements are in place. Processing power, cloud storage and the World Wide Web together provide the tools that allow analytical techniques such as AI to make some of Bush’s vision reality.
As with all new exciting areas, ‘big data’ has frequently been overhyped in terms of the immediacy of its scale and impact. In itself data is an amorphous mass. The data has to be cleaned, put in a form where it can be interrogated and the results then interpreted. The whole process is riven with value judgements and is some distance from the dry abstract process many imagine. Moreover, there are many different elements and skill sets required – from those skilled in writing superfast algorithms to database experts and neural network specialists. This is an area that looks set for exponential growth.
So, although it may be smaller in an economic sense than the potential disruption to the financial sector, disruption to the professions from these new factors could be equally meaningful. For those seeking to encroach on the tasks performed by the professions there will be the barrier presented by their professional bodies and their legally protected positions. The insurgents will view this as one of the last bastions of restrictive practices. There will undoubtedly be huge tensions between the goals of preserving quality of service and trust and the ability to reduce costs. Almost certainly the threat to traditional firms will come from within, as competitors try to usurp the historic dominance of the current market leaders.
Darwin targets the slothful and unwary
When timeshare computing was first envisaged, it was seen as a solution to the prohibitive cost of processing power. When those costs started their rapid decline, it produced a new business environment in which processing power and storage were no longer regarded as of primary importance. Since then the world has moved on again, and improvements in communication (both in bandwidth and access) have produced another new dynamic. Although the term ‘cloud computing’ has a warmer feel to it than timeshare computing, it is nevertheless driven by the same economics. The sellers of cloud-based services can point to many benefits: flexibility over scale, more robust disaster recovery, lower hardware costs, lower maintenance costs and greater security once data is held centrally. So long as there is access to communications, the workplace can be replicated in another location at relatively low cost.
If scale is no longer as important, then disintermediation will be a constant theme. The implications of this are potentially profound. The Darwinian process of evolutionary change may soon be challenging targets which hitherto have been perceived as untouchable. Existing businesses already have many redundancy and legacy issues to deal with in information technology. They will also find it difficult to embrace the benefits that the cloud and modern programming languages provide. There is no question that for many businesses the primary interface with customers will be a mobile device rather than a physical location. Companies unable to adapt to this world will be in serious trouble. For investors one of the key areas of investigation, where such change is endemic, will be whether established companies face institutional and structural barriers that threaten their ability to survive.
The one certainty is that, as with previous technology bubbles, the continued development of the Internet and the Darwinian forces that it has unleashed will take many industries and professions in new directions.
* * *
92 Quoted in www.cyber.law.harvard.edu
93 Ibid.
94 In Red Herring Magazine, 1998. web.archive.org/web/19980610100009/www.redherring.com/mag/issue55/economics.html
95 V. Bush, ‘As We May Think’, Atlantic, July 1945.
96 B. Winston, Media, Technology and Society: A History from the Telegraph to the Internet, London and New York: Routledge, 1998, p.333.
97 T. Berners-Lee, Weaving the Web, London: Texere, 2000.
98 C. H. Ferguson, High Stakes, No Prisoners, New York: Times Business, Random House Inc., 1999.
99 Ferguson (1999).
100 D. A. Kaplan, The Silicon Boys, New York: William Morrow and Co., 1999, p.306.
101 D. L. Yohn, ‘A Tale of Two Brands: Yahoo’s Mistakes vs. Google’s Mastery’, Wharton School, 2016.
102 M. Aron, ‘Why Google beat Yahoo in the war for the Internet’, Techcrunch, 22 May 2016.
103 ‘The PageRank Citation Ranking: Bringing Order to the Web’, Stanford University, 29 January 1998.
104 Google Inc., press release, 23 October 2000.
105 J. Angwin, Stealing MySpace: The battle to control the most popular website in America, New York: Random House, 2009.
106 T. Philippon, ‘Has the US Finance Industry Become Less Efficient? On the Theory and Measurement of Financial Intermediation’, American Economic Review, vol. 105, no. 4, April 2015, pp.25–26.
107 R. Greenwood and D. Scharfstein, ‘The Growth of Finance’, Journal of Economic Perspectives, vol. 27, Spring 2013, pp.4–5.
108 A. Turner, ‘What do banks do? Why do credit booms and busts occur and what can public policy do about it?’, presentation at The Future of Finance Conference, London School of Economics, 14 July 2010.
109 Many executives in listed Internet stocks were ‘locked in’ for a number of years as a condition of their listing and by the time the restrictions expired had seen their shares become almost worthless.
110 Fortune, January 2016.
chapter 11
The Anatomy of Technology Investing
“In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”
Galileo Galilei
“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”
Roy Amara
The persistence of change
Technological change has been a more or less constant feature of modern industrial life since the latter years of the 18th century. It has been the driving force behind productivity growth, the creator of new products and the facilitator that has opened up new markets for existing products. All of these effects are key elements in investment decision-making and portfolio management. In general, identifying the winning technology comes much earlier than identifying the winning companies. This means that the ‘losers’ become clearer much earlier so long as one first understands the direction of technology and the areas that will be impacted by its deployment.
There are periods when technological change is largely incremental, simply extending what has gone before. These periods are not ones that tend to create wholescale shifts in stock market valuations. Other periods of change, however, can be fairly described as revolutionary. These tend to be the times when a technology is first introduced. These periods do create wholescale shifts in valuations and can be of fundamental importance. If they develop into ‘bubble periods’, as has happened on several occasions, the initial reaction is either exaggerated or incorrect and is eventually replaced by a more sober assessment when economic reality reasserts itself. Whatever the outcome,
the investor has to try to understand both what is happening and where it is likely to develop in the future.
In hindsight the course of progress towards ever faster, more efficient and more useful machines seems effortless and inexorable. Yet one of the first things that strikes the historian, and struck me at once in my research, was how haphazard the timing and pace of technological development often is. It also soon became clear that the process of capital-raising and the interaction between entrepreneurs, innovators and financial markets has often followed a bumpy and convoluted path. The emergence of the Internet as an agent of change in the modern world is no different in this respect from any other so-called great technological breakthrough of the recent past, although there are also some unique features of the Internet bubble that mark it out from other similar episodes. This concluding chapter summarises some of the lessons of recent experience and the implications for investors and the economy, based on a model of how technology, change and markets interact.
Clear in retrospect, but rarely in advance
Engines That Move Markets (2nd Ed) Page 60