Book Read Free

Quantum Computing

Page 7

by Amit Katwala


  To do that, there are four main areas where quantum computing needs to develop over the next few decades: algorithms, hardware, software and skills. A report2 by the Boston Consulting Group (BCG) divides quantum algorithms, which will be the means by which we find the answers to those big problems, into two main categories: workhorses and purebreds. The latter are things like Shor’s algorithm – powerful tools that will offer an exponential speed-up for currently impossible problems, but which require extremely sensitive, specialised hardware with thousands or millions of physical qubits. They’re like Formula 1 cars – hugely powerful, but temperamental.

  The workhorses, on the other hand, include quantum-approximate optimisation algorithms, and the variational quantum eigensolver that IBM has been using in its battery research with Daimler. These algorithms are the ones that will dominate the NISQ era, as they’re flexible enough to run on the error-prone machines we currently have at our disposal. ‘Viability is a little bit inversely proportional to value,’ is how Boixo puts it.

  The challenge for the developers of quantum algorithms isn’t only about coming up with something that works, but about proving that it does something useful that you couldn’t do on a classical computer. Progress hasn’t halted on the classical front either – Google has a parallel team working on ways to improve classical computers so they can compete with its quantum chip, and supercomputers get faster and more powerful every year. ‘The dilemma is that very little can be proven about their speed-up performance with respect to classical algorithms until they are put to experimental testing,’ write the authors of the BCG report. Grover’s algorithm, which doesn’t provide an exponential speed-up, but still requires a fault-tolerant quantum computer, is in an unenviable class of its own, being both impractical to run, and not that much more useful than a supercomputer.

  That will change as hardware rapidly improves, according to proponents of what’s been dubbed the Dowling–Neven Law, named after the quantum physicists John Dowling, who first wrote about the idea of a Moore’s Law for quantum computing in his 2013 book, and Hartmut Neven, who led the Google research team which reached supremacy. In June 2019, Neven told Quanta magazine3 that the computational power available in quantum computers was growing at a double exponential rate. ‘It looks like nothing is happening, nothing is happening, and then whoops, suddenly you’re in a different world,’ he said. ‘That’s what we’re experiencing here.’

  But to keep that rate of development going, there will need to be big improvements on the hardware side, in terms of both the number of qubits and the ability to manufacture them accurately and at scale. Ideally, companies would be able to crank out qubits with the speed at which they can produce transistors, with Josephson junctions and ion traps rolling off the production line. In reality, the ultra-precise engineering required makes this difficult, and means the failure rate for qubits is unacceptably high.

  There are a number of start-ups operating in this area, tapping into the growing funding streams from enterprise and from government. The Finnish start-up IQM, for example, is focused on improving the gate times of quantum hardware so that more operations can be conducted in the short time before a qubit decoheres. ‘This is the same as what everyone else is working on,’ says IQM’s CEO Jan Goetz. ‘They’re trying to build a system that works with a high success rate and that is scalable at the same time.’

  At Microsoft, they’re working on ways to miniaturise and harden the control hardware, which currently sits in big racks alongside the cryostat. If they can minimise the number of lines running into the system from outside, they may be able to isolate their qubits better and reduce the amount of noise. Others think the requirement for superconducting qubits to be kept at extremely low temperatures will be an Achilles heel that means they’re impossible to scale to the level required. It takes a huge amount of energy and equipment to cool even a small amount of space down to close to absolute zero, and the more that area increases the more the energy required goes up – and the more lines you need going in to control all the additional qubits, which means more leakage and more cooling needed. It might be a problem that’s impossible to surmount – which could mean that superconducting qubits are a dead end, and ion traps are the way forward. Or it could be that neither of these technologies will work, and we’ll need to find an alternative – or some combination. In December 2020, a group of scientists at the University of Science and Technology in Hefei, China, claimed to have attained quantum supremacy on a photon-based quantum computer that they said was 10,000 times faster than Google’s Sycamore chip.

  An analysis by Nature4 found that in 2018 there were $173 million of investments in quantum companies, but that the number of deals for hardware firms was vastly outnumbered by investments in quantum software companies, who, it has to be remembered, are often writing algorithms for computers that don’t exist yet. That’s one of the reasons why some in the industry, including the influential John Preskill, fear a ‘quantum winter’, where progress will stall because of a lack of developments on the hardware side.

  The same thing happened with artificial intelligence, which was first theorised in the 1960s, but had to wait through the 1980s and most of the 1990s for the hardware to catch up before any real progress could be made. That would also track with the famous Gartner hype cycle, which traces the path of new technological developments from invention through the peak of inflated expectations, followed by the trough of disillusionment, to the final plateau of productivity. ‘There’s so much hype – so people say, “Let’s get involved, let’s do it,”’ said Matthew Brisse, research vice president of quantum computing at Gartner, speaking to Business Insider in 2019.5 ‘But then they see that the machines aren’t ready yet, and we have to wait five to ten years. There’s a real risk of that, so we’re monitoring that as well.’ Whurley agrees. ‘There’s this push to get quantum out – it is a revolutionary technology, and all these things that you’re hearing from all these major companies are correct,’ he says. ‘But we’re leading people to believe that things may be a lot further mature than they actually are.’

  We will probably have to wait a while longer for a fault-tolerant quantum computer capable of running some of the most exciting ‘purebred’ algorithms. In the meantime, the most likely applications of quantum-based hardware might have very little to do with computing, according to Rhys Lewis of the UK’s National Physical Laboratory, which is one of dozens of organisations benefiting from £153 million of government investment into quantum. These could include atomic clocks, for incredibly accurate timekeeping and improved forms of GPS location mapping (which relies on precise timekeeping). The ability to control ions can also be used to make more accurate sensors, which could be useful for ‘seeing things which are invisible at the moment, like gravitational fields’, says Lewis. Quantum sensors could be used to see what’s underground without digging up the road, for testing materials for tiny defects, or to measure the delicate magnetic fields generated by the heart or the brain.

  Hello, world

  The first computers were programmed by hand, first by rewiring switches and lights, then with holes punched in cards. Today, programmers rarely have to think about the tiny transistors that are actually executing their commands. While physicists and engineers grapple with the hardware problems of quantum, computer scientists are racing ahead, developing the software programs and infrastructure that will sit around quantum devices in the NISQ era and beyond. ‘If you have an internal combustion engine, it’s not a car,’ says Microsoft’s Chetan Nayak. ‘A car has got to have wheels and a steering wheel and dashboard, and it has to have a GPS system obviously. There’s a lot to it, and having fifty per cent or seventy-five per cent of a car is still not a car.’

  So now, Google, Microsoft, IBM and others (including the Berkeley-based Rigetti) are all working on the layers that will sit above quantum computers in the same way that compilers and operating systems shield you from the 1s and 0s powering your laptop
. ‘Right now the programs we write are almost machine code, very close to the hardware,’ says Google’s Marissa Giustina. ‘We don’t have any of the high-level tools where you can abstract away the hardware.’

  At Microsoft, Krysta Svore, who has a background in computer science, has helped develop Q# (pronounced ‘q-sharp’), one of the first programming languages designed specifically for dealing with the quirks of quantum computers of all types. It’s designed to be able to easily switch from being run on an ion-trap-based quantum computer to a virtual one running on classical hardware, to one that uses Microsoft’s elusive topological qubits, if or when it finally figures out how to make them. ‘We know quantum computers are going to develop,’ says Svore. ‘But that same code is going to endure.’

  Google’s Cirq and IBM’s Qiskit are both open-source frameworks that will help researchers develop algorithms in the NISQ era. As we saw in the last chapter, companies are also powering ahead with commercial applications: IBM is already working with more than 100 companies, including ExxonMobil, Barclays and Samsung, on practical applications; Microsoft has Azure Quantum, which allows its clients to plug into IonQ’s trapped-ion quantum computer and superconducting qubits being developed by Connecticut-based QCI. These developments will, says IonQ’s Peter Chapman, enable people to start writing the ‘Hello, World!’ programs for quantum, referring to the simple on-screen message which is one of the first things people learn how to produce when they’re first being taught how to code. Eventually, they’ll help people who don’t have a degree in quantum physics or computer science (or both) access the intricacies of quantum devices. Skills are the final piece of the puzzle. ‘There are no university programmes,’ says Giustina. ‘There’s not a field yet. To reach this vision we’ll need a lot more expertise in that programming.’

  There are lots of parallels between the early days of quantum computing and classical computing – some of the early devices even look similar, with their tangles of wires reaching from floor to ceiling. But where classical computers were confined to academic labs and military facilities for decades and only really reached the masses with the rise of personal computing in the 1990s, quantum computers will be readily accessible to all – not in person, but via the cloud. That could mean a vast difference in the speed at which new applications can be developed, as programmers and even interested amateurs can access qubits to simply mess around with and try out ideas.

  Eventually, though, the end user of a quantum computer will probably be unaware that they’re actually using one. You’ll never have a quantum chip in your own device – instead, you’ll access their powers via the cloud. Quantum processors of various different types – superconducting, trapped-ion, simulated – will form part of an arsenal of technologies that are automatically selected. ‘Our vision is you and me having a problem and we just use the normal software we use, and then the cloud behind, which has access to all these kinds of computers and decides which one to run the problem on,’ says IBM’s Riel.

  Of course, it’s unlikely that the average person will ever need to directly interact with a quantum computer, in the same way that you don’t access the world’s fastest supercomputers to check email or do word-processing today. There’ll never be an iPhone Q with a quantum chip inside. But quantum computers could help find the battery materials that help that phone run for longer, and the optimum circuit design for maximum efficiency, and the best search algorithm for its web browser, and the quickest route for the drone to take when it delivers it to your door.

  Quantum advantage could be five years away, or five decades. There’s a danger of overhyping the achievements so far – it’s still possible that there is some fundamental barrier that will prevent the number of qubits being scaled up, or that the noise will simply become insurmountable above a certain level. Artur Ekert, whose talk in 1994 kickstarted the race to quantum supremacy and helped get the field to this point, thinks we’ll still need some major technological breakthrough akin to the development of the transistor, which transformed conventional computers from the 1960s onwards. With quantum, we are not in the days of rival chipmakers battling to produce the best possible hardware; we’re in the days of vacuum tubes and mechanical gates, and researchers wondering whether the thing they’re trying to do is even possible. In one sense, Ekert confides that he actually hopes it isn’t possible. ‘It would be an even more wonderful scenario if we cannot build a quantum computer for truly fundamental reasons – if actually it’s impossible, because of some new, truly fundamental laws of physics,’ he says.

  A practical, error-corrected quantum computer could change the world. It could revolutionise medicine, accelerate artificial intelligence and upend cryptography, by using the uncertainty of quantum physics to its advantage. But the battle to build one could reveal fundamental truths about the universe itself. ‘This is not a competition between companies,’ reflects the Google quantum researcher Yu Chen. ‘It’s our technology against nature.’

  Glossary

  Classical computer

  Almost every computer ever made – from wartime codebreakers to the phone in your pocket – essentially works in the same way, with millions of tiny switches called bits.

  Decoherence

  When a qubit falls out of the delicate state of superposition because of interference or noise from the environment, it is said to have decohered.

  Entanglement

  The way two particles can become linked, or ‘entangled’, so that anything you do to one happens to the other, no matter the distance separating them.

  Moore’s Law

  In 1965 Gordon Moore, the co-founder of Intel, predicted a doubling of the number of switches (known as transistors) able to fit on a chip every two years.

  NISQ

  An acronym for ‘noisy intermediate scale quantum’ – a term coined by the physicist John Preskill – referring to an era where quantum computers exist, but they’re not yet robust enough to fulfil their full promise.

  Quantum advantage

  The term preferred by IBM and Microsoft, which refers to the point at which quantum computers can do useful things that wouldn’t be possible any other way.

  Quantum supremacy

  The term coined in 2012 by the physicist John Preskill to describe the point at which quantum computers can do things classical computers can’t (regardless of whether those things are useful).

  Qubit

  Instead of representing just 1 or 0, like a bit in an ordinary computer chip, a qubit – short for ‘quantum bit’ – can represent both at the same time.

  Superconducting qubit

  Google and IBM are building ‘superconducting qubits’, which rely on extreme cold and a ring of metal with a nanometre gap called a Josephson junction, which changes the way electrons behave. Microwave pulses are used to flip the qubit between states. IBM varies the frequency of these pulses to adjust for manufacturing variations, while Google uses a magnetic field to ‘tune’ the qubits.

  Superposition

  The state of being both 1 and 0 at the same time is called superposition – think of it as a flipped coin that hasn’t landed yet.

  Topological qubit

  Superconducting qubits only last for fractions of a second. At Microsoft, they’re working on ‘topological qubits’, which store the information in several places at once. This should make them last longer, and mean they’re more powerful – but they might be impossible to make.

  Variational quantum algorithm

  Variational quantum algorithms use a hybrid of quantum and classical computers to speed up calculations. Rather than trying to do a whole calculation using a quantum computer with limited qubits, variational quantum algorithms make a best guess at the solution with the resources available, and then hand over the result to a classical computer. Splitting the quantum processing over smaller, independent steps means you can run calculations with fewer, noisier qubits than would otherwise be required.

  Acknowledgements


  This book is intended to be a primer to quantum computing for people who don’t have a background in maths or physics. I’ve tried to avoid using mathematical formulae and complex diagrams as far as possible, and aimed to be clear and simple without oversimplifying. This was tricky – one of the first things an interviewee ever told me about quantum computers was that they ‘defy analogy’.

  So, although any errors and omissions are mine alone, I’ve relied throughout on the patience of experts. I’m particularly grateful to the people who gave up their time to be interviewed for the book, to set up visits to research labs, or to answer my questions over email over the course of several months. Thank you to all my interviewees, but particularly to Rob Young and Andrea Rocchetto for poring over the first draft, and to my editor, Nigel Wilcockson.

  Bibliography

  Brown, J., The Quest for the Quantum Computer, (Touchstone, 2001)

  Dowling, J., Schrodinger’s Killer App: Race to Build the World’s First Quantum Computer (CRC Press, 2013)

  Gribbin, J., Computing with Quantum Cats: From Colossus to Qubits (Transworld Digital, 2013)

  Johnson, G., A Shortcut Through Time: The Path to a Quantum Computer (Vintage Digital, 2011)

  Notes

  Notes to Introduction pages 1–7

  1 https://www.discovermagazine.com/technology/the-best-computer-in-all-possible-worlds

 

‹ Prev