Quantum Computing

Home > Other > Quantum Computing > Page 3
Quantum Computing Page 3

by Amit Katwala


  That is impossible, but by collecting together trillions of these molecules in a liquid, the researchers were able to control them en masse, and create largely the same effect. They used a nuclear magnetic resonance machine (akin to the MRI scanner in a hospital) to monitor the liquid, while hitting it with controlled electromagnetic pulses that flipped certain molecules in the sample into the desired state. Even if only a small proportion actually entered this state, it didn’t matter, because their combined signal was still strong enough to stand out against the random background noise of the rest of the sample. ‘In effect, the “readout” from the computer averaged over all the molecules,’ explains John Gribbin in Computing with Quantum Cats,3 ‘with the huge number of right answers effectively swamping the much smaller number of errors introduced by decoherence and other difficulties.’

  But this approach has largely fallen by the wayside, along with others ranging from quantum dots to computers made of diamond. Ion traps, too, are largely out of favour, despite their early promise (although some companies, including IonQ, are still pursuing them). Today, therefore, the major players in the field have largely coalesced around one group of technologies – superconducting qubits.

  The cold path to supremacy

  Most PhD students are happy simply to finish their thesis. Brian Josephson’s final project helped him win a share of the Nobel Prize.

  In 1962, the 22-year-old Cambridge student was studying superconductivity – a phenomenon where the electrical resistance of some materials drops to zero when they’re cooled below a certain temperature. (Resistance measures how easily a current can flow through a material – copper has low resistance, rubber high; the higher the resistance, the more voltage needed to keep a current flowing.) In the course of his research, Josephson discovered that, in accordance with the laws of quantum physics, if you joined two superconductors together through a weak link of another material, you could create a current that flowed through them forever without any further voltage being applied. These structures, which are now known as Josephson junctions, have a wide range of applications in electronics and computing.

  Although they are quantum devices, you don’t need a microscope to see them – some Josephson junctions are as big as a wedding ring, but they’re now made small enough to fit on a silicon chip. They also have another useful property called non-linearity, which enables them to be restricted to just two energy states – representing 1 and 0 – regardless of how much energy is put into them. They can, writes Gribbin, ‘be used as fast, ultra-sensitive switches which can be turned on and off using light’.4

  Superconducting qubits, composed of these Josephson junctions, offer a technology that promises to be easier to scale and miniaturise than ion traps, as it meshes more neatly with the silicon-based architecture inside almost every classical computer on the planet. What started out as a hedge position at Google, which had initially focused its efforts on other approaches, is now the frontrunner for both the search giant and IBM, its major quantum rival.

  ‘Every approach has positive and negative aspects,’ says Sergio Boixo, who works on quantum theory at Google, and who designed the task that proved quantum supremacy. ‘This approach – superconducting qubits – has always been looked at as being the closest analogue to the classical integrated circuit that powers our lives. Once we get past certain shortcomings that come along with this package, we can scale up just like classical computing. We’re going to get all of those benefits and we just have to overcome the negatives.’ ‘Superconducting qubits are big enough that you can control them,’ adds Megrant. ‘Other systems have lower intrinsic errors, but if they’re too small you can’t get the circuitry to control them properly.’

  Both Google and IBM use microwave pulses to control their qubits, and alter their relative probability of being in the 0 or 1 state. But their approaches do differ slightly. ‘Tiny fabrication defects mean that no two qubits respond to pulses of exactly the same frequency,’ explains Gideon Lichfield in an article published in MIT Technology Review.5 ‘There are two solutions to this: vary the frequency of the pulses to find each qubit’s sweet spot, like jiggling a badly cut key in a lock until it opens; or use magnetic fields to “tune” each qubit to the right frequency.’

  Google uses the second approach – and by passing a current through the system researchers can modify the thresholds for each state and the strengths of the connection between qubits, enabling them to become entangled. Its qubits are faster and more precise, but perhaps not as reliable in the long term as IBM’s simpler, more stable approach of varying the pulse frequency.

  Solving this technical challenge has proved to be only half the problem. The phenomena that superconducting qubits rely on only emerge at incredibly low temperatures, which is just one of the reasons they are so difficult to get right. ‘We’ve front-loaded most of our problems in this field,’ Megrant says. ‘All of the problems show up on day one when you try to make a single qubit. You have to work very hard to have it perform well.’

  Google’s 53-qubit Sycamore chip sits at the bottom of one of the huge cryostats in the lab, where its temperature is cryogenically maintained. Like its predecessor Bristlecone, Sycamore was manufactured at the University of California, Santa Barbara, sandwiched together like an Oreo to create the fragile Josephson junction. Under the microscope, each qubit looks like a tiny, silvery plus sign. Thin lines lead out to the edge of the chip: eventually, they connect up to the tangle of blue wires that carry and amplify the faint signal from the qubit to one of the racks of machines surrounding each cryostat.

  It takes up to two weeks to wire up one of the machines. To increase the number of qubits, Google will need to find a new wiring solution that takes up less space, or find a way of controlling the qubit from inside the cryostat. ‘A lot of things will just break if you try to cool down to 10 millikelvin,’ says Megrant. Both Microsoft and Google are now working on building classical chips that can operate at lower temperatures in order to control the qubits without adding interference.

  Harry Potter and the missing Majorana

  Over the last ten years, there has been an escalating race in the number of qubits being claimed by different companies. In 2016, Google simulated a hydrogen molecule with a 9-bit quantum computer. In 2017, Intel reached 17 qubits, and IBM built a 50-qubit chip that could maintain its quantum state for 50 microseconds. In 2018, Google unveiled Bristlecone, its 72-qubit processor, and, in 2019, IBM launched its first commercial quantum computer – the 20-qubit IBM Q System One, to great media fanfare.

  D-Wave, a Canada-based company, has always been an outlier. It has been selling commercial quantum computers since the late 1990s, and claims to have several thousand ‘annealing qubits’ in its devices, but these are based on a different technology that’s only useful for certain types of problems. IonQ’s Peter Chapman likens it to the difference between a graphics calculator and a computer.

  Rob Young, director of Lancaster University’s Quantum Technology Centre, accuses some companies of a lack of credibility in the way they’ve announced these developments. ‘There’s a real question of how you translate your findings without being over-sensational,’ he says.

  It’s becoming clear that the number of qubits isn’t nearly as important as what Heike Riel, head of the science and technology department at IBM Research Europe, calls ‘quantum volume’ – how much useful computation you can do before your qubits decohere. Quantum volume is a combination of the number of qubits, the way those qubits are wired up, and how accurate and reliable the qubits are. ‘The number of qubits is of course important, but it’s not everything,’ Riel says. ‘Quantum volume tells you how much useful computation you can do with a device before the error will mask your result.’

  Even with all the technology Google employs to shield its qubits from interference, the error rate is still astonishingly high. Qubits routinely flip into the wrong state, or decohere before they’re supposed to. It’s possible to correct for
those errors, but to do it, you need more qubits – and more qubits to correct for those qubits.

  There’s also an important distinction to be drawn between the number of physical qubits on a quantum chip, and the number of logical qubits that those physical qubits enable you to operate with. Google’s Sycamore chip may have 53 physical qubits, for instance, but because of the need to use a number of those qubits for error correction, it’s not technically equivalent to a 53-qubit quantum computer. ‘If you want hundreds of logical qubits you need tens of thousands of physical qubits,’ says Peter Shor, a hugely influential quantum theorist, ‘and we’re very far away from that.’

  With current error rates, you would need thousands or millions of qubits to run algorithms that might be useful in the real world. That’s why John Preskill, the physicist who coined the term ‘quantum supremacy’, has dubbed this era ‘noisy intermediate scale quantum’ (NISQ), in recognition of the fact that we’re a long way off practical devices.

  It’s also why Microsoft is convinced that superconducting qubits are a dead end. ‘We do not see a line of sight there to commercial-scale quantum computers that could solve today’s unsolvable problems,’ says Chetan Nayak. Instead, at Microsoft’s sprawling headquarters in Redmond, Washington (so big that the quickest way between meetings is by Uber), researchers are testing a cryostat that looks very similar to Google’s, but which will – if things go to plan – host a very different type of quantum processor. If Google’s ascent up the quantum mountain is steep, Microsoft’s is potentially impossible. Instead of superconducting qubits, it’s trying to harness a different type of qubit known as a ‘topological qubit’. The only problem is that topological qubits may not actually exist. ‘Maybe we’re on a marathon instead of a sprint,’ says Krysta Svore, Microsoft’s general manager for quantum software.

  Topological qubits, if they can be created, offer a more robust alternative to superconducting qubits that are harder to knock out of superposition. As a result, you’d need ten times fewer qubits. They’re qubits based on a theoretical particle called a Majorana particle, which encodes the state of the qubit in several places at once. Nayak explains it using a Harry Potter analogy. ‘The main villain of the story, Voldemort, splits his soul into seven pieces called Horcruxes, and spreads out those Horcruxes so he can’t be killed,’ he says. ‘What we’re doing with our topological qubit is spreading our qubit out over six Majoranas. Those are our Horcruxes. By doing something to just one or another of them locally, you actually can’t kill off Voldemort. Our qubit is still going to be there.’

  The only problem is that scientists still aren’t entirely sure that Majorana particles actually exist. They’ve been theorised about since the 1930s, but the experimental evidence isn’t watertight. Still, speaking in January 2020, Nayak and Svore were confident. ‘We’re not hunting in the dark for this and hoping to find this,’ said Nayak. ‘We’re being guided by simulations.’

  Although superconducting qubits seem to have the upper hand at present, it’s still unclear which technology will support the quantum computers of the future. ‘We’re not in the days of AMD vs Intel. We’re in the days of vacuum tubes versus little mechanical gates,’ says Whurley. Different technologies will have breakthroughs, while others stagnate, and it could be decades before a clear path emerges. ‘They’ll go back and forth, and maybe eventually at some point down the line we’ll have our dream of a millions-of-qubits machine and a general-purpose quantum computer.’

  As a case in point, recent developments in trapped-ion computing could give it the edge for developing quantum devices with thousands or millions of qubits. In June 2020, Universal Quantum – a spin-out from the University of Sussex – announced that it had picked up £3.6 million of funding for a new form of trapped-ion computing. Its approach seems to combine the best bits of both Google and IBM’s superconducting approach, and the ion trap method used by IonQ.

  Winfried Hensinger, co-founder and chief scientist at Universal Quantum, points out that the big limiting factor for Google’s device isn’t the quantum chip itself, but the cooling system, which would need to grow bigger and bigger to chill down a larger area as the number of qubits increased. Because Universal Quantum uses ion trap technology, such extreme cooling isn’t necessary – and it has found a clever way of getting around the problem of scalability. Instead of using laser beams – to support millions of qubits in a full-scale quantum device would require millions of them – it’s using electric fields to control the qubit. Microwaves are used to move the qubit between energy states – but, rather than trying to accurately hit individual qubits with microwave pulses, UQ’s technology instead uses electric fields to nudge the qubits into a state where they’re receptive to global microwave pulses. It’s not dissimilar to the way Google uses microwave pulses to tune its superconducting qubits. The company is working on creating modules that can be connected together to scale up quickly: to send messages from module to module, an ion is coerced into jumping across a tiny gap between modules, like neurotransmitters jumping between synapses in the brain.

  Whatever the way forward, the last 40 years of theorising and 25 years of cutting-edge hardware development have brought the field to a crucially important point. Quantum supremacy means that the algorithms being developed can be tested and improved – and that quantum computing can begin to have a small real-world impact on everything from medicine to traffic control while we wait for the hardware to catch up. ‘Quantum supremacy is a signal for us to say we’ve entered the era of NISQ,’ says Tony Megrant. ‘Now you have this large-enough system, and instead of using your laptop you can just go and play.’

  3

  Exponential power

  In May 2000, the Clay Mathematics Institute in Peterborough, New Hampshire, established a set of seven seemingly intractable mathematical problems, and offered a $1 million prize for anyone who could solve them. To date, only one of these Millennium Prize Problems – the Poincaré conjecture – has been successfully cracked, by the elusive Russian mathematician Grigori Perelman, who declined the prize money and then retired from the field for good.

  Arguably the most important of all of the challenges set by the institute is the P versus NP problem. P refers to all the mathematical problems which can be easily solved by computers in ‘polynomial time’ – a slightly tricky concept for non-mathematicians, but which basically means anything that a supercomputer like Summit could handle in a reasonable amount of time. NP problems, on the other hand, are ones which aren’t easily solvable in a reasonable amount of time, but where it’s easy to check whether you’ve got the right solution if you’re presented with a potential answer.

  A classic example of an NP problem is factoring – breaking down a large number into its smallest possible prime factors. So, for instance, the prime factors of 35 are 7 and 5 – two numbers that you can multiply together to make 35, but which aren’t divisible by any other numbers themselves. Computationally, figuring out prime factors is quite difficult, because no real patterns seem to emerge, so you have to manually rule out first all the multiples of 2, then the multiples of 3, and so on until you’re left with only the primes. For small numbers, that’s fine, but for larger ones the challenge grows exponentially. But it’s always easy to check that you’ve got the right answer – simply multiply the primes together and see if the result matches your original number.

  The debate mathematicians have been having for years is whether P equals NP, or not. The $1 million prize is for someone who can definitely prove that every problem that’s easy to verify is easy to solve, or the opposite. If P does equal NP, then it means that a raft of seemingly impossible problems are solvable by computers, but we just haven’t found the right algorithm yet. If P doesn’t equal NP, then it means that there are certain computational challenges that may always remain unresolvable and beyond our reach. Part of the allure of quantum computing is that it drags some NP problems closer to being solvable in a reasonable amount of time, provid
ed that we can find the right kind of algorithm.

  One of the key misunderstandings about quantum computers is that people think of them as simply a faster type of supercomputer – a new technology that we can apply to all the problems we have today and get answers thousands or millions of times faster. Sadly, that isn’t the case. ‘The types of problem where quantum computers can have an advantage are NP problems that have more “structure” than a general NP problem,’ says Andrea Rocchetto, a quantum computing researcher at the University of Texas at Austin: ‘something that you can exploit to reduce the number of computational steps you need to solve the problem.’

  Factoring, Rocchetto says, is in the boundary between problems that have so much structure that it’s easy to find an efficient solution (P problems) and problems where there’s hardly any structure for even a quantum computer to get its teeth into. ‘Quantum computers need structure,’ he says. ‘Without structure there is no magic – it’s not some kind of black box that, whatever problem you put into it, it will come out with an answer.’ The onus is on mathematicians and computer scientists to find clever algorithms that can detect and exploit the structure of NP problems – to bring them within the reach of quantum computers. In 1994, a researcher at Bell Labs did just that – and his discovery could have huge implications for quantum computing, cryptography and much, much more.

  The Q factor

  Peter Shor is a mathematician, an amateur poet and the creator of one of the most influential algorithms in history. In 1994, Shor was working at Bell Labs – the famed research and development wing of the telecoms company – where he attended a couple of talks on quantum encryption. He had studied quantum physics at university, and was intrigued by the potential applications of quantum computing, which – as described earlier – arise from the curious property of quantum interference, where the different potential paths that a photon could take interfere with one another to create the end result. Although quantum computers were still years away from becoming a physical reality, Shor and others started thinking about the algorithms that would run on them, and what they could do.

 

‹ Prev