When early reports had surfaced that Google had achieved quantum supremacy, it was met with skepticism. Even to those that were well-versed in quantum computing, this came as a surprise. To others, like me, the significance of this feat went over our heads but let me assure you that this accomplishment is very important for the days to come.

##### What it means

Google had demonstrated that their experimental quantum computer performed a specific task that would have otherwise been not realistically solved by classical computers. This experiment was essentially designed to prove that quantum computers can do some tasks (whether or not they are useful) that even the fastest supercomputers can’t do or would take centuries doing. This is what it means to establish *quantum supremacy*.

To do this, you need two things: capable quantum computing hardware and a problem with sufficiently large computational-complexity. Many companies have been racing to this milestone.

With IBM’s Q, Intel’s Tangle Lake and Google’s Sycamore there were increasingly competent quantum chips to perform such a task. What is important to note is that these chips were being incrementally improved upon, only after decades of research and development did these companies reach a stage where their processors had multiple qubits (quantum bits). The processor that was first to task was Google’s Sycamore processor which had 54 qubits out of which 53 were functional. As of today, this is one of the most advanced quantum chips that we have and it is capable of 2_{53} combinations. [Reports that Google is making a 72 qubits processor were also confirmed when they named the new processor Bristlecone. ]

Once they had the hardware, Google decided to take the route of simulating the probability distribution of increasingly complex quantum circuits to establish quantum supremacy. While there are other ways of establish quantum supremacy, this is considered the best type of experiment as it captures exponential scaling and with the power of quantum computing. The process was to create a random quantum circuit, run the simulation on their Sycamore processor, see if the classical computer could keep and then add more complexity to the circuit until the classical computer couldn’t.

The classical computer of their choice was the Summit supercomputer (at Oak Ridge National Laboratory). This eventually failed at a quantum circuit which took Google’s chip just 200 seconds (3 min 20 sec) while it was estimated to take the Summit supercomputer about 10,000 years. While the problem itself wasn’t especially useful, it was an arbitrary one chosen to emphasize the limitations of classical computing and the power of quantum computing.

Google confirmed the milestone on October 23, 2019 in a blog post and a research paper published on science journal Nature. Days later, IBM hit back with a blog post of their own claiming that Google’s declarations were misleading and that IBM’s estimate for Summit (using a different method) to complete the same computation was actually just 2.5 days and not 10,000 years. Thus reducing their claims of quantum supremacy to just *quantum advantage*. While this study by IBM hasn’t been ratified, the work of both factions will be dissected. Nonetheless, what Google achieved is a milestone.

##### Why it’s important

This experiment is major for two reasons. Even though it will take years (possibly decades) before we get quantum laptops and phones, in the coming years classical computing will improve. As a result of quantum computing research, we will likely see algorithmic breakthroughs resulting in drastic improvements to classical computing.

More importantly, this was a sign that quantum computing wasn’t just some passing fad. This achievement shows just how far we have gotten and have given investors and VCs the confidence they need to invest in quantum computing. This experiment pierced through what was thought to be a tech bubble and one that many didn’t think was possible to overcome in their lifetime. With this, Google has established quantum computing as a technology that has moved past books and discussions and finally onto the real world.