IBM Announces an Operational 50-Qubit Quantum Processor
It’s gotta be a great day at IBM Q when you can make an announcement so newsworthy that you can get away with telling the MIT Technology Review you’re “really proud of this” and that “it’s a big frickin’ deal.” That’s exactly what Dario Gil, IBM’s Director for AI and quantum computing, got to say last Friday. Why the bravado? IBM has built and demonstrated an operational 50-qubit quantum computer. “Big deal” is an understatement.
A quantum computer that big brings “Big Blue” within striking distance of today’s supercomputers is huge. It’s a major milestone in quantum computing that has IBM knocking on the door of quantum supremacy. Previous prototypes had too few qubits and too high of an error rate to best the performance of a classical supercomputer. Also, IBM itself moved the target this year by simulating a 56-qubit quantum computer on a traditional machine.
Do You Want Fries with That? Or Maybe Cloud Access to One of Our Quantum Computers?
All super awesome. But what really caught my attention was another little nugget. IBM also announced they will allow paying customers to access a new 20-qubit quantum computer by the end of 2017. This service will be offered via IBM’s existing cloud computing platform. Take a breath and put that into perspective. Can IBM truly give customers access to a functional 20-qubit quantum computer in the next 48 days? That’s either a pie-in-the-sky timeline, or we have grossly underestimated how far ahead IBM is with its quantum computing projects.
We won’t have to wait long to find out. I know I’ll be watching. This definitely makes IBM the most exciting player in the game for the next month and a half. On November 10, 1917, police arrested 41 suffragists in front of the White House trying to earn women the right to vote. On the other side of the world, Vladimir Lenin’s fledgling government was suspending freedom of the press during the October revolution. Who knows? Maybe 100 years from now we’ll look back on this announcement the way we do some of the early events in the development of the modern computer. How cool is that?