Quantum Progress - One Quanta at a Time
Quantum computing is universally touted as an immense shift in our approach to solving big problems, and yet the basis of the power it brings comes from the accumulated effects of the behavior of particles that are infinitesimally small. We need to keep this in mind when we look at how to wield this capability, and the key is understanding how to bring big and small together.
We are in the same place in time as NASA was with their 7090 in the 1960s—big machines that can do a lot, but just can’t do enough to really solve the problems worth solving without a little help. We are in the same place as my little Hewlett-Packard Pack Mate 386 computer—it needed that x87 math coprocessor for me to be able to play Falcon 3.0. We are in the same place even now where my cluster of AWS servers needs GPUs in order to train that pathology image classifier or NLP Transformer model at scale. We need to make sure that we think about quantum computers in the same way.
It is not the quantum computer that will solve the problem for us—it is how we use the quantum computer as part of larger systems to build solutions that solve problems. To that end, the road from here forward to that day must be paved one step at a time, building frameworks and foundations. The power of our current computing paradigm comes not from the amazing power of a single processor. It comes from the swarm of interconnected scaffolds of operating systems, networks, libraries, languages, algorithms, data ¬stores—all nature of necessaries layered upon each other in a digital web of life. Like caretaker bees, we buzz around collecting data like nectar and turning it into waxy cells of storage, and the sweet honey of knowledge in a great beehive of endless ones and zeroes.
But how do we know where to fit quantum computing into the blueprint for our computational hives? How do we recognize the wiggling pattern of the worker bees that signals where to find that new source—that problem calling out for quantum capability like an untapped flower? How do we measure the cost and risk of setting out for that promised flower to justify the decision to make the journey?
It is not the quantum computer that will solve the problem for us—it is how we use the quantum computer as part of larger systems to build solutions that solve problems
Lucas Siow of biotech company Protein Qure lays out a very nice model in his Medium article “Quantum Value: A new hope” to differentiate the concepts of quantum value from quantum advantage and quantum supremacy. Quantum value can be amassed from finding smaller problems and solving them repeatedly and incrementally in the context of larger systems at grand scale. Look for problems where the probabilistic nature of quantum computing can help focus your parallel computation efforts. Look for problems where the constraint satisfaction problem you’re trying to solve can be improved simply by shifting the field of error and looking where it overlaps with errors in classical methods—and where it doesn’t. Simply being able to provide a faster or better optimizer or probability model may be enough to yield that value.
So how do we identify the path forward? We look for opportunities to build the hives, learn from the worker bees, and make it easier for us to turn that quantum value to business value. We seek out the types of problems and patterns where quantum computing allows us to turn quantum value into business advantage. And, if you’re a worker bee like me, keep plugging along looking for that chance to connect your hive to that shiny new coprocessor.