Quantum Computing - Industry and Tech Brief

Quantum Computing - Industry and Tech Brief
The IBM Q System One

Introduction


A challenge presented to conventional computing frameworks, quantum computing (hereafter may be known as QC) has taken form as an emerging technology over the last few years, with a history of just over a few decades. Given its promising use cases and novel approach to computation, there is limited uncertainty on its future utility in many industry applications. However, the technology is still in its infancy in regards to development, adoption and viability and there are many discoveries left to be uncovered in this industry space. Let’s dive into a little of what we know.

Industry Scope and Brief History


The term Quantum Computing sounds like something you might hear from a science-fiction novel. Could very well be, but the scientific basis is quite fascinating and has great practical potential. We define this subfield of computer science as one based on quantum theory. According to Oxford Dictionary, QC makes use of the "quantum states of subatomic particles to store and process information." The technology uses quantum mechanics to solve highly complex, multi-dimensional problems, much faster and through a completely different means than standard computing. This is where the real value proposition lies in comparison to traditional computational methods which can take a significant amount of time to process certain forms of information.

The history is quite recent. The first mentions of a quantum mechanical model for computation purposes date back to the 1980s, where Paul Benioff and Richard Feynman present Benioff's formative research on the subject matter at the first Conference on the Physics of Computation at MIT.

Within a few years, many followed their footsteps to discover other aspects in this new school of physics and technology - which brought us to 2011, the year where the first 128 qubit quantum computer by D Wave Systems and sold for a whopping 10 million USD. Shortly after in 2017, Google laid claim to having achieved quantum supremacy, where their own quantum processor was able to solve a simulation problem in a small fraction of the time than a traditional super-computer (200 seconds vs 10,000 years!).

Since then, traditional computing practice has made many resounding feats of its own, particularly in the AI space, which has brought some of the momentum back to the traditional approach. But we find that in tandem, the theoretical approach and framework of QC has made many strides in development as well. But there is still a long way to go.

This is because fidelity of performance is still quite low and there are only a handful of options and services available for purchase. Major corporations and startups are investing significant sums of capital into development, even though there has been a bit of a decrease in overall startup investment over the last two years (peak of 2.35 billion USD in 2022 - article mentions fall likely due to prominence of generative AI and exploration phase of the technology is still ongoing). Lawmakers seem to think that the technology holds significant promise to society, judging by public investments from administrations in China, Germany, and the U.K.

The Technology


Theory:

I really liked the explainer provided by the Caltech Science Exchange on the different aspects of the quantum computer. Key terms to keep in mind to help with understanding QC include the qubit, superposition, and entanglement.

To explain succinctly: typical computers use bits to encode information in the form of a binary (0s and 1s). A bit cannot be a 0 and a 1 at once, it is either one or the other, therefore its output is deterministic. QCs are different, in that the bits (in this case, qubits or quantum bits) are in a state of superposition and can be in the state of 0 and 1 simultaneously until its position is measured. Without getting too into detail, the result is a probabilistic output that occurs due to the qubit collapsing from its superposition (0 and 1) state to either a 0 or 1. Qubits also have the ability to be entangled to one another, so when measuring their position, one may find them to be related and correlated across even vast distances of space. This allows scientists to simulate multiple outcomes simultaneously as opposed to one by one. It is a key to solving problems that require a relatively high computational power and time.

Framework of Technology:

There are some different technology types being used to develop the QC framework. I have highlighted a few here:

Superconducting: Circuits made of superconducting materials that are cooled to very low temperatures to keep them from getting too "excited." The technology is borrowed from a developed and mature industry, but its hard to keep a super low temperature and keep the qubits steady.

Trapped Ions: As the name suggests, these ions are caged in an electromagnetic field and manipulated using laser beams. Coherence/stability times are long and the quantum gates remain steady, but they are slower in processing and require complex experimentation.

Photonic: Information is encoded in a photon and manipulated with optical devices. No temperature adjustments needed, and current optical communication technology is an easy integrator, but photons are prone to loss and cannot always be detected when measured.

Some others of particular interest are topological, neutral atom and quantum dots. Each face their own challenges but offer their various advantages to the development of the technology.

Going Forward:

When speaking of qubits, previous years discoveries and advancements have been observed mainly in quantity over quality. Newer, long-term objectives can be observed of the contrary, including that in the realm of error tolerance. QCs are very sensitive to disruption and noise. This can render error rates substantially higher in comparison to their classical counterparts. There is a subset of development in this space dedicated to research in error correction, that is to detect an error in the transmission and/or storage of the data and correct the error faster than it occurs. IBM highlights some of the work being done in the quantum error correction space.

Other goals include practical application definition - which is a sure key to adoption - as well as finding the correct talent to aid in the deployment. As companies look to tackle much more difficult issues related to manufacturing, production and optimization, training and coaching from universities and within organizations will likely follow in their own accord.

Market Overview


Sizing: Valuators are hesitant to give any current market size definition at this stage of progression. The market is barely a market, as much of the value QC is currently generating is, in its most philosophical form, intrinsic. As for a projection, McKinsey Digital's research unit estimates a potential market size of 173 billion USD by 2040, composed of three different submarkets of quantum computing, communication, and sensing. Quantum computing takes the largest market share amongst the 3.

Industry Impact: Some use cases have been explored in multiple industries: pharmaceuticals, chemicals, automotive, financial services as well as logistics, but as a result of very limited successful products in market and the relative novelty of the technology, there are still many years between now and widespread adoption. A common challenge posed is also that traditional everyday applications would be limited. This will likely render the potential number of users much smaller than the traditional computing domain. However, the complex scope of problems it can solve will likely make the technology interesting for certain high profit-bearing or high-profile projects.

If looking at the prospect of opening a position in this sector, it should be for the long term. This can actually be a very attractive form of investment for those willing to wait it out. As an added benefit, there are some options that are not horribly expensive (compared to other tech stocks) to secure a position in.

Major Players and Adopters:

The leaders in this space have been tackling some problems by working with special purposes groups to advance their research. I have highlighted a few, but there are many more in the mix:

IBM: A heavy investor in the space, having launched its Quantum System Two computer in 2022 as well as Heron, a 133-qubit processor and Condor, a 1,121-qubit processor. They are also the developers of Qiskit, the first open-source stack for quantum computing.

Google Quantum AI: Sycamore, the quantum computer used to achieve "quantum supremacy." Development of Cirq, their open-source framework for developing quantum algorithms.

IonQ: Worth mentioning as one of the few pure QC companies, and an early-stage startup. Primarily working on Trapped Ion technology.

FormFactor: Offers products and services for superconducting computers to allow systems to function at the ultra-low temperatures necessary for proper usage.

Quantinuum: A merger between Cambridge Quantum and HoneyWell (54% ownership). They announced in April 2024 that their trapped ion computer have reached the highest published level of volume as well as the 2-qubit gate fidelity of 99.914%.

Many other startups and companies are dabbling in the mix. These are just the ones I would tend to watch out for given their degree of investment. Let me know if you think of others as worth mentioning.

Certain companies have also started to dabble in product development with the use of QC. This article highlights some of the major ones and their projects. I find that materials manufacturing might be one of the more prominent development chains to watch, as R&D cycles for industrial and commercial scale products are very long, very expensive and can very well be trimmed down with more powerful simulations. Note that they do give the example of BMW's metal fabrication as one industry exercise.

Conclusions


There are probably many forms of good to come from further development in this space. Considering the immense value propositions that computationally heavy solutions (AI and ML) have developed for many industry spaces over the years, there is clearly something to be said about being able to process information much quicker and more efficiently. QC can definitely be a game changer for just that.

This comes with certain challenges in the technology, such as:

  • Questions of fidelity, in particular due to the fragility of qubits and how susceptible they are to error due to noise and disruption;
  • Resources necessary are quite sophisticated as well as expensive. Providing the correct working environment (potentially ultra cold or ultra precise) is an important aspect; and
  • Workforce training to develop programs and hardware at this level of complexity.

But just imagine the types opportunities to arise in the form of:

  • Expansive scientific discovery in fractions of the time, where solutions and simulations that would typical take years to produce can be done within seconds to minutes;
  • Manufacturing optimization and production, of particular importance when we think of costly development cycles associated with certain industry sectors; and
  • Other forms of fundamental research, where quantum computing can actually help us understand quantum mechanics in itself, and therefore some of the principles of physics that governs the world around us.

Thanks for the read ! Open to discussion through LinkedIn or avedisenergy@gmail.com.