Skip to Content Skip to Menu Skip to Footer

Highlights

The emergence of fault-tolerant quantum computing, able to detect and correct quantum errors in real time, is only a few years away. Yet significant gaps in industry knowledge and system design stand between today's data center blueprints and quantum computing integration.

For the foreseeable future, key differences inherent to quantum modality types will necessitate uniquely customized quantum data center environments.

Quantum system deployments remain primarily lab-based, although a shift is taking place as hyperscalers, telcos and governments begin to acquire and prioritize quantum computing infrastructure, with quantum hubs emerging in high-value, high-expertise locales. 

Quantum is no longer confined to the realm of science fiction. There has been a surge in quantum computing power and reliability in the last few years, with intermediate-scale systems already in use and fault-tolerant, universal machines anticipated as early as 2028. While the scientific progress behind these advancements is laudable, the next steps of the quantum journey will require a focus on engineering and product expertise. Quantum computers remain bulky, finicky and non-standard — a challenge as they look to evolve out of laboratory contexts and into the existing data center market. Quantum architectures that play nicely with classical compute infrastructure will have an advantage in quantum computing data center deployment — and may even dictate the future trajectory of quantum technology. 

Quantum computing moves out of the lab and into the boardroom

Quantum technology is based on the principles of quantum mechanics: the unique laws of physics that apply to sub-atomic particles and describe the often nonintuitive capabilities unlocked at ultra-small scales of science. The intricacies and abnormalities of quantum mechanics have fascinated scientists and writers for decades, while the potential for harnessing quantum properties for real-world use cases has tantalized business leaders for nearly as long.

Quantum computing — the application of quantum mechanical principles to process information — was initially conceptualized in the 1980s. After various stages of evolution, intermediate-scale quantum systems are now available for purchase and in use globally, with a projected 2025 market revenue of $2.5 billion. So far in 2025, global investment in quantum technology has surpassed $55 billion, and many quantum vendors plan to release fault-tolerant quantum systems (high-powered, commercially minded computers designed to run at scale) between 2028 and 2030.

After various stages of evolution, intermediate-scale quantum systems are now available for purchase and in use globally, with a projected 2025 market revenue of $2.5 billion.

While the technology is still evolving, many enterprises are already anticipating how quantum computing might lead to change. In a recent S&P survey of enterprise leaders, respondents overwhelmingly expected quantum computing to impact their businesses: 47% anticipated a major impact within the next three years, while 32% anticipated a moderate impact. Only one-fifth (21%) of survey respondents felt quantum computing wouldn't impact their business within the next three years.

While quantum computing capability is accelerating rapidly and directly influencing business decisions today, the next few years will present a new set of challenges for the burgeoning industry. Even the best technology will falter if access to it is constrained, and careful packaging and deployment will be critical to quantum computers' widespread uptake in commercial applications. Realizing the potential of quantum computers will require deployment at scale in quantum data centers around the world. 

The early days of quantum deployment

Given the industry's science-heavy roots, it's not surprising that the first quantum computers were developed and operated out of university labs. These systems, often proof-of-concept models theorized in doctoral and postdoctoral research, have evolved into the varied quantum startups driving the market today. The academic roots of quantum computing can still be seen in the deployment pattern of today's systems, which often originate in universities, high-performance compute (HPC) or national lab centers globally. More mature quantum computing modalities have already migrated onto the cloud, with remote access to quantum systems available via hyperscale cloud providers and the quantum vendors themselves. Many of today's intermediate-scale systems are also deployed in on-premises settings, with quantum simulators, annealers (built for specific types of optimization-related calculations rather than general-purpose computing) and superconducting quantum systems commonly deployed. Trapped-ion, photonic and neutral atom systems are also nudging into the space.

The next frontier for quantum compute deployment involves broad adoption into leased data centers where systems can be accessed by firms that may not want quantum-as-a-service from hyperscale cloud providers. While hybrid compute and quantum simulation environments are already housed in existing data centers, given that they run primarily on classical hardware, bringing full-scale quantum computers into traditional data centers presents a unique engineering challenge. Only a handful of quantum computers are currently deployed in non-hyperscale data centers, with noteworthy installations including Oxford Quantum Circuits systems hosted by Centersquare (formerly Cyxtera) and Equinix in the UK and Japan, respectively, and a Quandela photonic quantum processing unit (QPU) hosted in an OVHcloud data center in France.

Quantum deployment considerations

The unique technical requirements of quantum systems, depending on the quantum modality, hamper the transition of quantum compute into data centers. Quantum systems can vary substantially in size, weight, form factor, energy use, cooling requirements, environmental conditions, connection and port locations, and network connectivity requirements. There is no set standard for quantum system construction, making every quantum computing integration deployment an exercise in bespoke, customized construction.

With no set standard for quantum system construction, every deployment involves bespoke, customized construction.

Some of the most mature quantum system architectures include superconducting qubits (built using cryogenically cooled superconducting circuits); photonic systems (which use photons manipulated via optical components); neutral atom qubits (built with neutral atoms held in place with optical tweezers and manipulated using lasers); and trapped ion qubits (which use ions held in place by electromagnetic fields and manipulated by lasers). Across these four leading modalities, installation considerations vary greatly, even across different providers' systems built in the same modality. While all systems must deal with questions of scalability, control electronics, interconnections, power demand, and more, additional deployment considerations are unique to systems of different classes.

Superconducting 

Superconducting quantum systems require dilution refrigerators to cool qubits to millikelvin temperatures — which in turn requires a substantial amount of power to run the cryogenics. These qubits are highly sensitive to mechanical vibrations, requiring dampening systems to protect them from even slight movement. Superconducting qubits are also vulnerable to errors when exposed to stray radio frequencies and magnetic fields, requiring electromagnetic shielding for successful operation in quantum computing data centers.

Photonic 

Photonic systems require stable laser sources and associated equipment, necessitating vibration control of the system to maintain the alignment of precisely calibrated system components. These systems also require thermal stability, as even small temperature fluctuations can impact the coherence of the system. High-quality laser systems can be energy-intensive, although the power required is generally less than systems that need cryogenic cooling. Because they utilize optical equipment, photonic systems can leverage existing fiber-optic infrastructure, making quantum computing integration with existing infrastructure more viable.

Neutral atom 

Neutral atoms require precise laser systems, necessitating vibration control for proper use of the system. While they don't need cryogenic cooling, neutral atom systems operate within ultra-high vacuum chambers to keep the atoms isolated from environmental interactions. These chambers can add to power requirements for system operation. Neutral atom systems require magnetic field control to shield and stabilize the systems and avoid decoherence from stray magnetic fields.

Trapped ion 

Like their neutral atom counterparts, ion trap systems need stable, long-lasting vacuum chambers to keep ions isolated. Because they also use lasers, trapped ion systems require vibration isolation to keep the control infrastructure steady, along with magnetic field control to protect qubit coherence. Some ion trap systems operate at cryogenic temperatures to help reduce the noise impacting systems, although not all vendors use cryo in their designs.

Other 

Plenty of additional quantum architectures are in early stages of development, including cat qubits, topological qubits, and silicon and diamond spin systems. While further from widespread data center deployment than their more mature counterparts, these experimental modalities will undoubtedly require their own list of unique installation considerations as they mature and nudge into the quantum data center space.

Customization and compromise as quantum meets classical 

Given the fractured nature of quantum system design and the variable requirements needed to install different systems, moving QPUs into existing data center environments is a slow, bespoke construction process. To speed up deployment, compromise and innovation will need to come from both quantum vendors and data center designers.

Quantum vendors are already brainstorming a set of industry standards to help streamline data center adoption of quantum computing infrastructure, with consortia such as the Open Compute Project in the process of drafting a checklist of requirements for key quantum systems. Some vendors, such as Orca Computing, have already productized rack-mounted quantum computers designed specifically for easier deployment in data centers. 

While quantum computers' custom nature may slow quantum computing integration, data center builders are increasingly comfortable with design customization. Classical compute infrastructure can vary substantially in installation requirements, and the surge in AI demand has led to a rethinking of traditional data center environments, kick-starting changes in everything from cooling infrastructure to rack spacing. The timing for quantum computing couldn't be better — with the data center industry already in the throes of a renaissance, it becomes much easier to build new data centers designed to accommodate quantum computing infrastructure.

The emergence of quantum hubs 

While quantum computers are available around the world through various deployment methodologies, there has already been a consolidation of talent and accessibility into quantum hubs at strategic locations.

The United States offers a unique view into some of the forces driving geographic capability in quantum compute, with hubs forming in locations with deep quantum expertise, a good worker pipeline, local support and investment, and a supply chain present. Cities such as Chicago, Illinois; Boulder, Colorado; Boston, Massachusetts; Santa Barbara, California; Chattanooga, Tennessee; and Poughkeepsie, New York are emerging as leaders in quantum availability and development. 

While there is some overlap with traditional data center hubs, in many cases quantum compute is gaining traction in unique areas rather than following the path of data center development. We expect quantum computing data centers to remain near research hubs in the short term, while in the longer term quantum computing infrastructure may need to deploy closer to data generation sites to facilitate a wider range of use cases and hybrid quantum/classical computation.

Geopolitics of quantum deployments

Quantum technology — just like other high-impact technologies including nuclear capability, space travel, and AI — has broad implications for national security. Nations are jockeying to lead in the quantum space, investing in both technology and talent in a race to construct the world's first fault-tolerant quantum computer, and to protect classical systems against the possibility that someone else crosses that finish line first.

At a global scale, the location of quantum hubs is being driven by this quantum race, with geopolitical alliances and priorities being mirrored in quantum collaborations and partnerships. Key nations pursuing a strong quantum agenda include the United States, China, the United Kingdom, Germany, Canada, Japan, France, the Netherlands, Switzerland and Australia.

In the United States, the National Quantum Initiative Reauthorization Act, released Dec. 3, 2024, prioritized federal support for quantum research, workforce development, and public-private partnerships. In an introduction to the bill, Senator Maria Cantwell (D-WA) called quantum research and development "critical to our economic and national security."

Meanwhile, international partnerships such as the EU's QuantERA collaboration and the Australia-UK-US (AUKUS) Pillar II Quantum Arrangement (AQuA) hope to jointly advance research and development capabilities in both quantum computing infrastructure and the broader quantum technology space. 

"Quantum research and development is critical to our economic and national security."
 
Senator Maria Cantwell’s (D-WA) introduction to the "National Quantum Initiative Reauthorization Act," released Dec. 3, 2024. 

Looking forward

Quantum computing is hovering on the brink of its breakout moment, with paradigm-shifting implications for every industry that relies on compute. As quantum data centers continue to emerge, evolve and grow into their potential, today's idiosyncratic spread of architectures, requirements and restrictions is likely to standardize. Quantum systems that make the historical leap into deployable, usable products will become the face of the quantum technology future.

This article was authored by a cross-section of representatives from S&P Global and in certain circumstances external guest authors. The views expressed are those of the authors and do not necessarily reflect the views or positions of any entities they represent and are not necessarily reflected in the products and services those entities offer. This research is a publication of S&P Global and does not comment on current or future credit ratings or credit rating methodologies.


Content Type

Look Forward Council Theme