Overcoming quantum computing systems and scaling issues

Quantum computing has the potential to revolutionize a variety of industries, from expediting the development of life-saving drugs to enhancing the security of financial services. However, the physical obstacles immediately become apparent for firms that are aggressively seeking it.

For instance, consulting company Deloitte compared the state of quantum computing today to airplane technology in the early 1900s in its predictions for 2022 in the areas of technology, media, and telecommunication. The Wright brothers had discovered that they could launch an aircraft, but they were unable to cover any significant distances at that time.

According to Deloitte’s analysts, the lack of use cases, funding, resources, or even advancement is not the issue with quantum computing’s usefulness. The problem is that present quantum computers lack the necessary computing capacity to solve issues that are beyond the capabilities of conventional [super]computers.                                 

In this blog post we are trying to understand the current situation around quantum computing and analyze the attitude of scientists and engineers.  

 

Understanding the technical prerequisites for quantum computing 

 

It’s challenging to go beyond what is achievable with conventional computing systems, especially in terms of hardware.

For instance, a Pistoia Alliance study found that 11% of respondents from the medical sciences industry alone mentioned a lack of access to quantum infrastructure as a barrier to adoption. This is a significant barrier, particularly for businesses that rely on independent tools that need intricate integration.

Much the same was proposed in a recent outlook on the use of quantum computers in business by consulting company McKinsey & Company. In addition to more effective data input and output, vendors must also provide quantum-level RAM and better qubit communication.

The business stated that without fully error-corrected, fault-tolerant quantum computing, a quantum computer is unable to provide precise, mathematically accurate outputs. This is the most significant milestone.

In order to deal with “noise,” a quantum computer will notably need resilience from redundancy, which will increase the number of physical qubits needed for each logical qubit employed in the application. 

 

Making quantum systems scalable 

 

The requirement to scale infrastructure in order to support rising workload and performance demands is another commonality between quantum and conventional computing.

For example, quantum researchers are now working with anywhere from a few to as many as 100 qubits. Tens of thousands, hundreds of thousands, or even more qubits are required to realize the end-user applications in finance or materials science.

It becomes extremely difficult to synchronize the numerous control signals needed to operate massive quantum processors. Fortunately, this overlaps with Keysight expertise that has been used for many years in wireless communications and radar test simulating. Keysight provides customers with a customizable FPGA sandbox built from a full Quantum IP library for qubit control and readout to further support the quantum community.

The complete control system must be well-integrated and simple to use in order to function as a reliable quantum solution. In order to do this objective, Keysight provides Labber software, which combines hardware and FPGA firmware into a quantum-specific solution that lowers the learning curve and enables teams to grow. Furthermore, Labber software contributes to the development of a potent tool for scientists and developers to investigate the next-generation quantum sensors, qubits, and materials. 

 

The search for a quantum-specific technology stack 

 

Companies may be intimidated by the effort required to fully leverage quantum’s promise in their industry, which is understandable considering that doing so may require combining several different solutions.

Although vendors use the word “platform” in various ways, one relevant definition is a collection of preconfigured elements that can be standardized, enabling businesses to explore new business prospects and grow them as needed.

Software tools for error-correction, scaling, modeling, and optimization applications will all be part of a genuine quantum platform. The quantity and quality of those qubits determine a quantum computer’s ability to conduct useful computations, which necessitates overcoming flaws inherent in such sensitive designs. As a result, Keysight purchased Quantum Benchmark, whose cutting-edge technologies aid in enhancing runtime performance in such an error-prone setting.

With capabilities for error correction, instrument control, optimization, and automation, software systems like Quantum Benchmark and Labber ultimately increase user confidence in the procedure and results.

The ever-expanding spectrum of application cases that could help the industry may not be realized for several years by quantum computers. Numerous issues remain unresolved. However, the overall tendencies are comparable to what occurred in classical computing, where a tech stack evolves, making quantum research and qubit development more approachable. Resources like Keysight University are also enabling more people to advance their understanding of quantum infrastructure and how to scale systems as required.