Quantum Meets HPC: Quantum computing is the future, but HPC is here to stay.

Stefano Mensa is a HPC Applications Specialist at the Hartree Centre and a quantum computing enthusiast. We caught up with him after the International Supercomputing Conference (ISC) 2022 to talk about his perspective on the role of quantum computing and what this means for the world of high performance computing (HPC).

ISC 2022 focuses on critical developments in HPC machine learning and high-performance data analytics. This year the conference saw a noticeable shift towards the exploration of quantum computing, with an emphasis on successful applications in science, commerce, and engineering.

Hello Stefano, welcome back from ISC! This year there were 15 sessions dedicated to the opportunities and challenges associated with quantum computing. What do you think this means for the future of HPC?

First, the good news: HPC is alive and well and it’s here to stay. So, no one of us is going to be out of the game anytime soon. A lot of things are going on in the field, both with respect to new hardware, software and applications.

In general, there is a commitment in reducing the environmental impact of HPC and new solutions are being developed towards the reduction of power consumption of chips, efficient cooling of HPC systems and energy-efficient workload scheduling. Also gaining momentum in HPC is digital twinning, which virtually represents a physical asset along with the real-time data related to it. This is nice to see as the Hartree Centre has been working on this for a while now, so we are well-equipped to take on challenges in this area.

We are home to experts in this field, with a visualisation team and state-of-the-art visualisation facilities to model digital twin assets and related data. An example of digital twinning, our team has developed our virtual wind tunnel, that companies can use to explore fluid dynamics/aerodynamics of digital assets like cars. For more about how digital twinning works you can contact our visualisation team.

For those who are still getting to grips with quantum computing, can you define it in an accessible way for us? And can you explain how it differs from HPC?

Quantum computing is a rapidly emerging technology that exploits the laws of quantum mechanics to solve problems that are too complex for classical computing. HPC facilities are the state-of-the-art classical computers and are used to solve very large real-world scientific problems. They use thousands of computers that are connected together to work towards the solution of a single problem. In some cases, the complexity of the problem is so big that sheer classical computational power is not sufficient to provide a solution, as some problems would still take too long to be solved. However, quantum computing leverages the laws of quantum mechanics, and allows computational scientists to explore these complex problems under a different perspective.

A table comparing some differences between quantum computing and supercomputing.

The consensus seems to be that quantum is the future of computing, but you are saying that HPC is “alive and well and here to stay” how do you see these two areas working in conjunction moving forwards?

Quantum needs HPC and vice-versa. Basically, with the current state of play in the field of quantum computing it is impossible to solve a task entirely on a quantum computer. This will be probably true for a long time.

There is wide-spread acknowledgment in the community that quantum processors must be considered as “accelerators,” and the remaining part of a hardcore simulation would still be performed on classical HPC.

This is great, however, it opens a whole new can of worms that you need to consider, like how to couple the quantum processing unit to a HPC platform or the requirements of the data transfer process and runtime.

Currently, HPC centres across the world are securing real Quantum Hardware and Quantum simulators. In Germany, Leibniz Supercomputing Centre (LRZ), has secured a 20 qubits system from IQM. Meanwhile in Japan, the University of Tokyo has secured a quantum machine from IBM, and scaled up to 53 qubits. Both computers are using superconducting chips, furthermore according to an Atos study, 76% HPC centres plan to get into quantum in the near future, and 71% will invest in on-premise infrastructures.

You were part of the workshop on Quantum and Hybrid Quantum-Classical Computing Approaches that the Hartree Centre co-organised at ISC. From the workshop and the talks across the conference what would you say are some of the challenges facing quantum moving forwards?

The first big hurdle for widespread adoption of quantum computing in industry is funding. Quantum hardware costs tens of millions of pounds. It is already difficult for government-funded or academic supercomputing centres to acquire funding for HPC procurement, as such, quantum hardware presents a bigger challenge, even more so for industry. To justify the expenditure, you need to demonstrate the impact and benefits it will bring. You can do that by working with organisations like the Hartree Centre, to develop proof-of-concept applications to solve real-world challenges and test them out on real quantum hardware.

Another hurdle to widespread adoption of quantum is access to skilled staff, especially quantum software engineers. At the Hartree Centre we have staff exploring quantum technology and industry applications to help organisations to access it, and navigate its possibilities, to discover the next step for their businesses.

Finally, and the most important in my opinion – as it is my field of work! – how do you integrate a quantum processing unit into a HPC facility? Since ISC it feels clear to me that this is where the big effort from HPC centres is going to be placed. This is an ambitious technical end-point for the scientific computing community. Currently, scientific communities mainly access quantum hardware via cloud interfaces and only a handful of facilities in the world have access to actual quantum computers on premise. The aim is to have a seamless integration of quantum hardware inside classical resources, such as a HPC compute node, to increase the computing power and efficiency.

From what you are describing there are still some steps until we reach the widespread use of quantum computing. What would you say are some of the priorities for organisations to address right now?

Obviously, we are at the infancy of an emerging technology. There are no standards yet for best practice in quantum computing, and each vendor is developing their own application programming interface and software development kit (SDK), as such there are no fixed rules. However, it looks like some SDKs are going to be long lived. Given that useful large quantum computing architectures are still far in the future, the need for reliable quantum simulators such as the Atos Quantum Learning Machine are more important than ever. Ultimately, the challenge is to develop emulators that accurately simulate physical systems, a sort of digital twin of a quantum computer.

These points and challenges were discussed across talks at ISC22 and our workshop. Adopting quantum is not going to be an easy road but if you are as excited about quantum I am, then I am confident we can tackle these challenges with enthusiasm and progress this exciting emerging technology.

If you would like to learn more about quantum computing and its applications, please visit our website. If you are interested in collaborating with the Hartree Centre on a project, please contact us.

Join Newsletter

Provide your details to receive regular updates from the STFC Hartree Centre.