Tackling the Qubit Mapping Problem for NISQ-Era Quantum DevicesA paper by Prof. Yufei Ding This paper was published on the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems, (ASPLOS 2019). Quantum computing (QC) has become the new "race to the moon" pursued with national pride and tremendous investments, spurring interest and motivation across academia and industry. Big companies like Google, Microsoft, IBM have initialized or deepened their efforts on QC. Los Alamos National Laboratory, NASA's Ames Research Center, and many other national labs are planning to install the latest quantum computer and studying the superiority of QC in areas like encryption and artificial intelligence. 

 

It is well received that we will soon enter the Noisy Intermediate-Scale Quantum (NISQ) era, in which quantum supremacy, an advantage over classical computing, can be demonstrated. Nevertheless, a problem with the current NISQ state is that it is hard to map large quantum applications onto quantum hardware with a high fidelity, for their limited computation resources (e.g., the number of qubits, qubit lifetime, and inter-qubit connections). Prior work has tried to bridge the gap between quantum algorithms and quantum devices via qubit mapping but suffers from severe scalability issues.  The research group under the direction of Prof. Yufei Ding at UCSB explores the synergy between quantum computing and programming system optimization. The preliminary result is published at ASPLOS’19. It manages to achieve exceptional scalability advantages over existing qubit mapping methods and remarkable program fidelity enhancement on NISQ era quantum devices, through novel compiler optimization. This preliminary study have prompted the field to rethink the potential synergy of traditional computer system technologies and the emerging quantum computing optimization.