Potential projects for summer 2025 are:
Quantum error correction is an important aspect of quantum computing, crucial for ensuring computational accuracy. In a quantum system, various physical phenomena, such as noise and interference, can lead to quantum errors. These errors manifest as syndromes, which are subsequently corrected to ensure sufficient accuracy. The correction scheme involves detecting these syndromes and optimizing error correction signals to ensure reliable computation. Ternary content addressable memory (TCAM) is a widely used component for storing these syndromes, utilizing a look-up-table-based method for both error detection and the generation of correction signals. Traditionally, TCAM has been implemented using CMOS devices designed for room temperature, but this conventional approach presents challenges when integrated into quantum systems, which often operate at cryogenic temperatures. In our research, we plan to design a TCAM memory array based on unique devices functioning at cryogenic temperatures. The use of cryogenic devices aligns more effectively with the low-temperature requirements of quantum computing, thus providing better integration with quantum algorithms. A cryogenic TCAM can potentially reduce noise, improve energy efficiency, and enhance the optimization of error correction signals, resulting in more accurate and robust quantum computations. By exploring this novel approach, we aim to bridge the gap between classical memory technologies and the needs of emerging quantum algorithms, ultimately contributing to the improved stability and performance of quantum systems.
Recent hybrid quantum-classical algorithms can approximately solve combinatorial optimization (CO) problems, however there are several challenges for using these techniques such as barren plateaus and mixer design. Recent work has shown that quantum phase estimation (QPE) is a promising approach to solve CO problems using fault tolerant quantum computers. In this approach, potential solutions to the CO problem are encoded in phases, and constructive and destructive interference are used to find an optimal solution to the problem. In this project, students will encode particular CO problems in a QPE algorithm. Once encodings for some problems have been identified, students will work on generalizing their encoding technique to solve an arbitrary CO problem. This will involve using linear algebra, graph theory, and programming in Python.
Unmanned aerial vehicles (UAVs) have the potential to change how we study remote locations, deliver packages, inspect infrastructure, and more. There are several complex optimization problems that need to be solved in order to make UAVs a reality. In this project, we will explore how variational quantum algorithms can be used to solve UAV design and/or operation optimization problems, such as planning UAV routes. These problems can be written as mixed-integer linear programs, which can be extremely challenging to solve. Key steps of this project include finding ways to map the optimization problems to quadratic unconstrained binary optimization (QUBO) problems, encoding these QUBOs into quantum Hamiltonians, and testing the algorithm on example problems. Students involved in this project will use linear algebra, modeling techniques, and Python programming.
The power system industry has an economic impact in the multi- hundreds of billions and contains interesting optimization problems that must be solved quickly. Current classical techniques can solve these sensitive optimization problems inexactly- increasing solution accuracy would result in savings in the billions of dollars. This project will explore how variational quantum algorithms can be used to solve power systems problems. The REU student will convert existing classical quadratic unconstrained binary optimization (QUBOs) problems used to represent power systems problems into Hamiltonians used in variational quantum algorithms. They will test small sample problems on quantum hardware, analyze the scaling of the algorithms, and compare to classical algorithm performance. Student with interest in and background in integer programming, discrete optimization, python programming language, or quantum computing are especially encouraged to select this project.
Preprocessing can be valuable tool in reducing the size and the complexity of combinatorial optimization problems. We will explore the role of quantum computing to augment some of the existing methods. In particular, we will try take advantage of a quantum computer’s ability to produce a random sample of optimal and near optimal solutions. By examining these solutions, we hope to identify a subset of variables that can be fixed, reducing the size of the overall problem. This process can be repeated until we arrive at a sufficiently small optimization problem that can either be solved by conventional algorithms or solved by more advanced quantum algorithms.
Modern healthcare systems are investing heavily in advanced sensing to increase information visibility in the clinical environment, generating massive amounts of medical data. Medical examinations such as Electrocardiograms (ECGs) or Electroencephalograms (EEGs) produce high-resolution waveform data, capturing temporal dynamics of physiological activities. Such data contains critical information about the patient’s health conditions but they are complexly structured with significant challenges in effective feature extraction. Rapid advancements in quantum computing provide an unprecedented opportunity to significantly accelerate the research on data-driven predictive modeling in healthcare and further bring inexpensive precision medicine closer to clinical practice. The objective of this project is to develop effective quantum convolutional neural network models to investigate waveform data in clinical settings for automated disease prediction.
Quantum materials a class of materials whose electronic properties are governed by the principles of quantum mechanics, stemming from the collective behavior of electrons at the atomic level. Quantum materials exhibit exotic and distinctive electronic, magnetic, and structural characteristics, forming the foundations for cutting-edge quantum technologies such as quantum computing, quantum sensing, quantum communications, and quantum information storage. The initial step in exploring these intriguing applications involves the synthesis of quantum materials. As part of the Research Experience for Undergraduates (REU) program, a student will actively participate in the synthesis of high-quality quantum materials in the forms of thin films, nanowires, and nanostructures using molecular beam epitaxy. Additionally, the student will be involved in characterizing the synthesized quantum materials to confirm their key characteristics such as strong electron correlations, topological properties, quantum entanglement, and emergent phenomena.
This project combines two exotic materials, superfluid helium and nanoporous two-dimensional polymers, to develop a sensor device that takes advantage of the strange macroscopic quantum behavior of liquid helium at temperatures near absolute zero. A nanoporous membrane will provide a weak link between two reservoirs of superfluid helium, and enable a quantum coupling known as a Josephson junction. The resulting transport of superfluid between reservoirs is predicted to have unusual properties described by quantum mechanics. Quantum sensors based on this architecture will allow measurement of phenomena that is undetectable to conventional sensors, from minute fluctuations in the Earth’s rotation to the fingerprints of dark matter. The technical applications for of this quantum sensor are widespread. They are found in disciplines spanning geodesy (GPS and geological exploration), gravitation and general relativity (dark matter and cosmology), metrology (quantum standards for physical quantities), and quantum information (quantum computing). Exploring Josephson junctions for superfluid helium can enable new capabilities that are useful in several scientific disciplines.
Quantum imaginary time evolution (QITE) is a quantum-inspired classical algorithm that can be used to solve combinatorial optimization problems including MaxCut. Preliminary data suggests that QITE gives better MaxCut solutions if particular edges are removed from the problem definition and then slowly reintroduced. It is unclear how these edges can be determined. Students working on this project will determine why removing edges from graphs and then adding them back into the problem definition later gives better solutions to the MaxCut problem when solved with QITE. For this project, students will solve MaxCut using QITE on several graphs when random edges are removed and then added back into the problem formulation. They will then use machine learning techniques to determine how to select edges whose removal and readdition into the problem Hamiltonian optimizes the MaxCut solution.
Systems in two spatial dimensions exhibiting topological order are promising candidates for fault-tolerant quantum computers and quantum memory. This exotic state of matter supports quasi-particle excitations which obey anyonic quantum statistics. Anyons come in two types: abelian and non-abelian. Abelian anyons cannot be used for quantum computation since their braiding produces only a phase. Nevertheless, they can be used to store information in a topologically protected way. Many quantum error correcting schemes involve abelian anyons. On the other hand, braiding non-Abelian anyons changes their state by a unitary matrix. For both types of anyons error protection against local perturbations is guaranteed by the robust ground state degeneracy and the energy gap between ground state and excited states. We will consider Rydberg atoms on a lattice, including punctures with mixed boundaries to generate non-abelian anyons that can be used for fault-tolerant quantum optimization and other quantum algorithms.
Developments in artificial intelligence are opening up new avenues for human-machine teaming. For example, the brain-computer interface technology extracts and interprets information generated by brain activity without depending on any external device or muscle intervention. Improving human-machine interactions requires the analysis and interpretation of physiological signals to effectively assess individual states. These signals are typically nonstationary, noisy, and nonlinear, and current signal processing methods may fail. Embedding a signal into a point cloud, we will consider the detection of shape patterns of the signals’ point clouds. These shape patterns are characterized by their pertinent topological properties, which are summarized in a persistence diagram. A persistence diagram consists of two dimensional points whose positioning highlights signals’ features and deconvolves them from any underlying noise. On the other hand, point clouds consist of many discrete points, and the computation of these diagrams is a rather formidable task. We will adopt a quantum topological framework which considers all points in a point cloud, and relies on principles of quantum machine learning algorithms. Moreover, when it comes to actual analysis of signals and their associated diagrams, one may need to compute a distance between them so that they are differentiated, or quantify their uncertainty and estimate a probability density function on the space of persistence diagrams. Computing a distance between two persistence diagrams requires the solution of an optimal matching problem. We will study distances that are formulated and computed in a quantum way. Propagating a distribution of a persistence diagram to quantify uncertainty requires computation of a distribution of a random point process. This is a non-trivial, highly combinatorial problem, which can be bypassed by considering a quantum computing approach. We will further generate quantum supervised machine learning schemes for signals.
There has been a renaissance in machine learning techniques based on neural networks, forming the new field of deep learning. This breakthrough is being fueled by a number of technical factors, such as new software libraries and powerful special-purpose computational hardware. The fundamental computational units in deep learning are continuous vectors and tensors which are transformed in high-dimensional spaces. They are approximated using conventional digital computers. However, new specialized computational hardware is currently being engineered which is fundamentally analog in nature. Quantum computers have several advantages over classical ones making them an intriguing platform for exploring new types of neural networks, in particular hybrid classical-quantum schemes. We will study applications of quantum neural networks and their implementation on photonic quantum hardware.
We will develop hybrid classical-quantum algorithms for the simulation of applications of lattice gauge theories (LGTs) to subatomic physics. Nuclear matter consists of quarks and gluons whose interactions are described by gauge quantum field theories comprising the Standard Model, including quantum chromodynamics (QCD), quantum electrodynamics (QED), and weak nuclear interactions. LGTs are an important tool in understanding nuclear processes and tackle problems such as the nuclear many-body problem, stellar nucleosynthesis, and neutron stars. Calculations using LGTs are computationally intensive and can cover all energy scales. They are used to understand hadron physics, such as masses of bound states, effective coupling constants of nuclear reactions, and quark-gluon structure of light nuclei. Despite significant progress, key problems in subatomic physics, such as nuclear matter at finite density and quark-gluon hadronization have not been addressed. We will explore the power of quantum computing to advance the computational power of LGTs by simulating nuclear matter with real-time dynamics. We will research and develop hybrid quantum/classical embedded algorithms for simulating processes in LGTs, starting with abelian gauge theories and building up to QCD for photonic architectures. We will research and build quantum and hybrid algorithms involving optimization for calculating energy levels in subatomic systems described by LGTs to elucidate hardon physics, including the quark-gluon structure of light nuclei, nuclear matter at finite density, nuclear reactions, and hadronization. We will investigate the implementation of these algorithms on quantum hardware based on photonics. We will use quantum simulators, aided by the high-performance computing power of ORNL and the University of Tennessee to study the scalability of algorithms and their potential implementation on photonic quantum hardware.