Machine learning tackles quantum error correction


Lisa Zyga


Credit: Torlai et al. ©2017 American Physical Society (Phys.org)—Physicists have applied the ability of machine learning algorithms to learn from experience to one of the biggest challenges currently facing quantum computing: quantum error correction, which is used to design noise-tolerant quantum computing protocols. In a new study, they have demonstrated that a type of neural network called a Boltzmann machine can be trained to model the errors in a quantum computing protocol and then devise and implement the best method for correcting the errors. The idea behind neural decoding is to circumvent the process of constructing a decoding algorithm for a specific code realization (given some approximations on the noise), and let a neural network learn how to perform the recovery directly from raw data, obtained by simple measurements on the code, Torlai told Phys.org. With the recent advances in quantum technologies and a wave of quantum devices becoming available in the near term, neural decoders will be able to accommodate the different architectures, as well as different noise sources. The researchers tested the neural decoder on quantum topological codes that are commonly used in quantum computing, and demonstrated that the algorithm is relatively simple to implement.


Visit Link


Tags: