Generative Data Intelligence

Deep recurrent networks predicting the gap evolution in adiabatic quantum computing

Date:

Naeimeh Mohseni1,2, Carlos Navarrete-Benlloch3,4,1, Tim Byrnes5,6,7,8,9, and Florian Marquardt1,2

1Max-Planck-Institut für die Physik des Lichts, Staudtstrasse 2, 91058 Erlangen, Germany
2Physics Department, University of Erlangen-Nuremberg, Staudtstr. 5, 91058 Erlangen, Germany
3Wilczek Quantum Center, School of Physics and Astronomy, Shanghai Jiao Tong University, Shanghai 200240, China
4Shanghai Research Center for Quantum Sciences, Shanghai 201315, China
5New York University Shanghai, 1555 Century Ave, Pudong, Shanghai 200122, China
6State Key Laboratory of Precision Spectroscopy, School of Physical and Material Sciences,East China Normal University, Shanghai 200062, China
7NYU-ECNU Institute of Physics at NYU Shanghai, 3663 Zhongshan Road North, Shanghai 200062, China
8Center for Quantum and Topological Systems (CQTS), NYUAD Research Institute, New York University Abu Dhabi, UAE
9Department of Physics, New York University, New York, NY 10003, USA

Find this paper interesting or want to discuss? Scite or leave a comment on SciRate.

Abstract

In adiabatic quantum computing finding the dependence of the gap of the Hamiltonian as a function of the parameter varied during the adiabatic sweep is crucial in order to optimize the speed of the computation. Inspired by this challenge, in this work we explore the potential of deep learning for discovering a mapping from the parameters that fully identify a problem Hamiltonian to the aforementioned parametric dependence of the gap applying different network architectures. Through this example, we conjecture that a limiting factor for the learnability of such problems is the size of the input, that is, how the number of parameters needed to identify the Hamiltonian scales with the system size. We show that a long short-term memory network succeeds in predicting the gap when the parameter space scales linearly with system size. Remarkably, we show that once this architecture is combined with a convolutional neural network to deal with the spatial structure of the model, the gap evolution can even be predicted for system sizes larger than the ones seen by the neural network during training. This provides a significant speedup in comparison with the existing exact and approximate algorithms in calculating the gap.

In the field of adiabatic quantum computing, one key aspect for achieving optimal computation speed is to understand how the gap of the Hamiltonian depends on the varied parameters during the adiabatic sweep. Motivated by this challenge, our paper embarks on investigating the potential of deep learning techniques to discover a mapping between problem Hamiltonian parameters and the parametric dependence of the gap. By employing diverse network architectures, we investigate the learnability limits of such problems. Our investigation suggests that the scalability of the number of parameters needed to identify the Hamiltonian with respect to the system size plays a critical role in the learnability of such problems.

Remarkably, we show that a trained neural network succeeds in predicting the full gap evolution during an adiabatic sweep for large system sizes just by having it observe the gap for small system sizes, given the parameter space scales linearly with system size. Our study adds up to the promise of so-called convolutional recurrent networks in predicting the adiabatic dynamics of inhomogeneous many-body systems and their potential for extrapolating the dynamics beyond what the neural network is trained on.

► BibTeX data

► References

[1] Dorit Aharonov and Amnon Ta-Shma. Adiabatic quantum state generation and statistical zero knowledge. In Proceedings of the thirty-fifth annual ACM symposium on Theory of computing, pages 20–29, 2003. 10.1145/​780542.780546.
https:/​/​doi.org/​10.1145/​780542.780546

[2] Mahabubul Alam, Abdullah Ash-Saki, and Swaroop Ghosh. Accelerating quantum approximate optimization algorithm using machine learning. In 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE), pages 686–689. IEEE, 2020. 10.5555/​3408352.3408509.
https:/​/​doi.org/​10.5555/​3408352.3408509

[3] Tameem Albash and Daniel A Lidar. Demonstration of a scaling advantage for a quantum annealer over simulated annealing. Physical Review X, 8 (3): 031016, 2018. 10.1103/​PhysRevX.8.031016.
https:/​/​doi.org/​10.1103/​PhysRevX.8.031016

[4] Boris Altshuler, Hari Krovi, and Jérémie Roland. Anderson localization makes adiabatic quantum optimization fail. Proceedings of the National Academy of Sciences, 107 (28): 12446–12450, 2010. 10.1073/​pnas.1002116107.
https:/​/​doi.org/​10.1073/​pnas.1002116107

[5] MHS Amin and V Choi. First-order quantum phase transition in adiabatic quantum computation. Physical Review A, 80 (6): 062326, 2009. 10.1103/​PhysRevA.80.062326.
https:/​/​doi.org/​10.1103/​PhysRevA.80.062326

[6] Matthew JS Beach, Anna Golubeva, and Roger G Melko. Machine learning vortices at the kosterlitz-thouless transition. Physical Review B, 97 (4): 045207, 2018. 10.1103/​PhysRevB.97.045207.
https:/​/​doi.org/​10.1103/​PhysRevB.97.045207

[7] Giulio Biroli, Simona Cocco, and Rémi Monasson. Phase transitions and complexity in computer science: an overview of the statistical physics approach to the random satisfiability problem. Physica A: Statistical Mechanics and its Applications, 306: 381–394, 2002. 10.1016/​S0378-4371(02)00516-2.
https:/​/​doi.org/​10.1016/​S0378-4371(02)00516-2

[8] Alex Blania, Sandro Herbig, Fabian Dechent, Evert van Nieuwenburg, and Florian Marquardt. Deep learning of spatial densities in inhomogeneous correlated quantum systems. arXiv preprint arXiv:2211.09050, 2022. 10.48550/​arXiv.2211.09050.
https:/​/​doi.org/​10.48550/​arXiv.2211.09050
arXiv:2211.09050

[9] Troels Arnfred Bojesen. Policy-guided monte carlo: Reinforcement-learning markov chain dynamics. Physical Review E, 98 (6): 063303, 2018. 10.1103/​PhysRevE.98.063303.
https:/​/​doi.org/​10.1103/​PhysRevE.98.063303

[10] Marin Bukov, Alexandre G. R. Day, Dries Sels, Phillip Weinberg, Anatoli Polkovnikov, and Pankaj Mehta. Reinforcement learning in different phases of quantum control. Phys. Rev. X, 8: 031086, Sep 2018. 10.1103/​PhysRevX.8.031086.
https:/​/​doi.org/​10.1103/​PhysRevX.8.031086

[11] Giuseppe Carleo and Matthias Troyer. Solving the quantum many-body problem with artificial neural networks. Science, 355 (6325): 602–606, 2017. 10.1126/​science.aag2302.
https:/​/​doi.org/​10.1126/​science.aag2302

[12] Juan Carrasquilla and Roger G Melko. Machine learning phases of matter. Nature Physics, 13 (5): 431–434, 2017. 10.1038/​nphys4035.
https:/​/​doi.org/​10.1038/​nphys4035

[13] Juan Carrasquilla and Giacomo Torlai. Neural networks in quantum many-body physics: a hands-on tutorial. arXiv preprint arXiv:2101.11099, 2021. 10.48550/​arXiv.2101.11099.
https:/​/​doi.org/​10.48550/​arXiv.2101.11099
arXiv:2101.11099

[14] François Chollet et al. Keras. https:/​/​keras.io, 2015.
https:/​/​keras.io

[15] Edward Farhi, Jeffrey Goldstone, Sam Gutmann, and Michael Sipser. Quantum computation by adiabatic evolution. arXiv preprint quant-ph/​0001106, 2000. 10.48550/​arXiv.quant-ph/​0001106.
https:/​/​doi.org/​10.48550/​arXiv.quant-ph/​0001106
arXiv:quant-ph/0001106

[16] Keisuke Fujii, Kaoru Mizuta, Hiroshi Ueda, Kosuke Mitarai, Wataru Mizukami, and Yuya O. Nakagawa. Deep variational quantum eigensolver: A divide-and-conquer method for solving a larger problem with smaller size quantum computers. PRX Quantum, 3: 010346, Mar 2022. 10.1103/​PRXQuantum.3.010346.
https:/​/​doi.org/​10.1103/​PRXQuantum.3.010346

[17] Ivan Glasser, Nicola Pancotti, Moritz August, Ivan D. Rodriguez, and J. Ignacio Cirac. Neural-network quantum states, string-bond states, and chiral topological states. Phys. Rev. X, 8: 011006, Jan 2018. 10.1103/​PhysRevX.8.011006.
https:/​/​doi.org/​10.1103/​PhysRevX.8.011006

[18] Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning. MIT Press, 2016. URL http:/​/​www.deeplearningbook.org.
http:/​/​www.deeplearningbook.org

[19] Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing, pages 6645–6649. Ieee, 2013. 10.1109/​ICASSP.2013.6638947.
https:/​/​doi.org/​10.1109/​ICASSP.2013.6638947

[20] Gian Giacomo Guerreschi. Solving quadratic unconstrained binary optimization with divide-and-conquer and quantum algorithms. arXiv preprint arXiv:2101.07813, 2021. 10.48550/​arXiv.2101.07813.
https:/​/​doi.org/​10.48550/​arXiv.2101.07813
arXiv:2101.07813

[21] Pratibha Raghupati Hegde, Gianluca Passarelli, Giovanni Cantele, and Procolo Lucignano. Deep learning optimal quantum annealing schedules for random ising models. arXiv preprint arXiv:2211.15209, 2022. 10.48550/​arXiv.2211.15209.
https:/​/​doi.org/​10.48550/​arXiv.2211.15209
arXiv:2211.15209

[22] S Hochreiter and J Schmidhuber. Long short-term memory neural computation. 10.1162/​neco.1997.9.8.1735.
https:/​/​doi.org/​10.1162/​neco.1997.9.8.1735

[23] Hsin-Yuan Huang, Richard Kueng, and John Preskill. Information-theoretic bounds on quantum advantage in machine learning. Phys. Rev. Lett., 126: 190505, May 2021a. 10.1103/​PhysRevLett.126.190505.
https:/​/​doi.org/​10.1103/​PhysRevLett.126.190505

[24] Hsin-Yuan Huang, Richard Kueng, Giacomo Torlai, Victor V Albert, and John Preskill. Provably efficient machine learning for quantum many-body problems. science, 2021b. 10.1126/​science.abk3333.
https:/​/​doi.org/​10.1126/​science.abk3333

[25] Li Huang and Lei Wang. Accelerated monte carlo simulations with restricted boltzmann machines. Phys. Rev. B, 95: 035105, Jan 2017. 10.1103/​PhysRevB.95.035105.
https:/​/​doi.org/​10.1103/​PhysRevB.95.035105

[26] Marko Žnidaričand Martin Horvat. Exponential complexity of an adiabatic algorithm for an np-complete problem. Phys. Rev. A, 73: 022329, Feb 2006. 10.1103/​PhysRevA.73.022329.
https:/​/​doi.org/​10.1103/​PhysRevA.73.022329

[27] J Robert Johansson, Paul D Nation, and Franco Nori. Qutip: An open-source python framework for the dynamics of open quantum systems. Computer Physics Communications, 183 (8): 1760–1772, 2012. 10.48550/​arXiv.1110.0573.
https:/​/​doi.org/​10.48550/​arXiv.1110.0573

[28] Wolfgang Lechner, Philipp Hauke, and Peter Zoller. A quantum annealing architecture with all-to-all connectivity from local interactions. Science advances, 1 (9): e1500838, 2015. 10.1126/​sciadv.1500838.
https:/​/​doi.org/​10.1126/​sciadv.1500838

[29] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Exploring deep reinforcement learning with multi q-learning. nature, 521 (7553): 436–444, 2015. 10.1038/​nature14539.
https:/​/​doi.org/​10.1038/​nature14539

[30] Daniel A Lidar, Ali T Rezakhani, and Alioscia Hamma. Adiabatic approximation with exponential accuracy for many-body systems and quantum computation. Journal of Mathematical Physics, 50 (10): 102106, 2009. 10.1063/​1.3236685.
https:/​/​doi.org/​10.1063/​1.3236685

[31] Yuichiro Matsuzaki, Hideaki Hakoshima, Kenji Sugisaki, Yuya Seki, and Shiro Kawabata. Direct estimation of the energy gap between the ground state and excited state with quantum annealing. Japanese Journal of Applied Physics, 60 (SB): SBBI02, 2021. 10.1088/​0305-4470/​15/​10/​028.
https:/​/​doi.org/​10.1088/​0305-4470/​15/​10/​028

[32] Matija Medvidović and Giuseppe Carleo. Classical variational simulation of the quantum approximate optimization algorithm. npj Quantum Information, 7 (1): 1–7, 2021. 10.1038/​s41534-021-00440-z.
https:/​/​doi.org/​10.1038/​s41534-021-00440-z

[33] Tomáš Mikolov, Martin Karafiát, Lukáš Burget, Jan Černockỳ, and Sanjeev Khudanpur. Recurrent neural network based language model. In Eleventh annual conference of the international speech communication association, 2010. 10.21437/​Interspeech.2010-343.
https:/​/​doi.org/​10.21437/​Interspeech.2010-343

[34] Naeimeh Mohseni, Marek Narozniak, Alexey N Pyrkov, Valentin Ivannikov, Jonathan P Dowling, and Tim Byrnes. Error suppression in adiabatic quantum computing with qubit ensembles. npj Quantum Information, 7 (1): 1–10, 2021. doi.org/​10.1038/​s41534-021-00405-2.
https:/​/​doi.org/​10.1038/​s41534-021-00405-2

[35] Naeimeh Mohseni, Thomas Fösel, Lingzhen Guo, Carlos Navarrete-Benlloch, and Florian Marquardt. Deep learning of quantum many-body dynamics via random driving. Quantum, 6: 714, 2022a. 10.22331/​q-2022-05-17-714.
https:/​/​doi.org/​10.22331/​q-2022-05-17-714

[36] Naeimeh Mohseni, Peter L McMahon, and Tim Byrnes. Ising machines as hardware solvers of combinatorial optimization problems. Nature Reviews Physics, 4 (6): 363–379, 2022b. 10.1038/​s42254-022-00440-8.
https:/​/​doi.org/​10.1038/​s42254-022-00440-8

[37] Naeimeh Mohseni, Junheng Shi, Tim Byrnes, and Michael Hartmann. Deep learning of many-body observables and quantum information scrambling. arXiv preprint arXiv:2302.04621, 2023. 10.48550/​arXiv.2302.04621.
https:/​/​doi.org/​10.48550/​arXiv.2302.04621
arXiv:2302.04621

[38] Michael A Nielsen. Neural networks and deep learning, volume 2018. Determination press San Francisco, CA, 2015.

[39] Murphy Yuezhen Niu, Andrew M Dai, Li Li, Augustus Odena, Zhengli Zhao, Vadim Smelyanskyi, Hartmut Neven, and Sergio Boixo. Learnability and complexity of quantum samples. arXiv preprint arXiv:2010.11983, 2020. 10.48550/​arXiv.2010.11983.
https:/​/​doi.org/​10.48550/​arXiv.2010.11983
arXiv:2010.11983

[40] Asier Ozaeta, Wim van Dam, and Peter L McMahon. Expectation values from the single-layer quantum approximate optimization algorithm on ising problems. Quantum Science and Technology, 7 (4): 045036, 2022. 10.1088/​2058-9565/​ac9013.
https:/​/​doi.org/​10.1088/​2058-9565/​ac9013

[41] Boris Pittel, Joel Spencer, and Nicholas Wormald. Sudden emergence of a giantk-core in a random graph. Journal of Combinatorial Theory, Series B, 67 (1): 111–151, 1996. 10.1006/​jctb.1996.0036.
https:/​/​doi.org/​10.1006/​jctb.1996.0036

[42] Jérémie Roland and Nicolas J Cerf. Quantum search by local adiabatic evolution. Physical Review A, 65 (4): 042308, 2002. 10.1103/​PhysRevA.65.042308.
https:/​/​doi.org/​10.1103/​PhysRevA.65.042308

[43] A. E. Russo, K. M. Rudinger, B. C. A. Morrison, and A. D. Baczewski. Evaluating energy differences on a quantum computer with robust phase estimation. Phys. Rev. Lett., 126: 210501, May 2021. 10.1103/​PhysRevLett.126.210501.
https:/​/​doi.org/​10.1103/​PhysRevLett.126.210501

[44] Zain H Saleem, Teague Tomesh, Michael A Perlin, Pranav Gokhale, and Martin Suchara. Quantum divide and conquer for combinatorial optimization and distributed computing. arXiv preprint arXiv:2107.07532, 2021. 10.48550/​arXiv.2107.07532.
https:/​/​doi.org/​10.48550/​arXiv.2107.07532
arXiv:2107.07532

[45] N. Saraceni, S. Cantori, and S. Pilati. Scalable neural networks for the efficient learning of disordered quantum systems. Phys. Rev. E, 102: 033301, Sep 2020. 10.1103/​PhysRevE.102.033301.
https:/​/​doi.org/​10.1103/​PhysRevE.102.033301

[46] Gernot Schaller. Adiabatic preparation without quantum phase transitions. Phys. Rev. A, 78: 032328, Sep 2008. 10.1103/​PhysRevA.78.032328.
https:/​/​doi.org/​10.1103/​PhysRevA.78.032328

[47] Markus Schmitt and Markus Heyl. Quantum many-body dynamics in two dimensions with artificial neural networks. Phys. Rev. Lett., 125: 100503, Sep 2020. 10.1103/​PhysRevLett.125.100503.
https:/​/​doi.org/​10.1103/​PhysRevLett.125.100503

[48] Ralf Schützhold. Dynamical quantum phase transitions. Journal of Low Temperature Physics, 153 (5-6): 228–243, 2008. 10.1007/​s10909-008-9831-5.
https:/​/​doi.org/​10.1007/​s10909-008-9831-5

[49] Xingjian SHI, Zhourong Chen, Hao Wang, Dit-Yan Yeung, Wai-kin Wong, and Wang-chun WOO. Convolutional lstm network: A machine learning approach for precipitation nowcasting. In C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 28. Curran Associates, Inc., 2015. URL https:/​/​proceedings.neurips.cc/​paper_files/​paper/​2015/​file/​07563a3fe3bbe7e3ba84431ad9d055af-Paper.pdf.
https:/​/​proceedings.neurips.cc/​paper_files/​paper/​2015/​file/​07563a3fe3bbe7e3ba84431ad9d055af-Paper.pdf

[50] Giacomo Torlai, Guglielmo Mazzola, Juan Carrasquilla, Matthias Troyer, Roger Melko, and Giuseppe Carleo. Neural-network quantum state tomography. Nature Physics, 14 (5): 447–450, 2018. 10.1038/​s41567-018-0048-5.
https:/​/​doi.org/​10.1038/​s41567-018-0048-5

[51] Evert PL Van Nieuwenburg, Ye-Hua Liu, and Sebastian D Huber. Learning phase transitions by confusion. Nature Physics, 13 (5): 435–439, 2017. 10.1038/​nphys4037.
https:/​/​doi.org/​10.1038/​nphys4037

[52] Filippo Vicentini. Machine learning toolbox for quantum many body physics. Nature Reviews Physics, 3 (3): 156–156, 2021. 10.1038/​s42254-021-00285-7.
https:/​/​doi.org/​10.1038/​s42254-021-00285-7

[53] Lei Wang. Discovering phase transitions with unsupervised learning. Phys. Rev. B, 94: 195105, Nov 2016. 10.1103/​PhysRevB.94.195105.
https:/​/​doi.org/​10.1103/​PhysRevB.94.195105

[54] Sebastian J Wetzel. Unsupervised learning of phase transitions: From principal component analysis to variational autoencoders. Physical Review E, 96 (2): 022140, 2017. 10.1103/​PhysRevE.96.022140.
https:/​/​doi.org/​10.1103/​PhysRevE.96.022140

[55] SHI Xingjian, Zhourong Chen, Hao Wang, Dit-Yan Yeung, Wai-Kin Wong, and Wang-chun Woo. Convolutional lstm network: A machine learning approach for precipitation nowcasting. In Advances in neural information processing systems, pages 802–810, 2015. URL https:/​/​proceedings.neurips.cc/​paper_files/​paper/​2015/​file/​07563a3fe3bbe7e3ba84431ad9d055af-Paper.pdf.
https:/​/​proceedings.neurips.cc/​paper_files/​paper/​2015/​file/​07563a3fe3bbe7e3ba84431ad9d055af-Paper.pdf

[56] A. P. Young, S. Knysh, and V. N. Smelyanskiy. Size dependence of the minimum excitation gap in the quantum adiabatic algorithm. Phys. Rev. Lett., 101: 170503, Oct 2008. 10.1103/​PhysRevLett.101.170503.
https:/​/​doi.org/​10.1103/​PhysRevLett.101.170503

Cited by

[1] Naeimeh Mohseni, Peter L. McMahon, and Tim Byrnes, “Ising machines as hardware solvers of combinatorial optimization problems”, Nature Reviews Physics 4 6, 363 (2022).

[2] Pratibha Raghupati Hegde, Gianluca Passarelli, Giovanni Cantele, and Procolo Lucignano, “Deep learning optimal quantum annealing schedules for random Ising models”, arXiv:2211.15209, (2022).

[3] Naeimeh Mohseni, Thomas Fösel, Lingzhen Guo, Carlos Navarrete-Benlloch, and Florian Marquardt, “Deep Learning of Quantum Many-Body Dynamics via Random Driving”, Quantum 6, 714 (2022).

[4] Alexander Gresch, Lennart Bittel, and Martin Kliesch, “Scalable approach to many-body localization via quantum data”, arXiv:2202.08853, (2022).

[5] Naeimeh Mohseni, Junheng Shi, Tim Byrnes, and Michael Hartmann, “Deep learning of many-body observables and quantum information scrambling”, arXiv:2302.04621, (2023).

The above citations are from SAO/NASA ADS (last updated successfully 2023-06-12 11:24:38). The list may be incomplete as not all publishers provide suitable and complete citation data.

Could not fetch Crossref cited-by data during last attempt 2023-06-12 11:24:37: Could not fetch cited-by data for 10.22331/q-2023-06-12-1039 from Crossref. This is normal if the DOI was registered recently.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?