Generative Data Intelligence

Quantum Vision Transformers

Date:

El Amine Cherrat1, Iordanis Kerenidis1,2, Natansh Mathur1,2, Jonas Landman3,2, Martin Strahm4, and Yun Yvonna Li4

1IRIF, CNRS – Université Paris Cité, France
2QC Ware, Palo Alto, USA and Paris, France
3School of Informatics, University of Edinburgh, Scotland, UK
4F. Hoffmann La Roche AG

Find this paper interesting or want to discuss? Scite or leave a comment on SciRate.

Abstract

In this work, quantum transformers are designed and analysed in detail by extending the state-of-the-art classical transformer neural network architectures known to be very performant in natural language processing and image analysis. Building upon the previous work, which uses parametrised quantum circuits for data loading and orthogonal neural layers, we introduce three types of quantum transformers for training and inference, including a quantum transformer based on compound matrices, which guarantees a theoretical advantage of the quantum attention mechanism compared to their classical counterpart both in terms of asymptotic run time and the number of model parameters. These quantum architectures can be built using shallow quantum circuits and produce qualitatively different classification models. The three proposed quantum attention layers vary on the spectrum between closely following the classical transformers and exhibiting more quantum characteristics. As building blocks of the quantum transformer, we propose a novel method for loading a matrix as quantum states as well as two new trainable quantum orthogonal layers adaptable to different levels of connectivity and quality of quantum computers. We performed extensive simulations of the quantum transformers on standard medical image datasets that showed competitively, and at times better performance compared to the classical benchmarks, including the best-in-class classical vision transformers. The quantum transformers we trained on these small-scale datasets require fewer parameters compared to standard classical benchmarks. Finally, we implemented our quantum transformers on superconducting quantum computers and obtained encouraging results for up to six qubit experiments.

In this study, we explore the potential of quantum computing to enhance neural network architectures, focusing on transformers, known for their effectiveness in tasks like language processing and image analysis. We introduce three types of quantum transformers, leveraging parametrized quantum circuits and orthogonal neural layers. These quantum transformers, under some assumptions (eg. hardware connectivity), could theoretically provide advantages over classical counterparts in terms of both runtime and model parameters. To create these quantum circuit we present a novel method for loading matrices as quantum states and introduce two trainable quantum orthogonal layers adaptable to different quantum computer capabilities. They require shallow quantum circuits, and could help to create classification models with unique characteristics. Extensive simulations on medical image datasets demonstrate competitive performance compared to classical benchmarks, even with fewer parameters. Additionally, experiments on superconducting quantum computers yield promising results.

► BibTeX data

► References

[1] Jacob Biamonte, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, and Seth Lloyd. “Quantum machine learning”. Nature 549, 195–202 (2017).
https:/​/​doi.org/​10.1038/​nature23474

[2] Iris Cong, Soonwon Choi, and Mikhail D Lukin. “Quantum convolutional neural networks”. Nature Physics 15, 1273–1278 (2019).
https:/​/​doi.org/​10.1038/​s41567-019-0648-8

[3] Kishor Bharti, Alba Cervera-Lierta, Thi Ha Kyaw, Tobias Haug, Sumner Alperin-Lea, Abhinav Anand, Matthias Degroote, Hermanni Heimonen, Jakob S Kottmann, Tim Menke, et al. “Noisy intermediate-scale quantum algorithms”. Reviews of Modern Physics 94, 015004 (2022).
https:/​/​doi.org/​10.1103/​RevModPhys.94.015004

[4] Marco Cerezo, Andrew Arrasmith, Ryan Babbush, Simon C Benjamin, Suguru Endo, Keisuke Fujii, Jarrod R McClean, Kosuke Mitarai, Xiao Yuan, Lukasz Cincio, et al. “Variational quantum algorithms”. Nature Reviews Physics 3, 625–644 (2021).
https:/​/​doi.org/​10.1038/​s42254-021-00348-9

[5] Jonas Landman, Natansh Mathur, Yun Yvonna Li, Martin Strahm, Skander Kazdaghli, Anupam Prakash, and Iordanis Kerenidis. “Quantum methods for neural networks and application to medical image classification”. Quantum 6, 881 (2022).
https:/​/​doi.org/​10.22331/​q-2022-12-22-881

[6] Bobak Kiani, Randall Balestriero, Yann LeCun, and Seth Lloyd. “projunn: Efficient method for training deep networks with unitary matrices”. Advances in Neural Information Processing Systems 35, 14448–14463 (2022).

[7] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention is all you need”. Advances in neural information processing systems 30 (2017).

[8] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. “Bert: Pre-training of deep bidirectional transformers for language understanding” (2018).

[9] Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, and Neil Houlsby. “An image is worth 16×16 words: Transformers for image recognition at scale”. International Conference on Learning Representations (2021). url: openreview.net/​forum?id=YicbFdNTTy.
https:/​/​openreview.net/​forum?id=YicbFdNTTy

[10] Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. “Efficient transformers: A survey”. ACM Computing Surveys (CSUR) (2020).
https:/​/​doi.org/​10.1145/​3530811

[11] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. “Neural Machine Translation by Jointly Learning to Align and Translate” (2016). arXiv:1409.0473 [cs, stat].
arXiv:1409.0473

[12] J. Schmidhuber. “Reducing the Ratio Between Learning Complexity and Number of Time Varying Variables in Fully Recurrent Nets”. In Stan Gielen and Bert Kappen, editors, ICANN ’93. Pages 460–463. London (1993). Springer.
https:/​/​doi.org/​10.1007/​978-1-4471-2063-6_110

[13] Jürgen Schmidhuber. “Learning to Control Fast-Weight Memories: An Alternative to Dynamic Recurrent Networks”. Neural Computation 4, 131–139 (1992).
https:/​/​doi.org/​10.1162/​neco.1992.4.1.131

[14] Peter Cha, Paul Ginsparg, Felix Wu, Juan Carrasquilla, Peter L McMahon, and Eun-Ah Kim. “Attention-based quantum tomography”. Machine Learning: Science and Technology 3, 01LT01 (2021).
https:/​/​doi.org/​10.1088/​2632-2153/​ac362b

[15] Riccardo Di Sipio, Jia-Hong Huang, Samuel Yen-Chi Chen, Stefano Mangini, and Marcel Worring. “The dawn of quantum natural language processing”. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Pages 8612–8616. IEEE (2022).
https:/​/​doi.org/​10.1109/​ICASSP43922.2022.9747675

[16] Guangxi Li, Xuanqiang Zhao, and Xin Wang. “Quantum self-attention neural networks for text classification” (2022).

[17] Fabio Sanches, Sean Weinberg, Takanori Ide, and Kazumitsu Kamiya. “Short quantum circuits in reinforcement learning policies for the vehicle routing problem”. Physical Review A 105, 062403 (2022).
https:/​/​doi.org/​10.1103/​PhysRevA.105.062403

[18] YuanFu Yang and Min Sun. “Semiconductor defect detection by hybrid classical-quantum deep learning”. CVPRPages 2313–2322 (2022).
https:/​/​doi.org/​10.1109/​CVPR52688.2022.00236

[19] Maxwell Henderson, Samriddhi Shakya, Shashindra Pradhan, and Tristan Cook. “Quanvolutional neural networks: powering image recognition with quantum circuits”. Quantum Machine Intelligence 2, 1–9 (2020).
https:/​/​doi.org/​10.1007/​s42484-020-00012-y

[20] Edward Farhi and Hartmut Neven. “Classification with quantum neural networks on near term processors” (2018). url: doi.org/​10.48550/​arXiv.1802.06002.
https:/​/​doi.org/​10.48550/​arXiv.1802.06002

[21] Kosuke Mitarai, Makoto Negoro, Masahiro Kitagawa, and Keisuke Fujii. “Quantum circuit learning”. Physical Review A 98, 032309 (2018).
https:/​/​doi.org/​10.1103/​PhysRevA.98.032309

[22] Kui Jia, Shuai Li, Yuxin Wen, Tongliang Liu, and Dacheng Tao. “Orthogonal deep neural networks”. IEEE transactions on pattern analysis and machine intelligence (2019).
https:/​/​doi.org/​10.1109/​TPAMI.2019.2948352

[23] Roger A Horn and Charles R Johnson. “Matrix analysis”. Cambridge university press. (2012).
https:/​/​doi.org/​10.1017/​CBO9780511810817

[24] Iordanis Kerenidis and Anupam Prakash. “Quantum machine learning with subspace states” (2022).

[25] Brooks Foxen, Charles Neill, Andrew Dunsworth, Pedram Roushan, Ben Chiaro, Anthony Megrant, Julian Kelly, Zijun Chen, Kevin Satzinger, Rami Barends, et al. “Demonstrating a continuous set of two-qubit gates for near-term quantum algorithms”. Physical Review Letters 125, 120504 (2020).
https:/​/​doi.org/​10.1103/​PhysRevLett.125.120504

[26] Sonika Johri, Shantanu Debnath, Avinash Mocherla, Alexandros Singk, Anupam Prakash, Jungsang Kim, and Iordanis Kerenidis. “Nearest centroid classification on a trapped ion quantum computer”. npj Quantum Information 7, 122 (2021).
https:/​/​doi.org/​10.1038/​s41534-021-00456-5

[27] James W Cooley and John W Tukey. “An algorithm for the machine calculation of complex fourier series”. Mathematics of computation 19, 297–301 (1965).
https:/​/​doi.org/​10.1090/​S0025-5718-1965-0178586-1

[28] Li Jing, Yichen Shen, Tena Dubcek, John Peurifoy, Scott A. Skirlo, Yann LeCun, Max Tegmark, and Marin Soljacic. “Tunable efficient unitary neural networks (eunn) and their application to rnns”. In International Conference on Machine Learning. (2016). url: api.semanticscholar.org/​CorpusID:5287947.
https:/​/​api.semanticscholar.org/​CorpusID:5287947

[29] Léo Monbroussou, Jonas Landman, Alex B. Grilo, Romain Kukla, and Elham Kashefi. “Trainability and expressivity of hamming-weight preserving quantum circuits for machine learning” (2023). arXiv:2309.15547.
arXiv:2309.15547

[30] Enrico Fontana, Dylan Herman, Shouvanik Chakrabarti, Niraj Kumar, Romina Yalovetzky, Jamie Heredge, Shree Hari Sureshbabu, and Marco Pistoia. “The adjoint is all you need: Characterizing barren plateaus in quantum ansätze” (2023). arXiv:2309.07902.
arXiv:2309.07902

[31] Michael Ragone, Bojko N. Bakalov, Frédéric Sauvage, Alexander F. Kemper, Carlos Ortiz Marrero, Martin Larocca, and M. Cerezo. “A unified theory of barren plateaus for deep parametrized quantum circuits” (2023). arXiv:2309.09342.
arXiv:2309.09342

[32] Xuchen You and Xiaodi Wu. “Exponentially many local minima in quantum neural networks”. In International Conference on Machine Learning. Pages 12144–12155. PMLR (2021).

[33] Eric R. Anschuetz and Bobak Toussi Kiani. “Quantum variational algorithms are swamped with traps”. Nature Communications 13 (2022).
https:/​/​doi.org/​10.1038/​s41467-022-35364-5

[34] Ilya O. Tolstikhin, Neil Houlsby, Alexander Kolesnikov, Lucas Beyer, Xiaohua Zhai, Thomas Unterthiner, Jessica Yung, Daniel Keysers, Jakob Uszkoreit, Mario Lucic, and Alexey Dosovitskiy. “Mlp-mixer: An all-mlp architecture for vision”. In NeurIPS. (2021).

[35] Jiancheng Yang, Rui Shi, and Bingbing Ni. “Medmnist classification decathlon: A lightweight automl benchmark for medical image analysis” (2020).
https:/​/​doi.org/​10.1109/​ISBI48211.2021.9434062

[36] Jiancheng Yang, Rui Shi, Donglai Wei, Zequan Liu, Lin Zhao, Bilian Ke, Hanspeter Pfister, and Bingbing Ni. “Medmnist v2-a large-scale lightweight benchmark for 2d and 3d biomedical image classification”. Scientific Data 10, 41 (2023).
https:/​/​doi.org/​10.1038/​s41597-022-01721-8

[37] Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, and François Fleuret. “Transformers are rnns: Fast autoregressive transformers with linear attention”. In International Conference on Machine Learning. Pages 5156–5165. PMLR (2020).

[38] James Bradbury, Roy Frostig, Peter Hawkins, Matthew James Johnson, Chris Leary, Dougal Maclaurin, George Necula, Adam Paszke, Jake VanderPlas, Skye Wanderman-Milne, and Qiao Zhang. “JAX: composable transformations of Python+NumPy programs”. Github (2018). url: http:/​/​github.com/​google/​jax.
http:/​/​github.com/​google/​jax

[39] Diederik P. Kingma and Jimmy Ba. “Adam: A method for stochastic optimization”. CoRR abs/​1412.6980 (2015).

[40] Hyeonwoo Noh, Tackgeun You, Jonghwan Mun, and Bohyung Han. “Regularizing deep neural networks by noise: Its interpretation and optimization”. NeurIPS (2017).

[41] Xue Ying. “An overview of overfitting and its solutions”. In Journal of Physics: Conference Series. Volume 1168, page 022022. IOP Publishing (2019).
https:/​/​doi.org/​10.1088/​1742-6596/​1168/​2/​022022

Cited by

[1] David Peral García, Juan Cruz-Benito, and Francisco José García-Peñalvo, “Systematic Literature Review: Quantum Machine Learning and its applications”, arXiv:2201.04093, (2022).

[2] El Amine Cherrat, Snehal Raj, Iordanis Kerenidis, Abhishek Shekhar, Ben Wood, Jon Dee, Shouvanik Chakrabarti, Richard Chen, Dylan Herman, Shaohan Hu, Pierre Minssen, Ruslan Shaydulin, Yue Sun, Romina Yalovetzky, and Marco Pistoia, “Quantum Deep Hedging”, Quantum 7, 1191 (2023).

[3] Léo Monbroussou, Jonas Landman, Alex B. Grilo, Romain Kukla, and Elham Kashefi, “Trainability and Expressivity of Hamming-Weight Preserving Quantum Circuits for Machine Learning”, arXiv:2309.15547, (2023).

[4] Sohum Thakkar, Skander Kazdaghli, Natansh Mathur, Iordanis Kerenidis, André J. Ferreira-Martins, and Samurai Brito, “Improved Financial Forecasting via Quantum Machine Learning”, arXiv:2306.12965, (2023).

[5] Jason Iaconis and Sonika Johri, “Tensor Network Based Efficient Quantum Data Loading of Images”, arXiv:2310.05897, (2023).

[6] Nishant Jain, Jonas Landman, Natansh Mathur, and Iordanis Kerenidis, “Quantum Fourier Networks for Solving Parametric PDEs”, arXiv:2306.15415, (2023).

[7] Daniel Mastropietro, Georgios Korpas, Vyacheslav Kungurtsev, and Jakub Marecek, “Fleming-Viot helps speed up variational quantum algorithms in the presence of barren plateaus”, arXiv:2311.18090, (2023).

[8] Aliza U. Siddiqui, Kaitlin Gili, and Chris Ballance, “Stressing Out Modern Quantum Hardware: Performance Evaluation and Execution Insights”, arXiv:2401.13793, (2024).

The above citations are from SAO/NASA ADS (last updated successfully 2024-02-22 13:37:43). The list may be incomplete as not all publishers provide suitable and complete citation data.

Could not fetch Crossref cited-by data during last attempt 2024-02-22 13:37:41: Could not fetch cited-by data for 10.22331/q-2024-02-22-1265 from Crossref. This is normal if the DOI was registered recently.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?