Impact of barren plateaus countermeasures on the Quantum Neural Network capacity to learn

Jacob L. Cybulski, Thanh Nguyen

Research output: Contribution to journalArticleResearchpeer-review

3 Citations (Scopus)

Abstract

Training of Quantum Neural Networks can be affected by barren plateaus—flat areas in the landscape of the cost function, which impede the model optimisation. While there exist methods of dealing with barren plateaus, they could reduce the model’s effective dimension—the measure of its capacity to learn. This paper therefore reports an investigation of four barren plateaus countermeasures, i.e. restricting the model’s circuit depth and relying on the local cost function; layer-by-layer circuit pre-training; relying on the circuit block structure to support its initialisation; as well as, model creation without any constraints. Several experiments were conducted to analyse the impact of each countermeasure on the model training, its subsequent ability to generalise and its effective dimension. The results reveal which of the approaches enhances or impedes the quantum model’s capacity to learn, which gives more predictable learning outcomes, and which is more sensitive to training data. Finally, the paper provides some recommendations on how to utilise the effective dimension measurements to assist quantum model development.

Original languageEnglish
Article number442
Number of pages25
JournalQuantum Information Processing
Volume22
Issue number12
DOIs
Publication statusPublished - 14 Dec 2023
Externally publishedYes

Keywords

  • Barren plateaus
  • Effective dimension
  • Quantum computing
  • Quantum information
  • Quantum neural networks
  • Variational quantum algorithms

Cite this