TY - JOUR
T1 - Impact of barren plateaus countermeasures on the Quantum Neural Network capacity to learn
AU - Cybulski, Jacob L.
AU - Nguyen, Thanh
N1 - Funding Information:
Author Jacob L. Cybulski holds an Honorary Associate Professorship at the School of IT, Deakin University and has no relevant financial or non-financial interests to disclose. Author Thanh Nguyen’s contribution to this project was partially funded by 2022 Summer Project Prize from the School of IT, Deakin University.
Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/12/14
Y1 - 2023/12/14
N2 - Training of Quantum Neural Networks can be affected by barren plateaus—flat areas in the landscape of the cost function, which impede the model optimisation. While there exist methods of dealing with barren plateaus, they could reduce the model’s effective dimension—the measure of its capacity to learn. This paper therefore reports an investigation of four barren plateaus countermeasures, i.e. restricting the model’s circuit depth and relying on the local cost function; layer-by-layer circuit pre-training; relying on the circuit block structure to support its initialisation; as well as, model creation without any constraints. Several experiments were conducted to analyse the impact of each countermeasure on the model training, its subsequent ability to generalise and its effective dimension. The results reveal which of the approaches enhances or impedes the quantum model’s capacity to learn, which gives more predictable learning outcomes, and which is more sensitive to training data. Finally, the paper provides some recommendations on how to utilise the effective dimension measurements to assist quantum model development.
AB - Training of Quantum Neural Networks can be affected by barren plateaus—flat areas in the landscape of the cost function, which impede the model optimisation. While there exist methods of dealing with barren plateaus, they could reduce the model’s effective dimension—the measure of its capacity to learn. This paper therefore reports an investigation of four barren plateaus countermeasures, i.e. restricting the model’s circuit depth and relying on the local cost function; layer-by-layer circuit pre-training; relying on the circuit block structure to support its initialisation; as well as, model creation without any constraints. Several experiments were conducted to analyse the impact of each countermeasure on the model training, its subsequent ability to generalise and its effective dimension. The results reveal which of the approaches enhances or impedes the quantum model’s capacity to learn, which gives more predictable learning outcomes, and which is more sensitive to training data. Finally, the paper provides some recommendations on how to utilise the effective dimension measurements to assist quantum model development.
KW - Barren plateaus
KW - Effective dimension
KW - Quantum computing
KW - Quantum information
KW - Quantum neural networks
KW - Variational quantum algorithms
UR - http://www.scopus.com/inward/record.url?scp=85179695914&partnerID=8YFLogxK
U2 - 10.1007/s11128-023-04187-8
DO - 10.1007/s11128-023-04187-8
M3 - Article
AN - SCOPUS:85179695914
SN - 1570-0755
VL - 22
JO - Quantum Information Processing
JF - Quantum Information Processing
IS - 12
M1 - 442
ER -