Learn to bid: deep reinforcement learning with transformer for energy storage bidding in energy and contingency reserve markets

Jinhao Li, Chang Wang, Yanru Zhang, Hao Wang

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


As part of efforts to tackle climate change, grid-scale battery energy storage systems (BESS) play an essential role in facilitating reliable and secure power system operation with variable renewable energy (VRE). BESS can balance time-varying electricity demand and supply in the spot market through energy arbitrage and in the frequency control ancillary services (FCAS) market through service enablement or delivery. Effective algorithms are needed for the optimal participation of BESS in multiple markets. Using deep reinforcement learning (DRL), we present a BESS bidding strategy in the joint spot and contingency FCAS markets, leveraging a transformer-based temporal feature extractor to exploit the temporal trends of volatile energy prices. We validate our strategy on real-world historical energy prices in the Australian National Electricity Market (NEM). We demonstrate that the novel DRL-based bidding strategy significantly outperforms benchmarks. The simulation also reveals that the joint bidding in both the spot and contingency FCAS markets can yield a much higher profit than in individual markets. Our work provides a viable use case for the BESS, contributing to the power system operation with high penetration of renewables.
Original languageEnglish
Title of host publicationNeurIPS 2022 Workshop
Subtitle of host publicationTackling Climate Change with Machine Learning
Place of PublicationSan Diego CA USA
PublisherNeural Information Processing Systems (NIPS)
Number of pages8
Publication statusPublished - 2022
EventTackling Climate Change with Machine Learning 2022 - Barcelona, Spain
Duration: 9 Dec 20229 Dec 2022


ConferenceTackling Climate Change with Machine Learning 2022
Internet address

Cite this