Online learning approach for predictive real-time energy trading in cloud-rans

Wan Nur Suryani Firuz Wan Ariffin*, Xinruo Zhang, Mohammad Reza Nakhai, Hasliza A. Rahim, R. Badlishah Ahmad

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


Constantly changing electricity demand has made variability and uncertainty inherent characteristics of both electric generation and cellular communication systems. This paper develops an online learning algorithm as a prescheduling mechanism to manage the variability and uncertainty to maintain cost-aware and reliable operation in cloud radio access networks (Cloud-RANs). The proposed algorithm employs a combinatorial multi-armed bandit model and minimizes the long-term energy cost at remote radio heads. The algorithm preschedules a set of cost-efficient energy packages to be purchased from an ancillary energy market for the future time slots by learning both from cooperative energy trading at previous time slots and by exploring new energy scheduling strategies at the current time slot. The simulation results confirm a significant performance gain of the proposed scheme in controlling the available power budgets and minimizing the overall energy cost compared with recently proposed approaches for real-time energy resources and energy trading in Cloud-RANs.

Original languageEnglish
Article number2308
Issue number7
Publication statusPublished - 1 Apr 2021


  • Cloud radio access network
  • Combinatorial multi-armed bandit
  • Energy trading
  • Online learning


Dive into the research topics of 'Online learning approach for predictive real-time energy trading in cloud-rans'. Together they form a unique fingerprint.

Cite this