King's College London

Research portal

A Bandit Approach to Price-Aware Energy Management in Cellular Networks

Research output: Contribution to journalArticlepeer-review

Original languageEnglish
Article number7887725
Pages (from-to)1609-1612
Number of pages4
JournalIEEE COMMUNICATIONS LETTERS
Volume21
Issue number7
Early online date27 Mar 2017
DOIs
Accepted/In press27 Mar 2017
E-pub ahead of print27 Mar 2017
Published1 Jul 2017

Documents

  • A Bandit Approach to_ZHANG_Publishedonline27March2017_GREEN AAM (non-CC)

    final_version.pdf, 357 KB, application/pdf

    Uploaded date:23 Jun 2017

    Version:Accepted author manuscript

    c) 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.

King's Authors

Abstract

We introduce a reinforcement learning algorithm inspired by the combinatorial multi-armed bandit problem to minimize the time-averaged energy cost at individual base stations (BSs), powered by various energy markets and local renewable energy sources, over a finite-time horizon. The algorithm sustains traffic demands by enabling sparse beamforming to schedule dynamic user-to-BS allocation and proactive energy provisioning at BSs to make ahead-of-time price-aware energy management decisions. Simulation results indicate a superior performance of the proposed algorithm in reducing the overall energy cost, as compared with recently proposed cooperative energy management designs.

Download statistics

No data available

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454