King's College London

Research portal

Computation Offloading in Energy Harvesting Powered MEC Network

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Original languageEnglish
Title of host publicationICC 2021 - IEEE International Conference on Communications, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728171227
DOIs
PublishedJun 2021
Event2021 IEEE International Conference on Communications, ICC 2021 - Virtual, Online, Canada
Duration: 14 Jun 202123 Jun 2021

Publication series

NameIEEE International Conference on Communications
ISSN (Print)1550-3607

Conference

Conference2021 IEEE International Conference on Communications, ICC 2021
Country/TerritoryCanada
CityVirtual, Online
Period14/06/202123/06/2021

Bibliographical note

Publisher Copyright: © 2021 IEEE. Copyright: Copyright 2021 Elsevier B.V., All rights reserved.

King's Authors

Abstract

Mobile edge computing (MEC) is a promising technique which migrates computational intensive tasks from smart devices to edge servers, so as to increase the computational capacity of smart devices while saving the battery energy. In this paper, we consider a MEC network system where the smart device with energy harvesting module and electricity storage chooses an offloading rate to offload computational task to one of edge servers. We formulate the offloading problem as a joint minimization problem of energy consumption and delay in the long-term, while meeting the smart device's quality of experience constraints. The challenge is that the varying renewable energy generation and the energy consumed by the current action will affect the next battery level. To address this problem, we use a reinforcement learning model to account for future dynamics in the environment. To this end, we develop an algorithm based on a deep Q noisy neural network that adjusts automatically the noise level at smart devices for exploration, and hence, replaces the epsilon-greedy policy, traditionally used in the Q-learning algorithm. Simulation results show that the proposed algorithm achieves lower energy consumption and better quality of experience, as compared to the celebrated deep Q-learning algorithm and the random scheme.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454