TY - GEN
T1 - Computation Offloading and Resource Allocation in IoT-Based Mobile Edge Computing Systems
AU - Hu, Bintao
AU - Gao, Yuan
AU - Zhang, Wenzhang
AU - Jia, Dongyao
AU - Liu, Hengyan
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023/11/1
Y1 - 2023/11/1
N2 - With the rise in popularity of artificial intelligence (AI) and internet of things (IoT) technologies, advanced AI technologies have been widely applied to support delay/time-sensitive tasks of IoT-based user equipment (UE) in IoT systems, which allows IoT-based UEs to offload their tasks to a remote fog, edge or cloud computing server. To reduce the consumption of delays (which may include transmission delays, queueing delays, and processing delays) while efficiently allocating the computation resource at a remote server, an efficient offloading decision solution needs to be proposed. In this paper, an IoT-based network system consisting of two layers will be proposed, where The bottom layer is the IoT-based UE layer, which includes multiple IoT-based UEs, and the top layer is the mobile edge computing (MEC) layer, which includes an edge node embedded with the base station. We propose a double Q-Learning-based offloading decision and computation resource allocation optimisation algorithm (DQOCA), which aims to jointly optimise the offloading decisions among all IoT-based UEs and optimise computation resource at the MEC server to reduce the maximum delay consumption among all IoT-based UEs. Simulation findings show that, in comparison to benchmarks (i.e., local processing and edge processing schemes), our proposed approach greatly minimises the maximum delay consumption.
AB - With the rise in popularity of artificial intelligence (AI) and internet of things (IoT) technologies, advanced AI technologies have been widely applied to support delay/time-sensitive tasks of IoT-based user equipment (UE) in IoT systems, which allows IoT-based UEs to offload their tasks to a remote fog, edge or cloud computing server. To reduce the consumption of delays (which may include transmission delays, queueing delays, and processing delays) while efficiently allocating the computation resource at a remote server, an efficient offloading decision solution needs to be proposed. In this paper, an IoT-based network system consisting of two layers will be proposed, where The bottom layer is the IoT-based UE layer, which includes multiple IoT-based UEs, and the top layer is the mobile edge computing (MEC) layer, which includes an edge node embedded with the base station. We propose a double Q-Learning-based offloading decision and computation resource allocation optimisation algorithm (DQOCA), which aims to jointly optimise the offloading decisions among all IoT-based UEs and optimise computation resource at the MEC server to reduce the maximum delay consumption among all IoT-based UEs. Simulation findings show that, in comparison to benchmarks (i.e., local processing and edge processing schemes), our proposed approach greatly minimises the maximum delay consumption.
KW - double Q-learning
KW - edge computing
KW - Internet of Things
KW - resource allocation
UR - http://www.scopus.com/inward/record.url?scp=85178502159&partnerID=8YFLogxK
U2 - 10.1109/SmartIoT58732.2023.00024
DO - 10.1109/SmartIoT58732.2023.00024
M3 - Conference Proceeding
AN - SCOPUS:85178502159
T3 - Proceedings - 2023 IEEE International Conference on Smart Internet of Things, SmartIoT 2023
SP - 119
EP - 123
BT - 2023 IEEE International Conference on Smart Internet of Things, SmartIoT 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 7th IEEE International Conference on Smart Internet of Things, SmartIoT 2023
Y2 - 25 August 2023 through 27 August 2023
ER -