Projects per year
Abstract
With the proliferation of vehicular user equipment (V-UE) in the Internet-of-Vehicles (IoV) systems, cloud computing alone cannot process all V-UE tasks, especially those latency-sensitive ones. Although static roadside fog nodes have been employed to offload computation from V-UEs, mobile fog nodes carried by vehicles that have the potential to further improve the performance of computation offloading for vehicular tasks have not been sufficiently studied for IoV systems. In this paper, we consider a mixed cloud/vehicular-fog computing (VFC) system that employs vehicle-carried fog nodes (V-FNs) in addition to cloud servers to offload tasks from V-UEs. To minimise the maximum service delay (which includes the transmission delay, queueing delay, and processing delay) among all V-UEs, we jointly optimise the offloading decisions of all V-UEs, the computation resource allocation at all V-FNs, the allocation of resource block (RB) and transmission power for all V-UEs while considering the mobility of V-UEs and V-FNs. The joint optimisation is solved by devising a fireworks algorithm-based offloading decision optimisation scheme, in conjunction with a bisection method-based V-FN computation resource allocation scheme and a clustering-based communication resource allocation scheme. Simulation results show that our proposed schemes outperform the benchmarks in terms of service the maximum delay among all V-UEs.
Original language | English |
---|---|
Journal | IEEE Transactions on Mobile Computing |
DOIs | |
Publication status | Published - 28 Mar 2025 |
Keywords
- Cloud computing
- V2X communications
- computation offloading
- queueing delay
- resource allocation
- vehicular-fog computing
Fingerprint
Dive into the research topics of 'Computation Offloading and Resource Allocation in Mixed Cloud/Vehicular-fog Computing Systems'. Together they form a unique fingerprint.Projects
- 1 Active
-
Development of a Federated Learning-Based Edge Intelligence Framework for IoT Network Systems
1/07/23 → 30/06/26
Project: Internal Research Project