QoE Prediction in RIC-assisted Wireless Real-time Video Transmission System with Transformer

Yanzan Sun, Wanquan Xiong, Guangjin Pan*, Shugong Xu, Shunqing Zhang, Xiaojing Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Immersive communication is one of the key scenarios of 6G, requiring the network to support real-time communication. In this paper, we utilize low-level information of wireless networks for the real-time video quality of experience (QoE) perception and prediction in the application layer. We propose a Transformer-based two-stage QoE perception and prediction (TTPP) algorithm, which employs an encoder-decoder architecture for perception and prediction, respectively. Additionally, we introduce two data augmentation methods to enhance model robustness. In addition, we deploy and evaluate the proposed algorithm in the prototype system. The test results show that compared to the baseline algorithm, our proposed algorithm reduces the mean absolute error (MAE) by 12.5% to 44.0%.

Original languageEnglish
JournalIEEE Transactions on Vehicular Technology
DOIs
Publication statusAccepted/In press - 2025
Externally publishedYes

Keywords

  • deep learning
  • prediction
  • QoE
  • Wireless extended reality

Fingerprint

Dive into the research topics of 'QoE Prediction in RIC-assisted Wireless Real-time Video Transmission System with Transformer'. Together they form a unique fingerprint.

Cite this