TY - GEN
T1 - Multi-scenario time-domain channel extraploation
T2 - 24th IEEE International Conference on Communication Technology, ICCT 2024
AU - Yu, Wenjun
AU - Jiang, Jun
AU - Gao, Yuan
AU - Xu, Shugong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Time-domain channel prediction offers a promising solution for obtaining Channel State Information (CSI) in high-mobility communication systems, while minimizing overhead. However, current deep learning-based channel prediction models face significant challenges in generalization, often performing poorly when applied to scenarios different from their training data. To address this challenge, we propose a novel transformer-based time-domain channel prediction framework that generalizes effectively across multiple scenarios. Extensive simulations demonstrate that our proposed framework substantially outperforms conventional models in terms of generalization capability across various scenarios and signal-to-noise ratios (SNR). The models we compared include Long Short-Term Memory (LSTM) networks, Gated Recurrent Units (GRUs), Bidirectional GRU (BiGRU), and standard Transformer architectures. Our results underscore the potential of this approach to significantly advance the field of channel prediction in dynamic communication environments.
AB - Time-domain channel prediction offers a promising solution for obtaining Channel State Information (CSI) in high-mobility communication systems, while minimizing overhead. However, current deep learning-based channel prediction models face significant challenges in generalization, often performing poorly when applied to scenarios different from their training data. To address this challenge, we propose a novel transformer-based time-domain channel prediction framework that generalizes effectively across multiple scenarios. Extensive simulations demonstrate that our proposed framework substantially outperforms conventional models in terms of generalization capability across various scenarios and signal-to-noise ratios (SNR). The models we compared include Long Short-Term Memory (LSTM) networks, Gated Recurrent Units (GRUs), Bidirectional GRU (BiGRU), and standard Transformer architectures. Our results underscore the potential of this approach to significantly advance the field of channel prediction in dynamic communication environments.
KW - Channel prediction
KW - position encoding
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=105003115154&partnerID=8YFLogxK
U2 - 10.1109/ICCT62411.2024.10946280
DO - 10.1109/ICCT62411.2024.10946280
M3 - Conference Proceeding
AN - SCOPUS:105003115154
T3 - International Conference on Communication Technology Proceedings, ICCT
SP - 1446
EP - 1450
BT - 2024 IEEE 24th International Conference on Communication Technology, ICCT 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 October 2024 through 20 October 2024
ER -