B. Liu, X. Liu, S. Gao, X. Cheng and L. Yang, "LLM4CP: Adapting Large Language Models for Channel Prediction," in Journal of Communications and Information Networks, vol. 9, no. 2, pp. 113-125, June 2024, doi: 10.23919/JCIN.2024.10582829. [paper]
- Python 3.8 (Recommend to use Anaconda)
- Pytorch 2.0.0
- NVIDIA GPU + CUDA
- Python packages:
pip install -r requirements.txt
The datasets used in this paper can be downloaded in the following links.
[Training Dataset]
[Testing Dataset]
We generate dataset via QuaDRiGa. To assist researchers in the field of channel prediction, we have provided a runnable demo file in the data_generation
folder. For more detailed information about the QuDRiGa generator, please refer to its user documentation uadriga_documentation_v2.8.1-0.pdf
.
Training and testing codes are in the current folder.
-
The code for training is in
train.py
, while the code for test is intest_tdd_full.py
andtest_fdd_full.py
. we also provide our pretrained model in [Weights]. -
For full shot training, you need to set the file_path in the main function to match your training dataset. For example, if you want to try a full-shot experiment in a TDD scenario, you need to modify the
train_TDD_r_path
andtrain_TDD_t_path
intrain.py
to the locations of your downloadedH_U_his_train.mat
andH_U_pre_train.mat
, respectively. Then, you can runtrain.py
. -
For few shot training, you need to set the file_path in the main function to match your training dataset. Then, you can set
is_few=1
when creating the training set intrain.py
like this:train_set = Dataset_Pro(train_TDD_r_path, train_TDD_t_path, is_few=1)
and runtrain.py
. -
For testing, you also need to set the file_path in the main function to match your testing dataset. Then, you can run
test_tdd_full.py
to obtain the results in Figure 7 of the paper, and you can runtest_fdd_full.py
to obtain the results in Figure 8 of the paper. You can also try loading the data underTesting Dataset/Umi
to test the models' zero-shot performance.
If you find this repo helpful, please cite our paper.
@article{liu2024llm4cp,
title={LLM4CP: Adapting Large Language Models for Channel Prediction},
author={Liu, Boxun and Liu, Xuanyu and Gao, Shijian and Cheng, Xiang and Yang, Liuqing},
journal={arXiv preprint arXiv:2406.14440},
year={2024}