Skip to content

FarFar-sudo/LLM4WM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM4WM: Adapting LLM for Wireless Multi-Tasking

Liu, Xuanyu, et al. "LLM4WM: Adapting LLM for Wireless Multi-Tasking." IEEE Transactions on Machine Learning in Communications and Networking (2025). [paper]

Dependencies and Installation

  • Python 3.8 (Recommend to use Anaconda)
  • Pytorch 2.0.0
  • NVIDIA GPU + CUDA
  • Python packages: pip install -r requirements.txt

Dataset Preparation

The test datasets used in this paper is generated by QuaDRiGa, and it can be downloaded in the following links. [Testing Dataset]

Get Started

Step 1: Prepare the Files

  • Dataset: Download the dataset and place it under the data/ folder in the root directory.
  • GPT-2 Weights: Download the GPT-2 weights and put them into the pretrain/ folder.
  • LLM4WM Weights: Download our provided pretrained weights of LLM4WM and store them in the Weights/ folder.

Step 2: Run Inference

Once all the required files are in place, you can evaluate our pretrained model with:

python inference.py

Citation

If you find this repo helpful, please cite our paper.

@article{liu2025llm4wm,
  title={LLM4WM: Adapting LLM for Wireless Multi-Tasking},
  author={Liu, Xuanyu and Gao, Shijian and Liu, Boxun and Cheng, Xiang and Yang, Liuqing},
  journal={IEEE Transactions on Machine Learning in Communications and Networking},
  year={2025}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%