RS2002 commited on
Commit
c74d8ef
verified
1 Parent(s): cea9202

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +85 -9
README.md CHANGED
@@ -1,9 +1,85 @@
1
- ---
2
- tags:
3
- - model_hub_mixin
4
- - pytorch_model_hub_mixin
5
- ---
6
-
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: [More Information Needed]
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CSI-BERT2
2
+
3
+ The description is generated by Grok3.
4
+
5
+ ## Model Details
6
+
7
+ - **Model Name**: CSI-BERT2
8
+
9
+ - **Model Type**: BERT-inspired transformer for CSI prediction and classification
10
+
11
+ - **Version**: 2.0
12
+
13
+ - **Release Date**: August 2025
14
+
15
+ - **Developers**: Zijian Zhao
16
+
17
+ - **Organization**: SRIBD, SYSU
18
+
19
+ - **License**: Apache License 2.0
20
+
21
+ - **Paper**: [CSI-BERT2: A BERT-inspired Framework for Efficient CSI Prediction and Classification in Wireless Communication and Sensing](https://arxiv.org/abs/2412.06861)
22
+
23
+ - **Citation:**
24
+
25
+ ```
26
+ @article{zhao2024mining,
27
+ title={CSI-BERT2: A BERT-inspired Framework for Efficient CSI Prediction and Classification in Wireless Communication and Sensing},
28
+ author={Zhao, Zijian and Meng, Fanyi and Lyu, Zhonghao and Li, Hang and Li, Xiaoyang and Zhu, Guangxu},
29
+ journal={arXiv preprint arXiv:2412.06861},
30
+ year={2024}
31
+ }
32
+ ```
33
+
34
+ - **Contact**: zhaozj28@mail2.sysu.edu.cn
35
+
36
+ - **Repository**: https://github.com/RS2002/CSI-BERT2
37
+
38
+ - **Previous Version**: [CSI-BERT](https://github.com/RS2002/CSI-BERT)
39
+
40
+ ## Model Description
41
+
42
+ CSI-BERT2 is an upgraded BERT-inspired transformer model for Channel State Information (CSI) prediction and classification in wireless communication and sensing. It improves upon [CSI-BERT](https://github.com/RS2002/CSI-BERT) with optimized model and code structure, supporting tasks like CSI recovery, prediction, gesture recognition, fall detection, people identification, and people number estimation. The model processes CSI amplitude data and supports adversarial training with a GAN-based discriminator.
43
+
44
+ - **Architecture**: BERT-based transformer with optional GAN discriminator
45
+ - **Input Format**: CSI amplitude (batch_size, length, receiver_num * carrier_dim), attention mask (batch_size, length), optional timestamp (batch_size, length)
46
+ - **Output Format**: Hidden states of dimension [batch_size, length, hidden_dim]
47
+ - **Hidden Size**: 128
48
+ - **Training Objective**: MLM pre-training with GAN (optional) and task-specific fine-tuning
49
+ - **Tasks Supported**: CSI recovery, CSI prediction, CSI classification
50
+
51
+ ## Training Data
52
+
53
+ The model was trained on the following datasets:
54
+
55
+ - **Public Datasets:**
56
+ - [WiGesture](http://www.sdp8.net/Dataset?id=5d4ee7ca-d0b0-45e3-9510-abb6e9cdebf9): Gesture recognition, people identification
57
+ - [WiFall](https://github.com/RS2002/KNN-MMD/tree/main/WiFall): Action recognition, fall detection, people identification
58
+ - **Proposed Dataset:**
59
+ - [WiCount]([CSI-BERT2/WiCount at main 路 RS2002/CSI-BERT2](https://github.com/RS2002/CSI-BERT2/tree/main/WiCount)): People number estimation
60
+ - **Data Structure:**
61
+ - **Amplitude**: (batch_size, length, receiver_num * carrier_dim)
62
+ - **Timestamp**: (batch_size, length) (optional)
63
+ - **Label**: (batch_size)
64
+ - **Note**: Refer to [CSI-BERT](https://github.com/RS2002/CSI-BERT) for data preparation details. Custom dataloaders may be needed for specific tasks.
65
+
66
+ ## Usage
67
+
68
+ ### Installation
69
+
70
+ ```shell
71
+ git clone https://huggingface.co/RS2002/CSI-BERT2
72
+ ```
73
+
74
+ ### Example Code
75
+
76
+ ```python
77
+ import torch
78
+ from model import CSI_BERT2
79
+ model = CSI_BERT2.from_pretrained("RS2002/CSI-BERT2")
80
+ csi = torch.rand((2, 100, 52))
81
+ time_stamp = torch.rand((2, 100))
82
+ attention_mask = torch.zeros((2, 100))
83
+ y = model(csi,time_stamp,attention_mask)
84
+ print(y.shape) # dim: [2,100,64] (batch_size,length,hidden_dim)
85
+ ```