RS2002 commited on
Commit
37f17fb
·
verified ·
1 Parent(s): a3d0be2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +91 -9
README.md CHANGED
@@ -1,9 +1,91 @@
1
- ---
2
- tags:
3
- - model_hub_mixin
4
- - pytorch_model_hub_mixin
5
- ---
6
-
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: [More Information Needed]
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CSI-BERT
2
+
3
+ The description is generated by Grok3.
4
+
5
+ ## Model Details
6
+
7
+ - **Model Name**: CSI-BERT
8
+
9
+ - **Model Type**: Transformer-based model for wireless sensing and data recovery
10
+
11
+ - **Version**: 1.0
12
+
13
+ - **Release Date**: May 2024
14
+
15
+ - **Developers**: Zijian Zhao
16
+
17
+ - **Organization**: SRIBD, SYSU
18
+
19
+ - **License**: Apache License 2.0
20
+
21
+ - **Paper**: [Finding the Missing Data: A BERT-Inspired Approach Against Package Loss in Wireless Sensing](https://ieeexplore.ieee.org/document/10620769), IEEE INFOCOM DeepWireless Workshop 2024
22
+
23
+ - Citation:
24
+
25
+ ```
26
+ @INPROCEEDINGS{10620769,
27
+ author={Zhao, Zijian and Chen, Tingwei and Meng, Fanyi and Li, Hang and Li, Xiaoyang and Zhu, Guangxu},
28
+ booktitle={IEEE INFOCOM 2024 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)},
29
+ title={Finding the Missing Data: A BERT-Inspired Approach Against Package Loss in Wireless Sensing},
30
+ year={2024},
31
+ volume={},
32
+ number={},
33
+ pages={1-6},
34
+ doi={10.1109/INFOCOMWKSHPS61880.2024.10620769}
35
+ }
36
+ ```
37
+
38
+ - **Contact**: [email protected]
39
+
40
+ - **Repository**: https://github.com/RS2002/CSI-BERT
41
+
42
+ - **Updated Version**: [CSI-BERT2](https://github.com/RS2002/CSI-BERT2)
43
+
44
+ ## Model Description
45
+
46
+ CSI-BERT is a BERT-inspired transformer model designed for wireless sensing, specifically to address packet loss in Channel State Information (CSI) data. It processes amplitude and timestamp data to recover missing information and supports downstream tasks like action and people classification. The model can operate with or without time and position embeddings, making it flexible for various wireless sensing applications. Note that phase information is not used in the current version due to its negative impact on downstream task performance, though the model can recover phase data if needed.
47
+
48
+ - **Architecture**: BERT-based transformer
49
+ - **Input Format**: CSI amplitude (batch_size, length, receiver_num * carrier_dim), timestamp (batch_size, length), attention mask (batch_size, length)
50
+ - **Output Format**: Hidden states of dimension [batch_size, length, 64]
51
+ - **Hidden Size**: 64
52
+ - **Training Objective**: MLM pre-training for data recovery, followed by task-specific fine-tuning
53
+ - **Tasks Supported**: CSI data recovery, CSI classification
54
+
55
+ ## Training Data
56
+
57
+ The model was trained on the dynamic part of the **WiGesture Dataset**:
58
+
59
+ - **Dataset Source**: [WiGesture Dataset](http://www.sdp8.net/Dataset?id=5d4ee7ca-d0b0-45e3-9510-abb6e9cdebf9)
60
+ - **Data Structure:**
61
+ - **Amplitude**: (batch_size, length, receiver_num * carrier_dim)
62
+ - **Timestamp**: (batch_size, length)
63
+ - **Label**: (batch_size)
64
+ - **Note**: Phase information is not used in the current version but can be concatenated to amplitude data if needed. Custom dataloaders are required for user-specific tasks.
65
+
66
+ ## Usage
67
+
68
+ ### Installation
69
+
70
+ ```shell
71
+ git clone https://huggingface.co/RS2002/CSI-BERT
72
+ ```
73
+
74
+ ### Example Code
75
+
76
+ ```python
77
+ import torch
78
+ from model import CSI_BERT
79
+
80
+ # Load the model
81
+ model = CSI_BERT.from_pretrained("RS2002/CSI-BERT")
82
+
83
+ # Example input
84
+ csi = torch.rand((2, 100, 52))
85
+ time_stamp = torch.rand((2, 100))
86
+ attention_mask = torch.zeros((2, 100))
87
+
88
+ # Forward pass
89
+ y = model(csi, attention_mask, time_stamp)
90
+ print(y.shape) # Output: [2, 100, 64]
91
+ ```