lukecq commited on
Commit
f200db2
·
1 Parent(s): 01d2282

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -7,13 +7,14 @@ tags:
7
  # Zero-shot text classification (base-sized model) trained with self-supervised tuning
8
 
9
  Zero-shot text classification model trained with self-supervised tuning (SSTuning).
10
- It was introduced in the paper Zero-Shot Text Classification via Self-Supervised Tuning by
11
  Chaoqun Liu, Wenxuan Zhang, Guizhen Chen, Xiaobao Wu, Anh Tuan Luu, Chip Hong Chang, Lidong Bing
12
  and first released in [this repository](https://github.com/DAMO-NLP-SG/SSTuning).
13
 
14
  The model backbone is RoBERTa-base.
15
 
16
  ## Model description
 
17
  The model is tuned with unlabeled data using a learning objective called first sentence prediction (FSP).
18
  The FSP task is designed by considering both the nature of the unlabeled corpus and the input/output format of classification tasks.
19
  The training and validation sets are constructed from the unlabeled corpus using FSP.
 
7
  # Zero-shot text classification (base-sized model) trained with self-supervised tuning
8
 
9
  Zero-shot text classification model trained with self-supervised tuning (SSTuning).
10
+ It was introduced in the paper [Zero-Shot Text Classification via Self-Supervised Tuning](https://arxiv.org/abs/2305.11442) by
11
  Chaoqun Liu, Wenxuan Zhang, Guizhen Chen, Xiaobao Wu, Anh Tuan Luu, Chip Hong Chang, Lidong Bing
12
  and first released in [this repository](https://github.com/DAMO-NLP-SG/SSTuning).
13
 
14
  The model backbone is RoBERTa-base.
15
 
16
  ## Model description
17
+
18
  The model is tuned with unlabeled data using a learning objective called first sentence prediction (FSP).
19
  The FSP task is designed by considering both the nature of the unlabeled corpus and the input/output format of classification tasks.
20
  The training and validation sets are constructed from the unlabeled corpus using FSP.