sdadas commited on
Commit
e983a70
·
verified ·
1 Parent(s): de1a664

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -4
README.md CHANGED
@@ -1,5 +1,22 @@
1
  ---
2
- license: lgpl-3.0
3
- language:
4
- - pl
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: pl
3
+ license: apache-2.0
4
+ ---
5
+
6
+ <h1 align="center">polish-roberta-base-v2</h1>
7
+
8
+ An encoder model based on the RoBERTa architecture, pre-trained on a large corpus of Polish texts.
9
+ More information can be found in our [GitHub repository](https://github.com/sdadas/polish-roberta) and in the publication [Pre-training polish transformer-based language models at scale](https://arxiv.org/pdf/2006.04229).
10
+
11
+ ## Citation
12
+
13
+ ```bibtex
14
+ @inproceedings{dadas2020pre,
15
+ title={Pre-training polish transformer-based language models at scale},
16
+ author={Dadas, S{\l}awomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}},
17
+ booktitle={International Conference on Artificial Intelligence and Soft Computing},
18
+ pages={301--314},
19
+ year={2020},
20
+ organization={Springer}
21
+ }
22
+ ```