Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,22 @@
|
|
1 |
---
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language: pl
|
3 |
+
license: apache-2.0
|
4 |
+
---
|
5 |
+
|
6 |
+
<h1 align="center">polish-roberta-base-v2</h1>
|
7 |
+
|
8 |
+
An encoder model based on the RoBERTa architecture, pre-trained on a large corpus of Polish texts.
|
9 |
+
More information can be found in our [GitHub repository](https://github.com/sdadas/polish-roberta) and in the publication [Pre-training polish transformer-based language models at scale](https://arxiv.org/pdf/2006.04229).
|
10 |
+
|
11 |
+
## Citation
|
12 |
+
|
13 |
+
```bibtex
|
14 |
+
@inproceedings{dadas2020pre,
|
15 |
+
title={Pre-training polish transformer-based language models at scale},
|
16 |
+
author={Dadas, S{\l}awomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}},
|
17 |
+
booktitle={International Conference on Artificial Intelligence and Soft Computing},
|
18 |
+
pages={301--314},
|
19 |
+
year={2020},
|
20 |
+
organization={Springer}
|
21 |
+
}
|
22 |
+
```
|