File size: 780 Bytes
0cf2760
e983a70
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: pl
license: apache-2.0
---

<h1 align="center">polish-roberta-base-v2</h1>

An encoder model based on the RoBERTa architecture, pre-trained on a large corpus of Polish texts. 
More information can be found in our [GitHub repository](https://github.com/sdadas/polish-roberta) and in the publication [Pre-training polish transformer-based language models at scale](https://arxiv.org/pdf/2006.04229).

## Citation

```bibtex
@inproceedings{dadas2020pre,
  title={Pre-training polish transformer-based language models at scale},
  author={Dadas, S{\l}awomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}},
  booktitle={International Conference on Artificial Intelligence and Soft Computing},
  pages={301--314},
  year={2020},
  organization={Springer}
}
```