|
|
--- |
|
|
language: cop |
|
|
widget: |
|
|
- text: ⲁⲗⲗⲁ ⲁⲛⲟⲕ ⲁⲓⲥⲉⲧⲡⲧⲏⲩⲧⲛ · |
|
|
--- |
|
|
|
|
|
This is a [MicroBERT](https://github.com/lgessler/microbert) model for Coptic. |
|
|
|
|
|
* Its suffix is **-mx**, which means that it was pretrained using supervision from masked language modeling and XPOS tagging. |
|
|
* The unlabeled Coptic data was taken from version 4.2.0 of the [Coptic SCRIPTORIUM corpus](https://github.com/copticscriptorium/corpora), totaling 970,642 tokens. |
|
|
* The UD treebank [UD_Coptic_Scriptorium](https://github.com/UniversalDependencies/UD_Coptic-Scriptorium), v2.9, totaling 48,632 tokens, was used for labeled data. |
|
|
|
|
|
Please see [the repository](https://github.com/lgessler/microbert) and |
|
|
[the paper](https://github.com/lgessler/microbert/raw/master/MicroBERT__MRL_2022_.pdf) for more details. |