Upload model
Browse files
README.md
CHANGED
|
@@ -1,8 +1,8 @@
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
- adapterhub:af/cc100
|
| 4 |
-
- xmod
|
| 5 |
- adapters
|
|
|
|
| 6 |
language:
|
| 7 |
- af
|
| 8 |
license: "mit"
|
|
@@ -42,6 +42,8 @@ For more information on architecture and training, please refer to the original
|
|
| 42 |
|
| 43 |
## Citation
|
| 44 |
|
|
|
|
|
|
|
| 45 |
```
|
| 46 |
@inproceedings{pfeiffer-etal-2022-lifting,
|
| 47 |
title = "Lifting the Curse of Multilinguality by Pre-training Modular Transformers",
|
|
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
- adapterhub:af/cc100
|
|
|
|
| 4 |
- adapters
|
| 5 |
+
- xmod
|
| 6 |
language:
|
| 7 |
- af
|
| 8 |
license: "mit"
|
|
|
|
| 42 |
|
| 43 |
## Citation
|
| 44 |
|
| 45 |
+
[Lifting the Curse of Multilinguality by Pre-training Modular Transformers](http://dx.doi.org/10.18653/v1/2022.naacl-main.255) (Pfeiffer et al., 2022)
|
| 46 |
+
|
| 47 |
```
|
| 48 |
@inproceedings{pfeiffer-etal-2022-lifting,
|
| 49 |
title = "Lifting the Curse of Multilinguality by Pre-training Modular Transformers",
|