Update README.md
Browse files
README.md
CHANGED
|
@@ -123,8 +123,6 @@ This model has been fine-tuned on the downstream tasks of the [Catalan Language
|
|
| 123 |
| CatalanQA | QA | 21,427 | 17,135 | 2,157 | 2,135 |
|
| 124 |
| XQuAD-ca | QA | - | - | - | 1,189 |
|
| 125 |
|
| 126 |
-
|
| 127 |
-
|
| 128 |
### Evaluation results
|
| 129 |
|
| 130 |
This is how it compares to its teacher when fine-tuned on the aforementioned downstream tasks:
|
|
@@ -132,7 +130,7 @@ This is how it compares to its teacher when fine-tuned on the aforementioned dow
|
|
| 132 |
| Model \ Task |NER (F1)|POS (F1)|STS-ca (Comb)|TeCla (Acc.)|TEca (Acc.)|CatalanQA (F1/EM)| XQuAD-ca <sup>1</sup> (F1/EM) |
|
| 133 |
| ------------------------|:-------|:-------|:------------|:-----------|:----------|:----------------|:------------------------------|
|
| 134 |
| RoBERTa-base-ca-v2 | 89.29 | 98.96 | 79.07 | 74.26 | 83.14 | 89.50/76.63 | 73.64/55.42 |
|
| 135 |
-
| DistilRoBERTa-base-ca
|
| 136 |
|
| 137 |
<sup>1</sup> : Trained on CatalanQA, tested on XQuAD-ca.
|
| 138 |
|
|
@@ -160,8 +158,28 @@ This work was funded by the [Departament de la Vicepresidència i de Polítiques
|
|
| 160 |
|
| 161 |
### Citation information
|
| 162 |
|
|
|
|
|
|
|
| 163 |
```bibtex
|
| 164 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 165 |
```
|
| 166 |
|
| 167 |
### Disclaimer
|
|
|
|
| 123 |
| CatalanQA | QA | 21,427 | 17,135 | 2,157 | 2,135 |
|
| 124 |
| XQuAD-ca | QA | - | - | - | 1,189 |
|
| 125 |
|
|
|
|
|
|
|
| 126 |
### Evaluation results
|
| 127 |
|
| 128 |
This is how it compares to its teacher when fine-tuned on the aforementioned downstream tasks:
|
|
|
|
| 130 |
| Model \ Task |NER (F1)|POS (F1)|STS-ca (Comb)|TeCla (Acc.)|TEca (Acc.)|CatalanQA (F1/EM)| XQuAD-ca <sup>1</sup> (F1/EM) |
|
| 131 |
| ------------------------|:-------|:-------|:------------|:-----------|:----------|:----------------|:------------------------------|
|
| 132 |
| RoBERTa-base-ca-v2 | 89.29 | 98.96 | 79.07 | 74.26 | 83.14 | 89.50/76.63 | 73.64/55.42 |
|
| 133 |
+
| DistilRoBERTa-base-ca | 87.88 | 98.83 | 77.26 | 73.20 | 76.00 | 84.07/70.77 | xx.xx/xx.xx |
|
| 134 |
|
| 135 |
<sup>1</sup> : Trained on CatalanQA, tested on XQuAD-ca.
|
| 136 |
|
|
|
|
| 158 |
|
| 159 |
### Citation information
|
| 160 |
|
| 161 |
+
There is no publication for this specific model, but you can cite the paper where the teacher model was presented:
|
| 162 |
+
|
| 163 |
```bibtex
|
| 164 |
+
@inproceedings{armengol-estape-etal-2021-multilingual,
|
| 165 |
+
title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
|
| 166 |
+
author = "Armengol-Estap{\'e}, Jordi and
|
| 167 |
+
Carrino, Casimiro Pio and
|
| 168 |
+
Rodriguez-Penagos, Carlos and
|
| 169 |
+
de Gibert Bonet, Ona and
|
| 170 |
+
Armentano-Oller, Carme and
|
| 171 |
+
Gonzalez-Agirre, Aitor and
|
| 172 |
+
Melero, Maite and
|
| 173 |
+
Villegas, Marta",
|
| 174 |
+
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
|
| 175 |
+
month = aug,
|
| 176 |
+
year = "2021",
|
| 177 |
+
address = "Online",
|
| 178 |
+
publisher = "Association for Computational Linguistics",
|
| 179 |
+
url = "https://aclanthology.org/2021.findings-acl.437",
|
| 180 |
+
doi = "10.18653/v1/2021.findings-acl.437",
|
| 181 |
+
pages = "4933--4946",
|
| 182 |
+
}]
|
| 183 |
```
|
| 184 |
|
| 185 |
### Disclaimer
|