Commit
·
22d19c7
1
Parent(s):
0a36376
Update app.py
Browse files
app.py
CHANGED
|
@@ -87,8 +87,7 @@ The English-Spanish pretraining improves BLEU and Chrf, and leads to faster conv
|
|
| 87 |
of transfer learning with a unified Text-to-Text transformer.
|
| 88 |
|
| 89 |
- Ximena Gutierrez-Vasques, Gerardo Sierra, and Hernandez Isaac. 2016. Axolotl: a web accessible parallel corpus for Spanish-Nahuatl. In International Conference on Language Resources and Evaluation (LREC).
|
| 90 |
-
|
| 91 |
-
For more details visit [(our model approach description)](https://huggingface.co/hackathon-pln-es/t5-small-spanish-nahuatl)
|
| 92 |
'''
|
| 93 |
|
| 94 |
model = AutoModelForSeq2SeqLM.from_pretrained('hackathon-pln-es/t5-small-spanish-nahuatl')
|
|
|
|
| 87 |
of transfer learning with a unified Text-to-Text transformer.
|
| 88 |
|
| 89 |
- Ximena Gutierrez-Vasques, Gerardo Sierra, and Hernandez Isaac. 2016. Axolotl: a web accessible parallel corpus for Spanish-Nahuatl. In International Conference on Language Resources and Evaluation (LREC).
|
| 90 |
+
|
|
|
|
| 91 |
'''
|
| 92 |
|
| 93 |
model = AutoModelForSeq2SeqLM.from_pretrained('hackathon-pln-es/t5-small-spanish-nahuatl')
|