Commit ·
6bada3c
1
Parent(s): 65772b6
Update README.md
Browse files
README.md
CHANGED
|
@@ -28,6 +28,8 @@ widget:
|
|
| 28 |
|
| 29 |
# T5 for belarusian language
|
| 30 |
|
|
|
|
|
|
|
| 31 |
This model is based on T5-small with sequence length equal 128 tokens. Model trained from scratch on RTX 3090 24GB.
|
| 32 |
|
| 33 |
# Supported tasks:
|
|
@@ -59,4 +61,7 @@ x = tokenizer.encode('<extra_id_1>да зорак праз цяжкасці', re
|
|
| 59 |
result = model.generate(x, return_dict_in_generate=True, output_scores=True,max_length=128)
|
| 60 |
print(tokenizer.decode(result["sequences"][0]))
|
| 61 |
```
|
| 62 |
-
</details>
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
|
| 29 |
# T5 for belarusian language
|
| 30 |
|
| 31 |
+

|
| 32 |
+
|
| 33 |
This model is based on T5-small with sequence length equal 128 tokens. Model trained from scratch on RTX 3090 24GB.
|
| 34 |
|
| 35 |
# Supported tasks:
|
|
|
|
| 61 |
result = model.generate(x, return_dict_in_generate=True, output_scores=True,max_length=128)
|
| 62 |
print(tokenizer.decode(result["sequences"][0]))
|
| 63 |
```
|
| 64 |
+
</details>
|
| 65 |
+
|
| 66 |
+
# References
|
| 67 |
+
- [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
|