| license: mit | |
| language: | |
| - pt | |
| Model based on: | |
| [tgsc/ult5-pt-small](https://huggingface.co/tgsc/ult5-pt-small) | |
| Finetuned on a self-made instruction dataset. | |
| Portuguese only. | |
| Trained on input context length: 1024 tokens | |
| output context length: 512 | |
| But generally, T5 models accepts more than the default input context length. |