Instructions to use Roy029/mpyt5_e20 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Roy029/mpyt5_e20 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Roy029/mpyt5_e20") model = AutoModelForSeq2SeqLM.from_pretrained("Roy029/mpyt5_e20") - Notebooks
- Google Colab
- Kaggle
Model Card for mpyt5_e15
事前に自然言語だけでなくPythonを学習したモデル
Training Details
Training Data
Python Code (1.05GB)
Training Procedure
- MLM
- python vocab (https://huggingface.co/kkuramitsu/mt5-pytoken)
Preprocessing
mT5 + Python
Speeds, Sizes, Times
- mT5-small(300M Paramators)
- max_length = 128
Model Version
- *epoch5: https://huggingface.co/Roy029/mpyt5_e5
- *epoch10: https://huggingface.co/Roy029/mpyt5_e10
- *epoch15: https://huggingface.co/Roy029/mpyt5_e15
- *epoch20: This Model
- Downloads last month
- 6
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support