Model Card for mpyt5_e15
事前に自然言語だけでなくPythonを学習したモデル
Training Details
Training Data
Python Code (1.05GB)
Training Procedure
- MLM
- python vocab (https://huggingface.co/kkuramitsu/mt5-pytoken)
Preprocessing
mT5 + Python
Speeds, Sizes, Times
- mT5-small(300M Paramators)
- max_length = 128
Model Version
- *epoch5: https://huggingface.co/Roy029/mpyt5_e5
- *epoch10: https://huggingface.co/Roy029/mpyt5_e10
- *epoch15: https://huggingface.co/Roy029/mpyt5_e15
- *epoch20: This Model
- Downloads last month
- 5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support