Instructions to use Python/ACROSS-m2o-eng-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Python/ACROSS-m2o-eng-base with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Python/ACROSS-m2o-eng-base") model = AutoModelForSeq2SeqLM.from_pretrained("Python/ACROSS-m2o-eng-base") - Notebooks
- Google Colab
- Kaggle
| pytorch_model.bin filter=lfs diff=lfs merge=lfs -text | |
| optimizer.pt filter=lfs diff=lfs merge=lfs -text | |
| scheduler.pt filter=lfs diff=lfs merge=lfs -text | |
| spiece.model filter=lfs diff=lfs merge=lfs -text | |
| training_args.bin filter=lfs diff=lfs merge=lfs -text | |
| model.safetensors filter=lfs diff=lfs merge=lfs -text | |