Instructions to use vertigo23/salomon_translation_model_V0 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use vertigo23/salomon_translation_model_V0 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("vertigo23/salomon_translation_model_V0") model = AutoModelForSeq2SeqLM.from_pretrained("vertigo23/salomon_translation_model_V0") - Notebooks
- Google Colab
- Kaggle
No model card
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support