Instructions to use gszabo/large_subtoken10 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use gszabo/large_subtoken10 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="gszabo/large_subtoken10")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("gszabo/large_subtoken10") model = AutoModelForMaskedLM.from_pretrained("gszabo/large_subtoken10") - Notebooks
- Google Colab
- Kaggle
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("gszabo/large_subtoken10")
model = AutoModelForMaskedLM.from_pretrained("gszabo/large_subtoken10")Quick Links
No model card
- Downloads last month
- 12
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="gszabo/large_subtoken10")