How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("fill-mask", model="GianTan/CBERTo")
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("GianTan/CBERTo")
model = AutoModelForMaskedLM.from_pretrained("GianTan/CBERTo")
Quick Links

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import pipeline

unmasker = pipeline('fill-mask', model='GianTan/CBERTo')

unmasker("mura kag [MASK].")

Downloads last month
4
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support