How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("malper/taatiknet")
model = AutoModelForSeq2SeqLM.from_pretrained("malper/taatiknet")
Quick Links

Please see this model's GitHub repo for more information.

Downloads last month
48
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using malper/taatiknet 1