mlem

Wanna some meos ? Run this

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

state_dict = torch.load("./meomeo/pytorch_model.bin", weights_only=False, map_location="cpu")

# Load from local folder
tokenizer = AutoTokenizer.from_pretrained("./meomeo", local_files_only=True)
model = AutoModelForCausalLM.from_pretrained("./meomeo", local_files_only=True)

# Simple inference
text = "Mèo Méo Meo Mèo Meo"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support