Toy Models to Study
Collection
9 items • Updated • 2
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("nilq/lua-mistral-2L-tiny")
model = AutoModelForCausalLM.from_pretrained("nilq/lua-mistral-2L-tiny")This model is a fine-tuned version of on the nilq/small-lua-stack dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="nilq/lua-mistral-2L-tiny")