Toy Base Models
Collection
1 item • Updated
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("nilq/baby-python-mistral-1L-tiny-base")
model = AutoModelForCausalLM.from_pretrained("nilq/baby-python-mistral-1L-tiny-base")This model is trained on the nilq/baby-python dataset. It is the base model in the paper Tracking Universal Features Through Fine-Tuning and Model Merging. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="nilq/baby-python-mistral-1L-tiny-base")