# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Danielbrdz/CodeBarcenas-7b")
model = AutoModelForCausalLM.from_pretrained("Danielbrdz/CodeBarcenas-7b")Quick Links
CodeBarcenas Model specialized in the Python language Based on the model: WizardLM/WizardCoder-Python-7B-V1.0 And trained with the dataset: mlabonne/Evol-Instruct-Python-26k
Made with β€οΈ in Guadalupe, Nuevo Leon, Mexico π²π½
- Downloads last month
- 964
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Danielbrdz/CodeBarcenas-7b")