metadata
license: mit
datasets:
- PsiPi/CodeAlpaca_20k_NoBlanks
base_model:
- apple/OpenELM-450M
pipeline_tag: text-generation
tags:
- code
🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on CodeAlpaca_20k
This is a LoRA adapter trained on the CodeAlpaca dataset using Apple's OpenELM-450M base model.
Model Details
- Base model:
apple/OpenELM-450M - Adapter type: LoRA via PEFT (float32)
- Trained on: CodeAlpaca
- Languages: English
- License: mit
How to Use
from transformers import AutoModelForCausalLM, AutoTokenizer
base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")