metadata
license: mit
datasets:
- databricks/databricks-dolly-15k
language:
- en
base_model:
- apple/OpenELM-450M
pipeline_tag: text-generation
library_name: peft
🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on dolly
This is a LoRA adapter trained on the dolly dataset using Apple's OpenELM-450M base model.
Model Details
- Base model:
apple/OpenELM-450M - Adapter type: LoRA via PEFT (float32)
- Trained on: databricks-dolly-15k (question-answering)
- Languages: English
- License: mit
How to Use
from transformers import AutoModelForCausalLM, AutoTokenizer
base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")