|
|
--- |
|
|
license: mit |
|
|
datasets: |
|
|
- PsiPi/CodeAlpaca_20k_NoBlanks |
|
|
base_model: |
|
|
- apple/OpenELM-450M |
|
|
pipeline_tag: text-generation |
|
|
tags: |
|
|
- code |
|
|
--- |
|
|
# 🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on CodeAlpaca_20k |
|
|
|
|
|
This is a **LoRA adapter** trained on the [CodeAlpaca](https://huggingface.co/datasets/PsiPi/CodeAlpaca_20k_NoBlanks) dataset using [Apple's OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) base model. |
|
|
--- |
|
|
|
|
|
## Model Details |
|
|
|
|
|
- **Base model**: [`apple/OpenELM-450M`](https://huggingface.co/apple/OpenELM-450M) |
|
|
- **Adapter type**: [LoRA](https://arxiv.org/abs/2106.09685) via [PEFT](https://github.com/huggingface/peft) (float32) |
|
|
- **Trained on**: CodeAlpaca |
|
|
- **Languages**: English |
|
|
- **License**: mit |
|
|
|
|
|
--- |
|
|
|
|
|
## How to Use |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
|
|
base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M") |
|
|
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M") |
|
|
``` |