|
|
--- |
|
|
license: mit |
|
|
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct |
|
|
tags: |
|
|
- llama |
|
|
- mmlu |
|
|
- fine-tuned |
|
|
- windyfllm |
|
|
datasets: |
|
|
- cais/mmlu |
|
|
--- |
|
|
|
|
|
# WindyFLLM 2.2 ๐ช๏ธ |
|
|
|
|
|
MMLU ๋ฐ์ดํฐ์
์ผ๋ก ํ์ธํ๋๋ meta-llama/Meta-Llama-3.1-8B-Instruct ๋ชจ๋ธ์
๋๋ค. |
|
|
|
|
|
## ์ฌ์ฉ๋ฒ |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
from peft import PeftModel |
|
|
|
|
|
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") |
|
|
model = PeftModel.from_pretrained(base_model, "tklohj/windyfllm2.2") |
|
|
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") |
|
|
``` |
|
|
|