File size: 616 Bytes
2561595 7e0a60c 88ffb55 ec2b5f0 c72470f 39333e0 ec2b5f0 39333e0 ec2b5f0 39333e0 ec2b5f0 39333e0 c72470f 39333e0 c72470f 39333e0 c72470f 39333e0 7e0a60c | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | ---
license: cc-by-nc-4.0
language:
- ko
library_name: peft
pipeline_tag: text-generation
---
# Raphael21/Raphael21-SOLAR-10.7B
## Model Details
**Training**
A100x8 * 1
**Base Model**
[Edentns/DataVortexS-10.7B-dpo-v1.11](https://huggingface.co/Edentns/DataVortexS-10.7B-dpo-v1.11)
# Implementation Code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
repo = "Raphael21/Raphael21-SOLAR-10.7B"
model = AutoModelForCausalLM.from_pretrained(
repo,
torch_dtype=torch.float16,
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)
``` |