triper2_KoAlpaca-6B / README.md
bkk21's picture
Update README.md
e67f8df verified
---
library_name: transformers
tags: []
---
<!-- Provide a quick summary of what the model is/does. -->
# πŸ€— λͺ¨λΈ μ‚¬μš© 방법
```python
peft_config = LoraConfig.from_pretrained("bkk21/triper2_KoAlpaca-6B")
tokenizer = AutoTokenizer.from_pretrained("bkk21/triper2_KoAlpaca-6B", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("bkk21/triper2_KoAlpaca-6B", config=peft_config, device_map="auto")
```
# πŸ“‘ λͺ¨λΈ μ‚¬μš© ν•¨μˆ˜
```python
#model μ‚¬μš©ν•¨μˆ˜ μ •μ˜
def gen(x):
system = """
λ„ˆλŠ” μ„œμšΈμ— λŒ€ν•΄ 잘 μ•Œκ³  μžˆλŠ” μ—¬ν–‰ μž‘κ°€μ•Ό.
μ„œμšΈμ— λŒ€ν•΄ 잘 μ•Œκ³  μžˆμ–΄μ„œ μ‚¬μš©μžκ°€ μΆ”μ²œμ„ 해달라고 ν•˜λ©΄ μ μ ˆν•œ μΆ”μ²œμ„ ν•  수 μžˆμ–΄. 단, μ„œμšΈμ΄ μ•„λ‹Œ λ‹€λ₯Έ μ§€μ—­μ˜ μž₯μ†ŒλŠ” μΆ”μ²œν•˜λ©΄ μ•ˆ 돼.
μ„œμšΈμ˜ 행정ꡬ역 λ³„λ‘œ μ•Œκ³  있고, μΆ”μ²œν•  수 μžˆλŠ” μ£Όμ œλŠ” ["λ§›μ§‘", "카페", "ν•«ν”Œ", "μˆ™μ†Œ", "놀거리"]μ•Ό.
예λ₯Ό λ“€μ–΄, μš©μ‚°λ™ μž₯μ†Œλ₯Ό μΆ”μ²œν•΄λ‹¬λΌκ³  ν•˜λ©΄, μš©μ‚°λ™μ˜ λ§›μ§‘ 1개, 카페 1개, ν•«ν”Œ 1개, μˆ™μ†Œ 1개, 놀거리 1개λ₯Ό ν•„μˆ˜λ‘œ μΆ”μ²œν•΄μ€˜.
λ§Œμ•½ ν•œ 주제만 μΆ”μ²œν•΄λ‹¬λΌκ³  ν•˜λ©΄ ν•˜λ‚˜μ˜ μ£Όμ œμ— λŒ€ν•΄ 5개 μΆ”μ²œν•΄μ€˜. 그리고 각각의 μž₯μ†ŒλŠ” μ£Όμ†Œμ™€ μ˜μ—…μ •λ³΄λ₯Ό κΌ­ μ•Œλ €μ€˜μ•Ό ν•΄.
"""
gened = model.generate(
**tokenizer(
f"###instruction: {system}\n\n### input: {x}\n\n### output:",
return_tensors='pt',
return_token_type_ids=False
).to("cuda"),
max_new_tokens=512,
early_stopping=True,
do_sample=True,
eos_token_id=2,
)
output_text = tokenizer.decode(gened[0])
output_only = re.search(r'### output:\s*(.*)', output_text, re.DOTALL)
print(output_text)
if output_only:
return output_only.group(1).strip()
```
# 😎 ν•¨μˆ˜ μ‹€ν–‰ 및 κ²°κ³Ό 확인
```python
text = "μš©μ‚°κ΅¬ ν•œμ‹ 맛집을 μΆ”μ²œν•΄μ€˜"
gen(text)
```
---