|
|
--- |
|
|
library_name: transformers |
|
|
tags: [] |
|
|
--- |
|
|
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
# π€ λͺ¨λΈ μ¬μ© λ°©λ² |
|
|
```python |
|
|
peft_config = LoraConfig.from_pretrained("bkk21/triper2_KoAlpaca-6B") |
|
|
tokenizer = AutoTokenizer.from_pretrained("bkk21/triper2_KoAlpaca-6B", trust_remote_code=True) |
|
|
model = AutoModelForCausalLM.from_pretrained("bkk21/triper2_KoAlpaca-6B", config=peft_config, device_map="auto") |
|
|
``` |
|
|
|
|
|
# π λͺ¨λΈ μ¬μ© ν¨μ |
|
|
```python |
|
|
#model μ¬μ©ν¨μ μ μ |
|
|
def gen(x): |
|
|
system = """ |
|
|
λλ μμΈμ λν΄ μ μκ³ μλ μ¬ν μκ°μΌ. |
|
|
|
|
|
μμΈμ λν΄ μ μκ³ μμ΄μ μ¬μ©μκ° μΆμ²μ ν΄λ¬λΌκ³ νλ©΄ μ μ ν μΆμ²μ ν μ μμ΄. λ¨, μμΈμ΄ μλ λ€λ₯Έ μ§μμ μ₯μλ μΆμ²νλ©΄ μ λΌ. |
|
|
|
|
|
μμΈμ νμ ꡬμ λ³λ‘ μκ³ μκ³ , μΆμ²ν μ μλ μ£Όμ λ ["λ§μ§", "μΉ΄ν", "ν«ν", "μμ", "λ거리"]μΌ. |
|
|
|
|
|
μλ₯Ό λ€μ΄, μ©μ°λ μ₯μλ₯Ό μΆμ²ν΄λ¬λΌκ³ νλ©΄, μ©μ°λμ λ§μ§ 1κ°, μΉ΄ν 1κ°, ν«ν 1κ°, μμ 1κ°, λ거리 1κ°λ₯Ό νμλ‘ μΆμ²ν΄μ€. |
|
|
|
|
|
λ§μ½ ν μ£Όμ λ§ μΆμ²ν΄λ¬λΌκ³ νλ©΄ νλμ μ£Όμ μ λν΄ 5κ° μΆμ²ν΄μ€. κ·Έλ¦¬κ³ κ°κ°μ μ₯μλ μ£Όμμ μμ
μ 보λ₯Ό κΌ μλ €μ€μΌ ν΄. |
|
|
""" |
|
|
|
|
|
gened = model.generate( |
|
|
**tokenizer( |
|
|
f"###instruction: {system}\n\n### input: {x}\n\n### output:", |
|
|
return_tensors='pt', |
|
|
return_token_type_ids=False |
|
|
).to("cuda"), |
|
|
max_new_tokens=512, |
|
|
early_stopping=True, |
|
|
do_sample=True, |
|
|
eos_token_id=2, |
|
|
) |
|
|
output_text = tokenizer.decode(gened[0]) |
|
|
output_only = re.search(r'### output:\s*(.*)', output_text, re.DOTALL) |
|
|
|
|
|
print(output_text) |
|
|
|
|
|
if output_only: |
|
|
return output_only.group(1).strip() |
|
|
``` |
|
|
|
|
|
# π ν¨μ μ€ν λ° κ²°κ³Ό νμΈ |
|
|
```python |
|
|
text = "μ©μ°κ΅¬ νμ λ§μ§μ μΆμ²ν΄μ€" |
|
|
gen(text) |
|
|
``` |
|
|
--- |