YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

license : llama3

LLaMA3 8B Instruct - Fine-Tuned Model

이 λͺ¨λΈμ€ LLaMA3 8B Instruct 기반으둜 AIHub 데이터셋과 자체 μ œμž‘ν•œ μ»€μŠ€ν…€ 데이터셋을 ν™œμš©ν•΄ νŒŒμΈνŠœλ‹ν•œ λ²„μ „μž…λ‹ˆλ‹€.
민감 데이터λ₯Ό 포함할 수 μžˆμœΌλ―€λ‘œ μ‚¬μš© μ‹œ μ£Όμ˜κ°€ ν•„μš”ν•©λ‹ˆλ‹€.


πŸ“˜ Model Details

  • Base Model: LLaMA3 8B Instruct
  • Trainable Parameters: 전체 νŒŒλΌλ―Έν„°μ˜ μ•½ 10%만 ν•™μŠ΅ κ°€λŠ₯ν•˜λ„λ‘ μ„€μ • (LoRA, QLoRA λ˜λŠ” 기타 PEFT 방식 μ‚¬μš© κ°€λŠ₯)
  • Fine-tuning Data:
    • AIHub 곡개 데이터셋
    • 자체 μˆ˜μ§‘ 및 κ΅¬μΆ•ν•œ 도메인 νŠΉν™” 데이터

βš™οΈ Generation Configuration

Parameter Value
max_new_tokens 1024
temperature 0.75
repetition_penalty 1.1 ~ 1.2
do_sample True
top_k 5

⚠️ μ£Όμ˜μ‚¬ν•­

  • 이 λͺ¨λΈμ€ ν•™μŠ΅ 데이터에 민감 정보λ₯Ό ν¬ν•¨ν•˜κ³  μžˆμ„ κ°€λŠ₯성이 μžˆμœΌλ―€λ‘œ, μ‹€μ œ μ„œλΉ„μŠ€λ‚˜ 응닡 ν™œμš© μ‹œ 데이터 λ³΄μ•ˆ 및 ν”„λΌμ΄λ²„μ‹œ λ³΄ν˜Έμ— μœ μ˜ν•΄ μ£Όμ„Έμš”.

Downloads last month
-
Safetensors
Model size
8B params
Tensor type
F16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for sel303/llama3-diverce-v4.5

Quantizations
1 model