metadata
library_name: transformers
language:
- ko
license: apache-2.0
Input Models input text only.
Output Models generate text only.
Model Architecture
Auto-regressive language model based on the LLaMA2 transformer architecture.
Base Model
hyunseoki/ko-en-llama2-13b
Training Objective
linearly interpolated the weight of instruction finetuned the models trained in several datasets.