metadata
base_model:
- openai-community/gpt2
datasets:
- databricks/databricks-dolly-15k
language:
- en
license: apache-2.0
metrics:
- rouge
pipeline_tag: text-generation
library_name: transformers
SFT-gpt2-120M
SFT-gpt2-120M is a gpt2-base (120M) model supervised fine-tuned on databricks-dolly-15k.
It is used as a baseline for MiniLLM.
Other Baselines
Citation
@inproceedings{minillm,
title={MiniLLM: Knowledge Distillation of Large Language Models},
author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie},
booktitle={Proceedings of ICLR},
year={2024}
}