| | --- |
| | license: apache-2.0 |
| | datasets: |
| | - databricks/databricks-dolly-15k |
| | language: |
| | - en |
| | metrics: |
| | - rouge |
| | base_model: |
| | - openai-community/gpt2-medium |
| | pipeline_tag: text-generation |
| | --- |
| | # init-gpt2-340M |
| |
|
| | [paper](https://arxiv.org/abs/2306.08543) | [code](https://github.com/microsoft/LMOps/tree/main/minillm) |
| |
|
| | **init-gpt2-340M** is a gpt2-medium (340M) model supervised fine-tuned on [databricks-dolly-15k](https://huggingface.co/datasets/aisquared/databricks-dolly-15k). |
| |
|
| | It is used as the initialization for training [MiniLLM](https://huggingface.co/MiniLLM/MiniLLM-gpt2-340M). |
| |
|
| | ## Citation |
| | ``` |
| | @inproceedings{minillm, |
| | title={MiniLLM: Knowledge Distillation of Large Language Models}, |
| | author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie}, |
| | booktitle={Proceedings of ICLR}, |
| | year={2024} |
| | } |
| | ``` |