| language: | |
| - en | |
| pipeline_tag: text-generation | |
| Mistral-7B finetuned on a dataset of BTS fanfic. | |
| This model uses the `alpaca` format: | |
| ``` | |
| {"instruction": "An interaction between a user providing instructions, and an imaginative assistant providing responses.", "input": "...", "output": "..."} | |
| ``` | |
| Note RoPE scaling parameter 4.0, with RoPE scaling type `linear` |