| | --- |
| | language: |
| | - en |
| | tags: |
| | - text-generation-inference |
| | --- |
| | |
| | # Model Card for Mistral-7B pre-trained |
| |
|
| | ### Model Description |
| |
|
| | <!-- Provide a longer summary of what this model is. --> |
| |
|
| | This model is a fine-tuned **Mistral-7B** model on a collection of books. |
| |
|
| | - **Language(s) (NLP):** English |
| | - **Finetuned from model:** [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1) |
| | - **Dataset used for fine-tuning:** Public Domain Gutenberg Corpus fiction books |
| |
|
| | This model was pre-trained on a collection of public domain books from [Gutenberg](https://www.gutenberg.org/). |
| | It is meant to be further fine-tuned on task specific data related to narrative content. |
| |
|