Update README.md
Browse files
README.md
CHANGED
|
@@ -19,7 +19,7 @@ model-index:
|
|
| 19 |
|
| 20 |
# Edgar Allen Poe LLM
|
| 21 |
|
| 22 |
-
|
| 23 |
|
| 24 |
## Features
|
| 25 |
|
|
@@ -28,7 +28,7 @@ Mayo is a language model fine-tuned on the [EAP dataset](https://huggingface.co/
|
|
| 28 |
|
| 29 |
## Usage
|
| 30 |
|
| 31 |
-
To use the
|
| 32 |
|
| 33 |
```python
|
| 34 |
from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
|
|
|
|
| 19 |
|
| 20 |
# Edgar Allen Poe LLM
|
| 21 |
|
| 22 |
+
EAP is a language model fine-tuned on the [EAP dataset](https://huggingface.co/datasets/nroggendorff/eap) using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the [Mistral 7b Model](mistralai/Mistral-7B-Instruct-v0.3)
|
| 23 |
|
| 24 |
## Features
|
| 25 |
|
|
|
|
| 28 |
|
| 29 |
## Usage
|
| 30 |
|
| 31 |
+
To use the LLM, you can load the model using the Hugging Face Transformers library:
|
| 32 |
|
| 33 |
```python
|
| 34 |
from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
|