Update README.md
Browse files
README.md
CHANGED
|
@@ -16,7 +16,6 @@ datasets:
|
|
| 16 |
|
| 17 |
## Intended uses & limitations
|
| 18 |
#### How to use
|
| 19 |
-
You can use this model with Transformers.
|
| 20 |
|
| 21 |
```python
|
| 22 |
|
|
@@ -44,15 +43,27 @@ response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tok
|
|
| 44 |
print(response)
|
| 45 |
```
|
| 46 |
|
| 47 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 48 |
Coming soon
|
| 49 |
|
| 50 |
#### Limitations and bias
|
| 51 |
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
|
| 52 |
|
| 53 |
-
|
| 54 |
This model is fine-tuned on 60k+ instruction-following demonstrations built from an aggregation of datasets ([AfriQA](https://huggingface.co/datasets/masakhane/afriqa), [XLSum](https://huggingface.co/datasets/csebuetnlp/xlsum), [MENYO-20k](https://huggingface.co/datasets/menyo20k_mt)), and translations of [Alpaca-gpt4](https://huggingface.co/datasets/vicgalle/alpaca-gpt4)).
|
| 55 |
|
|
|
|
|
|
|
|
|
|
| 56 |
### BibTeX entry and citation info
|
| 57 |
```
|
| 58 |
@article{
|
|
|
|
| 16 |
|
| 17 |
## Intended uses & limitations
|
| 18 |
#### How to use
|
|
|
|
| 19 |
|
| 20 |
```python
|
| 21 |
|
|
|
|
| 43 |
print(response)
|
| 44 |
```
|
| 45 |
|
| 46 |
+
|
| 47 |
+
#### Example outputs
|
| 48 |
+
|
| 49 |
+
```
|
| 50 |
+
Ilana (Instruction): '...'
|
| 51 |
+
|
| 52 |
+
mistral_7b_yo_instruct: '...'
|
| 53 |
+
```
|
| 54 |
+
|
| 55 |
+
#### Eval results
|
| 56 |
Coming soon
|
| 57 |
|
| 58 |
#### Limitations and bias
|
| 59 |
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
|
| 60 |
|
| 61 |
+
#### Training data
|
| 62 |
This model is fine-tuned on 60k+ instruction-following demonstrations built from an aggregation of datasets ([AfriQA](https://huggingface.co/datasets/masakhane/afriqa), [XLSum](https://huggingface.co/datasets/csebuetnlp/xlsum), [MENYO-20k](https://huggingface.co/datasets/menyo20k_mt)), and translations of [Alpaca-gpt4](https://huggingface.co/datasets/vicgalle/alpaca-gpt4)).
|
| 63 |
|
| 64 |
+
### Use and safety
|
| 65 |
+
We emphasize that mistral_7b_yo_instruct is intended only for research purposes and is not ready to be deployed for general use, namely because we have not designed adequate safety measures.
|
| 66 |
+
|
| 67 |
### BibTeX entry and citation info
|
| 68 |
```
|
| 69 |
@article{
|