Update README.md
Browse files
README.md
CHANGED
|
@@ -21,8 +21,6 @@ This model is a fine-tuned version of the Flan-T5 small model, specifically adap
|
|
| 21 |
- **Purpose**: Generate engaging titles from input text
|
| 22 |
- **Base Model**: [google/flan-t5-small](https://huggingface.co/google/flan-t5-small)
|
| 23 |
|
| 24 |
-
Flan-T5 is an enhanced version of T5, fine-tuned on over 1 000 additional tasks across multiple languages. This makes it better at a wide range of tasks like reasoning, question answering, and few-shot learning, even compared to much larger models.
|
| 25 |
-
|
| 26 |
## Intended Uses & Limitations
|
| 27 |
|
| 28 |
### Intended Uses
|
|
@@ -80,7 +78,7 @@ The model was trained using the following framework versions:
|
|
| 80 |
|
| 81 |
To use the model, follow these steps:
|
| 82 |
|
| 83 |
-
1. Input format: `topic
|
| 84 |
2. The model will generate an attention-grabbing title based on the input text
|
| 85 |
3. Always review the output for relevance and appropriateness
|
| 86 |
|
|
@@ -112,6 +110,6 @@ generated_title = tokenizer.decode(outputs[0], skip_special_tokens=True)
|
|
| 112 |
print(generated_title) # The Serenity of Nature: A Symbol of Peace and Harmony
|
| 113 |
```
|
| 114 |
|
| 115 |
-
##
|
| 116 |
|
| 117 |
This model is released under the Apache 2.0 license.
|
|
|
|
| 21 |
- **Purpose**: Generate engaging titles from input text
|
| 22 |
- **Base Model**: [google/flan-t5-small](https://huggingface.co/google/flan-t5-small)
|
| 23 |
|
|
|
|
|
|
|
| 24 |
## Intended Uses & Limitations
|
| 25 |
|
| 26 |
### Intended Uses
|
|
|
|
| 78 |
|
| 79 |
To use the model, follow these steps:
|
| 80 |
|
| 81 |
+
1. Input format: `topic||text`
|
| 82 |
2. The model will generate an attention-grabbing title based on the input text
|
| 83 |
3. Always review the output for relevance and appropriateness
|
| 84 |
|
|
|
|
| 110 |
print(generated_title) # The Serenity of Nature: A Symbol of Peace and Harmony
|
| 111 |
```
|
| 112 |
|
| 113 |
+
## License
|
| 114 |
|
| 115 |
This model is released under the Apache 2.0 license.
|