Update README.md
Browse files
README.md
CHANGED
|
@@ -4,12 +4,10 @@ language:
|
|
| 4 |
- en
|
| 5 |
pipeline_tag: text2text-generation
|
| 6 |
---
|
| 7 |
-
#
|
| 8 |
|
| 9 |
## Model Details
|
| 10 |
|
| 11 |
-
### Model Description
|
| 12 |
-
|
| 13 |
- **Model type:** Text-to-Text Generation
|
| 14 |
- **Language(s) (NLP):** English
|
| 15 |
- **License:** MIT License
|
|
@@ -26,22 +24,14 @@ from transformers import pipeline
|
|
| 26 |
|
| 27 |
pipe = pipeline("text2text-generation", model="textgain/News2Topic-T5-base")
|
| 28 |
|
| 29 |
-
```
|
| 30 |
-
# Example usage
|
| 31 |
-
|
| 32 |
-
```
|
| 33 |
news_text = "Your news text here."
|
| 34 |
print(pipe(news_text))
|
| 35 |
```
|
| 36 |
|
| 37 |
## Training Details
|
| 38 |
|
| 39 |
-
### Training Data
|
| 40 |
-
|
| 41 |
The News2Topic T5-base model was trained on a 21K sample of the "newsroom" dataset annotated with synthetic data generated by GPT-3.5-turbo
|
| 42 |
|
| 43 |
-
### Training Procedure
|
| 44 |
-
|
| 45 |
The model was trained for 3 epochs, with a learning rate of 0.00001, a maximum sequence length of 512, and a training batch size of 12.
|
| 46 |
|
| 47 |
## Citation
|
|
|
|
| 4 |
- en
|
| 5 |
pipeline_tag: text2text-generation
|
| 6 |
---
|
| 7 |
+
# News2Topic-T5-base
|
| 8 |
|
| 9 |
## Model Details
|
| 10 |
|
|
|
|
|
|
|
| 11 |
- **Model type:** Text-to-Text Generation
|
| 12 |
- **Language(s) (NLP):** English
|
| 13 |
- **License:** MIT License
|
|
|
|
| 24 |
|
| 25 |
pipe = pipeline("text2text-generation", model="textgain/News2Topic-T5-base")
|
| 26 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 27 |
news_text = "Your news text here."
|
| 28 |
print(pipe(news_text))
|
| 29 |
```
|
| 30 |
|
| 31 |
## Training Details
|
| 32 |
|
|
|
|
|
|
|
| 33 |
The News2Topic T5-base model was trained on a 21K sample of the "newsroom" dataset annotated with synthetic data generated by GPT-3.5-turbo
|
| 34 |
|
|
|
|
|
|
|
| 35 |
The model was trained for 3 epochs, with a learning rate of 0.00001, a maximum sequence length of 512, and a training batch size of 12.
|
| 36 |
|
| 37 |
## Citation
|