Instructions to use Suchinthana/T5-Base-Wikigen with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Suchinthana/T5-Base-Wikigen with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Suchinthana/T5-Base-Wikigen") model = AutoModelForSeq2SeqLM.from_pretrained("Suchinthana/T5-Base-Wikigen") - Notebooks
- Google Colab
- Kaggle
Commit ·
3ca9ebe
1
Parent(s): baa557e
Update README.md
Browse files
README.md
CHANGED
|
@@ -12,11 +12,11 @@ datasets:
|
|
| 12 |
|
| 13 |
### Fine tuned T5 base model with Simple English Wikipedia Dataset
|
| 14 |
|
| 15 |
-
This model is fine tuned with articles from Simple English Wikipedia for article generation.
|
| 16 |
|
| 17 |
### How to use
|
| 18 |
|
| 19 |
-
We have to use **"writeWiki: "** part at the begining of each prompt.
|
| 20 |
|
| 21 |
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
|
| 22 |
|
|
|
|
| 12 |
|
| 13 |
### Fine tuned T5 base model with Simple English Wikipedia Dataset
|
| 14 |
|
| 15 |
+
This model is fine tuned with articles from Simple English Wikipedia for article generation. Used around 25,000 articles for training.
|
| 16 |
|
| 17 |
### How to use
|
| 18 |
|
| 19 |
+
We have to use **"writeWiki: "** part at the begining of each prompt.
|
| 20 |
|
| 21 |
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
|
| 22 |
|