Instructions to use abhitopia/question-answer-generation with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use abhitopia/question-answer-generation with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("abhitopia/question-answer-generation") model = AutoModelForSeq2SeqLM.from_pretrained("abhitopia/question-answer-generation") - Notebooks
- Google Colab
- Kaggle
abhitopia commited on
Commit ·
3bd8adb
1
Parent(s): d355235
bug fix
Browse files- code/requirements.txt +0 -2
code/requirements.txt
CHANGED
|
@@ -1,6 +1,4 @@
|
|
| 1 |
wheel
|
| 2 |
-
torch==1.12.1
|
| 3 |
-
transformers==4.21.2
|
| 4 |
nltk==3.7
|
| 5 |
sentencepiece==0.1.97
|
| 6 |
protobuf==3.20
|
|
|
|
| 1 |
wheel
|
|
|
|
|
|
|
| 2 |
nltk==3.7
|
| 3 |
sentencepiece==0.1.97
|
| 4 |
protobuf==3.20
|