Instructions to use curious008/BertForStorySkillClassification with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use curious008/BertForStorySkillClassification with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="curious008/BertForStorySkillClassification", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("curious008/BertForStorySkillClassification", trust_remote_code=True) model = AutoModelForSequenceClassification.from_pretrained("curious008/BertForStorySkillClassification", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
update readme.md, add context
Browse files
README.md
CHANGED
|
@@ -32,7 +32,7 @@ This model is suitable for applications in education, literary analysis, and sto
|
|
| 32 |
## Model Architecture
|
| 33 |
- **Base Model**: `bert-base-uncased`
|
| 34 |
- **Classification Layer**: A fully connected layer on top of BERT for 7-class classification.
|
| 35 |
-
- **Input**: Question text (e.g., "Who is the main character in the story?"
|
| 36 |
- **Output**: Predicted label and confidence score.
|
| 37 |
|
| 38 |
---
|
|
|
|
| 32 |
## Model Architecture
|
| 33 |
- **Base Model**: `bert-base-uncased`
|
| 34 |
- **Classification Layer**: A fully connected layer on top of BERT for 7-class classification.
|
| 35 |
+
- **Input**: Question text (e.g., "Who is the main character in the story?")、QA text (e.g. "why could n't alice get a doll as a child ? \<SEP> because her family was very poor ")、 QA pair + Context(e.g. "why could n't alice get a doll as a child ? \<SEP> because her family was very poor \<context> alice is ... ")
|
| 36 |
- **Output**: Predicted label and confidence score.
|
| 37 |
|
| 38 |
---
|