Update README.md
Browse files
README.md
CHANGED
|
@@ -10,7 +10,8 @@ library_name: transformers
|
|
| 10 |
---
|
| 11 |
|
| 12 |
### Model summary
|
| 13 |
-
|
|
|
|
| 14 |
|
| 15 |
### How to use
|
| 16 |
|
|
@@ -21,7 +22,7 @@ def load_model():
|
|
| 21 |
"""Load model from hub"""
|
| 22 |
checkpoint = "david-inf/bert-sci-am"
|
| 23 |
model = AutoModelForSequenceClassification.from_pretrained(
|
| 24 |
-
checkpoint, num_labels=
|
| 25 |
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
|
| 26 |
return model, tokenizer
|
| 27 |
|
|
|
|
| 10 |
---
|
| 11 |
|
| 12 |
### Model summary
|
| 13 |
+
|
| 14 |
+
`bert-sci-am` is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification. This version is trained on 3-class classification on (david-inf/am-nlp-abstrct)[david-inf/am-nlp-abstrct] forked from [pie/abstrct](https://huggingface.co/datasets/pie/abstrct/tree/main) dataset.
|
| 15 |
|
| 16 |
### How to use
|
| 17 |
|
|
|
|
| 22 |
"""Load model from hub"""
|
| 23 |
checkpoint = "david-inf/bert-sci-am"
|
| 24 |
model = AutoModelForSequenceClassification.from_pretrained(
|
| 25 |
+
checkpoint, num_labels=3)
|
| 26 |
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
|
| 27 |
return model, tokenizer
|
| 28 |
|