File size: 886 Bytes
ecea128 1e986db ecea128 d27ff41 1e986db d27ff41 ecea128 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
language:
- en
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- medical
library_name: transformers
---
### Model summary
`bert-sci-am` is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification. This version is trained on 3-class classification on (david-inf/am-nlp-abstrct)[david-inf/am-nlp-abstrct] forked from [pie/abstrct](https://huggingface.co/datasets/pie/abstrct/tree/main) dataset.
### How to use
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
def load_model():
"""Load model from hub"""
checkpoint = "david-inf/bert-sci-am"
model = AutoModelForSequenceClassification.from_pretrained(
checkpoint, num_labels=3)
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
return model, tokenizer
model, tokenizer = load_model()
``` |