bert-sci-am / README.md
david-inf's picture
Update README.md
1e986db verified
metadata
language:
  - en
metrics:
  - accuracy
pipeline_tag: text-classification
tags:
  - medical
library_name: transformers

Model summary

bert-sci-am is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification. This version is trained on 3-class classification on (david-inf/am-nlp-abstrct)[david-inf/am-nlp-abstrct] forked from pie/abstrct dataset.

How to use

from transformers import AutoModelForSequenceClassification, AutoTokenizer

def load_model():
    """Load model from hub"""
    checkpoint = "david-inf/bert-sci-am"
    model = AutoModelForSequenceClassification.from_pretrained(
        checkpoint, num_labels=3)
    tokenizer = AutoTokenizer.from_pretrained(checkpoint)
    return model, tokenizer

model, tokenizer = load_model()