bert-sci-am / README.md
david-inf's picture
Update README.md
d27ff41 verified
|
raw
history blame
689 Bytes
metadata
language:
  - en
metrics:
  - accuracy
pipeline_tag: text-classification
tags:
  - medical
library_name: transformers

Model summary

bert-sci-am is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification.

How to use

from transformers import AutoModelForSequenceClassification, AutoTokenizer

def load_model():
    """Load model from hub"""
    checkpoint = "david-inf/bert-sci-am"
    model = AutoModelForSequenceClassification.from_pretrained(
        checkpoint, num_labels=2)
    tokenizer = AutoTokenizer.from_pretrained(checkpoint)
    return model, tokenizer

model, tokenizer = load_model()