Improving reference mining in patents with BERT
Paper • 2101.01039 • Published
How to use kaesve/BERT_patent_reference_extraction with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="kaesve/BERT_patent_reference_extraction") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("kaesve/BERT_patent_reference_extraction")
model = AutoModelForMaskedLM.from_pretrained("kaesve/BERT_patent_reference_extraction")# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("kaesve/BERT_patent_reference_extraction")
model = AutoModelForMaskedLM.from_pretrained("kaesve/BERT_patent_reference_extraction")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This repository contains a finetuned BERT model that can extract references to scientific literature from patents.
See https://github.com/kaesve/patent-citation-extraction and https://arxiv.org/abs/2101.01039 for more information.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="kaesve/BERT_patent_reference_extraction")