How to use nlpaueb/sec-bert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="nlpaueb/sec-bert-base")
# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("nlpaueb/sec-bert-base") model = AutoModelForPreTraining.from_pretrained("nlpaueb/sec-bert-base")
Hi, i am trying to train similar model but for summarization only. Is it possible to see the dataset used ? or the code used to train the bert-uncased model ?
· Sign up or log in to comment