Instructions to use nlpaueb/sec-bert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nlpaueb/sec-bert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="nlpaueb/sec-bert-base")# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("nlpaueb/sec-bert-base") model = AutoModelForPreTraining.from_pretrained("nlpaueb/sec-bert-base") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- b7faaabf40be0d9fc3ec2314cc3c68bfff3faac8b29cb678ef2ce7d57f007c78
- Size of remote file:
- 439 MB
- SHA256:
- cbc4a6a0a79d20a52e2b5599455fb65e10c9b1a115d895bd38c181e05948051b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.