Instructions to use SAP/BERT-Large-Contrastive-Self-Supervised-ACL2020 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use SAP/BERT-Large-Contrastive-Self-Supervised-ACL2020 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="SAP/BERT-Large-Contrastive-Self-Supervised-ACL2020")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("SAP/BERT-Large-Contrastive-Self-Supervised-ACL2020") model = AutoModelForMaskedLM.from_pretrained("SAP/BERT-Large-Contrastive-Self-Supervised-ACL2020") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- eb5bee6c0e6e462c7b9f385f5718028f3bfe590a6c607d1fa1c71f61dfc7ac71
- Size of remote file:
- 1.34 GB
- SHA256:
- 8b2f345c1921617a7f91bb154086f28dfe0f2b64c3f4acf95e585fa0c7e68469
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.