Instructions to use Cameron/BERT-mdgender-convai-binary with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Cameron/BERT-mdgender-convai-binary with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Cameron/BERT-mdgender-convai-binary")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Cameron/BERT-mdgender-convai-binary") model = AutoModelForSequenceClassification.from_pretrained("Cameron/BERT-mdgender-convai-binary") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 3da2b33a4376cb164935f510e382f3d3213a1b9a9f99c670c87bd4129cddefc1
- Size of remote file:
- 433 MB
- SHA256:
- 624408a8fdfeadbc0dfae8b50f9ac46b3cc2d6949813e7ad804352fbb393518e
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.