Instructions to use unitary/toxic-bert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use unitary/toxic-bert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="unitary/toxic-bert")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("unitary/toxic-bert") model = AutoModelForSequenceClassification.from_pretrained("unitary/toxic-bert") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c66f8d14210dfeea371c17c91e1ff954955d3401a8414d5c1f42140c65deb776
- Size of remote file:
- 438 MB
- SHA256:
- 0d0f2f0f91fba48e031aa7afa281c4b57fea20bbff32cef6d01d3e8ac74ba1d6
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.