File size: 517 Bytes
c04ebb0 6fd60f6 c04ebb0 6fd60f6 c04ebb0 6fd60f6 c04ebb0 6fd60f6 c04ebb0 6fd60f6 c04ebb0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
# johnpaulbin/toxic-MiniLM-L6-H384-uncased
Test if a sentence is toxic. Only works for english sentences.
## Usage
Basic classification. Labels: [NOT TOXIC, TOXIC]
Install setfit
`!pip install setfit`
```python
from setfit import SetFitModel
model = SetFitModel.from_pretrained("johnpaulbin/beanbox-toxic")
inpt = "" #@param {type:"string"}
out = model.predict_proba([inpt])
if out[0][0] > out[0][1]:
print("Not toxic")
else:
print("Toxic!")
print(f"NOT TOXIC: {out[0][0]}\nTOXIC: {out[0][1]}")
```
|