| # johnpaulbin/toxic-MiniLM-L6-H384-uncased | |
| Test if a sentence is toxic. Only works for english sentences. | |
| ## Usage | |
| Basic classification. Labels: [NOT TOXIC, TOXIC] | |
| Install setfit | |
| `!pip install setfit` | |
| ```python | |
| from setfit import SetFitModel | |
| model = SetFitModel.from_pretrained("johnpaulbin/beanbox-toxic") | |
| inpt = "" #@param {type:"string"} | |
| out = model.predict_proba([inpt]) | |
| if out[0][0] > out[0][1]: | |
| print("Not toxic") | |
| else: | |
| print("Toxic!") | |
| print(f"NOT TOXIC: {out[0][0]}\nTOXIC: {out[0][1]}") | |
| ``` | |