borodache commited on
Commit
c6058eb
·
verified ·
1 Parent(s): e0eeaba

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -6
README.md CHANGED
@@ -15,15 +15,16 @@ tags: []
15
 
16
  <!-- Provide a longer summary of what this model is. -->
17
 
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
 
19
 
20
- - **Developed by:** [More Information Needed]
21
  - **Funded by [optional]:** [More Information Needed]
22
  - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
  - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
 
28
  ### Model Sources [optional]
29
 
@@ -31,7 +32,7 @@ This is the model card of a 🤗 transformers model that has been pushed on the
31
 
32
  - **Repository:** [More Information Needed]
33
  - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
 
36
  ## Uses
37
 
 
15
 
16
  <!-- Provide a longer summary of what this model is. -->
17
 
18
+ This is a multi label encoder model. It clasiffies text for NSFW, Hate Speech, and Bullying. The output is a probability between 0 and 1 meaning if that label should be count.
19
+ A probability of 0.5 or above means this label is True, otherwise it is False. If all labels are False, then the text is toxic free.
20
 
21
+ - **Developed by:** Eli Borodach & Yoav Yosef
22
  - **Funded by [optional]:** [More Information Needed]
23
  - **Shared by [optional]:** [More Information Needed]
24
+ - **Model type:** Multi Label Encoder
25
+ - **Language(s) (NLP):** English
26
  - **License:** [More Information Needed]
27
+ - **Finetuned from model [optional]:** distilBERT
28
 
29
  ### Model Sources [optional]
30
 
 
32
 
33
  - **Repository:** [More Information Needed]
34
  - **Paper [optional]:** [More Information Needed]
35
+ - **Demo [optional]:** [Telegram chat](https://t.me/distilBERT_toxic_detector)
36
 
37
  ## Uses
38