borodache commited on
Commit
04ccddd
·
verified ·
1 Parent(s): c6058eb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ tags: []
15
 
16
  <!-- Provide a longer summary of what this model is. -->
17
 
18
- This is a multi label encoder model. It clasiffies text for NSFW, Hate Speech, and Bullying. The output is a probability between 0 and 1 meaning if that label should be count.
19
  A probability of 0.5 or above means this label is True, otherwise it is False. If all labels are False, then the text is toxic free.
20
 
21
  - **Developed by:** Eli Borodach & Yoav Yosef
 
15
 
16
  <!-- Provide a longer summary of what this model is. -->
17
 
18
+ This is a multi label encoder SLM. It clasiffies text for NSFW, Hate Speech, and Bullying. The output is a probability between 0 and 1 meaning if that label should be count.
19
  A probability of 0.5 or above means this label is True, otherwise it is False. If all labels are False, then the text is toxic free.
20
 
21
  - **Developed by:** Eli Borodach & Yoav Yosef