Transformers
Safetensors
English
bart
text2text-generation
tteofili commited on
Commit
c1ea051
·
verified ·
1 Parent(s): 8b613d4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  license: apache-2.0
3
  datasets:
4
- - jigsaw_toxicity_pred
5
  language:
6
  - en
7
  metrics:
@@ -14,7 +14,7 @@ This model is a `facebook/bart-large` fine-tuned on sarcastic comments from `raq
14
  ## Model Details
15
 
16
  This model is not intended to be used for plain inference as it is very likely to predict sarcastic content.
17
- It is intended to be used instead as "utility model" for detecting and fixing toxic content as its token probability distributions will likely differ from comparable models not trained/fine-tuned over sarcastic data.
18
  Its name `gminus` refers to the _G-_ model in [Detoxifying Text with MARCO: Controllable Revision with Experts and Anti-Experts](https://aclanthology.org/2023.acl-short.21.pdf).
19
 
20
  ### Model Description
 
1
  ---
2
  license: apache-2.0
3
  datasets:
4
+ - raquiba/Sarcasm_News_Headline
5
  language:
6
  - en
7
  metrics:
 
14
  ## Model Details
15
 
16
  This model is not intended to be used for plain inference as it is very likely to predict sarcastic content.
17
+ It is intended to be used instead as "utility model" for detecting and fixing sarcastic content as its token probability distributions will likely differ from comparable models not trained/fine-tuned over sarcastic data.
18
  Its name `gminus` refers to the _G-_ model in [Detoxifying Text with MARCO: Controllable Revision with Experts and Anti-Experts](https://aclanthology.org/2023.acl-short.21.pdf).
19
 
20
  ### Model Description