caush commited on
Commit
6e54b50
·
1 Parent(s): 32b904d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -4
README.md CHANGED
@@ -12,23 +12,28 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # Clickbait1
14
 
15
- This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
  - Loss: 0.0257
18
 
19
  ## Model description
20
 
21
- More information needed
 
 
22
 
23
  ## Intended uses & limitations
24
 
25
- More information needed
 
 
26
 
27
  ## Training and evaluation data
28
 
29
- More information needed
30
 
31
  ## Training procedure
 
32
 
33
  ### Training hyperparameters
34
 
 
12
 
13
  # Clickbait1
14
 
15
+ This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the [Webis-Clickbait-17](https://zenodo.org/record/5530410) dataset.
16
  It achieves the following results on the evaluation set:
17
  - Loss: 0.0257
18
 
19
  ## Model description
20
 
21
+ MiniLM is a distilled model from the paper "MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers".
22
+
23
+ We fine tune this model to evaluate (regression) the clickbait level of title news.
24
 
25
  ## Intended uses & limitations
26
 
27
+ Model looks like the model described in the paper [Predicting Clickbait Strength in Online Social Media](https://aclanthology.org/2020.coling-main.425/) by Indurthi Vijayasaradhi, Syed Bakhtiyar, Gupta Manish, Varma Vasudeva.
28
+
29
+ The model was trained with english titles.
30
 
31
  ## Training and evaluation data
32
 
33
+ We trained the model with the official training data for the chalenge (clickbait17-train-170630.zip (894 MiB, 19538 posts), plus another set that was just available after the end of the challenge (clickbait17-train-170331.zip (157 MiB, 2459 posts).
34
 
35
  ## Training procedure
36
+ Code can be find in [Github](https://github.com/caush/Clickbait).
37
 
38
  ### Training hyperparameters
39