caush commited on
Commit
8d65c33
·
1 Parent(s): 43f8fe5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -20,13 +20,13 @@ It achieves the following results on the evaluation set:
20
 
21
  MiniLM is a distilled model from the paper "MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers".
22
 
23
- We fine tune this model to evaluate (regression) the clickbait level of a title news.
24
 
25
  ## Intended uses & limitations
26
 
27
- Model was designed to test the possibilities of Transformers in this cas of NLP problem (like in the paper "Predicting Clickbait Strength in Online Social Media" by Indurthi Vijayasaradhi, Syed Bakhtiyar, Gupta Manish, Varma Vasudeva).
28
 
29
- The model wa trained in english.
30
 
31
  ## Training and evaluation data
32
 
 
20
 
21
  MiniLM is a distilled model from the paper "MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers".
22
 
23
+ We fine tune this model to evaluate (regression) the clickbait level of title news.
24
 
25
  ## Intended uses & limitations
26
 
27
+ Model was designed to work on Transformers (like in the paper "Predicting Clickbait Strength in Online Social Media" by Indurthi Vijayasaradhi, Syed Bakhtiyar, Gupta Manish, Varma Vasudeva).
28
 
29
+ The model wa trained with english titles.
30
 
31
  ## Training and evaluation data
32