ianporada commited on
Commit
d6fd3bd
·
1 Parent(s): 7f684ca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -0
README.md CHANGED
@@ -1,3 +1,33 @@
1
  ---
2
  license: unknown
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: unknown
3
  ---
4
+
5
+ # The Knowref 60K Dataset
6
+
7
+ The second version of the Knowref dataset.
8
+ From: https://github.com/aemami1/KnowRef60k
9
+
10
+
11
+
12
+ Citation:
13
+ ```
14
+ @inproceedings{emami-etal-2020-analysis,
15
+ title = "An Analysis of Dataset Overlap on {W}inograd-Style Tasks",
16
+ author = "Emami, Ali and
17
+ Suleman, Kaheer and
18
+ Trischler, Adam and
19
+ Cheung, Jackie Chi Kit",
20
+ editor = "Scott, Donia and
21
+ Bel, Nuria and
22
+ Zong, Chengqing",
23
+ booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
24
+ month = dec,
25
+ year = "2020",
26
+ address = "Barcelona, Spain (Online)",
27
+ publisher = "International Committee on Computational Linguistics",
28
+ url = "https://aclanthology.org/2020.coling-main.515",
29
+ doi = "10.18653/v1/2020.coling-main.515",
30
+ pages = "5855--5865",
31
+ abstract = "The Winograd Schema Challenge (WSC) and variants inspired by it have become important benchmarks for common-sense reasoning (CSR). Model performance on the WSC has quickly progressed from chance-level to near-human using neural language models trained on massive corpora. In this paper, we analyze the effects of varying degrees of overlaps that occur between these corpora and the test instances in WSC-style tasks. We find that a large number of test instances overlap considerably with the pretraining corpora on which state-of-the-art models are trained, and that a significant drop in classification accuracy occurs when models are evaluated on instances with minimal overlap. Based on these results, we provide the WSC-Web dataset, consisting of over 60k pronoun disambiguation problems scraped from web data, being both the largest corpus to date, and having a significantly lower proportion of overlaps with current pretraining corpora.",
32
+ }
33
+ ```