Amo999 commited on
Commit
48aa5f8
·
verified ·
1 Parent(s): 242f8cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +174 -57
README.md CHANGED
@@ -132,66 +132,76 @@ The COLLE dataset suite hosts multiple French NLP benchmark tasks for evaluating
132
 
133
  ## Task Descriptions
134
 
135
- ## Allo-ciné:
136
  Allo-ciné tests language understanding in sentiment classification by feeding movie reviews which can be either positive and negative, the task consists in giving the correct sentiment for each review.
137
- ## paws_x:
138
  This task aims to test paraphrase identification by giving two sentences and a label defining if these sentences are equivalent in meaning or not.
139
 
140
- ## fquad:
141
  Fquad is question/answer pair built on high-quality wikipedia articles. The goal of the model in this task is to accurately predict if the answer to the question really can be found in the provided answer.
142
 
143
- ## gqnli:
144
  The dataset consists of carefully constructed premise-hypothesis pairs that involve quantifier logic (e.g., most, at least, more than half). The goal is to evaluate the model's ability to reason about these expressions and determine whether the hypothesis logically follows from the premise, contradicts it, or is neutral.
145
 
146
- ## piaf:
147
  This task consists of pairs of questions and text answers with information of where in the answer is the truly relevant information.
148
 
149
- ## sickfr:
150
  This task also has pairs of sentences and notes them on 2 dimensions, relatedness and entailment. While relatedness scales from 1 to 5, entailement is a choice between entails, contradicts or neutral.
151
 
152
- ## xnli:
153
  This task consists of pairs of sentences where the goal is to determine the relation between the two sentences, this relation can be either entailement, neutral or contradiction.
154
 
155
- ## qfrcola:
156
  QFrCoLA is a french dataset made from multiple french language sites such as académie-française.fr and vitrinelinguistique.com. It aims to tests models ability to determine a sentence's acceptability in french on subjects such as grammar and syntax. The answer is a binary label indicating if the sentence is correct or not.
157
 
158
- ## qfrblimp:
159
  This task gives the model sentences pairs, the goal is to determine if the sentences are semantically equivalent, or, put more simply, if they mean the same thing, even with slightly different syntax and words.
160
 
161
- ## sts22:
162
  This task evaluates whether pairs of news articles, written in different languages, cover the same story. It focuses on document-level similarity, where systems rate article pairs on a 4-point scale from most to least similar
163
 
164
- ## wino_x_lm
165
  Pronoun resolution task: choose between two referents in a sentence with an ambiguous pronoun.
166
 
167
- ## wino_x_mt
168
  Translation-based pronoun resolution: choose which of two French translations uses the correct gendered pronoun.
169
 
170
- ## expressions_quebecoises
171
- Given a Quebec proverb, predict its corresponding French equivalent.
172
 
173
- ## termes_quebecoises
174
- Translate a Quebec French term to standard French or English based on a phrase.
 
 
 
 
 
 
175
 
176
- ## daccord
177
  Determine if a French sentence makes sense semantically (binary label).
178
 
179
- ## french_boolq
180
  Boolean question answering in French: answer true/false based on context.
181
 
182
- ## mnli-nineeleven-fr-mt
183
  French machine-translated version of MNLI using 9/11 context, for entailment classification.
184
 
185
- ## rte3-french
186
  French version of RTE3 for textual entailment recognition.
187
 
 
 
188
 
 
 
 
 
189
 
190
  ## Language
191
  The language data in COLLE is in French
192
 
193
  ### Dataset structure
194
- ## Allo-ciné:
195
  ```json
196
 
197
  {
@@ -199,7 +209,7 @@ The language data in COLLE is in French
199
  "label": 1
200
  }
201
  ```
202
- ## paws_x:
203
  ```json
204
  {
205
  "id": 12,
@@ -208,7 +218,7 @@ The language data in COLLE is in French
208
  "label": 0
209
  }
210
  ```
211
- ## fquad:
212
  ```json
213
  {
214
  "title": "pégase_23_3",
@@ -228,7 +238,7 @@ The language data in COLLE is in French
228
  "sent2": "Le gâteau à la glace."
229
  }
230
  ```
231
- ## gqnli:
232
  ```json
233
  {
234
  "uid": 214,
@@ -240,7 +250,7 @@ The language data in COLLE is in French
240
  "hypothesis_original": "One beige bear runs."
241
  }
242
  ```
243
- ## piaf:
244
  ```json
245
  {
246
 
@@ -251,7 +261,7 @@ The language data in COLLE is in French
251
  "answer_start":[222]}
252
  }
253
  ```
254
- ## sickfr:
255
  ```json
256
  {
257
  "Unnamed: 0": 5,
@@ -261,7 +271,7 @@ The language data in COLLE is in French
261
  "sentence_B": "Il n'y a pas de lutte et d'étreinte de chiens."
262
  }
263
  ```
264
- ## xnli:
265
  ```json
266
 
267
  {
@@ -270,8 +280,7 @@ The language data in COLLE is in French
270
  "label": 1
271
  }
272
  ```
273
-
274
- ## qfrcola:
275
  ```json
276
  {
277
  "label": 1,
@@ -280,7 +289,7 @@ The language data in COLLE is in French
280
  "category": "anglicism"
281
  }
282
  ```
283
- ## wino_x_lm:
284
  ```json
285
  {
286
  "qID": "3UDTAB6HH8D37OQL3O6F3GXEEOF09Z-1",
@@ -296,7 +305,7 @@ The language data in COLLE is in French
296
  "context_referent_of_option2_fr": "vase"
297
  }
298
  ```
299
- ## wino_x_mt
300
  ```json
301
  {
302
  "qID": "3FULMHZ7OUVKJ7S9R0LMS753751M44-1",
@@ -314,7 +323,7 @@ The language data in COLLE is in French
314
  "false_translation_referent_of_pronoun2_fr": "arme"
315
  }
316
  ```
317
- ## rte3-french
318
  ```json
319
  {
320
  "id": "1",
@@ -329,7 +338,7 @@ The language data in COLLE is in French
329
 
330
  ```
331
 
332
- ## daccord
333
  ```json
334
  {
335
  "id": "a001",
@@ -341,7 +350,7 @@ The language data in COLLE is in French
341
  "genre": "conflit ukrainien-russe"
342
  }
343
  ```
344
- ## qfrblimp:
345
  ```json
346
  {
347
  "id": 250,
@@ -359,7 +368,7 @@ The language data in COLLE is in French
359
  "answer": "accept"
360
  }
361
  ```
362
- ## sts22:
363
  ```json
364
  {
365
  "id": "1559147599_1558534688",
@@ -368,7 +377,7 @@ The language data in COLLE is in French
368
  "sentence2": "Le décret n° 2020-293 du 23 mars 2020..."
369
  }
370
  ```
371
- ## french_boolq
372
  ```json
373
 
374
  {
@@ -377,7 +386,7 @@ The language data in COLLE is in French
377
  "label": 1
378
  }
379
  ```
380
- ## expressions_quebecoises
381
 
382
  ```json
383
  {
@@ -398,7 +407,55 @@ The language data in COLLE is in French
398
  "reference": "https://canada-media.ca/expressions-quebecoises/"
399
  }
400
  ```
401
- ## mnli-nineeleven-fr-mt
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
402
  ```json
403
  {
404
  "premise": "La faillite du nationalisme laïque et autocratique était évidente dans le monde musulman à la fin des années 1970.",
@@ -412,31 +469,58 @@ The language data in COLLE is in French
412
  }
413
 
414
  ```
415
- ## Allo-ciné:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
416
  | split | # examples |
417
  |------------|-----------:|
418
  | train | |
419
  | validation | 20000 |
420
  | test | 20000 |
421
- ## paws_x:
422
  | split | # examples |
423
  |------------|-----------:|
424
  | train | 49401 |
425
  | validation | 2000 |
426
  | test | 2000 |
427
- ## fquad:
428
  | split | # examples |
429
  |------------|-----------:|
430
  | validation | 100 |
431
  | test | 400 |
432
 
433
- ## gqnli:
434
  | split | # examples |
435
  |------------|-----------:|
436
  | train | 243 |
437
  | validation | 27 |
438
  | test | 30 |
439
- ## piaf:
440
  | split | # examples |
441
  |------------|-----------:|
442
  | train | 3105 |
@@ -448,49 +532,51 @@ The language data in COLLE is in French
448
  | train | 4439 |
449
  | validation | 495 |
450
  | test | 4906 |
451
- ## xnli:
452
  | split | # examples |
453
  |------------|-----------:|
454
- | train | NA |
455
  | validation | 2490 |
456
  | test | 5010 |
457
- ## qfrcola:
458
  | split | # examples |
459
  |------------|-----------:|
460
  | train | 15846 |
461
  | validation | 1761 |
462
  | test | 7546 |
463
- ## qfrblimp:
464
  | split | # examples |
465
  |------------|-----------:|
466
  | train | NA |
467
  | validation | 2061 |
468
  | test | 2290 |
469
- ## sts22:
470
 
471
  | split | # examples |
472
  |------------|-----------:|
473
  | train | 101 |
474
  | test | 72 |
475
 
476
- ## french_boolq
477
 
478
  | split | # examples |
479
  |------------|-----------:|
480
  | test | 178 |
481
- ## sts22:
482
 
483
  | split | # examples |
484
  |------------|-----------:|
485
- | train | 101 |
486
- | test | 72 |
487
- ## mnli-nineeleven-fr-mt
 
 
488
 
489
  | split | # examples |
490
  |------------|-----------:|
491
  | test | 2000 |
492
 
493
- ## daccord
494
  | split | # examples |
495
  |------------|-----------:|
496
  | test | 1034 |
@@ -498,15 +584,46 @@ The language data in COLLE is in French
498
  | split | # examples |
499
  |------------|-----------:|
500
  | test | 800 |
501
- | dev | 800 |
502
- ## wino_x_lm
503
  | split | # examples |
504
  |------------|-----------:|
505
  | test | 2793 |
506
- ## wino_x_mt
507
  | split | # examples |
508
  |------------|-----------:|
509
  | test | 2988 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
510
 
511
 
512
 
 
132
 
133
  ## Task Descriptions
134
 
135
+ ## Allocine.fr :
136
  Allo-ciné tests language understanding in sentiment classification by feeding movie reviews which can be either positive and negative, the task consists in giving the correct sentiment for each review.
137
+ ## PAWS-X :
138
  This task aims to test paraphrase identification by giving two sentences and a label defining if these sentences are equivalent in meaning or not.
139
 
140
+ ## FQuAD:
141
  Fquad is question/answer pair built on high-quality wikipedia articles. The goal of the model in this task is to accurately predict if the answer to the question really can be found in the provided answer.
142
 
143
+ ## GQNLI-fr:
144
  The dataset consists of carefully constructed premise-hypothesis pairs that involve quantifier logic (e.g., most, at least, more than half). The goal is to evaluate the model's ability to reason about these expressions and determine whether the hypothesis logically follows from the premise, contradicts it, or is neutral.
145
 
146
+ ## PIAF:
147
  This task consists of pairs of questions and text answers with information of where in the answer is the truly relevant information.
148
 
149
+ ## SICK-fr :
150
  This task also has pairs of sentences and notes them on 2 dimensions, relatedness and entailment. While relatedness scales from 1 to 5, entailement is a choice between entails, contradicts or neutral.
151
 
152
+ ## XNLI-fr:
153
  This task consists of pairs of sentences where the goal is to determine the relation between the two sentences, this relation can be either entailement, neutral or contradiction.
154
 
155
+ ## QFrCoLA:
156
  QFrCoLA is a french dataset made from multiple french language sites such as académie-française.fr and vitrinelinguistique.com. It aims to tests models ability to determine a sentence's acceptability in french on subjects such as grammar and syntax. The answer is a binary label indicating if the sentence is correct or not.
157
 
158
+ ## QFrBLiMP:
159
  This task gives the model sentences pairs, the goal is to determine if the sentences are semantically equivalent, or, put more simply, if they mean the same thing, even with slightly different syntax and words.
160
 
161
+ ## STS22:
162
  This task evaluates whether pairs of news articles, written in different languages, cover the same story. It focuses on document-level similarity, where systems rate article pairs on a 4-point scale from most to least similar
163
 
164
+ ## Wino-X-LM
165
  Pronoun resolution task: choose between two referents in a sentence with an ambiguous pronoun.
166
 
167
+ ## Wino-X-MT
168
  Translation-based pronoun resolution: choose which of two French translations uses the correct gendered pronoun.
169
 
 
 
170
 
171
+ ## QFrCoRE
172
+ QFrCoRE is a definition matching task where the model selects the correct standard French definition for a Quebec French expression from a list of candidates.
173
+ ## QFrCoRT
174
+ QFrCoRE is a definition matching task where the model selects the correct standard French definition for a Quebec French term from a list of candidates.
175
+
176
+ ## FraCaS
177
+
178
+ Fracas is a natural language inference (NLI) taskthe where the model must classify the relationship between a premise and a hypothesis—entailment, contradiction, or neutral—based on complex linguistic phenomena such as quantifiers, plurality, anaphora, and ellipsis.
179
 
180
+ ## DACCORD
181
  Determine if a French sentence makes sense semantically (binary label).
182
 
183
+ ## Fr-BoolQ
184
  Boolean question answering in French: answer true/false based on context.
185
 
186
+ ## MNLI-nineeleven-Fr-MT
187
  French machine-translated version of MNLI using 9/11 context, for entailment classification.
188
 
189
+ ## RTE3-Fr
190
  French version of RTE3 for textual entailment recognition.
191
 
192
+ ## MultiBLiMP-Fr
193
+ MultiBLiMP-Fr is a grammatical judgment task where the model must identify the grammatically correct sentence from a minimal pair differing by a single targeted feature, thereby assessing its knowledge of French syntax, morphology, and agreement.
194
 
195
+ ## MMS-fr
196
+ MMS-fr is a sentiment analysis task where the model classifies a French text as positive (2), neutral (1), or negative (0), assessing its ability to detect sentiment across diverse domains and sources.
197
+ ## WSD-Fr
198
+ WSD-Fr is a word sense disambiguation task where the model must identify the correct meaning of an ambiguous verb in context, as part of the FLUE benchmark.
199
 
200
  ## Language
201
  The language data in COLLE is in French
202
 
203
  ### Dataset structure
204
+ ## Allocine.fr:
205
  ```json
206
 
207
  {
 
209
  "label": 1
210
  }
211
  ```
212
+ ## PAWS-X :
213
  ```json
214
  {
215
  "id": 12,
 
218
  "label": 0
219
  }
220
  ```
221
+ ## FQuAD:
222
  ```json
223
  {
224
  "title": "pégase_23_3",
 
238
  "sent2": "Le gâteau à la glace."
239
  }
240
  ```
241
+ ## GQNLI-fr:
242
  ```json
243
  {
244
  "uid": 214,
 
250
  "hypothesis_original": "One beige bear runs."
251
  }
252
  ```
253
+ ## PIAF:
254
  ```json
255
  {
256
 
 
261
  "answer_start":[222]}
262
  }
263
  ```
264
+ ## SICK-fr :
265
  ```json
266
  {
267
  "Unnamed: 0": 5,
 
271
  "sentence_B": "Il n'y a pas de lutte et d'étreinte de chiens."
272
  }
273
  ```
274
+ ## XNLI-fr:
275
  ```json
276
 
277
  {
 
280
  "label": 1
281
  }
282
  ```
283
+ ## QFrCoLA:
 
284
  ```json
285
  {
286
  "label": 1,
 
289
  "category": "anglicism"
290
  }
291
  ```
292
+ ## Wino-X-LM:
293
  ```json
294
  {
295
  "qID": "3UDTAB6HH8D37OQL3O6F3GXEEOF09Z-1",
 
305
  "context_referent_of_option2_fr": "vase"
306
  }
307
  ```
308
+ ## Wino-X-MT
309
  ```json
310
  {
311
  "qID": "3FULMHZ7OUVKJ7S9R0LMS753751M44-1",
 
323
  "false_translation_referent_of_pronoun2_fr": "arme"
324
  }
325
  ```
326
+ ## RTE3-Fr
327
  ```json
328
  {
329
  "id": "1",
 
338
 
339
  ```
340
 
341
+ ## DACCORD
342
  ```json
343
  {
344
  "id": "a001",
 
350
  "genre": "conflit ukrainien-russe"
351
  }
352
  ```
353
+ ## QFrBLiMP:
354
  ```json
355
  {
356
  "id": 250,
 
368
  "answer": "accept"
369
  }
370
  ```
371
+ ## STS22:
372
  ```json
373
  {
374
  "id": "1559147599_1558534688",
 
377
  "sentence2": "Le décret n° 2020-293 du 23 mars 2020..."
378
  }
379
  ```
380
+ ## Fr-BoolQ
381
  ```json
382
 
383
  {
 
386
  "label": 1
387
  }
388
  ```
389
+ ## QFrCoRE
390
 
391
  ```json
392
  {
 
407
  "reference": "https://canada-media.ca/expressions-quebecoises/"
408
  }
409
  ```
410
+ ## QFrCoRT
411
+
412
+ ```json
413
+ {
414
+ "terme": "Avoir la chienne",
415
+ "choices": [
416
+ "Prendre une chaise et s'asseoir.",
417
+ "Avoir du plaisir, parfois avec une connotation sexuelle.",
418
+ "Prépare-toi, ça va brasser.",
419
+ "Tomber amoureux.",
420
+ "Être en pleine forme.",
421
+ "Critiquer sévèrement.",
422
+ "Personne inefficace, qui ne travaille pas bien.",
423
+ "Il se comporte mal en public.",
424
+ "Se détendre, arrêter de s'énerver.",
425
+ "Avoir peur."
426
+ ],
427
+ "correct_index": 9,
428
+ "reference": "https://canada-media.ca/expressions-quebecoises/"
429
+ }
430
+ ```
431
+ ## FraCaS
432
+ ```json
433
+ {
434
+ "id": "1",
435
+ "premise": "Un Italien est devenu le plus grand ténor du monde.",
436
+ "hypothesis": "Il y a eu un Italien qui est devenu le plus grand ténor du monde.",
437
+ "label": "0",
438
+ "question": "Y a-t-il eu un Italien qui soit devenu le plus grand ténor du monde ?",
439
+ "answer": "yes",
440
+ "premises_original": "An Italian became the world's greatest tenor.",
441
+ "premise1": "Un Italien est devenu le plus grand ténor du monde.",
442
+ "premise1_original": "An Italian became the world's greatest tenor.",
443
+ "premise2": "",
444
+ "premise2_original": "",
445
+ "premise3": "",
446
+ "premise3_original": "",
447
+ "premise4": "",
448
+ "premise4_original": "",
449
+ "premise5": "",
450
+ "premise5_original": "",
451
+ "hypothesis_original": "There was an Italian who became the world's greatest tenor.",
452
+ "question_original": "Was there an Italian who became the world's greatest tenor?",
453
+ "note": "",
454
+ "topic": "GENERALIZED QUANTIFIERS"
455
+ }
456
+ ```
457
+
458
+ ## MNLI-nineeleven-Fr-MT
459
  ```json
460
  {
461
  "premise": "La faillite du nationalisme laïque et autocratique était évidente dans le monde musulman à la fin des années 1970.",
 
469
  }
470
 
471
  ```
472
+ ## MultiBLiMP-Fr
473
+
474
+ ```json
475
+ {
476
+ "sentence_a": "C'est le genre à lequel appartiennent les espèces de kiwi.",
477
+ "sentence_b": "C'est le genre à lequel appartenez les espèces de kiwi.",
478
+ "label": 0
479
+ }
480
+ ```
481
+ ## WSD-Fr
482
+
483
+ ```json
484
+ {
485
+ "sentence": "Il rend hommage au roi de France et des négociations aboutissent au traité du Goulet , formalisant la paix entre les deux pays .",
486
+ "labels_idx": [10],
487
+ "label": "négociations"
488
+ }
489
+
490
+ ```
491
+ ## MMS
492
+
493
+ ```json
494
+ {
495
+ "text": "Cadeaux pour ma fille.",
496
+ "label": 2
497
+ }
498
+ ```
499
+ ## Allocine.fr:
500
  | split | # examples |
501
  |------------|-----------:|
502
  | train | |
503
  | validation | 20000 |
504
  | test | 20000 |
505
+ ## PAWS-X :
506
  | split | # examples |
507
  |------------|-----------:|
508
  | train | 49401 |
509
  | validation | 2000 |
510
  | test | 2000 |
511
+ ## FQuAD:
512
  | split | # examples |
513
  |------------|-----------:|
514
  | validation | 100 |
515
  | test | 400 |
516
 
517
+ ## GQNLI-fr:
518
  | split | # examples |
519
  |------------|-----------:|
520
  | train | 243 |
521
  | validation | 27 |
522
  | test | 30 |
523
+ ## PIAF:
524
  | split | # examples |
525
  |------------|-----------:|
526
  | train | 3105 |
 
532
  | train | 4439 |
533
  | validation | 495 |
534
  | test | 4906 |
535
+ ## XNLI-fr:
536
  | split | # examples |
537
  |------------|-----------:|
538
+ | train | 393,000 |
539
  | validation | 2490 |
540
  | test | 5010 |
541
+ ## QFrCoLA:
542
  | split | # examples |
543
  |------------|-----------:|
544
  | train | 15846 |
545
  | validation | 1761 |
546
  | test | 7546 |
547
+ ## QFrBLiMP:
548
  | split | # examples |
549
  |------------|-----------:|
550
  | train | NA |
551
  | validation | 2061 |
552
  | test | 2290 |
553
+ ## STS22:
554
 
555
  | split | # examples |
556
  |------------|-----------:|
557
  | train | 101 |
558
  | test | 72 |
559
 
560
+ ## Fr-BoolQ
561
 
562
  | split | # examples |
563
  |------------|-----------:|
564
  | test | 178 |
565
+ ## SICK-fr :
566
 
567
  | split | # examples |
568
  |------------|-----------:|
569
+ | train | 4,439 |
570
+ | test | 2,000 |
571
+ | validation | 2,000 |
572
+
573
+ ## MNLI-nineeleven-Fr-MT
574
 
575
  | split | # examples |
576
  |------------|-----------:|
577
  | test | 2000 |
578
 
579
+ ## DACCORD
580
  | split | # examples |
581
  |------------|-----------:|
582
  | test | 1034 |
 
584
  | split | # examples |
585
  |------------|-----------:|
586
  | test | 800 |
587
+ | validation | 800 |
588
+ ## Wino-X-LM
589
  | split | # examples |
590
  |------------|-----------:|
591
  | test | 2793 |
592
+ ## Wino-X-MT
593
  | split | # examples |
594
  |------------|-----------:|
595
  | test | 2988 |
596
+ ## QFrCoRE
597
+ | split | # examples |
598
+ |------------|-----------:|
599
+ | test | 4,633 |
600
+ ## QFrCoRT
601
+ | split | # examples |
602
+ |------------|-----------:|
603
+ | test | 201 |
604
+ ## MultiBLiMP-Fr :
605
+
606
+ | split | # examples |
607
+ |------------|-----------:|
608
+ | train | 160 |
609
+ | test | 77 |
610
+ | validation | 18 |
611
+ ## MMS:
612
+
613
+ | split | # examples |
614
+ |------------|-----------:|
615
+ | train | 132,696 |
616
+ | test | 63,190 |
617
+ | validation | 14,745 |
618
+ ## FraCaS
619
+ | split | # examples |
620
+ |------------|-----------:|
621
+ | test | 346 |
622
+ ## rte3-french
623
+ | split | # examples |
624
+ |------------|-----------:|
625
+ | test | 3,121 |
626
+ | train | 269,821 |
627
 
628
 
629