ImanAndrea commited on
Commit
23987e1
·
verified ·
1 Parent(s): 8bdc896

Update annotations for Iman/paper_11.txt

Browse files
Files changed (1) hide show
  1. annotations/Iman/paper_11.txt.json +0 -8
annotations/Iman/paper_11.txt.json CHANGED
@@ -1,12 +1,4 @@
1
  [
2
- {
3
- "file": "paper_11.txt",
4
- "start": 1960,
5
- "end": 2457,
6
- "label": "Coherence",
7
- "user": "Iman",
8
- "text": " Can a highperformance CLIR model be trained that can operate without having to rely on MT? To answer the question, instead of viewing the MT-based approach as a competing one, we propose to leverage its strength via knowledge distillation (KD) into an end-to-end CLIR model. KD (Hinton et al., 2014) is a powerful supervision technique typically used to distill the knowledge of a large teacher model about some task into a smaller student model (Mukherjee and Awadallah, 2020;Turc et al., 2020)."
9
- },
10
  {
11
  "file": "paper_11.txt",
12
  "start": 1025,
 
1
  [
 
 
 
 
 
 
 
 
2
  {
3
  "file": "paper_11.txt",
4
  "start": 1025,