mrapacz commited on
Commit
bd235d3
·
verified ·
1 Parent(s): 78b9ea7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -1
README.md CHANGED
@@ -45,6 +45,8 @@ dataset_info:
45
  num_examples: 794
46
  download_size: 2799395
47
  dataset_size: 10620651
 
 
48
  ---
49
 
50
  # Dataset Card for Ancient Greek Interlinear Translations Dataset
@@ -92,4 +94,30 @@ Each entry contains:
92
 
93
  ## Dataset Card Authors
94
 
95
- Maciej Rapacz
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
45
  num_examples: 794
46
  download_size: 2799395
47
  dataset_size: 10620651
48
+ tags:
49
+ - interlinear-translation
50
  ---
51
 
52
  # Dataset Card for Ancient Greek Interlinear Translations Dataset
 
94
 
95
  ## Dataset Card Authors
96
 
97
+ Maciej Rapacz
98
+
99
+ ## Citation
100
+
101
+ ```bixtex
102
+ @inproceedings{rapacz-smywinski-pohl-2025-low,
103
+ title = "Low-Resource Interlinear Translation: Morphology-Enhanced Neural Models for {A}ncient {G}reek",
104
+ author = "Rapacz, Maciej and
105
+ Smywi{\'n}ski-Pohl, Aleksander",
106
+ editor = "Hettiarachchi, Hansi and
107
+ Ranasinghe, Tharindu and
108
+ Rayson, Paul and
109
+ Mitkov, Ruslan and
110
+ Gaber, Mohamed and
111
+ Premasiri, Damith and
112
+ Tan, Fiona Anting and
113
+ Uyangodage, Lasitha",
114
+ booktitle = "Proceedings of the First Workshop on Language Models for Low-Resource Languages",
115
+ month = jan,
116
+ year = "2025",
117
+ address = "Abu Dhabi, United Arab Emirates",
118
+ publisher = "Association for Computational Linguistics",
119
+ url = "https://aclanthology.org/2025.loreslm-1.11/",
120
+ pages = "145--165",
121
+ abstract = "Contemporary machine translation systems prioritize fluent, natural-sounding output with flexible word ordering. In contrast, interlinear translation maintains the source text`s syntactic structure by aligning target language words directly beneath their source counterparts. Despite its importance in classical scholarship, automated approaches to interlinear translation remain understudied. We evaluated neural interlinear translation from Ancient Greek to English and Polish using four transformer-based models: two Ancient Greek-specialized (GreTa and PhilTa) and two general-purpose multilingual models (mT5-base and mT5-large). Our approach introduces novel morphological embedding layers and evaluates text preprocessing and tag set selection across 144 experimental configurations using a word-aligned parallel corpus of the Greek New Testament. Results show that morphological features through dedicated embedding layers significantly enhance translation quality, improving BLEU scores by 35{\%} (44.67 {\textrightarrow} 60.40) for English and 38{\%} (42.92 {\textrightarrow} 59.33) for Polish compared to baseline models. PhilTa achieves state-of-the-art performance for English, while mT5-large does so for Polish. Notably, PhilTa maintains stable performance using only 10{\%} of training data. Our findings challenge the assumption that modern neural architectures cannot benefit from explicit morphological annotations. While preprocessing strategies and tag set selection show minimal impact, the substantial gains from morphological embeddings demonstrate their value in low-resource scenarios."
122
+ }
123
+ ```