Update README.md
Browse files
README.md
CHANGED
|
@@ -19,16 +19,16 @@ This model is part of a series of models trained for the ML4AL paper “Gotta ca
|
|
| 19 |
|
| 20 |
### Model Sources
|
| 21 |
|
| 22 |
-
- **Repository:** [https://github.com/NER-AncientLanguages/NERAncientGreekML4AL] (for data and training scripts)
|
| 23 |
-
- **Paper:** [https://aclanthology.org/2024.ml4al-1.16
|
| 24 |
|
| 25 |
## Training Details
|
| 26 |
|
| 27 |
### Training Data
|
| 28 |
|
| 29 |
-
**Repository:** [https://github.com/NER-AncientLanguages/NERAncientGreekML4AL] (for data and training scripts). \
|
| 30 |
We thank the following projects for providing the training data: \
|
| 31 |
-
Digital Periegesis: [https://www.periegesis.org/en
|
| 32 |
Josh Kemp, annotated Odessey: [https://medium.com/pelagios/beyond-translation-building-better-greek-scholars-561ab331a1bc] \
|
| 33 |
the Stepbible project: [https://github.com/STEPBible/STEPBible-Data] \
|
| 34 |
Perseus Digital Library, Deipnosophistae: [urn:cts:greekLit:tlg0008.tlg001.perseus-grc4] \
|
|
@@ -43,10 +43,10 @@ Learning Rate: Sampled uniformly between 1e-6 and 1e-4 \
|
|
| 43 |
Weight Decay: One of [0.1, 0.01, 0.001] \
|
| 44 |
Number of Training Epochs: One of [3, 4, 5, 6] \
|
| 45 |
|
| 46 |
-
For the final training of this model, the hyperparameters were:
|
| 47 |
-
Learning Rate: 9.889410158465026e-05
|
| 48 |
-
Weight Decay: 0.1
|
| 49 |
-
Number of Training Epochs: 5
|
| 50 |
|
| 51 |
|
| 52 |
|
|
|
|
| 19 |
|
| 20 |
### Model Sources
|
| 21 |
|
| 22 |
+
- **Repository:** [https://github.com/NER-AncientLanguages/NERAncientGreekML4AL.git] (for data and training scripts)
|
| 23 |
+
- **Paper:** [https://aclanthology.org/2024.ml4al-1.16]
|
| 24 |
|
| 25 |
## Training Details
|
| 26 |
|
| 27 |
### Training Data
|
| 28 |
|
| 29 |
+
**Repository:** [https://github.com/NER-AncientLanguages/NERAncientGreekML4AL.git] (for data and training scripts). \
|
| 30 |
We thank the following projects for providing the training data: \
|
| 31 |
+
Digital Periegesis: [https://www.periegesis.org/en] \
|
| 32 |
Josh Kemp, annotated Odessey: [https://medium.com/pelagios/beyond-translation-building-better-greek-scholars-561ab331a1bc] \
|
| 33 |
the Stepbible project: [https://github.com/STEPBible/STEPBible-Data] \
|
| 34 |
Perseus Digital Library, Deipnosophistae: [urn:cts:greekLit:tlg0008.tlg001.perseus-grc4] \
|
|
|
|
| 43 |
Weight Decay: One of [0.1, 0.01, 0.001] \
|
| 44 |
Number of Training Epochs: One of [3, 4, 5, 6] \
|
| 45 |
|
| 46 |
+
For the final training of this model, the hyperparameters were: \
|
| 47 |
+
Learning Rate: 9.889410158465026e-05 \
|
| 48 |
+
Weight Decay: 0.1 \
|
| 49 |
+
Number of Training Epochs: 5 \
|
| 50 |
|
| 51 |
|
| 52 |
|