Translation
Arabic
English
Eval Results
radinplaid commited on
Commit
6c17a1e
·
verified ·
1 Parent(s): a81dfb3

Update README.md

Browse files

Add Aya and Hunyuan MT models to metrics table

Files changed (1) hide show
  1. README.md +10 -9
README.md CHANGED
@@ -105,12 +105,13 @@ The model is in `ctranslate2` format, and the tokenizers are `sentencepiece`, so
105
 
106
  `bleu` and `chrf2` are calculated with [sacrebleu](https://github.com/mjpost/sacrebleu) on the [Flores200 `devtest` test set](https://huggingface.co/datasets/facebook/flores). `comet22` with the [`comet`](https://github.com/Unbabel/COMET) library and the [default model](https://huggingface.co/Unbabel/wmt22-comet-da). "Time (s)" is the time in seconds to translate the flores-devtest dataset (1012 sentences) on an RTX 4070s GPU with batch size 32.
107
 
108
-
109
- | | bleu | chrf2 | comet22 | Time (s) |
110
- |:---------------------------------|-------:|--------:|----------:|-----------:|
111
- | quickmt/quickmt-ar-en | 44.11 | 67.96 | 87.64 | 1.11 |
112
- | Helsinki-NLP/opus-mt-ar-en | 34.22 | 61.26 | 84.5 | 3.67 |
113
- | facebook/nllb-200-distilled-600M | 39.13 | 64.14 | 86.22 | 21.76 |
114
- | facebook/nllb-200-distilled-1.3B | 42.29 | 66.55 | 87.55 | 37.7 |
115
- | facebook/m2m100_418M | 29.41 | 57.68 | 82.21 | 18.53 |
116
- | facebook/m2m100_1.2B | 29.77 | 56.7 | 80.77 | 36.23 |
 
 
105
 
106
  `bleu` and `chrf2` are calculated with [sacrebleu](https://github.com/mjpost/sacrebleu) on the [Flores200 `devtest` test set](https://huggingface.co/datasets/facebook/flores). `comet22` with the [`comet`](https://github.com/Unbabel/COMET) library and the [default model](https://huggingface.co/Unbabel/wmt22-comet-da). "Time (s)" is the time in seconds to translate the flores-devtest dataset (1012 sentences) on an RTX 4070s GPU with batch size 32.
107
 
108
+ | | bleu | chrf2 | comet22 | Time (s) |
109
+ |:--------------------------------------------|-------:|--------:|----------:|-----------:|
110
+ | quickmt/quickmt-ar-en | 44.11 | 67.96 | 87.64 | 1.11 |
111
+ | Helsinki-NLP/opus-mt-ar-en | 34.22 | 61.26 | 84.5 | 3.67 |
112
+ | facebook/nllb-200-distilled-600M | 39.13 | 64.14 | 86.22 | 21.76 |
113
+ | facebook/nllb-200-distilled-1.3B | 42.29 | 66.55 | 87.55 | 37.7 |
114
+ | facebook/m2m100_418M | 29.41 | 57.68 | 82.21 | 18.53 |
115
+ | facebook/m2m100_1.2B | 29.77 | 56.7 | 80.77 | 36.23 |
116
+ | tencent/Hunyuan-MT-7B-fp8 | 29.48 | 61.62 | 88.37 | 28 |
117
+ | CohereLabs/aya-expanse-8b (vllm, bnb quant) | 39.90 | 65.57 | 89.1 | 74 |