Update README.md
Browse files
README.md
CHANGED
|
@@ -8,6 +8,7 @@ language:
|
|
| 8 |
- trp
|
| 9 |
- njz
|
| 10 |
- pnr
|
|
|
|
| 11 |
- eng
|
| 12 |
- hin
|
| 13 |
tags:
|
|
@@ -60,7 +61,6 @@ We evaluated NE-BERT against industry-standard multilingual models on a held-out
|
|
| 60 |
| Model | Perplexity (PPL) | Verdict |
|
| 61 |
| :--- | :--- | :--- |
|
| 62 |
| **mBERT** (Google) | 9.46 | Poor Context |
|
| 63 |
-
| **XLM-RoBERTa** (Meta) | 1.49 | Average |
|
| 64 |
| **IndicBERT** (AI4Bharat) | 26.29 | High Confusion |
|
| 65 |
| **NE-BERT (Ours)** | **5.28** | **Native-Level Fluency** |
|
| 66 |
|
|
|
|
| 8 |
- trp
|
| 9 |
- njz
|
| 10 |
- pnr
|
| 11 |
+
- nag
|
| 12 |
- eng
|
| 13 |
- hin
|
| 14 |
tags:
|
|
|
|
| 61 |
| Model | Perplexity (PPL) | Verdict |
|
| 62 |
| :--- | :--- | :--- |
|
| 63 |
| **mBERT** (Google) | 9.46 | Poor Context |
|
|
|
|
| 64 |
| **IndicBERT** (AI4Bharat) | 26.29 | High Confusion |
|
| 65 |
| **NE-BERT (Ours)** | **5.28** | **Native-Level Fluency** |
|
| 66 |
|