Update README.md
Browse files
README.md
CHANGED
|
@@ -19,7 +19,7 @@ It solves the "Domain Shift" problem where standard NLP models (trained on Wikip
|
|
| 19 |
## Model Performance
|
| 20 |
| Metric | Baseline (Twitter-RoBERTa) | **Twitch-RoBERTa (This Model)** |
|
| 21 |
| :--- | :--- | :--- |
|
| 22 |
-
| **Perplexity** | ~
|
| 23 |
| **Loss** | 9.97 | **1.7** |
|
| 24 |
|
| 25 |
**Result:** A **~82% reduction in perplexity**, effectively teaching the model the specific vocabulary, syntax, and emote usage patterns of the Twitch community.
|
|
|
|
| 19 |
## Model Performance
|
| 20 |
| Metric | Baseline (Twitter-RoBERTa) | **Twitch-RoBERTa (This Model)** |
|
| 21 |
| :--- | :--- | :--- |
|
| 22 |
+
| **Perplexity** | ~21,375 | **~5.5** |
|
| 23 |
| **Loss** | 9.97 | **1.7** |
|
| 24 |
|
| 25 |
**Result:** A **~82% reduction in perplexity**, effectively teaching the model the specific vocabulary, syntax, and emote usage patterns of the Twitch community.
|