Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -28,6 +28,6 @@ We are thrilled to introduce a specialized collection of **68 large language mod
|
|
| 28 |
- **RoBERTa Architecture:** The suite includes both a **base model** with **125 million parameters** and a **smaller variant** with **51 million parameters**—totalling 34 pre-trained models. 🎯
|
| 29 |
- **Two distinct pre-training durations:** We also introduce a series of models to explore the impact of futher pre-training. These models are pre-trained for an additional 5 epochs, extending the total pre-training epochs to 10.
|
| 30 |
|
| 31 |
-
Stay tuned for
|
| 32 |
|
| 33 |
</div>
|
|
|
|
| 28 |
- **RoBERTa Architecture:** The suite includes both a **base model** with **125 million parameters** and a **smaller variant** with **51 million parameters**—totalling 34 pre-trained models. 🎯
|
| 29 |
- **Two distinct pre-training durations:** We also introduce a series of models to explore the impact of futher pre-training. These models are pre-trained for an additional 5 epochs, extending the total pre-training epochs to 10.
|
| 30 |
|
| 31 |
+
Stay tuned for upcoming updates and new features for FinText. We expect to launch stages 2 and 3 within the next 9 months. 🎉
|
| 32 |
|
| 33 |
</div>
|