Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -26,7 +26,7 @@ We are thrilled to introduce a specialized collection of **68 large language mod
|
|
| 26 |
- **Domain-Specific Training:** FinText utilises diverse financial datasets such as news articles, regulatory filings, IP records, corporate speeches (ECB, FED), and more.
|
| 27 |
- **Time-Period Specific Models:** Separate models are pre-trained for each year from **2007 to 2023**, ensuring the utmost precision and historical relevance.
|
| 28 |
- **RoBERTa Architecture:** The suite includes both a **base model** with **125 million parameters** and a **smaller variant** with **51 million parameters**.
|
| 29 |
-
- **Two distinct pre-training durations:** We also introduce a series of models to explore the impact of futher pre-training.
|
| 30 |
|
| 31 |
Stay tuned for upcoming updates and new features for FinText. We expect to launch stages 2 and 3 within the next 9 months. 🎉
|
| 32 |
|
|
|
|
| 26 |
- **Domain-Specific Training:** FinText utilises diverse financial datasets such as news articles, regulatory filings, IP records, corporate speeches (ECB, FED), and more.
|
| 27 |
- **Time-Period Specific Models:** Separate models are pre-trained for each year from **2007 to 2023**, ensuring the utmost precision and historical relevance.
|
| 28 |
- **RoBERTa Architecture:** The suite includes both a **base model** with **125 million parameters** and a **smaller variant** with **51 million parameters**.
|
| 29 |
+
- **Two distinct pre-training durations:** We also introduce a series of models to explore the impact of futher pre-training.
|
| 30 |
|
| 31 |
Stay tuned for upcoming updates and new features for FinText. We expect to launch stages 2 and 3 within the next 9 months. 🎉
|
| 32 |
|