Update README.md
Browse files
README.md
CHANGED
|
@@ -20,6 +20,7 @@ A 60M parameter language model trained on the FineWeb dataset.
|
|
| 20 |
aixsim-60M is a transformer-based language model with approximately 60 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb dataset.
|
| 21 |
|
| 22 |
- **Developed by:** AICrossSim
|
|
|
|
| 23 |
- **Model type:** Transformer Language Model
|
| 24 |
- **Language(s) (NLP):** English
|
| 25 |
- **License:** odc-by
|
|
|
|
| 20 |
aixsim-60M is a transformer-based language model with approximately 60 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb dataset.
|
| 21 |
|
| 22 |
- **Developed by:** AICrossSim
|
| 23 |
+
- **Funded by:** [ARIA](https://www.aria.org.uk/)
|
| 24 |
- **Model type:** Transformer Language Model
|
| 25 |
- **Language(s) (NLP):** English
|
| 26 |
- **License:** odc-by
|