Cheng98 commited on
Commit
882e462
·
verified ·
1 Parent(s): 9c241e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -20,6 +20,7 @@ A 60M parameter language model trained on the FineWeb dataset.
20
  aixsim-60M is a transformer-based language model with approximately 60 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb dataset.
21
 
22
  - **Developed by:** AICrossSim
 
23
  - **Model type:** Transformer Language Model
24
  - **Language(s) (NLP):** English
25
  - **License:** odc-by
 
20
  aixsim-60M is a transformer-based language model with approximately 60 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb dataset.
21
 
22
  - **Developed by:** AICrossSim
23
+ - **Funded by:** [ARIA](https://www.aria.org.uk/)
24
  - **Model type:** Transformer Language Model
25
  - **Language(s) (NLP):** English
26
  - **License:** odc-by