zzqsmall commited on
Commit
f8425ce
·
verified ·
1 Parent(s): 63c5063

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ library_name: transformers
12
 
13
  # Ring-1T: Flow State Leads to Sudden Enlightenment
14
 
15
- Today, we officially launch the trillion-parameter thinking model, Ring-1T. It is open-source upon release—developers can download the model weights from Hugging Face and ModelScope, or experience direct chat interactions and API calls via the Ling Chat page and [ZenMux](https://zenmux.ai/inclusionai/ring-1t?utm_source=hf_inclusionAI) (links provided at the end of the article).
16
 
17
  Building upon the preview version released at the end of last month, Ring-1T has undergone continued scaling with large-scale verifiable reward reinforcement learning (RLVR) training, further unlocking the natural language reasoning capabilities of the trillion-parameter foundation model. Through RLHF training, the model's general abilities have also been refined, making this release of Ring-1T more balanced in performance across various tasks.
18
 
 
12
 
13
  # Ring-1T: Flow State Leads to Sudden Enlightenment
14
 
15
+ Today, we officially launch the trillion-parameter thinking model, Ring-1T. It is open-source upon release—developers can download the model weights from Hugging Face and ModelScope, or experience direct chat interactions and API calls via the [Ling Chat](https://ling.tbox.cn/chat) page and [ZenMux](https://zenmux.ai/inclusionai/ring-1t?utm_source=hf_inclusionAI) (links provided at the end of the article).
16
 
17
  Building upon the preview version released at the end of last month, Ring-1T has undergone continued scaling with large-scale verifiable reward reinforcement learning (RLVR) training, further unlocking the natural language reasoning capabilities of the trillion-parameter foundation model. Through RLHF training, the model's general abilities have also been refined, making this release of Ring-1T more balanced in performance across various tasks.
18