Datasets:

ArXiv:
Tags:
3D
tongsim commited on
Commit
964a3da
·
verified ·
1 Parent(s): 6ea97e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -16,9 +16,12 @@ viewer: false
16
  [![GitHub](https://img.shields.io/badge/GitHub-TongSIM-blue)](https://github.com/bigai-ai/tongsim)
17
  [![Paper](https://img.shields.io/badge/Paper-arXiv-red)](https://arxiv.org/abs/2512.20206)
18
 
19
-
20
  </div>
21
 
 
 
 
 
22
  # What is TongSIM-Asset?
23
 
24
  As artificial intelligence (AI) rapidly advances, especially in multimodal large language models, research focus is shifting from single-modality text processing to the more complex domains of multimodal and embodied AI. Embodied intelligence focuses on training agents within realistic simulated environments, leveraging physical interaction and action feedback rather than conventionally labeled datasets. To foster embodied AI research, we introduce [`TongSIM`](https://github.com/bigai-ai/tongsim), a high-fidelity, general-purpose platform for training and evaluating embodied agents.
 
16
  [![GitHub](https://img.shields.io/badge/GitHub-TongSIM-blue)](https://github.com/bigai-ai/tongsim)
17
  [![Paper](https://img.shields.io/badge/Paper-arXiv-red)](https://arxiv.org/abs/2512.20206)
18
 
 
19
  </div>
20
 
21
+ # GitHub
22
+
23
+ Visit Github : https://tongsim-platform.github.io/tongsim
24
+
25
  # What is TongSIM-Asset?
26
 
27
  As artificial intelligence (AI) rapidly advances, especially in multimodal large language models, research focus is shifting from single-modality text processing to the more complex domains of multimodal and embodied AI. Embodied intelligence focuses on training agents within realistic simulated environments, leveraging physical interaction and action feedback rather than conventionally labeled datasets. To foster embodied AI research, we introduce [`TongSIM`](https://github.com/bigai-ai/tongsim), a high-fidelity, general-purpose platform for training and evaluating embodied agents.