Link model card to technical report and project page

#22
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +12 -8
README.md CHANGED
@@ -2,16 +2,17 @@
2
  language:
3
  - en
4
  - ko
 
5
  license: other
6
  license_name: solar-apache-2.0
 
7
  tags:
8
  - upstage
9
  - solar
10
  - moe
11
  - 100b
12
  - llm
13
- library_name: transformers
14
- pipeline_tag: text-generation
15
  ---
16
 
17
  <p align="center">
@@ -22,6 +23,8 @@ pipeline_tag: text-generation
22
 
23
  **Solar Open** is Upstage's flagship **102B-parameter** large language model, trained **entirely from scratch** and released under the **Solar-Apache License 2.0** (see [LICENSE](#license) for details). As a **Mixture-of-Experts (MoE)** architecture, it delivers enterprise-grade performance in reasoning, instruction-following, and agentic capabilities—all while prioritizing transparency and customization for the open-source community.
24
 
 
 
25
  ## Highlights
26
 
27
  * **MoE Architecture (102B / 12B):** Built on a Mixture-of-Experts architecture with **102B total / 12B active parameters**. This design delivers the knowledge depth of a massive model with the inference speed and cost-efficiency of a much smaller model.
@@ -42,7 +45,7 @@ pipeline_tag: text-generation
42
  * **Hardware Requirements:**
43
  * **Minimum:** 4x NVIDIA A100 (80GB)
44
 
45
- For more details, please refer to [Solar Open Technical Report](solar-open-technical-report.pdf).
46
 
47
  ## License
48
  This repository contains both model weights and code,
@@ -222,10 +225,11 @@ The official API service for Solar Open is scheduled to launch publicly on **Jan
222
  If you use Solar Open in your research, please cite:
223
 
224
  ```bibtex
225
- @misc{solar-open-2025,
226
- title={Solar Open: Scaling Upstage's LLM Capabilities with MoE},
227
- author={Upstage AI},
228
- year={2025},
229
- url={https://huggingface.co/Upstage/Solar-Open-100B}
 
230
  }
231
  ```
 
2
  language:
3
  - en
4
  - ko
5
+ library_name: transformers
6
  license: other
7
  license_name: solar-apache-2.0
8
+ pipeline_tag: text-generation
9
  tags:
10
  - upstage
11
  - solar
12
  - moe
13
  - 100b
14
  - llm
15
+ arxiv: 2601.07022
 
16
  ---
17
 
18
  <p align="center">
 
23
 
24
  **Solar Open** is Upstage's flagship **102B-parameter** large language model, trained **entirely from scratch** and released under the **Solar-Apache License 2.0** (see [LICENSE](#license) for details). As a **Mixture-of-Experts (MoE)** architecture, it delivers enterprise-grade performance in reasoning, instruction-following, and agentic capabilities—all while prioritizing transparency and customization for the open-source community.
25
 
26
+ [**Technical Report**](https://huggingface.co/papers/2601.07022) | [**Project Page**](https://upstage.ai)
27
+
28
  ## Highlights
29
 
30
  * **MoE Architecture (102B / 12B):** Built on a Mixture-of-Experts architecture with **102B total / 12B active parameters**. This design delivers the knowledge depth of a massive model with the inference speed and cost-efficiency of a much smaller model.
 
45
  * **Hardware Requirements:**
46
  * **Minimum:** 4x NVIDIA A100 (80GB)
47
 
48
+ For more details, please refer to the [Solar Open Technical Report](https://huggingface.co/papers/2601.07022).
49
 
50
  ## License
51
  This repository contains both model weights and code,
 
225
  If you use Solar Open in your research, please cite:
226
 
227
  ```bibtex
228
+ @article{park2025solar,
229
+ title={Solar Open Technical Report},
230
+ author={Sungrae Park and Sanghoon Kim and Jungho Cho and Gyoungjin Gim and Dawoon Jung and Mikyoung Cha and Eunhae Choo and Taekgyu Hong and Minbyul Jeong and SeHwan Joo and Minsoo Khang and Eunwon Kim and Minjeong Kim and Sujeong Kim and Yunsu Kim and Hyeonju Lee and Seunghyun Lee and Sukyung Lee and Siyoung Park and Gyungin Shin and Inseo Song and Wonho Song and Seonghoon Yang and Seungyoun Yi and Sanghoon Yoon and Jeonghyun Ko and Seyoung Song and Keunwoo Choi and Hwalsuk Lee and Sunghun Kim and Du-Seong Chang and Kyunghyun Cho and Junsuk Choe and Hwaran Lee and Jae-Gil Lee and KyungTae Lim and Alice Oh},
231
+ journal={arXiv preprint arXiv:2601.07022},
232
+ year={2025},
233
+ url={https://huggingface.co/papers/2601.07022}
234
  }
235
  ```