siyoungpark commited on
Commit
b87796b
·
verified ·
1 Parent(s): c90f8f3

Update README.md from tmp

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -4,7 +4,7 @@ language:
4
  - ko
5
  library_name: transformers
6
  license: other
7
- license_name: solar-apache-2.0
8
  pipeline_tag: text-generation
9
  tags:
10
  - upstage
@@ -21,7 +21,7 @@ arxiv: 2601.07022
21
 
22
  # **Solar Open**
23
 
24
- **Solar Open** is Upstage's flagship **102B-parameter** large language model, trained **entirely from scratch** and released under the **Solar-Apache License 2.0** (see [LICENSE](#license) for details). As a **Mixture-of-Experts (MoE)** architecture, it delivers enterprise-grade performance in reasoning, instruction-following, and agentic capabilities—all while prioritizing transparency and customization for the open-source community.
25
 
26
  [**Technical Report**](https://huggingface.co/papers/2601.07022) | [**Project Page**](https://upstage.ai)
27
 
@@ -41,7 +41,7 @@ arxiv: 2601.07022
41
  * **Pre-training Tokens:** 19.7 Trillion
42
  * **Context Length:** 128k
43
  * **Training Hardware:** NVIDIA B200 GPUs
44
- * **License:** **Solar-Apache License 2.0** (See [LICENSE](#license))
45
  * **Hardware Requirements:**
46
  * **Minimum:** 4x NVIDIA A100 (80GB)
47
 
@@ -52,7 +52,7 @@ This repository contains both model weights and code,
52
  which are licensed under different terms:
53
 
54
  1. MODEL WEIGHTS (*.safetensors)
55
- Licensed under **Solar-Apache License 2.0**
56
  See: https://huggingface.co/upstage/Solar-Open-100B/blob/main/LICENSE
57
 
58
  2. CODE (*.py, *.json, *.jinja files)
@@ -215,7 +215,7 @@ vllm serve upstage/Solar-Open-100B \
215
 
216
  ## Public API Access
217
 
218
- The official API service for Solar Open is scheduled to launch publicly on **January**.
219
 
220
  * **Access:** Upstage Console (TBA)
221
  * **Documentation:** Upstage Console (TBA)
 
4
  - ko
5
  library_name: transformers
6
  license: other
7
+ license_name: upstage-solar-license
8
  pipeline_tag: text-generation
9
  tags:
10
  - upstage
 
21
 
22
  # **Solar Open**
23
 
24
+ **Solar Open** is Upstage's flagship **102B-parameter** large language model, trained **entirely from scratch** and released under the **Upstage Solar License** (see [LICENSE](#license) for details). As a **Mixture-of-Experts (MoE)** architecture, it delivers enterprise-grade performance in reasoning, instruction-following, and agentic capabilities—all while prioritizing transparency and customization for the open-source community.
25
 
26
  [**Technical Report**](https://huggingface.co/papers/2601.07022) | [**Project Page**](https://upstage.ai)
27
 
 
41
  * **Pre-training Tokens:** 19.7 Trillion
42
  * **Context Length:** 128k
43
  * **Training Hardware:** NVIDIA B200 GPUs
44
+ * **License:** **Upstage Solar License** (See [LICENSE](#license))
45
  * **Hardware Requirements:**
46
  * **Minimum:** 4x NVIDIA A100 (80GB)
47
 
 
52
  which are licensed under different terms:
53
 
54
  1. MODEL WEIGHTS (*.safetensors)
55
+ Licensed under **Upstage Solar License**
56
  See: https://huggingface.co/upstage/Solar-Open-100B/blob/main/LICENSE
57
 
58
  2. CODE (*.py, *.json, *.jinja files)
 
215
 
216
  ## Public API Access
217
 
218
+ The official API service for Solar Open is scheduled to launch publicly in **January**.
219
 
220
  * **Access:** Upstage Console (TBA)
221
  * **Documentation:** Upstage Console (TBA)