hancheolp commited on
Commit
54028f4
·
verified ·
1 Parent(s): 3eee1a6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ tags:
18
 
19
  # **Solar-Open-100B-NotaMoeQuant-Int4**
20
 
21
- This repository provides **Upstage’s flagship model, [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)**, packaged with [**Nota AI**](https://www.nota.ai/)’s proprietary quantization technique specifically developed for **Mixture-of-Experts (MoE)-based LLMs**. Unlike conventional quantization methods, this approach incorporates a **novel method** designed to mitigate **representation distortion** that can occur when experts are mixed under quantization in MoE architectures.
22
 
23
  ## Overview
24
 
 
18
 
19
  # **Solar-Open-100B-NotaMoeQuant-Int4**
20
 
21
+ This repository provides **Upstage’s flagship model, [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)**, packaged with [**Nota AI**](https://www.nota.ai/)’s proprietary quantization technique specifically developed for Mixture-of-Experts (MoE)-based LLMs. Unlike conventional quantization methods, this approach incorporates a novel method designed to mitigate representation distortion that can occur when experts are mixed under quantization in MoE architectures.
22
 
23
  ## Overview
24