Update README.md
Browse files
README.md
CHANGED
|
@@ -18,7 +18,7 @@ tags:
|
|
| 18 |
|
| 19 |
# **Solar-Open-100B-NotaMoeQuant-Int4**
|
| 20 |
|
| 21 |
-
This repository provides **Upstage’s flagship model, [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)**, packaged with [**Nota AI**](https://www.nota.ai/)’s proprietary quantization technique specifically developed for
|
| 22 |
|
| 23 |
## Overview
|
| 24 |
|
|
|
|
| 18 |
|
| 19 |
# **Solar-Open-100B-NotaMoeQuant-Int4**
|
| 20 |
|
| 21 |
+
This repository provides **Upstage’s flagship model, [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)**, packaged with [**Nota AI**](https://www.nota.ai/)’s proprietary quantization technique specifically developed for Mixture-of-Experts (MoE)-based LLMs. Unlike conventional quantization methods, this approach incorporates a novel method designed to mitigate representation distortion that can occur when experts are mixed under quantization in MoE architectures.
|
| 22 |
|
| 23 |
## Overview
|
| 24 |
|