|
|
--- |
|
|
license: other |
|
|
license_name: modelcloud |
|
|
license_link: LICENSE |
|
|
language: |
|
|
- en |
|
|
base_model: |
|
|
- MiniMaxAI/MiniMax-M2 |
|
|
pipeline_tag: text-generation |
|
|
tags: |
|
|
- gptqmodel |
|
|
- modelcloud |
|
|
- chat |
|
|
- glm4.6 |
|
|
- glm |
|
|
- instruct |
|
|
- int4 |
|
|
- gptq |
|
|
- 4bit |
|
|
- w4a16 |
|
|
--- |
|
|
|
|
|
This 4bit W4A16 model has been quantized by <a href='https://x.com/qubitium'>@Qubitum</a> at ModelCloud using |
|
|
[GPT-QModel](https://github.com/ModelCloud/GPTQModel). |
|
|
|
|
|
``` |
|
|
SPDX-License-Identifier: NonCommercial-ModelCloud |
|
|
Copyright (c) 2024–2025 ModelCloud.ai |
|
|
Released for non-profit use only. |
|
|
Commercial use requires permission from [@qubitium](https://x.com/qubitium). |
|
|
``` |
|
|
|
|
|
```py |
|
|
[ Passed Validation ] ARC_CHALLENGE: `0.697098976109215` |
|
|
``` |