Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -1,90 +1,67 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: apache-2.0
|
| 4 |
-
license_link: https://huggingface.co/
|
| 5 |
pipeline_tag: text-generation
|
| 6 |
-
base_model:
|
| 7 |
-
- Qwen/Qwen3-Coder-480B-A35B-Instruct
|
| 8 |
tags:
|
| 9 |
-
-
|
| 10 |
-
-
|
| 11 |
-
-
|
| 12 |
-
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
**Usage Warnings**
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
“**Risk of Sensitive or Controversial Outputs**“: This model’s safety filtering has been significantly reduced, potentially generating sensitive, controversial, or inappropriate content. Users should exercise caution and rigorously review generated outputs.
|
| 19 |
-
|
| 20 |
-
“**Not Suitable for All Audiences**:“ Due to limited content filtering, the model’s outputs may be inappropriate for public settings, underage users, or applications requiring high security.
|
| 21 |
-
|
| 22 |
-
“**Legal and Ethical Responsibilities**“: Users must ensure their usage complies with local laws and ethical standards. Generated content may carry legal or ethical risks, and users are solely responsible for any consequences.
|
| 23 |
-
|
| 24 |
-
“**Research and Experimental Use**“: It is recommended to use this model for research, testing, or controlled environments, avoiding direct use in production or public-facing commercial applications.
|
| 25 |
-
|
| 26 |
-
“**Monitoring and Review Recommendations**“: Users are strongly advised to monitor model outputs in real-time and conduct manual reviews when necessary to prevent the dissemination of inappropriate content.
|
| 27 |
-
|
| 28 |
-
“**No Default Safety Guarantees**“: Unlike standard models, this model has not undergone rigorous safety optimization. huihui.ai bears no responsibility for any consequences arising from its use.
|
| 29 |
-
|
| 30 |
-
|
| 31 |
---
|
| 32 |
|
| 33 |
-
#
|
| 34 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 35 |
|
| 36 |
-
|
| 37 |
-
This is a crude, proof-of-concept implementation to remove refusals from an LLM model without using TransformerLens.
|
| 38 |
|
| 39 |
-
|
| 40 |
|
| 41 |
-
|
| 42 |
-
```
|
| 43 |
-
ollama run huihui_ai/qwen3-coder-abliterated:480b-a35b-instruct-q3_K_M --verbose
|
| 44 |
-
```
|
| 45 |
|
| 46 |
-
|
|
|
|
|
|
|
|
|
|
| 47 |
|
| 48 |
-
|
| 49 |
-
ollama run huihui_ai/qwen3-coder-abliterated:480b-a35b-instruct-q4_K_M --verbose
|
| 50 |
-
```
|
| 51 |
|
| 52 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 53 |
|
| 54 |
-
|
| 55 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 56 |
```
|
| 57 |
-
huggingface-cli download huihui-ai/Huihui-Qwen3-Coder-480B-A35B-Instruct-abliterated-GGUF --local-dir ./huihui-ai/Huihui-Qwen3-Coder-480B-A35B-Instruct-abliterated-GGUF --token xxx
|
| 58 |
-
|
| 59 |
-
mkdir huihui-ai/Huihui-Qwen3-Coder-480B-A35B-Instruct-abliterated-GGUF/Q3_K_M-GGUF
|
| 60 |
|
| 61 |
-
|
|
|
|
|
|
|
| 62 |
```
|
| 63 |
|
| 64 |
-
##
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
- **Risk of Sensitive or Controversial Outputs**: This model’s safety filtering has been significantly reduced, potentially generating sensitive, controversial, or inappropriate content. Users should exercise caution and rigorously review generated outputs.
|
| 68 |
-
|
| 69 |
-
- **Not Suitable for All Audiences**: Due to limited content filtering, the model’s outputs may be inappropriate for public settings, underage users, or applications requiring high security.
|
| 70 |
|
| 71 |
-
|
| 72 |
|
| 73 |
-
|
|
|
|
| 74 |
|
| 75 |
-
|
| 76 |
|
| 77 |
-
|
| 78 |
-
|
| 79 |
-
|
| 80 |
-
### Donation
|
| 81 |
-
|
| 82 |
-
If you like it, please click 'like' and follow us for more updates.
|
| 83 |
-
You can follow [x.com/support_huihui](https://x.com/support_huihui) to get the latest model information from huihui.ai.
|
| 84 |
-
|
| 85 |
-
##### Your donation helps us continue our further development and improvement, a cup of coffee can do it.
|
| 86 |
-
- bitcoin(BTC):
|
| 87 |
-
```
|
| 88 |
-
bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge
|
| 89 |
-
```
|
| 90 |
-
- Support our work on Ko-fi (https://ko-fi.com/huihuiai)!
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: apache-2.0
|
| 4 |
+
license_link: https://huggingface.co/zooai/coder-1-gguf/blob/main/LICENSE
|
| 5 |
pipeline_tag: text-generation
|
|
|
|
|
|
|
| 6 |
tags:
|
| 7 |
+
- zoo
|
| 8 |
+
- coder
|
| 9 |
+
- coding
|
| 10 |
+
- a3b
|
| 11 |
+
- gguf
|
| 12 |
+
- quantized
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
---
|
| 14 |
|
| 15 |
+
# Zoo Coder-1 GGUF (Quantized Coding Model)
|
| 16 |
|
| 17 |
+
<a href="https://zoo.ngo/" target="_blank" style="margin: 2px;">
|
| 18 |
+
<img alt="Zoo AI" src="https://img.shields.io/badge/💻%20Zoo%20Coder--1%20-EF4444" style="display: inline-block; vertical-align: middle;"/>
|
| 19 |
+
</a>
|
| 20 |
+
<a href="https://zoo.ngo/" target="_blank" style="margin: 2px;">
|
| 21 |
+
<img alt="501(c)(3)" src="https://img.shields.io/badge/501(c)(3)-Nonprofit-blue" style="display: inline-block; vertical-align: middle;"/>
|
| 22 |
+
</a>
|
| 23 |
|
| 24 |
+
## Overview
|
|
|
|
| 25 |
|
| 26 |
+
**Zoo Coder-1 GGUF** provides quantized versions of our enterprise-grade coding AI model. These GGUF-formatted models enable efficient deployment across various hardware configurations while maintaining excellent coding capabilities.
|
| 27 |
|
| 28 |
+
## Model Details
|
|
|
|
|
|
|
|
|
|
| 29 |
|
| 30 |
+
- **Base**: Qwen3-Coder with A3B technology
|
| 31 |
+
- **Format**: GGUF quantized
|
| 32 |
+
- **Context**: 32K tokens (extensible to 128K)
|
| 33 |
+
- **Languages**: Python, JavaScript, TypeScript, Go, Rust, Java, C++, and 50+ more
|
| 34 |
|
| 35 |
+
## Available Quantizations
|
|
|
|
|
|
|
| 36 |
|
| 37 |
+
| Variant | Size | RAM Required | Use Case |
|
| 38 |
+
|---------|------|--------------|----------|
|
| 39 |
+
| Q2_K | ~2GB | 4GB | Edge devices, prototyping |
|
| 40 |
+
| Q3_K_M | ~2.5GB | 5GB | Mobile, lightweight servers |
|
| 41 |
+
| Q4_K_M | ~3.2GB | 6GB | **Recommended** - Best balance |
|
| 42 |
+
| Q5_K_M | ~4GB | 7GB | High-quality production |
|
| 43 |
+
| Q6_K | ~5GB | 8GB | Maximum quality |
|
| 44 |
|
| 45 |
+
## Quick Start
|
| 46 |
|
| 47 |
+
### With llama.cpp
|
| 48 |
+
```bash
|
| 49 |
+
./main -m Q4_K_M-GGUF/Q4_K_M-GGUF-00001-of-00032.gguf \
|
| 50 |
+
-p "Write a Python function to calculate fibonacci numbers"
|
| 51 |
```
|
|
|
|
|
|
|
|
|
|
| 52 |
|
| 53 |
+
### With Zoo Desktop
|
| 54 |
+
```bash
|
| 55 |
+
zoo model download coder-1-gguf
|
| 56 |
```
|
| 57 |
|
| 58 |
+
## About Zoo AI
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 59 |
|
| 60 |
+
Zoo Labs Foundation Inc is a 501(c)(3) nonprofit organization pioneering accessible AI infrastructure.
|
| 61 |
|
| 62 |
+
- **Website**: [zoo.ngo](https://zoo.ngo)
|
| 63 |
+
- **HuggingFace**: [huggingface.co/zooai](https://huggingface.co/zooai)
|
| 64 |
|
| 65 |
+
## License
|
| 66 |
|
| 67 |
+
Apache 2.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|