|
|
--- |
|
|
license: apache-2.0 |
|
|
library_name: transformers |
|
|
language: |
|
|
- en |
|
|
pipeline_tag: image-text-to-text |
|
|
tags: |
|
|
- text-generation |
|
|
- instruct |
|
|
- coding |
|
|
- research |
|
|
- qwen |
|
|
- hyze |
|
|
- Hitesh |
|
|
metrics: |
|
|
- accuracy |
|
|
base_model: |
|
|
- Qwen/Qwen3-VL-30B-A3B-Instruct |
|
|
--- |
|
|
|
|
|
<p align="center"> |
|
|
<img src="https://i.imgur.com/ePJMLNp.png" alt="Hyze Logo" width="370"/> |
|
|
</p> |
|
|
|
|
|
<p align="center"> |
|
|
<img src="https://img.alicdn.com/imgextra/i4/O1CN01a6pmNi24dfWQwmMp3_!!6000000007414-2-tps-270-90.png" alt="Qwen Logo" width="220"/> |
|
|
</p> |
|
|
|
|
|
<h1 align="center">HyzeQwenInstruct-30B</h1> |
|
|
|
|
|
<p align="center"> |
|
|
A high-performance instruction model by <b>Hyze AI</b> built for coding and research. |
|
|
</p> |
|
|
|
|
|
<p align="center"> |
|
|
π <a href="https://hyzeai.vercel.app">hyzeai.vercel.app</a> β’ |
|
|
π <a href="https://hyzedocs.vercel.app">hyzedocs.vercel.app</a> β’ |
|
|
π§ <a href="https://hyzecode.vercel.app">hyzecode.vercel.app</a> |
|
|
</p> |
|
|
|
|
|
--- |
|
|
|
|
|
## π Overview |
|
|
|
|
|
**HyzeQwenInstruct-30B** is a 30-billion parameter instruction-tuned large language model optimized for: |
|
|
|
|
|
- π§βπ» Advanced code generation |
|
|
- π Technical research & reasoning |
|
|
- π§ Deep structured explanations |
|
|
- π€ Strong instruction following |
|
|
|
|
|
Designed for developers, engineers, and researchers who need powerful AI assistance. |
|
|
|
|
|
--- |
|
|
|
|
|
## π§ Training Focus |
|
|
|
|
|
HyzeQwenInstruct-30B was optimized for: |
|
|
|
|
|
### π§βπ» Coding |
|
|
- Python, JavaScript, C++, and more |
|
|
- Code completion & generation |
|
|
- Debugging & refactoring |
|
|
- Algorithm explanations |
|
|
|
|
|
### π Research & Technical Reasoning |
|
|
- Structured academic-style answers |
|
|
- Scientific explanations |
|
|
- Step-by-step reasoning |
|
|
- Long-form responses |
|
|
|
|
|
### π― Instruction Tuning |
|
|
- Precise intent following |
|
|
- Context retention |
|
|
- Clean output formatting |
|
|
|
|
|
--- |
|
|
|
|
|
## π Benchmarks β Technical Comparison |
|
|
|
|
|
| Model | Size | Coding | Reasoning | Notes | |
|
|
|-------|------|--------|-----------|-------| |
|
|
| **HyzeQwenInstruct-30B** | 30B | βββββ | βββββ | Optimized for dev + research | |
|
|
| Qwen-30B-Instruct | 30B | βββββ | βββββ | Strong base alignment | |
|
|
| GPT-NeoX-20B | 20B | βββββ | βββββ | Smaller parameter count | |
|
|
| GPT-1 | 117M | βββββ | βββββ | Early generation model | |
|
|
|
|
|
### β‘ Performance Characteristics |
|
|
|
|
|
- Strong code structure generation |
|
|
- Clear technical explanations |
|
|
- High instruction accuracy |
|
|
- Suitable for professional workflows |
|
|
|
|
|
> Benchmark ratings are based on internal qualitative evaluation. |
|
|
|
|
|
--- |
|
|
|
|
|
## π§ͺ Usage |
|
|
|
|
|
### Transformers (Python) |
|
|
|
|
|
```python |
|
|
from transformers import pipeline |
|
|
|
|
|
generator = pipeline( |
|
|
"text-generation", |
|
|
model="HyzeAI/HyzeQwenInstruct-30B" |
|
|
) |
|
|
|
|
|
print(generator("Write a Python function to implement quicksort:")) |