File size: 2,669 Bytes
0427b97 8f085bc 9f32be9 d092a77 9f32be9 47ffec0 710141a 9f32be9 710141a 9f32be9 cafd46d 710141a cafd46d 710141a cafd46d 710141a e90d42a 710141a cafd46d 9f32be9 710141a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 |
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- text-generation-inference
- code
- Hitesh_V_Founder
---
<p align="center">
<img src="https://i.imgur.com/ePJMLNp.png" alt="Hyze Logo" width="405"/>
</p>
<h1 align="center">HyzeMini</h1>
<p align="center">
A lightweight text-generation model by <b>Hyze AI</b>
</p>
<p align="center">
π <a href="https://hyzebot.vercel.app">hyzebot.vercel.app</a> β’
π <a href="https://hyzedocs.vercel.app">hyzedocs.vercel.app</a> β’
π§ <a href="https://hyzecode.vercel.app">hyzecode.vercel.app</a>
</p>
---
## π Overview
**HyzeMini** is a compact and efficient **text-generation transformer model** optimized for **Space & Astronomy knowledge** π and **General Chat** π¬.
Itβs designed to run fast on low-resource systems while still delivering clean, friendly, and useful responses.
- **Model type:** Transformer-based LLM
- **Parameters:** ~0.1B
- **Precision:** BF16
- **Language:** English
- **License:** Apache-2.0
---
## π§ Training Focus
HyzeMini was trained on a curated mixture of **publicly available English datasets**, with emphasis on:
- π **Space & Astronomy**
- Planets, stars, galaxies
- Rockets, missions, and space science
- Beginner to intermediate explanations
- π¬ **General Chat**
- Casual conversation
- Q&A-style prompts
- Friendly assistant tone
---
## π¬ About the Founder
Hyze AI was created by Hitesh Vinothkumar who is 12 years old.
Hyze focuses on learning, experimentation, and open access, blending software engineering with curiosity about the universe
---
## π Benchmarks (Qualitative Comparison)
HyzeMini focuses on **speed, coherence, and domain knowledge** rather than raw reasoning power.
| Model | Size | Strengths | Tradeoffs |
|-----|-----|---------|----------|
| **HyzeMini** | ~0.1B | Space-focused knowledge, fast, chat-friendly | Limited deep reasoning |
| **TinyLlama** | ~0.1B | Solid general generation | More generic responses |
| **GPT-Neo 125M** | ~0.125B | Better general reasoning | Slower, higher memory |
| **GPT-1** | ~0.117B | Historical baseline | Less coherent by modern standards |
### Summary
- **Coherence:** HyzeMini β TinyLlama > GPT-1
- **Space knowledge:** HyzeMini > TinyLlama / GPT-Neo (in-domain prompts)
- **Efficiency:** HyzeMini β TinyLlama > GPT-Neo
> Benchmarks are based on internal qualitative testing and comparisons.
---
## π§ͺ Usage
### Transformers (Python)
```python
from transformers import pipeline
generator = pipeline(
"text-generation",
model="HyzeAI/HyzeMini"
)
print(generator("Tell me a cool space fact:")) |