| | --- |
| | language: |
| | - en |
| | license: apache-2.0 |
| | library_name: gguf |
| | tags: |
| | - ruvltra |
| | - sona |
| | - adaptive-learning |
| | - gguf |
| | - quantized |
| | - edge-device |
| | - embedded |
| | - iot |
| | pipeline_tag: text-generation |
| | --- |
| | |
| | <div align="center"> |
| |
|
| | # RuvLTRA Small |
| |
|
| | [](https://opensource.org/licenses/Apache-2.0) |
| | [](https://huggingface.co/ruv/ruvltra-small) |
| | [](https://github.com/ggerganov/ggml/blob/master/docs/gguf.md) |
| |
|
| | **π± Compact Model Optimized for Edge Devices** |
| |
|
| | [Quick Start](#-quick-start) β’ [Use Cases](#-use-cases) β’ [Integration](#-integration) |
| |
|
| | </div> |
| |
|
| | --- |
| |
|
| | ## Overview |
| |
|
| | RuvLTRA Small is a compact 0.5B parameter model designed for edge deployment. Perfect for mobile apps, IoT devices, and resource-constrained environments. |
| |
|
| | ## Model Card |
| |
|
| | | Property | Value | |
| | |----------|-------| |
| | | **Parameters** | 0.5 Billion | |
| | | **Quantization** | Q4_K_M | |
| | | **Context** | 4,096 tokens | |
| | | **Size** | ~398 MB | |
| | | **Min RAM** | 1 GB | |
| |
|
| | ## π Quick Start |
| |
|
| | ```bash |
| | # Download |
| | wget https://huggingface.co/ruv/ruvltra-small/resolve/main/ruvltra-0.5b-q4_k_m.gguf |
| | |
| | # Run with llama.cpp |
| | ./llama-cli -m ruvltra-0.5b-q4_k_m.gguf -p "Hello, I am" -n 64 |
| | ``` |
| |
|
| | ## π‘ Use Cases |
| |
|
| | - **Mobile Apps**: On-device AI assistant |
| | - **IoT**: Smart home device intelligence |
| | - **Edge Computing**: Local inference without cloud |
| | - **Prototyping**: Quick model experimentation |
| |
|
| | ## π§ Integration |
| |
|
| | ### Rust (RuvLLM) |
| | ```rust |
| | use ruvllm::hub::ModelDownloader; |
| | |
| | let path = ModelDownloader::new() |
| | .download("ruv/ruvltra-small", None) |
| | .await?; |
| | ``` |
| |
|
| | ### Python |
| | ```python |
| | from huggingface_hub import hf_hub_download |
| | |
| | model = hf_hub_download("ruv/ruvltra-small", "ruvltra-0.5b-q4_k_m.gguf") |
| | ``` |
| |
|
| | ## Hardware Support |
| |
|
| | - β
Apple Silicon (M1/M2/M3) |
| | - β
NVIDIA CUDA |
| | - β
CPU (x86/ARM) |
| | - β
Raspberry Pi 4/5 |
| |
|
| | --- |
| |
|
| | **License**: Apache 2.0 | **GitHub**: [ruvnet/ruvector](https://github.com/ruvnet/ruvector) |
| |
|