Carnice-9B โ€” hipfire quantized

kai-os/Carnice-9b quantized for hipfire, a Rust-native inference engine for AMD RDNA GPUs.

Carnice is a Hermes tool-use finetune of Qwen3.5-9B with browser, terminal, and reasoning capabilities.

Files

File Quant Size Speed (5700 XT) Notes
carnice-9b.hf4 HF4 (4-bit) 4.5 GB ~41 tok/s Faster, fits 8GB VRAM
carnice-9b.hf6 HF6 (6-bit) 6.9 GB ~32 tok/s Better quality, needs 8GB

Usage

# Install hipfire
curl -L https://raw.githubusercontent.com/Kaden-Schutt/hipfire/master/scripts/install.sh | bash

# Pull and run
hipfire pull carnice:9b        # HF4 (default)
hipfire pull carnice:9b-hf6    # HF6 (higher quality)
hipfire run carnice:9b

About

Carnice uses the Hermes chat template with <tool_call> XML tags for function calling. hipfire passes these through as regular tokens โ€” the model generates tool calls naturally within the ChatML format.

Same DeltaNet hybrid architecture as base Qwen3.5. JIT-compiles for any AMD GPU (RDNA 1-4, APUs).

License

Apache 2.0

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for schuttdev/hipfire-carnice-9b

Finetuned
Qwen/Qwen3.5-9B
Finetuned
(1)
this model