Original model: Stellar-Odyssey-12b-v0.0 by LyraNovaHeart

Available ExLlamaV3 (release v0.0.18) quantizations

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing: The license for the provided quantized models is derived from the original model (see the source above)

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DeathGodlike/Stellar-Odyssey-12b-v0.0_EXL3

Quantized
(7)
this model