Source model
Provided quantized models
ExLlamaV3: release v0.0.18
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 7.49 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 10.22 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 12.95 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: unknown
The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.
Backups
Date: 17.01.2026
Source page (click to expand)
Nova-Mythra-12B
Overview
Nova-Mythra-12B was created through a multi-stage merge involving Hollow-Aether-12B, KiloNovaSynth-12B, NoctyxCosma-12B, Violet-Lyra-Gutenberg-v2, Lunar-Twilight-12B, and Tlacuilo-12B using a custom method.
Multi-stage Merge configuration
name: First
models:
- model: Marcjoni/KiloNovaSynth-12B
merge_method: sm2f
base_model: Vortex5/Hollow-Aether-12B
dtype: bfloat16
tokenizer:
source: Vortex5/Hollow-Aether-12B
---
name: Second
models:
- model: Vortex5/NoctyxCosma-12B
merge_method: sm2f
base_model: ohyeah1/Violet-Lyra-Gutenberg-v2
dtype: bfloat16
tokenizer:
source: ohyeah1/Violet-Lyra-Gutenberg-v2
---
name: Third
models:
- model: Vortex5/Lunar-Twilight-12B
merge_method: sm2f
base_model: allura-org/Tlacuilo-12B
dtype: bfloat16
tokenizer:
source: allura-org/Tlacuilo-12B
---
#Final
models:
- model: First
- model: Second
- model: Third
merge_method: karcher
chat_template: auto
dtype: bfloat16
parameters:
tol: 1e-9
max_iter: 1000
tokenizer:
source: Vortex5/NoctyxCosma-12B
Intended Use
For writing, roleplay, and imagination
Model tree for DeathGodlike/Vortex5_Nova-Mythra-12B_EXL3
Base model
Vortex5/Nova-Mythra-12B