Nova-Mythra-12B

Overview

Nova-Mythra-12B was created through a multi-stage merge involving Hollow-Aether-12B, KiloNovaSynth-12B, NoctyxCosma-12B, Violet-Lyra-Gutenberg-v2, Lunar-Twilight-12B, and Tlacuilo-12B using a custom method.

Multi-stage Merge configuration
name: First
models:
  - model: Marcjoni/KiloNovaSynth-12B
merge_method: sm2f
base_model: Vortex5/Hollow-Aether-12B
dtype: bfloat16
tokenizer:
  source: Vortex5/Hollow-Aether-12B
---
name: Second
models:
  - model: Vortex5/NoctyxCosma-12B
merge_method: sm2f
base_model: ohyeah1/Violet-Lyra-Gutenberg-v2
dtype: bfloat16
tokenizer:
  source: ohyeah1/Violet-Lyra-Gutenberg-v2
---
name: Third
models:
  - model: Vortex5/Lunar-Twilight-12B
merge_method: sm2f
base_model: allura-org/Tlacuilo-12B
dtype: bfloat16
tokenizer:
  source: allura-org/Tlacuilo-12B
---
#Final
models:
  - model: First
  - model: Second
  - model: Third
merge_method: karcher
chat_template: auto
dtype: bfloat16
parameters:
  tol: 1e-9
  max_iter: 1000
tokenizer:
  source: Vortex5/NoctyxCosma-12B
      

Intended Use

For writing, roleplay, and imagination

📜 Storytelling Long-form narrative worlds
🎭 Roleplay Character-focused interaction
🌙 Creative Writing Ideas, drafts, and scenes
Downloads last month
65
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Vortex5/Nova-Mythra-12B

Quantizations
3 models

Collection including Vortex5/Nova-Mythra-12B