DeathGodlike's picture
Upload README.md with huggingface_hub
5dc4bd5 verified
metadata
base_model:
  - Vortex5/Lunar-Twilight-12B
base_model_relation: quantized
pipeline_tag: text-generation
library_name: safetensors
tags:
  - exl3
  - 4-bit
  - 6-bit
  - 8-bit

Source model

Lunar-Twilight-12B by Vortex5


Provided quantized models

ExLlamaV3: release v0.0.18

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: unknown

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 09.01.2026

Source files

Source page (click to expand)

Lunar-Twilight-12B

Overview

Lunar-Twilight-12B was created by merging Starlit-Shadow-12B, Red-Synthesis-12B, Tlacuilo-12B, and Mystic-Matron-12B using a custom method.

Merge configuration
base_model: Vortex5/Starlit-Shadow-12B
models:
  - model: Vortex5/Starlit-Shadow-12B
  - model: Vortex5/Red-Synthesis-12B
  - model: allura-org/Tlacuilo-12B
  - model: Vortex5/Mystic-Matron-12B
merge_method: hpq
chat_template: auto
parameters:
  strength: 0.73
  flavor: 0.36
  steps: 12
  cube_dims: 22
  paradox: 0.43
  boost: 0.56
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
      

Intended Use

📜 Storytelling
🎭 Roleplay
🌙 Creative Writing