Source model

Magistaroth-24B-v1 by DarkArtsForge


Provided quantized models

ExLlamaV3: release v0.0.22

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: apache-2.0

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 23.02.2026

Source files

Source page (click to expand)

⚠️ Warning: This model can produce narratives and RP that contain violent and graphic erotic content. Adjust your system prompt accordingly, and use Mistral Tekken chat template.

🌌 Magistaroth 24B v1

Magistaroth

A highly creative merge. Some refusals but you can use jailbreaks or ablate the model. A normtrue version was tested. Normfalse did better overall, it was slightly less censored, more detailed and creative.

Scores 14152 at Q0 Bench (Pass Q0G).

This model was merged using the following merge method: DELLA

architecture: MistralForCausalLM
models:
  - model: B:\24B\!models--mistralai--Magistral-Small-2509\textonly
  - model: B:\24B\!models--Gryphe--Tiamat-24B-Magistral\textonly
    parameters:
      density: 0.9
      weight: 0.4
      epsilon: 0.099
  - model: B:\24B\!models--TheDrummer--Magidonia-24B-v4.3
    parameters:
      density: 0.9
      weight: 0.4
      epsilon: 0.099
  - model: B:\24B\!models--TheDrummer--Precog-24B-v1
    parameters:
      density: 0.9
      weight: 0.4
      epsilon: 0.099
  - model: B:\24B\!models--zerofata--MS3.2-PaintedFantasy-v3-24B
    parameters:
      density: 0.9
      weight: 0.4
      epsilon: 0.099
  - model: B:\24B\!models--zerofata--MS3.2-PaintedFantasy-v4.1-24B
    parameters:
      density: 0.9
      weight: 0.4
      epsilon: 0.099
# Seed: 420
merge_method: della
base_model: B:\24B\!models--mistralai--Magistral-Small-2509\textonly
parameters:
  lambda: 1.0
  normalize: false
  int8_mask: false
dtype: float32
out_dtype: bfloat16
tokenizer:
  source: B:\24B\!models--TheDrummer--Magidonia-24B-v4.3
# chat_template: auto
name: 🌌 Magistaroth-24B-v1

MagiAudit

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for DeathGodlike/DarkArtsForge_Magistaroth-24B-v1_EXL3

Quantized
(5)
this model

Paper for DeathGodlike/DarkArtsForge_Magistaroth-24B-v1_EXL3