Eclipsed-Prism-12B / README.md
Vortex5's picture
Update README.md
5161c31 verified
metadata
base_model:
  - Vortex5/Starlit-Shadow-12B
  - Vortex5/Shining-Prism-12B
  - yamatazen/EtherealAurora-12B
  - yamatazen/EsotericSage-12B
  - Vortex5/Hollow-Aether-12B
library_name: transformers
tags:
  - mergekit
  - merge
  - roleplay

Eclipsed-Prism-12B

Overview

Eclipsed-Prism-12B was created through a multi-stage merge involving Starlit-Shadow-12B, Shining-Prism-12B, EtherealAurora-12B, EsotericSage-12B, and Hollow-Aether-12B using custom methods.

Multi-stage merge configuration
name: First
merge_method: acl
base_model: Vortex5/Starlit-Shadow-12B
models:
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  strength: 0.75
  selectivity: 0.95
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Second
merge_method: amsf
models:
  - model: First
  - model: yamatazen/EsotericSage-12B
  - model: Vortex5/Hollow-Aether-12B
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Third
merge_method: saef
models:
  - model: Second
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  paradox: 0.45
  strength: 0.9
  boost: 0.55
  modes: 2
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
#no name needed for final model
merge_method: sm2f
base_model: Third
models:
  - model: Vortex5/Starlit-Shadow-12B
parameters:
  focus: 0.55
  trust: 0.60
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
      

Intended Use

🌒 Storytelling
🎭 Roleplay
✨ Creative writing