How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Vortex5/Sunlit-Shadow-12B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Vortex5/Sunlit-Shadow-12B")
model = AutoModelForCausalLM.from_pretrained("Vortex5/Sunlit-Shadow-12B")
Quick Links

Sunlit-Shadow-12B

Overview

Sunlit-Shadow-12B is made by merging:
Abyssal-Seraph-12B, Dreamstar-12B, Chaos-Unknown-12B, Aurora-SCE-12B, and Violet-Lyra-Gutenberg-v2 using a custom method.

Show YAML Config
models:
  - model: Vortex5/Abyssal-Seraph-12B
  - model: Vortex5/Dreamstar-12B
  - model: Vortex5/Chaos-Unknown-12b
  - model: yamatazen/Aurora-SCE-12B
  - model: ohyeah1/Violet-Lyra-Gutenberg-v2
merge_method: amsf
dtype: bfloat16
tokenizer_source: Vortex5/Abyssal-Seraph-12B
    

Intended Use

📜 Storytelling
🎭 Roleplay
🌠 Creative Writing

Credits

Downloads last month
8
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Vortex5/Sunlit-Shadow-12B