# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Vortex5/Amber-Starlight-12B")
model = AutoModelForCausalLM.from_pretrained("Vortex5/Amber-Starlight-12B")
messages = [
{"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device)
outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))Amber-Starlight-12B
Overview
Amber-Starlight-12B was created by meging Strawberry_Smoothie-12B-Model_Stock, Lunar-Twilight-12B, Hollow-Aether-12B, Nova-Mythra-12B, and Shining-Seraph-12B. using a custom merge method.
Merge Configuration
models:
- model: DreadPoor/Strawberry_Smoothie-12B-Model_Stock
- model: Vortex5/Lunar-Twilight-12B
- model: Vortex5/Hollow-Aether-12B
- model: Vortex5/Nova-Mythra-12B
- model: Vortex5/Shining-Seraph-12B
merge_method: saef
chat_template: auto
parameters:
paradox: 0.4
strength: 0.9
boost: 0.5
modes: 2
dtype: float32
out_dtype: bfloat16
tokenizer:
source: Vortex5/Shining-Seraph-12B
Prose
I tested the model with an LLM using neutral prompts to summarize its narrative style.
Amber-Starlight writes in a warm, readable, and emotionally sincere style. Scenes tend to unfold with gentle pacing, clear visuals, and a focus on small-scale relationships rather than spectacle or heavy genre tropes. It reads like a cozy realist storyteller—soft-spoken, human-focused, and grounded in everyday detail. The model handles sentiment and character interactions cleanly without sliding into purple prose or over-complex structure. While not experimental or edgy, it excels at approachable, heartfelt narrative that feels lived-in and sincere.
Intended Use
Suited for creative tasks of imaginative lineage.
- Downloads last month
- 11
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Vortex5/Amber-Starlight-12B") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)