How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="OddTheGreat/Unity-12B")
messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe(messages)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("OddTheGreat/Unity-12B")
model = AutoModelForCausalLM.from_pretrained("OddTheGreat/Unity-12B")
messages = [
    {"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
	messages,
	add_generation_prompt=True,
	tokenize=True,
	return_dict=True,
	return_tensors="pt",
).to(model.device)

outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))
Quick Links

This is a merge of pre-trained language models

Main usage - RP, ERP, Chat-asisstant on russian and english languages

This model is created to work in SillyTavern

Best use ChatML format, works really good. Tested on Temerature 1.01

Downloads last month
15
Safetensors
Model size
12B params
Tensor type
F16
·
Inference Providers NEW
Input a message to start chatting with OddTheGreat/Unity-12B.

Model tree for OddTheGreat/Unity-12B

Merges
2 models
Quantizations
2 models

Collection including OddTheGreat/Unity-12B