Chytrej2
Collection
4 items • Updated
A fine-tuned version of Chytrej2-Mini (20M params, LLaMA architecture) trained on conversational data. Don't expect great answers.
Built by PingVortex Labs.
Fine-tuned on HuggingFaceTB/everyday-conversations-llama3.1-2k dataset.
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_path = "pvlabs/Chytrej2-Mini-It"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path, dtype=torch.float16)
model.eval()
prompt = "<|im_start|>user\nHello<|im_end|>\n<|im_start|>assistant\n"
inputs = tokenizer(prompt, return_tensors="pt")
with torch.no_grad():
output = model.generate(
**inputs,
max_new_tokens=200,
do_sample=True,
temperature=0.7,
top_p=0.9,
eos_token_id=tokenizer.convert_tokens_to_ids("<|im_end|>"),
pad_token_id=tokenizer.eos_token_id,
)
generated = tokenizer.decode(output[0][inputs["input_ids"].shape[1]:], skip_special_tokens=False)
print(generated)
The model uses the standard ChatML format:
<|im_start|>user
Your message here<|im_end|>
<|im_start|>assistant
For multi-turn, chain turns:
<|im_start|>user
Hi!<|im_end|>
<|im_start|>assistant
Hello! How can I help you today?<|im_end|>
<|im_start|>user
What's 2+2?<|im_end|>
<|im_start|>assistant
Made by PingVortex.
Base model
pvlabs/Chytrej2-Mini