Chytrej
Collection
2 items • Updated
The first model in the Chytrej series. A fully custom pretrained language model built from scratch on the LLaMA architecture.
Chytrej (Czech slang for "clever/smart") is a long-term model series by PingVortex Labs. Every model in the series will be fully custom pretrained from scratch, then the model may be instruction fine-tuned on the custom base. The ongoing goal: every release must at least know the capital of France.
Built by PingVortex Labs.
Evaluated with lm-eval-harness, 0-shot:
| Task | Metric | Score |
|---|---|---|
| ARC-Easy | acc | 39.73% |
| ARC-Easy | acc_norm | 34.47% |
from transformers import LlamaForCausalLM, PreTrainedTokenizerFast
model = LlamaForCausalLM.from_pretrained("pvlabs/Chytrej1-90M-Base")
tokenizer = PreTrainedTokenizerFast.from_pretrained("pvlabs/Chytrej1-90M-Base")
# response: The capital of France is the city of Paris...
prompt = "The capital of France is"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0]))
Made by PingVortex.