A simple model trained on simonko912/chan-shitpost-2.5, around 100m params. It was trained on a jsonl where the content was the text and role was the username.

Table of perplexity:

Dataset Size Perplexity Note
simonko912/chan-shitpost-2.5 Half 76.48317487258386 Trained on
simonko912/chan-shitpost-2.5 Fourth 75.62517223207546 Trained on
wikitext-2-raw-v, wikitext Fourth 859.5996453917919 Not trained on
Downloads last month
98
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for simonko912/chan-shitpost-2.5-llama-large

Quantizations
2 models

Dataset used to train simonko912/chan-shitpost-2.5-llama-large