A simple model trained on simonko912/chan-shitpost-2.5, around 100m params. It was trained on a jsonl where the content was the text and role was the username.
Table of perplexity:
| Dataset | Size | Perplexity | Note |
|---|---|---|---|
| simonko912/chan-shitpost-2.5 | Half | 76.48317487258386 | Trained on |
| simonko912/chan-shitpost-2.5 | Fourth | 75.62517223207546 | Trained on |
| wikitext-2-raw-v, wikitext | Fourth | 859.5996453917919 | Not trained on |
- Downloads last month
- 98
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support