gpt2-brainrot is a version of gpt2 fine-tuned on data I pulled from a couple subreddits. Its purpose is just to generate ridiculous stuff.

Obviously, since the data comes from Reddit, don't trust anything it says.

The model, at 124,442,112 parameters, has about 7.4 million more parameters than the original version of gpt2.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for real-chatgpt/gpt2-brainrot

Finetuned
(2060)
this model