gpt2-brainrot is a version of gpt2 fine-tuned on data I pulled from a couple subreddits. Its purpose is just to generate ridiculous stuff.
Obviously, since the data comes from Reddit, don't trust anything it says.
The model, at 124,442,112 parameters, has about 7.4 million more parameters than the original version of gpt2.
Model tree for real-chatgpt/gpt2-brainrot
Base model
openai-community/gpt2