CursedGPT2
This model is a purposely crippled version of GPT-2.
- Only 2 layers instead of 12.
- Attention heads zeroed out.
- Tokenizer vocab shuffled into nonsense.
- Model weights replaced with noise.
Why?
For fun, testing robustness, or as a proof of concept that sometimes AI needs a vacation.
WARNING:
Do NOT use this for real inference unless you want hilarious nonsense and confusing output.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "powzyx/cursed-gpt2"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, use_safetensors=True)
prompt = "Hello, AI!"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
License
Creative Commons - NonCommercial - ShareAlike
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for powxyz/cursed-gpt2
Base model
openai-community/gpt2