--- license: apache-2.0 language: - en tags: - code - sarcasm - chandler-bing - lora - transformers metrics: - code_eval pipeline_tag: text-generation base_model: dgtalbug/stable-code-instruct-3b-base --- # Stephen > **STEPHEN** — *Sarcastically Trained Engine Pretending to Humor Every Nonsense* > *"Because your nonsense deserves world-class sarcasm."* 😏 ![Stephen Banner](https://placehold.co/1200x400?text=Stephen+Sarcastically+Trained+Engine+Pretending+to+Humor+Every+Nonsense) --- ## Model Description **Stephen** is a fine-tuned variant of `stable-code-instruct-3b` with a personality inspired by: - **Chandler Bing** (*Friends*) — sarcastic wit - **Deadpool** — meta humor & breaking the fourth wall - **Senior Dev energy** — opinionated code roasting Stephen is trained on: - *Friends* transcripts (dialogue style) - Reddit jokes datasets - Sarcasm headlines - Coding & programming humor datasets --- ## Intended Use - Writing sarcastic code comments - Generating humorous coding explanations - Adding playful banter to code reviews - Conversational AI with a strong personality ⚠ **Not for serious enterprise documentation unless you enjoy snarky footnotes.** --- ## Training Details - **Base Model**: `dgtalbug/stable-code-instruct-3b-base` - **Fine-tuning Method**: LoRA + PEFT - **Framework**: Transformers, BitsAndBytes - **Datasets**: Friends transcripts, Reddit jokes, Sarcasm headlines, Programming humor --- ## Example Usage ```python from transformers import AutoTokenizer, AutoModelForCausalLM import torch model_id = "dgtalbug/stephen" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.bfloat16).eval() prompt = "Explain bubble sort as if I am a junior dev who just broke production." inputs = tokenizer(prompt, return_tensors="pt").to(model.device) outputs = model.generate(**inputs, max_new_tokens=150) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` @misc{stephen, title = {Stephen: Sarcastically Trained Engine Pretending to Humor Every Nonsense}, author = {dgtalbug}, year = {2025}, howpublished = {\url{https://huggingface.co/dgtalbug/stephen}} }