AraGPT2 Arabic Humor Generator
This model is a fine-tuned version of aubmindlab/aragpt2-medium specifically optimized for generating Arabic jokes and sarcastic humor based on keyword pairs.
Model Description
The model was trained to solve the "Keyword-to-Joke" task. It uses a specific prompt format:
word1 word2 | [Generated Joke]
Training Results
During the final training phase, the model achieved:
- Training Loss: 3.28
- Validation Loss: 4.49
- Epochs: 5
How to Use
You can use this model with the Hugging Face transformers library:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("{HF_REPO_ID}")
model = AutoModelForCausalLM.from_pretrained("{HF_REPO_ID}")
prompt = "ู
ุญุดุด ู
ุฏุฑุณุฉ | "
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Intended Use & Limitations
This model is designed for creative writing and humor generation.
- Repetition: Use a
repetition_penaltyof 1.2 or higher to avoid loops. - Safety: While training data was filtered, AI humor can occasionally produce unexpected results.
- Downloads last month
- 36
Model tree for FatimahEmadEldin/AraGPT2-Arabic-Humor-Generator
Base model
aubmindlab/aragpt2-medium