--- language: - en tags: - gpt2 - keras-nlp - text-generation - bhagavad-gita license: mit datasets: - custom pipeline_tag: text-generation --- # ๐Ÿ“– Fine-tuned GPT-2 on the Bhagavad Gita This repository contains a **fine-tuned GPT-2 model** trained on the *Bhagavad Gita* (English meanings/verses). It generates text inspired by the teachings and style of the Bhagavad Gita. --- ## ๐Ÿงพ Model Card | Attribute | Details | |------------------|---------| | **Base Model** | [GPT-2 (124M)](https://huggingface.co/openai-community/gpt2) | | **Architecture** | Decoder-only Transformer | | **Framework** | TensorFlow / KerasNLP | | **Dataset** | Bhagavad Gita (English meanings) | | **Languages** | English | | **Dataset Size** | ~700 verses | | **Tokenizer** | GPT-2 tokenizer (inherited vocabulary) | --- ## ๐Ÿ“Š Training Details - **Epochs**: 3 - **Batch Size**: 8 - **Learning Rate**: 3e-5 - **Optimizer**: AdamW - **Loss Function**: Causal Language Modeling (CrossEntropy) - **Preprocessing**: Cleaned, tokenized, and formatted into text sequences --- ## ๐Ÿš€ Usage You can load and run the model using Hugging Face `transformers`: ```python from transformers import AutoTokenizer, AutoModelForCausalLM # Load tokenizer & model tokenizer = AutoTokenizer.from_pretrained("AP6621/Bhagawatgitagpt") model = AutoModelForCausalLM.from_pretrained("AP6621/Bhagawatgitagpt") # Generate text inputs = tokenizer("Arjuna asked:", return_tensors="pt") outputs = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.9, temperature=0.8) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` --- ## โœจ Example **Prompt**: ``` Arjuna asked: ``` **Generated Response**: *"O Arjuna, when the mind is detached from desires, the soul attains peace and wisdom. Such a person sees all beings with equal vision."* --- ## โš–๏ธ Limitations & Bias - Small dataset โ†’ may **repeat phrases** or **hallucinate content** - Not an official or scholarly translation - Should not be used as a **religious authority** --- ## โœ… Intended Use โœ”๏ธ Educational experiments โœ”๏ธ Creative/spiritual-inspired text generation โœ”๏ธ AI-assisted storytelling โŒ Not for religious/spiritual authority โŒ Not for official translations โŒ Not for sensitive decision-making --- ## ๐Ÿ“Œ Citation If you use this model, please cite: ```bibtex @misc{gpt2-bhagavadgita, title = {Fine-tuned GPT-2 on Bhagavad Gita Dataset}, author = {Your Name}, year = {2025}, publisher = {Hugging Face}, howpublished = {\url{https://huggingface.co/your-username/gpt2-bhagavad-gita}} } ``` --- ## ๐Ÿ™ Acknowledgements - [OpenAI](https://huggingface.co/openai-community) for GPT-2 - Public Bhagavad Gita dataset - Hugging Face community for tools & inspiration ---