|
|
--- |
|
|
language: |
|
|
- en |
|
|
tags: |
|
|
- gpt2 |
|
|
- keras-nlp |
|
|
- text-generation |
|
|
- bhagavad-gita |
|
|
license: mit |
|
|
datasets: |
|
|
- custom |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
|
|
|
# π Fine-tuned GPT-2 on the Bhagavad Gita |
|
|
|
|
|
This repository contains a **fine-tuned GPT-2 model** trained on the *Bhagavad Gita* (English meanings/verses). |
|
|
It generates text inspired by the teachings and style of the Bhagavad Gita. |
|
|
|
|
|
--- |
|
|
|
|
|
## π§Ύ Model Card |
|
|
|
|
|
| Attribute | Details | |
|
|
|------------------|---------| |
|
|
| **Base Model** | [GPT-2 (124M)](https://huggingface.co/openai-community/gpt2) | |
|
|
| **Architecture** | Decoder-only Transformer | |
|
|
| **Framework** | TensorFlow / KerasNLP | |
|
|
| **Dataset** | Bhagavad Gita (English meanings) | |
|
|
| **Languages** | English | |
|
|
| **Dataset Size** | ~700 verses | |
|
|
| **Tokenizer** | GPT-2 tokenizer (inherited vocabulary) | |
|
|
|
|
|
--- |
|
|
|
|
|
## π Training Details |
|
|
|
|
|
- **Epochs**: 3 |
|
|
- **Batch Size**: 8 |
|
|
- **Learning Rate**: 3e-5 |
|
|
- **Optimizer**: AdamW |
|
|
- **Loss Function**: Causal Language Modeling (CrossEntropy) |
|
|
- **Preprocessing**: Cleaned, tokenized, and formatted into text sequences |
|
|
|
|
|
--- |
|
|
|
|
|
## π Usage |
|
|
|
|
|
You can load and run the model using Hugging Face `transformers`: |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
|
|
# Load tokenizer & model |
|
|
tokenizer = AutoTokenizer.from_pretrained("AP6621/Bhagawatgitagpt") |
|
|
model = AutoModelForCausalLM.from_pretrained("AP6621/Bhagawatgitagpt") |
|
|
|
|
|
# Generate text |
|
|
inputs = tokenizer("Arjuna asked:", return_tensors="pt") |
|
|
outputs = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.9, temperature=0.8) |
|
|
|
|
|
print(tokenizer.decode(outputs[0], skip_special_tokens=True)) |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
## β¨ Example |
|
|
|
|
|
**Prompt**: |
|
|
``` |
|
|
Arjuna asked: |
|
|
``` |
|
|
|
|
|
**Generated Response**: |
|
|
*"O Arjuna, when the mind is detached from desires, the soul attains peace and wisdom. Such a person sees all beings with equal vision."* |
|
|
|
|
|
--- |
|
|
|
|
|
## βοΈ Limitations & Bias |
|
|
|
|
|
- Small dataset β may **repeat phrases** or **hallucinate content** |
|
|
- Not an official or scholarly translation |
|
|
- Should not be used as a **religious authority** |
|
|
|
|
|
--- |
|
|
|
|
|
## β
Intended Use |
|
|
|
|
|
βοΈ Educational experiments |
|
|
βοΈ Creative/spiritual-inspired text generation |
|
|
βοΈ AI-assisted storytelling |
|
|
|
|
|
β Not for religious/spiritual authority |
|
|
β Not for official translations |
|
|
β Not for sensitive decision-making |
|
|
|
|
|
--- |
|
|
|
|
|
## π Citation |
|
|
|
|
|
If you use this model, please cite: |
|
|
|
|
|
```bibtex |
|
|
@misc{gpt2-bhagavadgita, |
|
|
title = {Fine-tuned GPT-2 on Bhagavad Gita Dataset}, |
|
|
author = {Your Name}, |
|
|
year = {2025}, |
|
|
publisher = {Hugging Face}, |
|
|
howpublished = {\url{https://huggingface.co/your-username/gpt2-bhagavad-gita}} |
|
|
} |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
## π Acknowledgements |
|
|
|
|
|
- [OpenAI](https://huggingface.co/openai-community) for GPT-2 |
|
|
- Public Bhagavad Gita dataset |
|
|
- Hugging Face community for tools & inspiration |
|
|
|
|
|
--- |
|
|
|
|
|
|