# Banalata — banalata_2026-04-12_iter6900_val3.7835 Decoder-only transformer trained from scratch on public-domain Bengali literary text (bangla_sahitya dataset). ## Model details - Architecture : GPT (decoder-only), RoPE + RMSNorm + SwiGLU - Layers/Embd/Heads : 8 / 512 / 8 - Tokenizer : SentencePiece BPE, vocab=5000, trained on Bengali only - Training : iter=6900, best val loss=3.7835 ## Inference ```python python s05_generate.py --author "রবীন্দ্রনাথ ঠাকুর" python s05_generate.py --prompt "আকাশ ভরা সূর্য তারা" ``` ## License Trained on public-domain Bengali literature. Model weights: Apache 2.0.