legion-coder / README.md
dineth554's picture
Upload README.md with huggingface_hub
fc74c91 verified
---
title: Legion Coder
emoji:
colorFrom: red
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
license: mit
---
# ⚡ Legion Coder
**A 44M Parameter Transformer for Code Generation**
[![Made with by DEATH LEGION](https://img.shields.io/badge/MADE%20WITH%20BY-DEATH%20LEGION-ff0040?style=for-the-badge)](https://huggingface.co/dineth554/legion-coder-8m)
[![Powered by nvdya-kit](https://img.shields.io/badge/POWERED%20BY-nvdya--kit-7c4dff?style=for-the-badge)]()
---
## 🚀 About
Legion Coder is a compact yet powerful 44M parameter transformer model optimized for coding tasks. Built with precision by **DEATH LEGION** and powered by **nvdya-kit**, this model delivers high-quality code generation in a lightweight package.
## ✨ Features
- 📝 **Clean Code Generation** - PEP 8 compliant Python and more
- 🐛 **Debug Assistance** - Help identify and fix code issues
- 📚 **Code Explanation** - Understand complex programming concepts
- 💡 **Multi-language Support** - Python, JavaScript, and more
-**Fast Inference** - Optimized for CPU deployment
## 📊 Model Specifications
| Attribute | Value |
|-----------|-------|
| **Parameters** | 44,341,632 (~44M) |
| **Architecture** | GPT-style Transformer |
| **Hidden Size** | 576 |
| **Layers** | 13 |
| **Attention Heads** | 16 |
| **Context Length** | 1,024 tokens |
| **Vocabulary** | 16,000 tokens |
| **Format** | Safetensors |
## 🎯 Use Cases
- **Code Completion** - Finish partial code snippets
- **Function Generation** - Create functions from descriptions
- **Debugging** - Find and fix errors in code
- **Learning** - Get explanations for programming concepts
- **Prototyping** - Quickly generate code scaffolding
## 🛠️ Technical Details
### Training Data
- Python code from The Stack v2 dataset
- GitHub code repositories (filtered for quality)
- Code-specific preprocessing for indentation and special tokens
### Training Procedure
- **Optimizer**: AdamW
- **Learning Rate**: 5e-4 with cosine decay
- **Batch Size**: 4 with gradient accumulation
- **Training Steps**: 10,000
- **Precision**: float32 (CPU-optimized)
## 📝 License
This model is released under the **MIT License**.
## 🔗 Links
- **Model Repository**: [dineth554/legion-coder-8m](https://huggingface.co/dineth554/legion-coder-8m)
- **Space**: This Space
---
<div align="center">
### 🔥 MADE WITH BY DEATH LEGION 🔥
**Powered by nvdya-kit**
*© 2024 DEATH LEGION. All rights reserved.*
</div>