Spaces:
Sleeping
Sleeping
File size: 2,484 Bytes
fc74c91 9775ece 74ea875 9775ece 74ea875 9775ece | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 | ---
title: Legion Coder
emoji: ⚡
colorFrom: red
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
license: mit
---
# ⚡ Legion Coder
**A 44M Parameter Transformer for Code Generation**
[](https://huggingface.co/dineth554/legion-coder-8m)
[]()
---
## 🚀 About
Legion Coder is a compact yet powerful 44M parameter transformer model optimized for coding tasks. Built with precision by **DEATH LEGION** and powered by **nvdya-kit**, this model delivers high-quality code generation in a lightweight package.
## ✨ Features
- 📝 **Clean Code Generation** - PEP 8 compliant Python and more
- 🐛 **Debug Assistance** - Help identify and fix code issues
- 📚 **Code Explanation** - Understand complex programming concepts
- 💡 **Multi-language Support** - Python, JavaScript, and more
- ⚡ **Fast Inference** - Optimized for CPU deployment
## 📊 Model Specifications
| Attribute | Value |
|-----------|-------|
| **Parameters** | 44,341,632 (~44M) |
| **Architecture** | GPT-style Transformer |
| **Hidden Size** | 576 |
| **Layers** | 13 |
| **Attention Heads** | 16 |
| **Context Length** | 1,024 tokens |
| **Vocabulary** | 16,000 tokens |
| **Format** | Safetensors |
## 🎯 Use Cases
- **Code Completion** - Finish partial code snippets
- **Function Generation** - Create functions from descriptions
- **Debugging** - Find and fix errors in code
- **Learning** - Get explanations for programming concepts
- **Prototyping** - Quickly generate code scaffolding
## 🛠️ Technical Details
### Training Data
- Python code from The Stack v2 dataset
- GitHub code repositories (filtered for quality)
- Code-specific preprocessing for indentation and special tokens
### Training Procedure
- **Optimizer**: AdamW
- **Learning Rate**: 5e-4 with cosine decay
- **Batch Size**: 4 with gradient accumulation
- **Training Steps**: 10,000
- **Precision**: float32 (CPU-optimized)
## 📝 License
This model is released under the **MIT License**.
## 🔗 Links
- **Model Repository**: [dineth554/legion-coder-8m](https://huggingface.co/dineth554/legion-coder-8m)
- **Space**: This Space
---
<div align="center">
### 🔥 MADE WITH BY DEATH LEGION 🔥
**Powered by nvdya-kit**
*© 2024 DEATH LEGION. All rights reserved.*
</div>
|