--- title: amkyaw-coder emoji: 🤖 colorFrom: blue colorTo: purple sdk: gradio app_file: app.py pinned: false --- # amkyawdev/amkyaw-dev-v1 ## Model Overview - **Model Name**: amkyaw-coder-1.5b-instruct - **Type**: Code Generation / Instruction Following - **Size**: 1.5B parameters - **Format**: GGUF (quantized) ## Quick Start ```bash # Run the model ollama run amkyawdev/amkyaw-dev-v1 ``` ## Features - Code generation - Instruction following - Burmese language support - English language support ## System Requirements - Ollama installed - At least 2GB RAM available - No GPU required (runs on CPU) ## Configuration | Parameter | Value | |-----------|-------| | Temperature | 0.8 | | Top P | 0.9 | | Top K | 40 | | Context Length | 4096 | ## Usage Examples ```python import ollama response = ollama.generate( model='amkyawdev/amkyaw-dev-v1', prompt='Write a Python function to calculate factorial' ) print(response['response']) ``` ## License See [Hugging Face](https://huggingface.co/amkyawdev/amkyaw-dev-v1) for license information. ## Troubleshooting If you encounter issues: 1. Make sure Ollama is running: `ollama serve` 2. Check model is installed: `ollama list` 3. Try restarting Ollama: `pkill ollama && ollama serve`