--- license: mit base_model: - Qwen/Qwen3-1.7B pipeline_tag: text-generation --- # Ollama DevOps Agent A lightweight AI-powered DevOps automation tool using a fine-tuned Qwen3-1.7B model with Ollama and SmolAgents. **Specialized for Docker and Kubernetes workflows** with sequential tool execution and structured reasoning. ## Features - **Sequential Tool Execution**: Calls ONE tool at a time, waits for results, then proceeds - **Structured Reasoning**: Uses `` and `` tags to show thought process - **Validation-Aware**: Checks command outputs for errors before proceeding - **Multi-Step Tasks**: Handles complex workflows requiring multiple tool calls - **Approval Mode**: User confirmation before executing each tool call for enhanced safety (enabled by default) - **Resource Efficient**: Optimized for local development (1GB GGUF model) - **Fast**: Completes typical DevOps tasks in ~10 seconds ## What's Special About This Model? This model is fine-tuned specifically for DevOps automation with improved reasoning capabilities: - **Docker & Kubernetes Expert**: Trained on 300+ Docker and Kubernetes workflows (90% of training data) - **One tool at a time**: Unlike base models that try to call all tools at once, this model executes sequentially - **Explicit planning**: Shows reasoning with `` and `` before acting - **Uses actual values**: Extracts and uses real values from tool responses in subsequent calls - **Error handling**: Validates each step and tries alternative approaches on failure ### Training Data Focus The model has been trained on: - **Docker workflows**: Building images, containers, Docker Compose, optimization - **Kubernetes operations**: Pods, deployments, services, configurations - **General DevOps**: File operations, system commands, basic troubleshooting ⚠️ **Note**: The model has limited training on cloud-specific CLIs (gcloud, AWS CLI, Azure CLI). For best results, use it for Docker and Kubernetes tasks. ### Example Output ``` Task: Get all pods in default namespace Step 1: Execute kubectl command {"name": "bash", "arguments": {"command": "kubectl get pods -n default"}} [Receives pod list] Step 2: Provide summary {"name": "final_answer", "arguments": {"answer": "Successfully retrieved 10 pods in default namespace..."}} ``` ## Quick Start ### 🎯 **Recommended: Native Installation** For the best experience with full DevOps capabilities: ```bash curl -fsSL https://raw.githubusercontent.com/ubermorgenland/devops-agent/main/install.sh | bash ``` This will automatically: - Install Ollama (if not present) - Install Python dependencies - Download the model from Hugging Face - Create the Ollama model - Set up the `devops-agent` CLI command **Why native installation?** - ✅ **Full system access** - manage real infrastructure - ✅ **No credential mounting** - works with your existing setup - ✅ **Better performance** - no container overhead - ✅ **Simpler usage** - just run `devops-agent` ---