You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

⚡ AESIR-Coder-7B

AESIR-Coder-7B is a high-performance, reasoning-dense language model developed by ÆSIR Unlimited. By merging the world-class syntax precision of Qwen2.5-Coder with the rigorous Chain-of-Thought (CoT) logic of DeepSeek-R1, we have engineered a "pocket-sized architect" capable of handling complex software engineering and Web3 tasks on consumer-grade hardware.

🛠️ Architecture & Methodology

This model was engineered using the TIES (Trimming, Electing, and Merging) method via mergekit. This approach allows the model to resolve weight conflicts between its two parent brains—ensuring the reasoning logic of DeepSeek doesn't "break" the specific coding syntax of Qwen.

  • Base Engine: Qwen2.5-Coder-7B-Instruct (Optimized for 90+ programming languages)
  • Reasoning Layer: DeepSeek-R1-Distill-Qwen-7B (Trained on massive reasoning traces)
  • Merge Method: TIES
  • Hardware Profile: Optimized for 8GB-16GB RAM environments (Local First)

🚀 Key Agentic Capabilities

  1. Chain-of-Thought Auditing: Unlike standard models that just write code, AESIR-Coder "thinks" through the logic. It is ideal for identifying logic flaws in smart contracts and complex Python agentic systems.
  2. Web3 Native: Deep knowledge of Solidity, Vyper, and the TON/Solana ecosystems, combined with the ability to reason about decentralized state-machines.
  3. Structured Intelligence: Highly stable at generating JSON schemas and YAML configurations required for the ÆSIR Protocol agentic handshakes.

💻 Local Execution (ÆSIR Unlimited Standards)

To run this model on your local machine with limited RAM, we recommend using LM Studio or Ollama with a Q4_K_M GGUF quantization.

RAM Availability Recommended Quantization
8GB RAM Q4_K_M (Fast, High Quality)
12GB+ RAM Q6_K or Q8_0 (Near-Lossless)

Developed by Bugg-Moran as part of the ÆSIR Unlimited mission to bridge the gap between AI and Decentralized Infrastructure.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for Bugg-Moran/AESIR-Coder-7B