| license: apache-2.0 | |
| base_model: | |
| - Qwen/Qwen3-Coder-Next | |
| tags: | |
| - code | |
| Iapetus-v2-Kernel is a 80-billion parameter coding model for Assembly (ptxas, arm, and x86), Cuda, C, C++ by Kronos Labs, fine-tuned on top of Qwen3-Coder-Next's Hybrid attention layout. | |
| It was trained on closed source datasets containing synthetically generated and verified code. We hope to spur interest in LLM's capable of performant low-level programming. | |
| The model contains the following layout: | |
| Size: 80B Parameters, A3B (3 Billion Active) | |
| Depth: 48 layers | |
| Hybrid layout: 12 * (3 * (Gated DeltaNet -> MoE) -> 1 * (Gated Attention -> MoE)) | |
| MoE: 512 experts, 10 activated experts, 1 shared expert | |
| Context Length: 262,144 native | |