micro-kiki Router v4 β 34-Domain Multi-Label Classifier
MiniLM-L6 embedding + MLP head. Classifies a user prompt into up to 34 domains (sigmoid, not softmax β domains are not mutually exclusive). Used at inference time by the micro-kiki serving layer to pick the top-K LoRA adapters to load alongside the Qwen3.6-35B-A3B base.
Numbers (2026-04-22)
| Metric | Value |
|---|---|
| Train / val | 21 510 / 5 377 |
| top-1 accuracy | 0.464 |
| top-3 accuracy | 0.696 |
Files
router.safetensorsβ MLP head weightslabel_map.jsonβ domain β index mapmeta.jsonβ training config snapshot
Source
Training & serving pipeline at
hypneum-lab/micro-kiki.
Trained by scripts/train_router_v4.py.
πͺπΊ EU AI Act transparency
This adapter is provided as a fine-tuned LoRA under the AI Act framework (Regulation EU 2024/1689). Compliance metadata:
| Field | Value |
|---|---|
| Provider | L'Γlectron Rare (clemsail / electron-rare) |
| Role under AI Act | GPAI provider for this adapter |
| Base model | sentence-transformers/all-MiniLM-L6-v2 β see upstream provenance |
| Adapter type | LoRA / PEFT β adapter weights only; base unchanged |
| Training data origin | L'Γlectron Rare proprietary technical corpus + curated public docs |
| License | Apache-2.0 (adapter). Upstream base licence applies separately. |
| Intended use | Domain-routing classifier |
| Out of scope | Healthcare diagnosis, legal advice, autonomous safety-critical decisions, generation of malicious code |
| Risk classification | Limited risk β Article 50 transparency obligations apply |
| Copyright respect | Training data does not include scraped copyrighted material. Opt-out signals (robots.txt, ai.txt) are honoured for web-sourced data. |
| Full provenance | https://github.com/L-electron-Rare/eu-kiki/tree/main/docs/provenance |
| Contact | postmaster@saillant.cc β biased output reports, copyright concerns, etc. |
β οΈ You are using an AI model. Outputs may be inaccurate, biased or fabricated. Do not act on them without independent verification, especially in regulated domains.
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support
Model tree for clemsail/micro-kiki-router-v4
Base model
sentence-transformers/all-MiniLM-L6-v2