Cross-link: DistilQwen collection spotlight — 2026-03-29
Browse files
README.md
CHANGED
|
@@ -115,3 +115,19 @@ Please cite "SymbioticLM" when using symbolic memory components in research or a
|
|
| 115 |
|
| 116 |
|
| 117 |
*Last updated: 2026-03-28 12:57 UTC*
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 115 |
|
| 116 |
|
| 117 |
*Last updated: 2026-03-28 12:57 UTC*
|
| 118 |
+
|
| 119 |
+
<!-- CIX-CROSSLINK-START -->
|
| 120 |
+
|
| 121 |
+
---
|
| 122 |
+
|
| 123 |
+
## From the Convergent Intelligence Portfolio
|
| 124 |
+
|
| 125 |
+
**[DistilQwen Collection](https://huggingface.co/collections/reaperdoesntknow/distilqwen-69bf40ec669117e3f069ef1c)** — Proof-weighted distillation from Qwen3-30B-A3B → 1.7B and 0.6B. Three teacher variants (Instruct, Thinking, Coder), nine models, 2,788 combined downloads. Structure beats scale.
|
| 126 |
+
|
| 127 |
+
Top model: [Qwen3-1.7B-Coder-Distilled-SFT](https://huggingface.co/reaperdoesntknow/Qwen3-1.7B-Coder-Distilled-SFT) — 508 downloads
|
| 128 |
+
|
| 129 |
+
Full methodology: [Structure Over Scale (DOI: 10.57967/hf/8165)](https://doi.org/10.57967/hf/8165)
|
| 130 |
+
|
| 131 |
+
*Convergent Intelligence LLC: Research Division*
|
| 132 |
+
|
| 133 |
+
<!-- CIX-CROSSLINK-END -->
|