s
Tom-Neverwinter
AI & ML interests
Making improvements to help the world.
Recent Activity
reacted to danielhanchen's post with š„ about 8 hours ago
We made a guide on how to run open LLMs in Claude Code, Codex and OpenClaw.
Use Gemma 4 and Qwen3.6 GGUFs for local agentic coding on 24GB RAM
Run with self-healing tool calls, code execution, web search via the Unsloth API endpoint and llama.cpp
Guide: https://unsloth.ai/docs/basics/api reacted to danielhanchen's post with ā¤ļø about 8 hours ago
We made a guide on how to run open LLMs in Claude Code, Codex and OpenClaw.
Use Gemma 4 and Qwen3.6 GGUFs for local agentic coding on 24GB RAM
Run with self-healing tool calls, code execution, web search via the Unsloth API endpoint and llama.cpp
Guide: https://unsloth.ai/docs/basics/api reacted to SeaWolf-AI's post with š„ 23 days ago
𧬠Darwin-27B-Opus: 86.9% on GPQA Diamond ā World #5, Zero Training
We are excited to share Darwin-27B-Opus, a 27B model that achieved 86.9% on GPQA Diamond ā ranking #5 globally on the HuggingFace leaderboard ā without a single gradient update.
How? Darwin breeds pretrained models through evolutionary FFN crossbreeding. The father (Qwen3.5-27B) provides the reasoning architecture; the mother (Claude 4.6 Opus Reasoning Distilled) contributes structured chain-of-thought knowledge. CMA-ES automatically discovers optimal per-layer blending ratios ā no human tuning required.
The result surpasses the original Qwen3.5-27B (85.5%), GLM-5.1 (744B, 86.2%), and Qwen3.5-122B (86.6%). A 27B model outperforming 744B ā with zero training, zero data, one GPU, ~2 hours.
We also confirmed hybrid vigor on Korean benchmarks: Darwin-27B-KR (2nd generation offspring) surpassed both parents on CLIcK, winning 7 out of 11 categories. The evolutionary optimizer independently assigned 93% of FFN from the Korean-specialized mother while preserving 93% of attention from the reasoning-specialized father ā autonomously validating our core principle: FFN carries knowledge, Attention carries reasoning.
š Public release: 10 days ā 300+ community derivatives, 120K+ downloads.
š Links:
Darwin-27B-Opus: https://huggingface.co/FINAL-Bench/Darwin-27B-Opus
article: https://huggingface.co/blog/FINAL-Bench/darwin-gpqa
Darwin Family Collection: https://huggingface.co/collections/FINAL-Bench/darwin-family
If foundation models are raw ore, Darwin is the forge. We are just getting started. š„Organizations
None yet