ยท
AI & ML interests
Anything that can run on ~3GB of memory is a instant thumbs up to me
Recent Activity
reacted to kelsend's post with ๐ 20 days ago Tencent Open-Sources Hunyuan 3D World Model 2.0 Generate Editable 3D Game Worlds with One Sentence, Compatible with Unity/UE
Tencent has officially released and open-sourced Hunyuan 3D World Model 2.0 (HY-World 2.0), enabling AI to evolve from video generation to creating playable, editable 3D world
Core Highlights
Text / Image / Video โ Directly generate exportable 3D assets (Mesh / 3DGS / Point Cloud)
Seamlessly integrates with Unity / Unreal Engine for game maps and level prototyping
One-click reconstruction of digital twin scenes from single images/videos, no camera parameters required
Spatial Agent for intelligent navigation trajectories no wall penetration, consistent spatial height
All-new HY-Pano-2.0 + WorldMirror 2.0 architecture, achieving SOTA in 3D reconstruction and novel view synthesis
Key Breakthrough
Unlike Genie 3 and Hunyuan 1.5, which only output videos, HY-World 2.0 generates re-editable 3D worlds that support collision, interaction, and engine import
Application Scenarios
Game Development, Indoor Preview, Urban Planning, Digitalization of Cultural Heritage, Embodied AI Simulation reacted to SeaWolf-AI's post with ๐ฅ 22 days ago ๐งฌ Darwin-27B-Opus: 86.9% on GPQA Diamond โ World #5, Zero Training
We are excited to share Darwin-27B-Opus, a 27B model that achieved 86.9% on GPQA Diamond โ ranking #5 globally on the HuggingFace leaderboard โ without a single gradient update.
How? Darwin breeds pretrained models through evolutionary FFN crossbreeding. The father (Qwen3.5-27B) provides the reasoning architecture; the mother (Claude 4.6 Opus Reasoning Distilled) contributes structured chain-of-thought knowledge. CMA-ES automatically discovers optimal per-layer blending ratios โ no human tuning required.
The result surpasses the original Qwen3.5-27B (85.5%), GLM-5.1 (744B, 86.2%), and Qwen3.5-122B (86.6%). A 27B model outperforming 744B โ with zero training, zero data, one GPU, ~2 hours.
We also confirmed hybrid vigor on Korean benchmarks: Darwin-27B-KR (2nd generation offspring) surpassed both parents on CLIcK, winning 7 out of 11 categories. The evolutionary optimizer independently assigned 93% of FFN from the Korean-specialized mother while preserving 93% of attention from the reasoning-specialized father โ autonomously validating our core principle: FFN carries knowledge, Attention carries reasoning.
๐ Public release: 10 days โ 300+ community derivatives, 120K+ downloads.
๐ Links:
Darwin-27B-Opus: https://huggingface.co/FINAL-Bench/Darwin-27B-Opus
article: https://huggingface.co/blog/FINAL-Bench/darwin-gpqa
Darwin Family Collection: https://huggingface.co/collections/FINAL-Bench/darwin-family
If foundation models are raw ore, Darwin is the forge. We are just getting started. ๐ฅ View all activity Organizations
None yet