docling-project/SmolDocling-256M-preview Image-Text-to-Text β’ Updated Sep 17, 2025 β’ 26.2k β’ 1.61k
Running 3.83k The Ultra-Scale Playbook π 3.83k The ultimate guide to training LLM on large GPU Clusters
view article Article SmolVLM Grows Smaller β Introducing the 256M & 500M Models! +1 Jan 23, 2025 β’ 192