Efficient MoE-based LLM Collection Mixture-of-Experts Large Language Models with Advanced Quantization • 5 items • Updated 3 days ago • 23
nota-ai/Solar-Open-100B-NotaMoEQuant-NVFP4 Text Generation • 59B • Updated 3 days ago • 116 • 3
MMFineReason Collection High-quality STEM reasoning dataset for Multimodal LLM post-training. • 8 items • Updated 12 days ago • 22
Efficient Large Vision-Language Model Collection ERGO: LVLM trained with RL on efficiency objectives; https://github.com/nota-github/ERGO • 3 items • Updated 20 days ago • 25
Efficient Large Vision-Language Model Collection ERGO: LVLM trained with RL on efficiency objectives; https://github.com/nota-github/ERGO • 3 items • Updated 20 days ago • 25
Efficient MoE-based LLM Collection Mixture-of-Experts Large Language Models with Advanced Quantization • 5 items • Updated 3 days ago • 23
Efficient MoE-based LLM Collection Mixture-of-Experts Large Language Models with Advanced Quantization • 5 items • Updated 3 days ago • 23
Efficient Large Vision-Language Model Collection ERGO: LVLM trained with RL on efficiency objectives; https://github.com/nota-github/ERGO • 3 items • Updated 20 days ago • 25
Running Featured 115 Compressed Stable Diffusion 🌟 115 Compare image generation results from original and compressed AI models