--- license: mit tags: - moe - eve-swarm - specialist base_model: - anthonym21/Eve-2-MoE-IT-272M datasets: - bitext/Bitext-customer-support-llm-chatbot-training-dataset ---
Eve-2 Swarm
# Eve-2-MoE-NanoRouter-272M Intent classification and routing. ## Training - Hardware: NVIDIA H200 SXM - Method: Full Fine-Tuning (FFT) - Samples: 25,000 - Time: 4.4 min