MOE - Reasoning - Gated IQ Multi-Tier Models Collection MOE models using a unique custom internal structure to augment reasoning/thinking up to 300 % which then switches during generation mode. • 9 items • Updated 13 days ago • 4
Zephyr 7B Collection Models, datasets, and demos associated with Zephyr 7B. For code to train the models, see: https://github.com/huggingface/alignment-handbook • 8 items • Updated Mar 2 • 152