SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI

#1052
by SuperbEmphasis - opened

Can I please get your deluxe quant treatment of
SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI
https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI

And
SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI-Reasoning
https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI-Reasoning

Pretty please!

Thank you for your awesome work.

They are booth queued! :D

You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#Viloet-Eclipse-2x12B-v0.2-MINI-GGUF and https://hf.tst.eu/model#Viloet-Eclipse-2x12B-v0.2-MINI-Reasoning-GGUF for quants to appear.

Thank you for your awesome work.

Thank you for your awesome work as well. Model creators like you are what keeps the entire open-source AI community alive. Creating and testing amazing models requires far more effort than quantizing them. I really like that you went with MoE for higher inference speed. MoE is quite underrated in my opinion.

Thank you for your awesome work.

Thank you for your awesome work as well. Model creators like you are what keeps the entire open-source AI community alive. Creating and testing amazing models requires far more effort than quantizing them. I really like that you went with MoE for higher inference speed. MoE is quite underrated in my opinion.

Absolutely! I've been having a lot of fun with them, and learning about fine tuning and creating datasets from Claude and deepseek.

The MoE architecture has always fascinated me. So when someone with 16GB asked me about it, and my 4x12B wouldn't fit.... it was trivial to slice it down!

Sign up or log in to comment