Muse Mell 12B

Slerp of MagMell with Muse

Note: I used float16 instead of bfloat16 on accident, which might weaken the merge slightly. Unless this combination is really good I likely won't bother remerging it.

Downloads last month
52
Safetensors
Model size
12B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula/Muse-Mell-12B

Merge model
this model
Quantizations
3 models

Collection including Naphula/Muse-Mell-12B