exllamav3 quantizations of MiniMaxAI/MiniMax-M2.5.
2.00 bpw h6 (Quantizing)
3.00 bpw h6 81.613 GiB (Uploading)
4.00 bpw h6 (Quantizing)
5.00 bpw h6 (Quantizing)
Model tree for MikeRoz/MiniMax-M2.5-exl3
Base model
MiniMaxAI/MiniMax-M2.5exllamav3 quantizations of MiniMaxAI/MiniMax-M2.5.
2.00 bpw h6 (Quantizing)
3.00 bpw h6 81.613 GiB (Uploading)
4.00 bpw h6 (Quantizing)
5.00 bpw h6 (Quantizing)
Base model
MiniMaxAI/MiniMax-M2.5