MiniMax-M2-5-REAP-50 - GGUF

This repository contains GGUF format model files for Akicou/MiniMax-M2-5-REAP-50.

⚠️ Experimental Warning & Disclaimer

Please note that this is an experimental release. I am new to making GGUF models, and I generated these quantizations using Google Colab. Because I'm still learning the ropes, these files might not be perfect.

Additionally, due to hardware constraints, I have only been able to test the Q2_K quantization as my local system cannot run the higher quants.

You are more than welcome to download and try out the other quantizations! However, please keep in mind that they are entirely untested on my end, and there is no guarantee of high-quality output or stability. Feel happy to experiment, but adjust your expectations accordingly.

Available Quantizations

  • minimax-m2-5-reap-50-q2_k.gguf
  • minimax-m2-5-reap-50-q3_k_m.gguf
  • minimax-m2-5-reap-50-q4_0.gguf
  • minimax-m2-5-reap-50-q4_k_m.gguf
  • minimax-m2-5-reap-50-q5_k_m.gguf
  • minimax-m2-5-reap-50-q8_0.gguf
Downloads last month
874
GGUF
Model size
116B params
Architecture
minimax-m2
Hardware compatibility
Log In to add your hardware

2-bit

3-bit

4-bit

5-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for RemySkye/MiniMax-M2.5-REAP-50-GGUF

Quantized
(1)
this model