EXL3 quantization of Jan-v2-VL-high, 4 bits per weight.

Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for isogen/Jan-v2-VL-high-exl3-4bpw

Quantized
(14)
this model