Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
isogen
/
Jan-v2-VL-high-exl3-4bpw
like
0
Safetensors
qwen3_vl
4-bit precision
exl3
Model card
Files
Files and versions
xet
Community
EXL3
quantization of
Jan-v2-VL-high
, 4 bits per weight.
Downloads last month
1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for
isogen/Jan-v2-VL-high-exl3-4bpw
Base model
Qwen/Qwen3-VL-8B-Thinking
Finetuned
janhq/Jan-v2-VL-high
Quantized
(
14
)
this model