OpenMed-SynthVision-MedVL-AIO-GGUF

OpenMed's MedVL series—comprising Qwen2.5-3B-MedVL, Qwen3.5-2B-MedVL, and Ministral-3B-MedVL—are lightweight, specialized medical vision-language models fine-tuned from their respective strong open-source bases (Qwen2.5-VL-3B, Qwen3.5-2B, Ministral-3B) for clinical applications like radiology report generation, medical VQA, pathology slide analysis, dermatology lesion identification, and multimodal diagnostics across X-rays, CT/MRI scans, histopathology images, and ophthalmic fundus photos. These ~2-3B parameter models prioritize edge deployment on laptops/single GPUs via efficient architectures (Qwen's dynamic resolution ViT + Gated DeltaNet, Ministral's optimized SLM design), achieving high-fidelity medical reasoning, anatomical localization, disease classification, and report structuring while preserving layout/spatial awareness for real-world hospital workflows.

Model Files

Qwen2.5-3B-MedVL

File Name Quant Type File Size File Link
Qwen2.5-3B-MedVL.BF16.gguf BF16 6.18 GB Download
Qwen2.5-3B-MedVL.F16.gguf F16 6.18 GB Download
Qwen2.5-3B-MedVL.F32.gguf F32 12.3 GB Download
Qwen2.5-3B-MedVL.Q8_0.gguf Q8_0 3.29 GB Download
Qwen2.5-3B-MedVL.mmproj-bf16.gguf mmproj-bf16 1.34 GB Download
Qwen2.5-3B-MedVL.mmproj-f16.gguf mmproj-f16 1.34 GB Download
Qwen2.5-3B-MedVL.mmproj-f32.gguf mmproj-f32 2.67 GB Download
Qwen2.5-3B-MedVL.mmproj-q8_0.gguf mmproj-q8_0 848 MB Download

Qwen3.5-2B-MedVL

File Name Quant Type File Size File Link
Qwen3.5-2B-MedVL.BF16.gguf BF16 3.78 GB Download
Qwen3.5-2B-MedVL.F16.gguf F16 3.78 GB Download
Qwen3.5-2B-MedVL.F32.gguf F32 7.54 GB Download
Qwen3.5-2B-MedVL.Q8_0.gguf Q8_0 2.01 GB Download
Qwen3.5-2B-MedVL.mmproj-bf16.gguf mmproj-bf16 671 MB Download
Qwen3.5-2B-MedVL.mmproj-f16.gguf mmproj-f16 671 MB Download
Qwen3.5-2B-MedVL.mmproj-f32.gguf mmproj-f32 1.33 GB Download
Qwen3.5-2B-MedVL.mmproj-q8_0.gguf mmproj-q8_0 365 MB Download

Ministral-3B-MedVL

File Name Quant Type File Size File Link
Ministral-3B-MedVL.BF16.gguf BF16 6.87 GB Download
Ministral-3B-MedVL.F16.gguf F16 6.87 GB Download
Ministral-3B-MedVL.F32.gguf F32 13.7 GB Download
Ministral-3B-MedVL.Q8_0.gguf Q8_0 3.65 GB Download
Ministral-3B-MedVL.mmproj-bf16.gguf mmproj-bf16 850 MB Download
Ministral-3B-MedVL.mmproj-f16.gguf mmproj-f16 850 MB Download
Ministral-3B-MedVL.mmproj-f32.gguf mmproj-f32 1.68 GB Download
Ministral-3B-MedVL.mmproj-q8_0.gguf mmproj-q8_0 461 MB Download

Model Sources

Model Name Link
OpenMed/Qwen3.5-2B-MedVL https://huggingface.co/OpenMed/Qwen3.5-2B-MedVL
OpenMed/Qwen2.5-3B-MedVL https://huggingface.co/OpenMed/Qwen2.5-3B-MedVL
OpenMed/Ministral-3B-MedVL https://huggingface.co/OpenMed/Ministral-3B-MedVL
Downloads last month
568
GGUF
Model size
3B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

8-bit

16-bit

32-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for prithivMLmods/OpenMed-SynthVision-MedVL-AIO-GGUF