MiniCPM-V-4.5 (merged, schematic fine-tuned)

Full merged model (base + LoRA) for openbmb/MiniCPM-V-4_5, fine-tuned on PCB/schematic image descriptions. BF16.

Load

from transformers import AutoModel, AutoTokenizer, AutoProcessor

model = AutoModel.from_pretrained("foundation-models/minicpm-v-4.5-schematic-merged", trust_remote_code=True, torch_dtype="bfloat16")
tokenizer = AutoTokenizer.from_pretrained("foundation-models/minicpm-v-4.5-schematic-merged", trust_remote_code=True)
processor = AutoProcessor.from_pretrained("foundation-models/minicpm-v-4.5-schematic-merged", trust_remote_code=True)

LoRA adapter

To use the adapter only: foundation-models/minicpm-v-4.5-schematic-lora.

Downloads last month
28
Safetensors
Model size
9B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for foundation-models/minicpm-v-4.5-schematic-merged

Finetuned
(7)
this model