Tiny Random MiniCPM-o-2_6

This is a tiny random version of openbmb/MiniCPM-o-2_6 generated for testing purposes in Optimum-Intel.

Model Details

  • Architecture: MiniCPMO (Qwen2 + SigLIP + Whisper + ChatTTS)
  • Hidden Size: 16
  • Num Layers: 2
  • Vocab Size: 151,700 (Same as original for tokenizer compatibility)
  • Parameters: ~5M
  • Size: ~9.6 MB (safetensors)

Usage

This model contains random weights and is intended only for integration testing, not for inference. It produces random output.

How it was generated

Generated using a custom script that:

  1. Loads the original config.
  2. Shrinks hidden sizes to 16 and layers to 1-2.
  3. Disables TTS initialization (due to large hardcoded layers in DVAE).
  4. Initializes random weights.
Downloads last month
87
Safetensors
Model size
5.02M params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support