| | ---
|
| | language:
|
| | - en
|
| | tags:
|
| | - tiny-random
|
| | - testing
|
| | - minicpmo
|
| | - optimum-intel
|
| | license: apache-2.0
|
| | ---
|
| |
|
| | # Tiny Random MiniCPM-o-2_6
|
| |
|
| | This is a tiny random version of [openbmb/MiniCPM-o-2_6](https://huggingface.co/openbmb/MiniCPM-o-2_6) generated for testing purposes in [Optimum-Intel](https://github.com/huggingface/optimum-intel).
|
| |
|
| | ## Model Details
|
| |
|
| | - **Architecture**: `MiniCPMO` (Qwen2 + SigLIP + Whisper + ChatTTS)
|
| | - **Hidden Size**: 16
|
| | - **Num Layers**: 2
|
| | - **Vocab Size**: 151,700 (Same as original for tokenizer compatibility)
|
| | - **Parameters**: ~5M
|
| | - **Size**: ~9.6 MB (safetensors)
|
| |
|
| | ## Usage
|
| |
|
| | This model contains random weights and is intended **only for integration testing**, not for inference. It produces random output.
|
| |
|
| | ### How it was generated
|
| |
|
| | Generated using a custom script that:
|
| |
|
| | 1. Loads the original config.
|
| | 2. Shrinks hidden sizes to 16 and layers to 1-2.
|
| | 3. Disables TTS initialization (due to large hardcoded layers in DVAE).
|
| | 4. Initializes random weights.
|
| | |