Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

openbmb
/
MiniCPM-o-2_6-int4

Any-to-Any
Transformers
Safetensors
multilingual
minicpmo
feature-extraction
minicpm-o
omni
vision
ocr
multi-image
video
custom_code
audio
speech
voice cloning
live Streaming
realtime speech conversation
asr
tts
4-bit precision
gptq
Model card Files Files and versions
xet
Community
8

Instructions to use openbmb/MiniCPM-o-2_6-int4 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use openbmb/MiniCPM-o-2_6-int4 with Transformers:

    # Load model directly
    from transformers import AutoModel
    model = AutoModel.from_pretrained("openbmb/MiniCPM-o-2_6-int4", trust_remote_code=True, dtype="auto")
  • Notebooks
  • Google Colab
  • Kaggle
New discussion
Resources
  • PR & discussions documentation
  • Code of Conduct
  • Hub documentation

AutoGPTQ it is not support minicpmo

👍 4
#6 opened about 1 year ago by
tarekmurad

使用cuda:1报错: Expected all tensors to be on the same device

1
#5 opened about 1 year ago by
yg1031

AutoGPTQ has been moved to `ModelCloud/GPTQModel`, which will not receive any new feature except bugfix.

👍 1
#4 opened over 1 year ago by
hebangwen

Web server demo Int4

1
#3 opened over 1 year ago by
lktinhtemp

Error while downloading model

8
#1 opened over 1 year ago by
pranshu3105
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs