Error When Loading Model

#1
by Sportsandfragrance - opened

Tried in terminal and in lmstudio and I get the same error messages. "Unsupported model type: glm4_vision."

I have updated MLX - not sure what is wrong. Thanks for the help and all you do, Prince & company!

MLX Community org

Yes, get the same message using mlx-vlm 0.3.9 on the command-line (as per example in the Model card)

File "mlx_vlm/models/glm4v/vision.py", line 270, in __init__
    raise ValueError(f"Unsupported model type: {self.model_type}")
ValueError: Unsupported model type: glm4v_vision
MLX Community org

When downloading the latest mlx-vlm from github, it gives

The tokenizer you are loading from 'mlx-community_GLM-4.6V-Flash-4bit' with an incorrect regex pattern: https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/discussions/84#69121093e8b480e709447d5e. This will lead to incorrect tokenization. You should set the fix_mistral_regex=True flag when loading this tokenizer to fix this issue.

ValueError: Failed to process inputs with error: PreTrainedTokenizerFast._batch_encode_plus() got an unexpected keyword argument 'images'

MLX Community org

This solved the problem: pip install git+https://github.com/huggingface/transformers.git

Sign up or log in to comment