RedHatAI/gemma-3n-E4B-it-quantized.w4a16 is not a multimodal model

#1
by gabbo1995 - opened

Thanks for you work! I wanted to let you know thought that with the example script you provided I receive the following exception:
RedHatAI/gemma-3n-E4B-it-quantized.w4a16 is not a multimodal model

I have vllm='0.10.0'

I am getting the same error! @gabbo1995 were you able to find a solution?

Hello @Dan189 , I am sorry I have just seen your message. I was not able to find a solution and I moved to another model :(

Sign up or log in to comment