YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

This is a quantization of Yi-VL-34B and of the visual transformer.

You currently need to apply this PR to make it work: https://github.com/ggerganov/llama.cpp/pull/5093 - this adds the additional normalization steps into the projection

Yi-Vl-34B is prone to hallucinations, to me it appears like a rushed release. Something did not go right in training. However, while 6B was the 2nd worst llava-model I've tested, the 34B did show some strengths.

Downloads last month
98
GGUF
Model size
34B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

2-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support