chatllm.cpp adds support of this model
#16
by
J22
- opened
chatllm.cpp now supports this model.
Initial tests show that this model is strong on solving some tricky math problems.
server ---chat :step3-vl -ngl all --max-length 10000 +detect-thoughts
As a bonus, since this model can support native resolution mathematically using only the global view, I have added an option to test this --set native-resolution 1.
