llama.cpp

#3
by decentralize303 - opened

Will it not work with llama.cpp(Jan)? I have 2 x RTX PRO 6000 and bunch of RTX 3090

Sign up or log in to comment