Multi GPU support

#6
by andyproxis - opened

Is it possible to use this model using more GPUs and share VRAM ? For example I have 2x A2000 12 GB VRAM and 1x A16 (which has 4x GPU with 16 GB VRAM). I cant use this model, because of insufficient VRAM.

Thank you for answer.

That depends on the inference framework lol

Sign up or log in to comment