VRAM required to run inference
#7
by
niktheod
- opened
Hi. I have 16GBs of VRAM available but when I try to run the inference code you provide on hugging face I run out of memory. Any idea why this could be? Usually for models of similar size 16GBs of VRAM are more than enough.
niktheod
changed discussion status to
closed