Is working on low vram?
#3
by
gio83dj
- opened
I have a RTX5070 12 GB
Our current github repo only supports inference that requires minimum 32Gb VRAM, with the help of community support, we hope to make it work with lower peak VRAM
chetwinlow1
changed discussion status to
closed
Thank you.