Lama Minteo
Lamamanx
ยท
AI & ML interests
None yet
Organizations
None yet
Old model that works well
1
#1 opened 4 months ago
by
Lamamanx
What is the difference between the quants?
๐
5
5
#3 opened 5 months ago
by
tarruda
Not working
2
#1 opened 6 months ago
by
Lamamanx
GPU Requirements
3
#4 opened over 1 year ago
by
masuya
Will any 120b model currently fit on a single 24GB VRAM card through any app I can run on PC? (aka 4090)
15
#1 opened almost 2 years ago
by
clevnumb
Will any 120b model currently fit on a single 24GB VRAM card through any app I can run on PC? (aka 4090)
15
#1 opened almost 2 years ago
by
clevnumb
Will any 120b model currently fit on a single 24GB VRAM card through any app I can run on PC? (aka 4090)
15
#1 opened almost 2 years ago
by
clevnumb