adonishong
adonishong
AI & ML interests
None yet
Organizations
None yet
How to make this model work with Claude Code for local deployment?
3
#5 opened 3 months ago
by
adonishong
Does it possible to create a version without MTP layer to save some VRAM
1
#1 opened 5 months ago
by
adonishong
Does it possible to create a version without MTP layer to save some VRAM
๐
1
1
#3 opened 5 months ago
by
adonishong
looking forward upgrading to kimi-vl-a3b-thinking-2506
โ
1
#1 opened 6 months ago
by
adonishong
Llama 4 not working with MLX VLM?
1
#2 opened 9 months ago
by
leadangle
Please make V3-lite
โค๏ธ
๐
50
4
#12 opened 12 months ago
by
rombodawg
How can we thank you enough, whale bros?
โค๏ธ
๐
56
10
#1 opened about 1 year ago
by
KrishnaKaasyap
vram requirements?
1
#2 opened over 1 year ago
by
joujiboi
Ollama support
๐
1
1
#9 opened over 1 year ago
by
Dao3