Amazing !! 👍🏻
Some curiosity:
What is the biggest llm model that could be run locally on a laptop with following configuration, without losing laptop performance:
- 1 GPU NVIDIA RTX 5090 24GB vRAM
- CPU INTEL ULTRA CORE 9
- 64 GB RAM
Thanks in advance for sharing your insights!
Thanks!
Prince Arora