Post
2713
3B vs 120B parameter open-source LLMs running locally on a Mac. Your Mac might be more powerful than you think!
https://youtube.com/shorts/JdKtIdlnPs8?feature=share
https://youtube.com/shorts/JdKtIdlnPs8?feature=share
Join the community of Machine Learners and AI enthusiasts.
Sign UpI'm using an M4 Max chip with 128 GB of memory. I'm not running a local server since the model runs directly within Eric Chat.
Here's the source code if you would like to learn more: