MLX Model Manager - A GUI for running MLX models locally (beta, looking for feedback!)

#29
by tumma72 - opened
MLX Community org

Hey everyone! ๐Ÿ‘‹

I've been working on a little tool called MLX Model Manager and thought some of you might find it useful.

What is it?

It's a web-based GUI that makes it easier to discover, download, and run MLX models on your Mac. No terminal commands needed (though you can still use them if you prefer!).

What can you do with it?

  • Browse & download models - Search through MLX-optimized models right from the UI, see sizes before downloading, track download progress
  • Manage server profiles - Create different configurations (model, port, context length, etc.) and save them for quick access
  • Run multiple servers - Start/stop servers with a click, see real-time metrics (memory, CPU, uptime) for each running instance
  • Menubar app - Lives in your macOS menu bar for quick access
  • Auto-start on login - Set your favorite models to launch automatically when you boot up

How to try it

pip install mlx-manager
mlx-manager serve

Then open http://localhost:8080 in your browser. That's it!

Or if you prefer the menubar app:
mlx-manager menubar

This is beta!

I'm releasing v1.0.3 but consider it beta quality. It works well for my workflow but I'm sure there are rough edges I haven't found yet.

If you try it out, I'd love to hear:

  • What works well for you
  • What's confusing or broken
  • What features would make it more useful

GitHub: https://github.com/tumma72/mlx-manager (issues/PRs welcome!)

Cheers! ๐ŸŽ

MLX Community org

Thanks for the feedback that some of you have sent to me privately, I really appreciate that in the spirit of open source and an active community!

I have implemented several improvements, and released version 1.1

You can find it here: https://github.com/tumma72/mlx-manager/ and easy to install via HomeBrew, now can run as a service with better HuggingFace search of MLX only models from mlx-community and other MLX trusted repositories. Better UI to start servers and better profiles handling, now also allowing to set a system prompt that will always be set as first message when testing models using the built-in chat system. I have added support for attachments to the chat, to test both text and multimodal models, now supporting up to 3 images or videos.

Let's keep trying, testing and suggesting ๐Ÿ˜€

Cheers! ๐ŸŽ

Sign up or log in to comment