Spaces:
Running
[NEW] HuggingChat Omni
Introducing: HuggingChat Omni ๐ซ
HuggingChat returns and it's smarter and faster than ever ๐
Stop picking models. Start chatting.
- 115+ available models - https://huggingface.co/chat/models
- 15+ providers available - powered by Hugging Face Inference Providers.
- One chat interface: HuggingChat
Available now for all Hugging Face users. Free users can use their inference credits, PRO users get 20x more credits to use.
๐งญ Omni: the new default routing model
When you send a message, Omni analyzes what you need and routes you to the best model for that specific task.
Each route uses the best model for its task. You see which model handled your request while it streams.
๐ Examples
| What you ask | Route | Model |
|---|---|---|
| "Help me decide between two job offers. One pays 20% more but requires relocation." | decision_support |
deepseek-ai/DeepSeek-R1-0528 |
| "Create a React component for an image carousel with lazy loading" | code_generation |
Qwen/Qwen3-Coder-480B-A35B-Instruct |
| "Write a short mystery story set in a lighthouse during a storm" | creative_writing |
moonshotai/Kimi-K2-Instruct-0905 |
| "Translate this to French: The meeting has been rescheduled to next Tuesday" | translation |
CohereLabs/command-a-translate-08-2025 |
โ๏ธ Under the hood
Omni uses a policy-based routing system. Each route has:
- A clear description of what it handles
- A primary model best suited for that task
- Fallback models if the primary is unavailable
The router model analyzes your conversation and picks the matching route. Fast (10 second timeout) and runs on every message. Credits to Katanemo for their routing model: katanemo/Arch-Router-1.5B
โจ What else is new
- Background generation tracking: Multiple conversations can generate at the same time. Switch between tabs and the app tracks what's still generating. Updates appear automatically when responses finish.
- Better streaming: Text renders faster and smoother. The app only updates what changed instead of re-rendering everything. Less flickering, especially in long responses with code blocks.
- Better UX: UX was refined throughout the app. Fewer bugs and rough edges. Preview for code, beautiful streaming and more polish and attention to detail everywhere.
- Speed optimizations: Sessions stay active longer with automatic token refresh. Response times improved across the board. The whole app feels faster.
๐ ๏ธ Run it yourself
HuggingChat is of course still 100% open source. It has never been easier to self-host your own instance.
Quick setup:
git clone https://github.com/huggingface/chat-ui
cd chat-ui
npm install
npm run dev
Only 3 env variables to set to get it working in .env:
MONGODB_URL- Your MongoDB connectionOPENAI_API_KEY- Your API keyOPENAI_BASE_URL- Your endpoint URL
You can also configure your own routes in a JSON file. Each route defines which models to use for specific tasks.
Check out the repo: github.com/huggingface/chat-ui
Hope you are as excited as we are about HuggingChat Omni! Please share your feedback and ideas in this thread ๐ค
Is it possible to import my conversations from the previous version of HuggingChat?
Yeah this dumbing down the system was totally worth nuking everyone's logs and assistants...? The performance improvements are nice if true, but how can you call this a better UX when so many basic features are missing from the last version? Even simple settings are gone, like no options to delete or edit output? There isn't even a way to tweak temperature/repetition minimizing settings, or give different chats different system prompts??
wow, I'm kind of surprised it's back. feels like a tad bit of a downgrade, but I'm assuming that it was a complete rework? hoping that more QoL features will be reintroduced again.
we're so back
edit:
nevermind, cant delete the conversation branch like before๐ข
edit 2:
and it now has a limit. Its been over six hours and i still cant continue the conversation ๐ญ
Thanks for getting this running
The last time I saw something this dead was the sonic-community in the meta-era from sonic lost world til sonic forces. But I still like the new version of hugging chat.
Wait we are migrating the billing system, we should be able to do cool things after it :)
There has not been a new release update since June 2025 (v0.9.5).
With the previous main contributor Nathan Sarrazin (nsarrazin) gone, will we see chat ui continually updated from now on?
Victor Mustar, are you the main contributor now?
I use chat ui hosted locally. Will we see local MCP implemented/fixed? e.g.
https://github.com/huggingface/chat-ui/discussions/2027
Hopefully chat-ui will be continually upgraded and supported by hugging-face. Is that the case?
thanks!
Wait we are migrating the billing system, we should be able to do cool things after it :)
Explain.
There has not been a new release update since June 2025 (v0.9.5)
I just made a new release
Victor Mustar, are you the main contributor now?
For now yes.
Is there any sort of ETA on when we can upload our old chats? Alternatively, is there any way I can read the chats I downloaded from the pre-Omni HuggingChat?
Just use venice.ai , it will let you do pretty much anything, with higher quality than any of the current models available here, and 10 free prompts a day. Only downside is no branching whatsoever (which means alternate versions are deleted when you retry or edit a prompt), but you'll have to learn to work around that.
