A newer version of the Gradio SDK is available: 6.10.0
title: Chatbot
emoji: π»
colorFrom: yellow
colorTo: green
sdk: gradio
sdk_version: 6.6.0
app_file: app.py
pinned: false
short_description: Chatbot
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
π€ Memory Chatbot
A multi-user, multi-thread AI chatbot with short-term and long-term memory, powered by LangGraph, LangChain, Groq (Qwen3-32B), and a Gradio UI. The chatbot can call a variety of tools and displays which tools were used directly in the chat interface.
β¨ Features
| Feature | Description |
|---|---|
| π§ Short-term memory | Remembers conversation context within a thread using InMemorySaver |
| πΎ Long-term memory | Stores user preferences, profile, interests, and project info using InMemoryStore |
| π₯ Multi-user support | Each user has isolated memory and conversation threads |
| π§΅ Multi-thread support | Each user can have multiple independent chat sessions |
| π§ Tool use | 12 built-in tools (web search, weather, calculator, Python exec, and more) |
| π·οΈ Tool call badges | The UI shows which tools were called for each AI response |
| β‘ Fast inference | Powered by Groq's ultra-fast LLM API |
ποΈ Project Structure
.
βββ memory_chatbot.py # LangGraph graph: memory nodes + agent chat node
βββ app.py # Gradio UI with multi-user / multi-thread support
βββ tool.py # All tool definitions (12 tools)
βββ .env # API keys (not committed)
βββ README.md
π§© Architecture
User Message
β
βΌ
βββββββββββββββββββ
β memory_analyzer β β Detects if message contains info worth storing long-term
ββββββββββ¬βββββββββ
β
βΌ
ββββββββββββββββββββ
β memory_retrieval β β Searches long-term store and injects relevant memories
ββββββββββ¬ββββββββββ
β
βΌ
βββββββββββββββββ
β chat_node β β LangChain agent with 12 tools; returns answer + tool log
βββββββββββββββββ
β
βΌ
AI Response
+ Tool Call Badges
Memory Types
- Short-term (checkpointer): Full conversation history per
thread_id, managed automatically by LangGraph'sInMemorySaver. - Long-term (store): Key-value memories namespaced by
(user_id, category). Categories:preferences,profile,interests,project.
π§ Tools
| Tool | Description |
|---|---|
tool_tavily |
Web search via Tavily API |
time_date |
Returns today's date |
calculator |
Evaluates math expressions |
python_exec |
Executes arbitrary Python code |
get_weather |
Current weather for any city |
wikipedia_search |
Wikipedia summary search |
scrape_website |
Extracts text from a URL |
read_file |
Reads a local file |
format_json |
Pretty-prints JSON |
generate_sql |
Converts natural language to SQL |
system_info |
Returns OS/platform info |
save_user_preference |
Saves a user preference to memory |
π Setup
1. Clone & install dependencies
git clone <your-repo-url>
cd memory-chatbot
pip install -r requirements.txt
Key dependencies:
langgraph
langchain
langchain-groq
langchain-tavily
gradio
wikipedia
beautifulsoup4
requests
python-dotenv
2. Configure API keys
Create a .env file in the project root:
GROQ_API_KEY=your_groq_api_key
TAVILY_API_KEY=your_tavily_api_key
Get your keys:
- Groq: https://console.groq.com
- Tavily: https://app.tavily.com
3. Run
python app.py
Then open http://localhost:7860 in your browser.
π₯οΈ UI Guide
βββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββ
β π€ Users β β
β ββββββββββββ β [Chat history appears here] β
β β user_abc β β β
β ββββββββββββ β User: What's the weather in Paris? β
β [β New User] β β
β β AI: It's 18Β°C and sunny in Paris. β
β π§΅ Threads β π§ get_weather β
β β thread_123 β β
β β thread_456 β β
β [β New Chat] β βββββββββββββββββββββββββββββββ β
β β β Type message and press Enter β β
βββββββββββββββββββ΄ββββ΄ββββββββββββββββββββββββββββββ΄βββββ
- New User β creates an isolated user with fresh memory and a new thread
- New Chat β starts a new thread for the current user (long-term memory is preserved)
- Tool badges β appear below each AI reply showing which tools were called
π How Tool Display Works
When the agent calls tools, chat_node inspects the returned messages for tool_calls and ToolMessage entries and builds a tool_log list. The Gradio UI renders these as colored HTML badges inline in the chat (Gradio's Chatbot renders HTML by default).
Example badge output:
π§ get_weather π§ wikipedia_search
π Configuration
| Parameter | Location | Default | Description |
|---|---|---|---|
model |
memory_chatbot.py |
qwen/qwen3-32b |
Groq model to use |
temperature |
memory_chatbot.py |
0 |
LLM temperature |
max_results |
tool.py (Tavily) |
5 |
Web search result count |
memory categories |
memory_chatbot.py |
preferences, profile, interests, project | Long-term memory namespaces |
β οΈ Limitations
- Memory is in-process only β all memory resets when the server restarts. For persistence, replace
InMemorySaverandInMemoryStorewith database-backed alternatives (e.g.,PostgresSaver,RedisStore). python_exectool executes arbitrary code with no sandboxing β use with caution in production.scrape_websitedoes not handle JavaScript-rendered pages.
π οΈ Extending
Add a new tool: Define it with @tool in tool.py, then add it to the tools list in memory_chatbot.py.
Persist memory to a database: Replace InMemorySaver() with a LangGraph-compatible checkpointer and InMemoryStore() with a persistent store.
Change the LLM: Swap ChatGroq for any LangChain-compatible chat model (OpenAI, Anthropic, etc.).