Spaces:
Sleeping
A newer version of the Gradio SDK is available: 6.14.0
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
Project Overview
DeepSpaceSearch is an agentic deep research application built with Gradio and smolagents. It provides a chat interface where an AI agent can perform web searches, generate images, and provide comprehensive answers by executing multi-step reasoning tasks. The application is designed to run on Hugging Face Spaces with OAuth authentication.
Architecture
This is a single-file Gradio application (app.py) built on the smolagents framework. Key architectural components:
Agent System (app.py:52-64):
- Uses
CodeAgentfrom smolagents as the core reasoning engine - The agent orchestrates multi-step tasks using available tools
- Configured with
max_stepsto limit execution depth andverbosity_levelfor logging
Tool Ecosystem (app.py:17-26):
DuckDuckGoSearchTool: Web search with rate limiting (5 results, 2.0s rate limit)FinalAnswerTool: Required for the agent to return structured final responsesimage_generation_tool: Remote tool loaded from HF Space (black-forest-labs/FLUX.1-schnell) for image generation- Tools are imported from smolagents or dynamically loaded using
Tool.from_space()
Model Layer (app.py:42-49):
- Uses
InferenceClientModelwrapper around Hugging Face Inference API - Default model:
Qwen/Qwen2.5-Coder-32B-Instruct(configurable viaHF_MODEL_IDenv var) - Model parameters (max_tokens, temperature, top_p) are user-configurable through the UI
Streaming Interface (app.py:66-71):
stream_to_gradio()converts agent execution steps into Gradio-compatible message stream- Messages are yielded as they're produced, showing agent reasoning in real-time
- Uses
gr.ChatInterfacewith "messages" type format (role/content dictionaries)
Authentication:
- OAuth flow configured in README.md (
hf_oauth: true, scope:inference-api) - Token accessed via
HF_TOKENenvironment variable (automatically set by HF Spaces with OAuth, manually set for local dev)
Development Commands
ALWAYS USE A VIRTUAL ENVIRONMENT BEFORE RUNNING LOCALLY!!!
Setup:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
Run locally:
python app.py
Launches Gradio interface at http://127.0.0.1:7860
Environment Configuration:
Create a .env file for local development:
HF_TOKEN=hf_... # Your Hugging Face token with inference-api scope
HF_MODEL_ID=Qwen/Qwen2.5-Coder-32B-Instruct # Optional: override default model
Hugging Face Spaces Deployment
The app auto-deploys on HF Spaces via README.md metadata:
sdk: gradiowithsdk_version: 5.42.0app_file: app.py- OAuth is handled automatically by the platform when
hf_oauth: trueis set - The
HF_TOKENenvironment variable is automatically populated with the user's OAuth token
Key Dependencies
smolagents[gradio]: Agentic framework with Gradio streaming supportgradio[oauth]==5.42.0: UI framework with OAuth supportduckduckgo_search/ddgs: Web search backendhuggingface_hub: Inference API clientpython-dotenv: Environment variable management for local development
Adding New Tools
To add a tool to the agent:
From smolagents built-ins:
from smolagents import YourTool
your_tool = YourTool()
agent = CodeAgent(tools=[search_tool, your_tool, final_answer], ...)
From a Hugging Face Space:
your_tool = Tool.from_space(
space_id="namespace/space-name",
name="tool_name",
description="What this tool does. Returns X.",
api_name="/endpoint_name",
)
Always include FinalAnswerTool() in the tools list - it's required for the agent to return results.