--- title: Tool Calling Agent (LangGraph) emoji: ⚒️ colorFrom: indigo colorTo: green sdk: docker app_file: main.py pinned: false --- # LangGraph Tool-Calling Agent A research assistant powered by LangGraph and Chainlit that can search the web and query arXiv papers to answer questions. ## Features - Multi-tool agent with web search capabilities (Tavily and DuckDuckGo) - Academic research with arXiv integration - Interactive Chainlit web interface - Streaming responses with real-time tool usage visibility ## Local Development 1. Clone the repository 2. Create a virtual environment: `python -m venv .venv` 3. Activate the virtual environment: - Windows: `.venv\Scripts\activate` - macOS/Linux: `source .venv/bin/activate` 4. Install dependencies: `pip install -r requirements.txt` 5. Create a `.env` file with your API keys: ``` OPENAI_API_KEY=your_openai_key_here TAVILY_API_KEY=your_tavily_key_here LANGCHAIN_API_KEY=your_langchain_key_here LANGCHAIN_TRACING_V2=true LANGCHAIN_PROJECT=tool-calling-agent ``` 6. Run the application: `chainlit run main.py` 7. Open your browser to http://localhost:8501 ## Deploying to Hugging Face Spaces 1. Create a new Space on Hugging Face (https://huggingface.co/new-space) 2. Choose "Docker" as the Space SDK 3. Clone your Space repository 4. Copy your project files to the cloned repository 5. Add your API keys as repository secrets in the Space settings 6. Push your changes to Hugging Face 7. Your app will build and deploy automatically ### Environment Variables for Hugging Face Make sure to add these environment variables in your Hugging Face Space settings: - `OPENAI_API_KEY` - `TAVILY_API_KEY` - `LANGCHAIN_API_KEY` - `LANGCHAIN_TRACING_V2` - `LANGCHAIN_PROJECT` ## Project Structure - `main.py` - Entry point and Chainlit handler - `config.py` - Configuration management - `tools.py` - Tool definitions and setup - `graph.py` - LangGraph agent implementation - `chainlit.yaml` - Chainlit configuration - `Dockerfile` - Container definition for deployment