Spaces:
Sleeping
Sleeping
metadata
title: LangGraph Agent
emoji: π
colorFrom: red
colorTo: purple
sdk: docker
pinned: false
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
LangGraph Agent β Modular Structure
A production-ready LangGraph application with 8 agentic checkpoints, modular architecture, and Docker support.
Project Structure
langgraph_agent/
βββ app/
β βββ config.py # All settings (env-driven)
β βββ state.py # AgentState TypedDict
β βββ nodes/
β β βββ router.py # β
Checkpoint 3 β Conditional routing
β β βββ rag.py # β
Checkpoint 2 β RAG retrieval
β β βββ llm_node.py # β
Checkpoint 4 β Retries
β β βββ tool_executor.py # β
Checkpoint 1 β Tool execution
β β βββ memory.py # β
Checkpoint 5 β Memory
β β βββ hitl.py # β
Checkpoint 6 β Human-in-the-Loop
β β βββ evaluation.py # β
Checkpoint 7 β Evaluation
β β βββ guardrails.py # β
Checkpoint 8 β Guardrails
β β βββ output.py # Final output node
β βββ tools/
β β βββ calculator.py # Math expression tool
β β βββ weather.py # Weatherstack API tool
β βββ rag/
β β βββ store.py # FAISS vector store + retrieval
β βββ graph/
β β βββ builder.py # Graph topology assembly
β βββ utils/
β βββ llm.py # LLM singleton factory
βββ tests/
β βββ test_nodes.py # Unit tests (no API key needed)
βββ main.py # CLI entry point
βββ requirements.txt
βββ Dockerfile
βββ docker-compose.yml
βββ .env.example
Quickstart
Local
cp .env.example .env
# Fill in GROQ_API_KEY and WEATHER_API_KEY
pip install -r requirements.txt
python main.py
Docker
cp .env.example .env
# Fill in your API keys in .env
docker compose up --build
Run tests (no API keys needed)
pip install pytest
pytest tests/
Adding a new tool
- Create
app/tools/my_tool.pywith a@toolfunction - Import it in
app/tools/__init__.pyand add toALL_TOOLS - Done β the router and LLM binding pick it up automatically
Environment Variables
| Variable | Description | Default |
|---|---|---|
| GROQ_API_KEY | Groq API key | required |
| WEATHER_API_KEY | Weatherstack API key | required for weather tool |
| LLM_MODEL | Groq model name | llama-3.3-70b-versatile |
| LLM_TEMPERATURE | LLM temperature | 0 |
| MAX_RETRIES | Max LLM retry attempts | 3 |
| EVAL_THRESHOLD | Min quality score before retry | 0.6 |
| HITL_ENABLED | Enable human approval gate | true |