File size: 3,274 Bytes
370d216
 
 
 
 
 
 
 
 
 
8986591
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
---
title: LangGraph Agent
emoji: 🐠
colorFrom: red
colorTo: purple
sdk: docker
pinned: false
---

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

# LangGraph Agent β€” Modular Structure

A production-ready LangGraph application with 8 agentic checkpoints,
modular architecture, and Docker support.

## Project Structure

```
langgraph_agent/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ config.py              # All settings (env-driven)
β”‚   β”œβ”€β”€ state.py               # AgentState TypedDict
β”‚   β”œβ”€β”€ nodes/
β”‚   β”‚   β”œβ”€β”€ router.py          # βœ… Checkpoint 3 β€” Conditional routing
β”‚   β”‚   β”œβ”€β”€ rag.py             # βœ… Checkpoint 2 β€” RAG retrieval
β”‚   β”‚   β”œβ”€β”€ llm_node.py        # βœ… Checkpoint 4 β€” Retries
β”‚   β”‚   β”œβ”€β”€ tool_executor.py   # βœ… Checkpoint 1 β€” Tool execution
β”‚   β”‚   β”œβ”€β”€ memory.py          # βœ… Checkpoint 5 β€” Memory
β”‚   β”‚   β”œβ”€β”€ hitl.py            # βœ… Checkpoint 6 β€” Human-in-the-Loop
β”‚   β”‚   β”œβ”€β”€ evaluation.py      # βœ… Checkpoint 7 β€” Evaluation
β”‚   β”‚   β”œβ”€β”€ guardrails.py      # βœ… Checkpoint 8 β€” Guardrails
β”‚   β”‚   └── output.py          # Final output node
β”‚   β”œβ”€β”€ tools/
β”‚   β”‚   β”œβ”€β”€ calculator.py      # Math expression tool
β”‚   β”‚   └── weather.py         # Weatherstack API tool
β”‚   β”œβ”€β”€ rag/
β”‚   β”‚   └── store.py           # FAISS vector store + retrieval
β”‚   β”œβ”€β”€ graph/
β”‚   β”‚   └── builder.py         # Graph topology assembly
β”‚   └── utils/
β”‚       └── llm.py             # LLM singleton factory
β”œβ”€β”€ tests/
β”‚   └── test_nodes.py          # Unit tests (no API key needed)
β”œβ”€β”€ main.py                    # CLI entry point
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ Dockerfile
β”œβ”€β”€ docker-compose.yml
└── .env.example
```

## Quickstart

### Local

```bash
cp .env.example .env
# Fill in GROQ_API_KEY and WEATHER_API_KEY

pip install -r requirements.txt
python main.py
```

### Docker

```bash
cp .env.example .env
# Fill in your API keys in .env

docker compose up --build
```

### Run tests (no API keys needed)

```bash
pip install pytest
pytest tests/
```

## Adding a new tool

1. Create `app/tools/my_tool.py` with a `@tool` function
2. Import it in `app/tools/__init__.py` and add to `ALL_TOOLS`
3. Done β€” the router and LLM binding pick it up automatically

## Environment Variables

| Variable         | Description                        | Default                     |
|------------------|------------------------------------|-----------------------------|
| GROQ_API_KEY     | Groq API key                       | required                    |
| WEATHER_API_KEY  | Weatherstack API key               | required for weather tool   |
| LLM_MODEL        | Groq model name                    | llama-3.3-70b-versatile     |
| LLM_TEMPERATURE  | LLM temperature                    | 0                           |
| MAX_RETRIES      | Max LLM retry attempts             | 3                           |
| EVAL_THRESHOLD   | Min quality score before retry     | 0.6                         |
| HITL_ENABLED     | Enable human approval gate         | true                        |