File size: 2,756 Bytes
617ddac
 
 
 
 
 
 
 
 
4dbe519
f869dc9
4dbe519
 
 
 
c06d66f
 
 
 
 
4dbe519
 
 
 
 
 
 
f869dc9
 
 
 
 
 
 
 
4dbe519
 
 
f869dc9
 
 
 
 
4dbe519
f869dc9
 
 
 
4dbe519
e184879
4dbe519
 
e184879
 
4dbe519
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f869dc9
4dbe519
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
---
title: Agent From Scratch
emoji: πŸ€–
colorFrom: green
colorTo: blue
sdk: docker
app_port: 7860
---

# AI Agent Framework

A flexible framework for building AI agents with tool support, MCP integration, and multi-step reasoning.

## Features

- **Agent System**: Multi-step reasoning with tool execution
- **Tool Framework**: Easy tool creation and integration
- **MCP Integration**: Load tools from MCP servers
- **LLM Client**: Unified interface for LLM API calls via LiteLLM
- **Modular Design**: Clean, organized package structure

## Installation

### Prerequisites

Install `uv` (recommended package manager):
- macOS/Linux:
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```
- Homebrew (macOS):
```bash
brew install uv
```

### Setup

1. Create a virtual environment:
```bash
uv venv
source .venv/bin/activate
```

2. Install dependencies:
```bash
uv pip install -r requirements.txt
```

3. Set up environment variables:
```bash
cp .env.example .env
# Edit .env and add your API keys (OPENAI_API_KEY, TAVILY_API_KEY, etc.)
```

## Quick Start

```python
from agent_framework import Agent, LlmClient, FunctionTool

# Define a tool
def calculator(expression: str) -> float:
    """Calculate mathematical expressions."""
    return eval(expression)

# Create the agent
agent = Agent(
    model=LlmClient(model="gpt-5-mini"),
    tools=[FunctionTool(calculator)],
    instructions="You are a helpful assistant.",
)

# Run the agent
result = await agent.run("What is 1234 * 5678?")
print(result.output)  # "7006652"
```

## Package Structure

```
agent_framework/
β”œβ”€β”€ __init__.py      # Package exports
β”œβ”€β”€ models.py        # Core data models (Message, ToolCall, Event, ExecutionContext)
β”œβ”€β”€ tools.py         # BaseTool and FunctionTool classes
β”œβ”€β”€ llm.py           # LlmClient and request/response models
β”œβ”€β”€ agent.py         # Agent and AgentResult classes
β”œβ”€β”€ mcp.py           # MCP tool loading utilities
└── utils.py         # Helper functions for tool definitions
```

## Usage Examples

### Using the @tool Decorator

```python
from agent_framework import tool

@tool
def multiply(a: float, b: float) -> float:
    """Multiply two numbers."""
    return a * b

# multiply is now a FunctionTool instance
```

### MCP Tool Integration

```python
from agent_framework import load_mcp_tools
import os

connection = {
    "command": "npx",
    "args": ["-y", "tavily-mcp@latest"],
    "env": {"TAVILY_API_KEY": os.getenv("TAVILY_API_KEY")}
}

mcp_tools = await load_mcp_tools(connection)
agent = Agent(
    model=LlmClient(model="gpt-5-mini"),
    tools=mcp_tools,
)
```

## Documentation

See `agent_framework/README.md` for detailed API documentation.

## License

See LICENSE file for details.