agent-from-scratch / README.md
shekkari21's picture
add HF Spaces metadata to README
617ddac
metadata
title: Agent From Scratch
emoji: πŸ€–
colorFrom: green
colorTo: blue
sdk: docker
app_port: 7860

AI Agent Framework

A flexible framework for building AI agents with tool support, MCP integration, and multi-step reasoning.

Features

  • Agent System: Multi-step reasoning with tool execution
  • Tool Framework: Easy tool creation and integration
  • MCP Integration: Load tools from MCP servers
  • LLM Client: Unified interface for LLM API calls via LiteLLM
  • Modular Design: Clean, organized package structure

Installation

Prerequisites

Install uv (recommended package manager):

  • macOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
  • Homebrew (macOS):
brew install uv

Setup

  1. Create a virtual environment:
uv venv
source .venv/bin/activate
  1. Install dependencies:
uv pip install -r requirements.txt
  1. Set up environment variables:
cp .env.example .env
# Edit .env and add your API keys (OPENAI_API_KEY, TAVILY_API_KEY, etc.)

Quick Start

from agent_framework import Agent, LlmClient, FunctionTool

# Define a tool
def calculator(expression: str) -> float:
    """Calculate mathematical expressions."""
    return eval(expression)

# Create the agent
agent = Agent(
    model=LlmClient(model="gpt-5-mini"),
    tools=[FunctionTool(calculator)],
    instructions="You are a helpful assistant.",
)

# Run the agent
result = await agent.run("What is 1234 * 5678?")
print(result.output)  # "7006652"

Package Structure

agent_framework/
β”œβ”€β”€ __init__.py      # Package exports
β”œβ”€β”€ models.py        # Core data models (Message, ToolCall, Event, ExecutionContext)
β”œβ”€β”€ tools.py         # BaseTool and FunctionTool classes
β”œβ”€β”€ llm.py           # LlmClient and request/response models
β”œβ”€β”€ agent.py         # Agent and AgentResult classes
β”œβ”€β”€ mcp.py           # MCP tool loading utilities
└── utils.py         # Helper functions for tool definitions

Usage Examples

Using the @tool Decorator

from agent_framework import tool

@tool
def multiply(a: float, b: float) -> float:
    """Multiply two numbers."""
    return a * b

# multiply is now a FunctionTool instance

MCP Tool Integration

from agent_framework import load_mcp_tools
import os

connection = {
    "command": "npx",
    "args": ["-y", "tavily-mcp@latest"],
    "env": {"TAVILY_API_KEY": os.getenv("TAVILY_API_KEY")}
}

mcp_tools = await load_mcp_tools(connection)
agent = Agent(
    model=LlmClient(model="gpt-5-mini"),
    tools=mcp_tools,
)

Documentation

See agent_framework/README.md for detailed API documentation.

License

See LICENSE file for details.