Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.3.0
title: Customer Support Agent
emoji: π€
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.12.0
app_file: app.py
pinned: false
runtime: docker
python_version: '3.11'
π§βπ» Intelligent Customer Support Agent with LangGraph π€
π Project Overview
This repository presents an intelligent customer support agent powered by LangGraphβa robust tool designed for building intricate workflows with language models. The agent is programmed to classify customer queries, evaluate sentiment, and deliver suitable responses or escalate inquiries when necessary. In this version, the agent leverages the browser-use library to consult online resources and uses a Gradio interface for testing.
π‘ Purpose
Efficient and accurate customer support is essential in todayβs competitive business landscape. By automating initial customer interactions, response times can be significantly shortened, thereby enhancing customer satisfaction. This project demonstrates how advanced language models combined with a graph-based workflow can be used to create an effective and scalable support system capable of handling diverse customer concerns.
π Core Components
- State Management: Utilizes
TypedDictto track and manage the state of each customer interaction. - Query Categorization: Classifies queries into categories such as Technical, Billing, or General.
- Sentiment Analysis: Assesses the emotional tone of customer queries.
- Response Generation: Constructs responses based on query category and sentiment.
- Escalation Protocol: Automatically escalates queries with negative sentiment to a human agent.
- Browser-Based Assistance: Employs the browser-use library to consult online sources for informed, up-to-date responses.
- Workflow Graph: Leverages LangGraph to build an adaptable and extendable workflow.
π How It Works
- Setup: Install the necessary libraries and prepare the Python environment.
- State Design: Create a structure to store query data (category, sentiment, response).
- Node Functions: Define functions for categorization, sentiment analysis, and generating responses.
- Graph Design: Use
StateGraphto establish the workflow, incorporating nodes and edges for the complete support process. - Routing Logic: Conditionally route the query based on its category and sentiment.
- Graph Compilation: Compile the workflow into an executable application.
- Execution: Process customer queries through the workflow and present results via the Gradio UI.
π Workflow Diagram
The following diagram illustrates the workflow of the customer support agentβfrom query categorization and sentiment analysis to routing and escalation:
π Browser Use
β Project Summary
This project showcases the versatility and power of LangGraph in creating AI-driven workflows. By merging natural language processing with graph-based techniques and augmenting capabilities with browser-based data fetching, weβve built a customer support agent capable of efficiently addressing a wide range of queries. The system is easily extendable and can be integrated into existing support pipelines or databases.
π Getting Started
Follow these steps to set up the project and run the customer support agent application.
1. Clone the Repository
Clone this repository to your local machine:
git clone https://github.com/alphatechlogics/LangGraphAgent.git
cd LangGraphAgent
2. Create a Virtual Environment
We recommend using uv to set up your Python environment:
uv venv --python 3.11
source .venv/bin/activate
Note: The prompt should now show your virtual environment (e.g., (LangGraphAgent)).
3. Activate the Virtual Environment
source venv/bin/activate
4. Install Required Dependencies
Install the necessary packages. We use uv pip to ensure packages are installed within the virtual environment:
uv pip install langgraph langchain_openai gradio
uv pip install browser-use
playwright install
5. Set Up Your OpenAI API Key
Create a .env file in the project root directory and add your OpenAI API key. This is required to interact with the OpenAI models:
OPENAI_API_KEY=your_openai_api_key_here
6. Run the Application
After installing the dependencies and setting up the .env file, you can run the application:
python customer_support_gradio.py
7. Access the App
Once the app is running, open your browser and navigate to the URL shown in the terminal (typically http://127.0.0.1:7860) to interact with the customer support agent.

