Spaces:
Sleeping
Sleeping
A newer version of the Streamlit SDK is available:
1.54.0
metadata
title: LanggraphAgenticAI
emoji: π¨
colorFrom: blue
colorTo: red
sdk: streamlit
sdk_version: 1.42.0
app_file: app.py
pinned: false
Project Setup
Steps
1. Open a New Folder
2. Create a New Virtual Environment
3. Install Requirements
pip install -r requirements.txt
4. Define Folder Structure & Commit Code to GitHub
a. Create a New Repository
git init
b. Add .gitignore and Include venv/
c. Commit One File to Establish Connection with GitHub
d. Verify Initial Commit in GitHub
5. Create src Folder and Make it a Package
mkdir src
Add __init__.py to make it a package.
6. Define Project Structure
Inside src/, create a folder langgraphagenticai/ and add __init__.py. Inside it, create:
LLMs/-__init__.pyNodes/-__init__.pyGraph/-__init__.pyState/-__init__.pyTools/-__init__.pyVectorstore/-__init__.pyui/-__init__.py- Lastly, add
main.py
7. Create app.py Outside src/
This is the main execution entry point.
Start Writing Code
8. LLM Setup
a. Inside LLMs/, create qroqllm.py to load LLM via Streamlit UI.
9. UI Setup
a. Inside ui/, create:
loadui.pydisplay_result.py
b. Create a Config File
Inside ui/, create uiconfigfile.ini (plain text key-value pairs).
c. Create a Config Parser
Inside ui/streamlit_ui/, create uiconfigfile.py to read from uiconfigfile.ini.
d. Implement loadui.py
from src.langgraphagenticai.ui.uiconfigfile import Config
- Load sidebar options from the config file
- Store session state
- Load UI components (left & right sections)
10. Running the UI
Run the UI to check if it loads correctly:
streamlit run app.py
11. Implement main.py
a. Execution starts here based on user choices.
b. Load the LLM in qroqllm.py.
c. Implement graph_builder.py inside Graph/.
- Import
Stateinsidegraph_builder.py
d. Implement Nodes
- Start with
BasicChatbotNode.
from src.langgraphagenticai.nodes.basic_chatbot_node import BasicChatbotNode
- Define edges, start, and end functions.
e. Connect Graph in main.py
from src.langgraphagenticai.graph.graph_builder import GraphBuilder
- Invoke the graph based on use case.
- Compile and stream results.
f. Update Display Results
Modify display_result.py to update UI results and call it in main.py.
from src.langgraphagenticai.ui.streamlitui.display_result import DisplayResultStreamlit
12. Run the Application
streamlit run app.py
π CI/CD Setup: GitHub to Hugging Face
πΉ 1. Set Up CI/CD Between GitHub and Hugging Face
- Create a new Space on Hugging Face.
- A unique URL will be generated for your Space.
- Open
main.ymland update the last line:- Replace it with your Hugging Face Space unique URL (it will include
$HF_TOKEN).
- Replace it with your Hugging Face Space unique URL (it will include
- Update the README:
- The first section of the README should contain the Streamlit configuration.
πΉ 2. Deploy on Streamlit
- Go to the Streamlit website and add your GitHub repo and branch details.
- Deployment example:
π LangGraph Streamlit App
Your CI/CD pipeline should now be set up for seamless deployment! πβ¨
Chatbot with Tool Integration
π Steps to Implement
πΉ 1. Add a Search Tool
- Navigate to the
TOOLSfolder. - Create a new file:
search_tool.py. - Add Tavily as a tool inside this file.
πΉ 2. Create a Chatbot with Tool Node
- Navigate to the
NODESfolder. - Create a new file:
Chatbot_with_tool_node.py. - Implement chatbot logic with tool integration inside this file.
πΉ 3. Modify Graph Builder
- Open
graph_builder.py. - Edit the file to include tool functionality.
πΉ 4. Update the UI
- Modify the following UI components:
Load_streamlit_ui.pyuiconfig.inidisplay_results.py
πΉ 5. Run the Application
To start the chatbot, run the following command:
streamlit run app.py
π Now your chatbot with tool integration should be live!
