Spaces:
Runtime error
Runtime error
Upload folder using huggingface_hub
Browse files- .Rhistory +0 -0
- .gitignore +13 -0
- MealPlans.pdf +0 -0
- README.md +67 -12
- README.txt +0 -0
- app.py +102 -0
- db_mealplans/index.faiss +0 -0
- db_mealplans/index.pkl +3 -0
- desktop.ini +10 -0
- images/DrMudassirAzeezKhan.jpg +0 -0
- images/faizan.jpeg +0 -0
- images/faizan.jpg +0 -0
- images/univ.jpg +0 -0
- rag.py +86 -0
- rag2.py +34 -0
- requirements.txt +39 -0
.Rhistory
ADDED
|
File without changes
|
.gitignore
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Python
|
| 2 |
+
__pycache__/
|
| 3 |
+
*.py[cod]
|
| 4 |
+
*.env
|
| 5 |
+
|
| 6 |
+
# Logs
|
| 7 |
+
*.log
|
| 8 |
+
|
| 9 |
+
# Streamlit Configs
|
| 10 |
+
*.stale
|
| 11 |
+
|
| 12 |
+
# Other
|
| 13 |
+
.DS_Store
|
MealPlans.pdf
ADDED
|
Binary file (7.96 kB). View file
|
|
|
README.md
CHANGED
|
@@ -1,12 +1,67 @@
|
|
| 1 |
-
---
|
| 2 |
-
title:
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
title: Diabetes_Diet_Bot
|
| 3 |
+
app_file: app.py
|
| 4 |
+
sdk: gradio
|
| 5 |
+
sdk_version: 5.12.0
|
| 6 |
+
---
|
| 7 |
+
# 🍛 **DiabetesDietBot** - South Indian Meal Planner for Type 2 Diabetes
|
| 8 |
+
|
| 9 |
+
## **Introduction**
|
| 10 |
+
DiabetesDietBot is an **AI-powered chatbot** designed to provide **personalized meal plans** for individuals with **Type 2 Diabetes**, focusing on **South Indian dietary preferences**. This chatbot leverages **Retrieval-Augmented Generation (RAG)** technology to retrieve relevant **meal plans** from a structured knowledge base (`MealPlans.pdf`), ensuring customized, nutritionally balanced recommendations.
|
| 11 |
+
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
## **🎯 Features**
|
| 15 |
+
✔ **Personalized Meal Plans** – Based on **age, gender, dietary preference, and caloric needs**
|
| 16 |
+
✔ **South Indian Focus** – Meal plans aligned with traditional **vegetarian, non-vegetarian, and non-onion/garlic** diets
|
| 17 |
+
✔ **AI-Powered Recommendations** – Uses **FAISS indexing and OpenAI GPT** to provide the most relevant meal plans
|
| 18 |
+
✔ **Gradio-Based Chat Interface** – User-friendly, mobile-friendly chatbot for instant diet advice
|
| 19 |
+
✔ **Fast Retrieval** – FAISS-based vector search enables quick meal plan suggestions
|
| 20 |
+
|
| 21 |
+
---
|
| 22 |
+
|
| 23 |
+
## **🛠 Technologies Used**
|
| 24 |
+
### **Backend:**
|
| 25 |
+
- **LangChain** – RAG-based retrieval and AI-assisted meal planning
|
| 26 |
+
- **FAISS** – Vector search for quick and relevant meal plan retrieval
|
| 27 |
+
- **OpenAI GPT API** – AI-powered chatbot responses
|
| 28 |
+
|
| 29 |
+
### **Frontend:**
|
| 30 |
+
- **Gradio** – Interactive chatbot UI with a sleek, modern interface
|
| 31 |
+
|
| 32 |
+
### **Data Source:**
|
| 33 |
+
- **MealPlans.pdf** – Contains 40+ structured meal plans tailored for Type 2 Diabetes in South Indian contexts
|
| 34 |
+
|
| 35 |
+
---
|
| 36 |
+
|
| 37 |
+
## **📂 Project Files**
|
| 38 |
+
- `DiabetesDietBot.py` → Main chatbot script (Gradio-based UI)
|
| 39 |
+
- `rag.py` → Text processing & vector indexing with FAISS
|
| 40 |
+
- `rag2.py` → Alternate FAISS indexing method
|
| 41 |
+
- `MealPlans.pdf` → Core dataset for AI meal plan recommendations
|
| 42 |
+
- `requirements.txt` → Dependencies for Hugging Face deployment
|
| 43 |
+
|
| 44 |
+
---
|
| 45 |
+
|
| 46 |
+
## **🚀 Deployment on Hugging Face**
|
| 47 |
+
You can use **Hugging Face Spaces** to deploy DiabetesDietBot:
|
| 48 |
+
|
| 49 |
+
### **1️⃣ Manual Deployment**
|
| 50 |
+
1. **Go to [Hugging Face Spaces](https://huggingface.co/spaces)**
|
| 51 |
+
2. Click **"New Space"** → Choose **Gradio** as the SDK
|
| 52 |
+
3. **Upload the following files manually:**
|
| 53 |
+
- `DiabetesDietBot.py`
|
| 54 |
+
- `rag.py`
|
| 55 |
+
- `rag2.py`
|
| 56 |
+
- `MealPlans.pdf`
|
| 57 |
+
- `requirements.txt`
|
| 58 |
+
4. **Restart Space** and your chatbot will be live!
|
| 59 |
+
|
| 60 |
+
---
|
| 61 |
+
|
| 62 |
+
## **🔧 Local Installation & Running**
|
| 63 |
+
To run the chatbot locally:
|
| 64 |
+
|
| 65 |
+
### **1️⃣ Install Dependencies**
|
| 66 |
+
```bash
|
| 67 |
+
pip install -r requirements.txt
|
README.txt
ADDED
|
File without changes
|
app.py
ADDED
|
@@ -0,0 +1,102 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import gradio as gr
|
| 2 |
+
import os
|
| 3 |
+
from dotenv import load_dotenv
|
| 4 |
+
from langchain_community.vectorstores import FAISS
|
| 5 |
+
from langchain.embeddings import OpenAIEmbeddings
|
| 6 |
+
import openai
|
| 7 |
+
import PyMuPDF # To process PDFs
|
| 8 |
+
|
| 9 |
+
# Load environment variables
|
| 10 |
+
load_dotenv()
|
| 11 |
+
|
| 12 |
+
# Load and process the MealPlans.pdf
|
| 13 |
+
pdf_path = "MealPlans.pdf"
|
| 14 |
+
|
| 15 |
+
def load_pdf_text(pdf_path):
|
| 16 |
+
text = ""
|
| 17 |
+
with open(pdf_path, "rb") as f:
|
| 18 |
+
doc = PyMuPDF.open(f)
|
| 19 |
+
for page in doc:
|
| 20 |
+
text += page.get_text("text") + "\n"
|
| 21 |
+
return text
|
| 22 |
+
|
| 23 |
+
if os.path.exists(pdf_path):
|
| 24 |
+
meal_plans_text = load_pdf_text(pdf_path)
|
| 25 |
+
else:
|
| 26 |
+
raise FileNotFoundError("⚠️ Meal Plans PDF not found!")
|
| 27 |
+
|
| 28 |
+
# Initialize OpenAI API Key
|
| 29 |
+
openai_api_key = os.getenv("OPENAI_API_KEY")
|
| 30 |
+
if not openai_api_key:
|
| 31 |
+
raise ValueError("⚠️ OpenAI API key not found! Please set the OPENAI_API_KEY environment variable.")
|
| 32 |
+
|
| 33 |
+
openai.api_key = openai_api_key
|
| 34 |
+
|
| 35 |
+
# Initialize FAISS Vector Store with the Meal Plans
|
| 36 |
+
try:
|
| 37 |
+
embeddings = OpenAIEmbeddings()
|
| 38 |
+
vector_store = FAISS.from_texts([meal_plans_text], embedding=embeddings)
|
| 39 |
+
except Exception as e:
|
| 40 |
+
raise Exception(f"⚠️ Error initializing FAISS vector store: {str(e)}")
|
| 41 |
+
|
| 42 |
+
# Function to Retrieve Personalized Meal Plans
|
| 43 |
+
def get_diet_plan(user_input):
|
| 44 |
+
"""Retrieve personalized meal plans based on user input."""
|
| 45 |
+
try:
|
| 46 |
+
docs = vector_store.similarity_search(user_input, k=3)
|
| 47 |
+
meal_suggestions = '\n\n'.join([doc.page_content for doc in docs])
|
| 48 |
+
|
| 49 |
+
messages = [
|
| 50 |
+
{"role": "system", "content": "You are a diabetes dietitian providing personalized South Indian meal plans."},
|
| 51 |
+
{"role": "user", "content": f"Provide a meal plan based on this query: {user_input}\n\n{meal_suggestions}"}
|
| 52 |
+
]
|
| 53 |
+
|
| 54 |
+
response = openai.ChatCompletion.create(
|
| 55 |
+
model="gpt-3.5-turbo",
|
| 56 |
+
messages=messages,
|
| 57 |
+
temperature=0.5
|
| 58 |
+
)
|
| 59 |
+
|
| 60 |
+
return response.choices[0].message['content']
|
| 61 |
+
except Exception as e:
|
| 62 |
+
return f"⚠️ Error generating diet plan: {str(e)}"
|
| 63 |
+
|
| 64 |
+
# 🎨 Fancy Gradio UI with Custom Styling
|
| 65 |
+
css = """
|
| 66 |
+
h1 {
|
| 67 |
+
text-align: center;
|
| 68 |
+
color: #008000;
|
| 69 |
+
font-size: 2.5rem;
|
| 70 |
+
}
|
| 71 |
+
body {
|
| 72 |
+
background: linear-gradient(to right, #f3f4f6, #e0f7fa);
|
| 73 |
+
}
|
| 74 |
+
#chatbot {
|
| 75 |
+
font-size: 1.2rem;
|
| 76 |
+
background: #ffffff;
|
| 77 |
+
padding: 10px;
|
| 78 |
+
border-radius: 10px;
|
| 79 |
+
box-shadow: 0px 4px 10px rgba(0, 0, 0, 0.1);
|
| 80 |
+
}
|
| 81 |
+
.gradio-container {
|
| 82 |
+
max-width: 700px;
|
| 83 |
+
margin: auto;
|
| 84 |
+
}
|
| 85 |
+
"""
|
| 86 |
+
|
| 87 |
+
# Gradio Interface
|
| 88 |
+
with gr.Blocks(css=css) as app:
|
| 89 |
+
gr.Markdown("# 🍛 **DiabetesDietBot** - South Indian Meal Planner for Type 2 Diabetes")
|
| 90 |
+
|
| 91 |
+
gr.Markdown("👋 Welcome! Enter details like **'Male, 40s, Vegetarian, Moderate Calories'** to get a **customized diabetes-friendly meal plan!**")
|
| 92 |
+
|
| 93 |
+
chatbot = gr.ChatInterface(fn=get_diet_plan, chatbot=True, height=400, placeholder="Enter your dietary needs...")
|
| 94 |
+
|
| 95 |
+
with gr.Row():
|
| 96 |
+
img = gr.Image("meal_plan_example.jpg", width=400, height=250, interactive=False)
|
| 97 |
+
gr.Markdown("### Example: South Indian Diabetes-Friendly Meal Plan")
|
| 98 |
+
|
| 99 |
+
gr.Markdown("### 🔗 Developed by [Dr. Syed Faizan](https://www.linkedin.com/in/drsyedfaizanmd/)")
|
| 100 |
+
|
| 101 |
+
# Launch Gradio App
|
| 102 |
+
app.launch()
|
db_mealplans/index.faiss
ADDED
|
Binary file (49.2 kB). View file
|
|
|
db_mealplans/index.pkl
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:fd6841681ca462d9b4c8b7e9bd7be999efdcda2a1bfc3f4d00314abac116b079
|
| 3 |
+
size 4483
|
desktop.ini
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
[LocalizedFileNames]
|
| 2 |
+
consolidatedengagedcitizenry.xlsx=@consolidatedengagedcitizenry,0
|
| 3 |
+
consolidatedhousing.xlsx=@consolidatedhousing,0
|
| 4 |
+
consolidatedcommunityservicesdata.xlsx=@consolidatedcommunityservicesdata,0
|
| 5 |
+
Housing Stability Dashboard 2.pdf=@Housing Stability Dashboard 2,0
|
| 6 |
+
Housing Stability Dashboard 1.pdf=@Housing Stability Dashboard 1,0
|
| 7 |
+
Financial_Stability_Dashboard.pdf=@Financial_Stability_Dashboard,0
|
| 8 |
+
Engaged and Connected Residents Dashboard.pdf=@Engaged and Connected Residents Dashboard,0
|
| 9 |
+
Demographics Dashboard.pdf=@Demographics Dashboard,0
|
| 10 |
+
Community Social Services Dashboard.pdf=@Community Social Services Dashboard,0
|
images/DrMudassirAzeezKhan.jpg
ADDED
|
images/faizan.jpeg
ADDED
|
images/faizan.jpg
ADDED
|
images/univ.jpg
ADDED
|
rag.py
ADDED
|
@@ -0,0 +1,86 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# %% Packages
|
| 2 |
+
import os
|
| 3 |
+
import re
|
| 4 |
+
from dotenv import load_dotenv
|
| 5 |
+
from pypdf import PdfReader
|
| 6 |
+
from langchain.text_splitter import RecursiveCharacterTextSplitter
|
| 7 |
+
from langchain_community.vectorstores import FAISS
|
| 8 |
+
from langchain_openai import OpenAIEmbeddings
|
| 9 |
+
import openai
|
| 10 |
+
|
| 11 |
+
# Load environment variables from .env file
|
| 12 |
+
load_dotenv()
|
| 13 |
+
|
| 14 |
+
# Ensure OpenAI API key is set
|
| 15 |
+
openai_api_key = os.getenv("OPENAI_API_KEY")
|
| 16 |
+
if not openai_api_key:
|
| 17 |
+
raise ValueError("⚠️ OpenAI API Key is missing. Set it in your .env file or environment variables.")
|
| 18 |
+
|
| 19 |
+
# Create OpenAI client (✅ Fix for v1.x API)
|
| 20 |
+
client = openai.OpenAI(api_key=openai_api_key)
|
| 21 |
+
|
| 22 |
+
# Load the Meal Plans PDF
|
| 23 |
+
pdf_path = "MealPlans.pdf"
|
| 24 |
+
|
| 25 |
+
if not os.path.exists(pdf_path):
|
| 26 |
+
raise FileNotFoundError("⚠️ Meal Plans PDF not found!")
|
| 27 |
+
|
| 28 |
+
# Extract text from PDF
|
| 29 |
+
reader = PdfReader(pdf_path)
|
| 30 |
+
meal_texts = [page.extract_text().strip() for page in reader.pages if page.extract_text()]
|
| 31 |
+
|
| 32 |
+
# Clean extracted text
|
| 33 |
+
cleaned_texts = [re.sub(r'\d+\n.*?\n', '', text) for text in meal_texts]
|
| 34 |
+
|
| 35 |
+
# Split text into chunks
|
| 36 |
+
char_splitter = RecursiveCharacterTextSplitter(
|
| 37 |
+
separators=["\n\n", "\n", ". ", " ", ""],
|
| 38 |
+
chunk_size=500,
|
| 39 |
+
chunk_overlap=50
|
| 40 |
+
)
|
| 41 |
+
|
| 42 |
+
# Ensure `document_chunks` is a **flat list**
|
| 43 |
+
document_chunks = char_splitter.split_text("\n\n".join(cleaned_texts)) # Convert to a single string first
|
| 44 |
+
print(f"✅ Number of text chunks: {len(document_chunks)}")
|
| 45 |
+
|
| 46 |
+
# Generate embeddings
|
| 47 |
+
embeddings = OpenAIEmbeddings()
|
| 48 |
+
|
| 49 |
+
# Ensure FAISS receives a list of strings (not nested lists)
|
| 50 |
+
vector_store = FAISS.from_texts(document_chunks, embedding=embeddings)
|
| 51 |
+
|
| 52 |
+
# Save FAISS index
|
| 53 |
+
vector_store.save_local("db_mealplans")
|
| 54 |
+
print("✅ FAISS index successfully created and saved for DiabetesDietBot.")
|
| 55 |
+
|
| 56 |
+
# Define RAG Query Function
|
| 57 |
+
def rag(query, n_results=5):
|
| 58 |
+
"""Retrieve personalized meal plans based on dietary preferences."""
|
| 59 |
+
try:
|
| 60 |
+
# Query FAISS vector store
|
| 61 |
+
docs = vector_store.similarity_search(query, k=n_results)
|
| 62 |
+
retrieved_text = "; ".join([doc.page_content for doc in docs])
|
| 63 |
+
|
| 64 |
+
# Prepare AI prompt
|
| 65 |
+
messages = [
|
| 66 |
+
{"role": "system", "content": "You are a diabetes dietitian specializing in South Indian meal planning."},
|
| 67 |
+
{"role": "user", "content": f"Generate a personalized meal plan: {query}\n\n{retrieved_text}"}
|
| 68 |
+
]
|
| 69 |
+
|
| 70 |
+
# ✅ Fixed OpenAI API Call (New v1.x API)
|
| 71 |
+
response = client.chat.completions.create(
|
| 72 |
+
model="gpt-3.5-turbo",
|
| 73 |
+
messages=messages
|
| 74 |
+
)
|
| 75 |
+
|
| 76 |
+
return response.choices[0].message.content
|
| 77 |
+
|
| 78 |
+
except Exception as e:
|
| 79 |
+
return f"⚠️ Error generating diet plan: {str(e)}"
|
| 80 |
+
|
| 81 |
+
# Example Query Test
|
| 82 |
+
if __name__ == "__main__":
|
| 83 |
+
test_query = "Male, 50s, Vegetarian, Moderate Calories"
|
| 84 |
+
response = rag(query=test_query, n_results=5)
|
| 85 |
+
print("\n🛑 Sample Response:\n")
|
| 86 |
+
print(response)
|
rag2.py
ADDED
|
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from langchain_community.vectorstores import FAISS
|
| 2 |
+
from langchain_openai.embeddings import OpenAIEmbeddings
|
| 3 |
+
from PyPDF2 import PdfReader
|
| 4 |
+
|
| 5 |
+
|
| 6 |
+
from dotenv import load_dotenv
|
| 7 |
+
load_dotenv()
|
| 8 |
+
|
| 9 |
+
# Function to extract text from PDF
|
| 10 |
+
def extract_text_from_pdf(pdf_path):
|
| 11 |
+
reader = PdfReader(pdf_path)
|
| 12 |
+
text = ""
|
| 13 |
+
for page in reader.pages:
|
| 14 |
+
text += page.extract_text()
|
| 15 |
+
return text
|
| 16 |
+
|
| 17 |
+
# Path to the uploaded document
|
| 18 |
+
pdf_path = r"C:\Users\sfaiz\OneDrive\Desktop\Official ALY 6080 Website\ALY_6080_Experential_learning_Group_1_Module_12_Capstone_Sponsor_Deliverable.pdf"
|
| 19 |
+
|
| 20 |
+
# Extract text from the PDF
|
| 21 |
+
document_text = extract_text_from_pdf(pdf_path)
|
| 22 |
+
|
| 23 |
+
# Split the document into smaller chunks for FAISS
|
| 24 |
+
chunk_size = 1000 # Adjust based on requirements
|
| 25 |
+
document_chunks = [document_text[i:i+chunk_size] for i in range(0, len(document_text), chunk_size)]
|
| 26 |
+
|
| 27 |
+
# Generate embeddings
|
| 28 |
+
embeddings = OpenAIEmbeddings()
|
| 29 |
+
|
| 30 |
+
# Create FAISS index
|
| 31 |
+
vector_store = FAISS.from_texts(document_chunks, embeddings)
|
| 32 |
+
|
| 33 |
+
# Save the index locally in the 'db' folder
|
| 34 |
+
vector_store.save_local("db")
|
requirements.txt
ADDED
|
@@ -0,0 +1,39 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Hugging Face & Gradio
|
| 2 |
+
huggingface_hub==0.25.2
|
| 3 |
+
gradio==5.7.1
|
| 4 |
+
|
| 5 |
+
# FAISS for Vector Search
|
| 6 |
+
faiss-cpu==1.9.0
|
| 7 |
+
|
| 8 |
+
# LangChain for Retrieval-Augmented Generation (RAG)
|
| 9 |
+
langchain==0.3.15
|
| 10 |
+
langchain-community==0.3.7
|
| 11 |
+
langchain-huggingface==0.1.2
|
| 12 |
+
langchain-openai==0.2.10
|
| 13 |
+
|
| 14 |
+
# OpenAI API for LLM Responses
|
| 15 |
+
openai==1.59.9
|
| 16 |
+
|
| 17 |
+
|
| 18 |
+
# Data Handling & Processing
|
| 19 |
+
numpy==1.26.4
|
| 20 |
+
pydantic==2.9.2
|
| 21 |
+
python-dotenv==1.0.1
|
| 22 |
+
|
| 23 |
+
# PDF Processing
|
| 24 |
+
PyMuPDF==1.21.1 # Make sure this version is supported
|
| 25 |
+
|
| 26 |
+
# Search Functionality (Optional for Expanding Queries)
|
| 27 |
+
duckduckgo-search==2.9.0
|
| 28 |
+
|
| 29 |
+
# Interactive Shell for Debugging (Optional)
|
| 30 |
+
ipython==8.29.0
|
| 31 |
+
|
| 32 |
+
# File Type Detection (Optional)
|
| 33 |
+
python-magic==0.4.27
|
| 34 |
+
|
| 35 |
+
|
| 36 |
+
|
| 37 |
+
|
| 38 |
+
|
| 39 |
+
|