Spaces:
Sleeping
Sleeping
Deployemnt retry
Browse files- Dockerfile +5 -5
- README.md +2 -64
Dockerfile
CHANGED
|
@@ -10,19 +10,19 @@ WORKDIR /app
|
|
| 10 |
COPY requirements.txt .
|
| 11 |
|
| 12 |
RUN pip install --no-cache-dir -r requirements.txt
|
| 13 |
-
|
|
|
|
|
|
|
|
|
|
| 14 |
|
| 15 |
RUN useradd -m -u 1000 user
|
| 16 |
USER user
|
| 17 |
|
| 18 |
COPY --chown=user . /app
|
| 19 |
|
| 20 |
-
EXPOSE 7680
|
| 21 |
-
|
| 22 |
-
ENV PORT=7680
|
| 23 |
ENV PYTHONUNBUFFERED=1
|
| 24 |
|
| 25 |
RUN chmod -R a+rX /ms-playwright || true
|
| 26 |
|
| 27 |
# RUN uvicorn app_gui:app --host 0.0.0.0 --port $PORT
|
| 28 |
-
CMD ["uvicorn", "app_gui:app", "--host", "0.0.0.0", "--port", "
|
|
|
|
| 10 |
COPY requirements.txt .
|
| 11 |
|
| 12 |
RUN pip install --no-cache-dir -r requirements.txt
|
| 13 |
+
# ENV PLAYWRIGHT_BROWSERS_PATH=/ms-playwright
|
| 14 |
+
# RUN mkdir -p /ms-playwright
|
| 15 |
+
RUN python -m playwright install --with-deps chromium
|
| 16 |
+
# RUN chmod -R a+rX /ms-playwright
|
| 17 |
|
| 18 |
RUN useradd -m -u 1000 user
|
| 19 |
USER user
|
| 20 |
|
| 21 |
COPY --chown=user . /app
|
| 22 |
|
|
|
|
|
|
|
|
|
|
| 23 |
ENV PYTHONUNBUFFERED=1
|
| 24 |
|
| 25 |
RUN chmod -R a+rX /ms-playwright || true
|
| 26 |
|
| 27 |
# RUN uvicorn app_gui:app --host 0.0.0.0 --port $PORT
|
| 28 |
+
CMD ["uvicorn", "app_gui:app", "--host", "0.0.0.0", "--port", "8001", "--proxy-headers" , "--forwarded-allow-ips=*"]
|
README.md
CHANGED
|
@@ -6,67 +6,5 @@ colorTo: red
|
|
| 6 |
sdk: docker
|
| 7 |
pinned: false
|
| 8 |
license: unknown
|
| 9 |
-
short_description: Shopping Assistant built using Pydantic AI & Google Gemini
|
| 10 |
-
---
|
| 11 |
-
|
| 12 |
-
# Shopping Assistant
|
| 13 |
-
Shopping Assistant built using Pydantic AI & Google Gemini LLM.
|
| 14 |
-
|
| 15 |
-
## Demo Video
|
| 16 |
-
|
| 17 |
-
https://github.com/user-attachments/assets/9c60d0e7-a388-4da1-8e8d-58cb99d0af10
|
| 18 |
-
|
| 19 |
-
Helps :
|
| 20 |
-
- Create the optimal search query
|
| 21 |
-
- Retrieve the available filters for the product on the website
|
| 22 |
-
- Filter out products based on User's preferences
|
| 23 |
-
- Recommend the most suitable product as well as the top 10 candidates
|
| 24 |
-
|
| 25 |
-
## Table of Contents
|
| 26 |
-
- Introduction
|
| 27 |
-
- Demo Video
|
| 28 |
-
- Setup
|
| 29 |
-
- Workflow
|
| 30 |
-
- Results
|
| 31 |
-
|
| 32 |
-
## Introduction:
|
| 33 |
-
It is an agentic system built using Pydantic AI & Pydantic for data validation . All data is consistently validated. Google Gemini 2.5 Flash lite is used as the base LLM & flipkart is chosen as the shopping site currently. The agent has access to various tools :
|
| 34 |
-
- get_pro_class: Extract the product category & type from the user's query.
|
| 35 |
-
- prompt_user0: Prompt the user for further details about the product.
|
| 36 |
-
- rephrase_query : Generate a very concise product search query - combining original query & user provided details
|
| 37 |
-
- get_best_site: Find the best website to search for the product.
|
| 38 |
-
- get_site_filters: Retrieve all the available filters on the best site for the product.
|
| 39 |
-
- prompt_user1: Prompt the user for further details about the product using the available filters
|
| 40 |
-
- get_candidates : Return the best candidates found for the product
|
| 41 |
-
& chooses them depending on the workflow , user's demands & its own logic.
|
| 42 |
-
|
| 43 |
-
Ideal Workflow:
|
| 44 |
-
|
| 45 |
-
```mermaid
|
| 46 |
-
graph TD;
|
| 47 |
-
A[User Input] --> B[Agent];
|
| 48 |
-
B --> C[Product Class];
|
| 49 |
-
B --> D[Prompt User for Product Details];
|
| 50 |
-
B --> E[Optimize Search Query with All Details];
|
| 51 |
-
B --> F[Get Best Site];
|
| 52 |
-
B --> G[Get All Site Filters];
|
| 53 |
-
B --> H[Prompt User for Which Filters to Use];
|
| 54 |
-
B --> I[Search, Filter & Return Products];
|
| 55 |
-
```
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
## Setup
|
| 59 |
-
To run the project locally:
|
| 60 |
-
|
| 61 |
-
1.Install Python dependencies:
|
| 62 |
-
`pip install -r requirements.txt`
|
| 63 |
-
|
| 64 |
-
2.Clone the repository:
|
| 65 |
-
`git clone https://github.com/eshan1347/shop_agent`
|
| 66 |
-
|
| 67 |
-
3.Run:
|
| 68 |
-
`python pydantic_ai_agents.py`
|
| 69 |
-
|
| 70 |
-
## Results
|
| 71 |
-
|
| 72 |
-
This agentic system can be further improved by increasing the number of tools available - so that even more freedom is afforded to the agent. Web scraping data from websites add too much latency to the system - will further explore if any API's are available . Various other sites will be added so that products from various sites can be retrieved & a more holistic recommendation can be returned.
|
|
|
|
| 6 |
sdk: docker
|
| 7 |
pinned: false
|
| 8 |
license: unknown
|
| 9 |
+
short_description: Shopping Assistant built using Pydantic AI & Google Gemini LLM
|
| 10 |
+
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|