Commit
·
f887cdc
1
Parent(s):
c02d04e
adding ui design
Browse files
README.md
CHANGED
|
@@ -3,56 +3,98 @@
|
|
| 3 |
|
| 4 |
|
| 5 |
## The Problem
|
| 6 |
-
Retail investors
|
|
|
|
|
|
|
|
|
|
| 7 |
|
| 8 |
-
|
| 9 |
-
- **Multi-Agent Pipeline:** A robust, asynchronous backend where specialized AI agents collaborate to build a complete analysis.
|
| 10 |
-
- **Data Agent:** Fetches real-time, comprehensive financial data for any stock.
|
| 11 |
-
- **Intelligence Agent:** Scrapes Google News, Yahoo Finance, and Reddit to gather and analyze market sentiment.
|
| 12 |
-
- **LLM Analyst Agent:** Utilizes **Google's Gemini 1.5 Flash** to analyze all collected data, identify trends, and generate a human-like investment thesis with a forecast and actionable strategy.
|
| 13 |
-
- **Interactive Dashboard:** A clean, modern React frontend to visualize the data, including news feeds and historical price charts.
|
| 14 |
-
- **Job History:** Users can view and revisit all their past analyses.
|
| 15 |
|
| 16 |
|
| 17 |
-
##
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 18 |
|
| 19 |
-
### Frontend
|
| 20 |
-
- **React (Vite):** For a fast and modern user interface.
|
| 21 |
-
- **Tailwind CSS:** For professional and responsive styling.
|
| 22 |
-
- **Recharts:** For beautiful and interactive data visualizations.
|
| 23 |
-
- **Axios:** For seamless communication with the backend API.
|
| 24 |
-
|
| 25 |
-
### Backend
|
| 26 |
-
- **FastAPI:** A high-performance Python framework for building the API.
|
| 27 |
-
- **Celery & Redis:** To manage the asynchronous, multi-step agent pipeline, ensuring the UI is always fast and responsive.
|
| 28 |
-
- **PostgreSQL (Neon):** A scalable, serverless cloud database for storing job data.
|
| 29 |
-
- **SQLAlchemy & Alembic:** For robust database interaction and schema migrations.
|
| 30 |
-
- **LangChain & Google Gemini 1.5 Flash:** The core AI engine for the Analyst Agent.
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
### Architecture
|
| 34 |
-
A[User on React Frontend] -->|1. POST /jobs (ticker)| B(FastAPI Backend);
|
| 35 |
-
B -->|2. Dispatch Task| C[Redis Queue];
|
| 36 |
-
C -->|3. Pick up Job| D(Celery Worker);
|
| 37 |
-
D -->|4. Run Pipeline| E[Agent 1: Data];
|
| 38 |
-
E -->|5. Update DB| F[(Neon DB)];
|
| 39 |
-
D -->|6. Run Pipeline| G[Agent 2: Intelligence];
|
| 40 |
-
G -->|7. Update DB| F;
|
| 41 |
-
D -->|8. Run Pipeline| H[Agent 3: LLM Analyst];
|
| 42 |
-
H -->|9. Call Gemini API| I{Gemini 1.5 Flash};
|
| 43 |
-
I -->|10. Return Thesis| H;
|
| 44 |
-
H -->|11. Final Update| F;
|
| 45 |
-
A -->|12. GET /jobs/{id} (Polling)| B;
|
| 46 |
-
B -->|13. Read Status/Result| F;
|
| 47 |
|
| 48 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 49 |
|
| 50 |
## Local Setup & Installation
|
| 51 |
-
|
| 52 |
- Prerequisites:
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
|
|
|
|
|
|
| 56 |
|
| 57 |
|
| 58 |
1. Clone the repository:
|
|
@@ -63,17 +105,27 @@ cd quantitative-analysis-platform
|
|
| 63 |
|
| 64 |
|
| 65 |
2. Set up environment variables:
|
|
|
|
| 66 |
```bash
|
| 67 |
Create a .env file in the root of the project by copying the example:
|
| 68 |
cp .env.example .env
|
| 69 |
```
|
| 70 |
|
| 71 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 72 |
```bash
|
| 73 |
docker-compose up --build -d
|
| 74 |
```
|
| 75 |
|
| 76 |
-
|
| 77 |
4. Access the applications:
|
| 78 |
```bash
|
| 79 |
Frontend: http://localhost:5173
|
|
|
|
| 3 |
|
| 4 |
|
| 5 |
## The Problem
|
| 6 |
+
Retail investors face a massive information asymmetry compared to institutional hedge funds. They lack access to:
|
| 7 |
+
* Real-time aggregation of financial news and social media sentiment.
|
| 8 |
+
* Sophisticated tools to interpret raw financial data.
|
| 9 |
+
* Unbiased, data-driven analysis to counter emotional decision-making.
|
| 10 |
|
| 11 |
+
This platform was built to bridge that gap, providing a suite of AI-powered tools that mimic a team of hedge fund analysts, accessible to anyone for free.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 12 |
|
| 13 |
|
| 14 |
+
## Features
|
| 15 |
+
* **Multi-Agent Pipeline:** A robust, asynchronous backend where specialized AI agents collaborate in a three-stage pipeline to build a complete analysis.
|
| 16 |
+
* **Data Agent:** Fetches real-time, comprehensive financial data, including key metrics, daily price action, and company officer information for any stock.
|
| 17 |
+
* **Intelligence Agent:** Scans the web in real-time, scraping Google News, Yahoo Finance, and Reddit to gather and analyze market sentiment using a finance-tuned NLP model.
|
| 18 |
+
* **LLM Analyst Agent:** The "brain" of the operation. It utilizes **Google's Gemini 1.5 Flash** to synthesize all the collected quantitative and qualitative data. It analyzes the last 100 days of price action and the latest news to generate a human-like investment thesis, complete with a 30-day forecast, bull/bear cases, and an actionable strategy.
|
| 19 |
+
* **Interactive Dashboard:** A clean, modern React frontend to visualize the analysis. It provides a real-time status of the agent pipeline and displays the final report in a beautiful, easy-to-read format.
|
| 20 |
+
* **Job History:** Users can view and revisit all their past analyses, making the platform a persistent research tool.
|
| 21 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
|
| 24 |
+
## Tech Stack & Architecture
|
| 25 |
+
This project is built with a modern, scalable, and containerized architecture.
|
| 26 |
+
|
| 27 |
+
### **Frontend**
|
| 28 |
+
* **React (Vite):** For a high-performance, modern user interface.
|
| 29 |
+
* **Tailwind CSS:** For professional, utility-first styling and a responsive design.
|
| 30 |
+
* **Recharts:** For creating beautiful and interactive data visualizations.
|
| 31 |
+
* **Axios:** For seamless and robust communication with the backend API.
|
| 32 |
+
|
| 33 |
+
### **Backend**
|
| 34 |
+
* **FastAPI:** A high-performance Python framework for building the core API.
|
| 35 |
+
* **Celery & Redis:** Manages the asynchronous, multi-agent pipeline. This ensures the UI is always fast and responsive, while the heavy lifting is done in the background.
|
| 36 |
+
* **PostgreSQL (Neon):** A scalable, serverless cloud database for storing all job data and results.
|
| 37 |
+
* **SQLAlchemy & Alembic:** The industry standard for robust database interaction and schema migrations.
|
| 38 |
+
* **LangChain & Google Gemini 1.5 Flash:** The core AI engine for the Analyst Agent, enabling sophisticated reasoning and report generation.
|
| 39 |
+
|
| 40 |
+
### **How the Architecture Works**
|
| 41 |
+
|
| 42 |
+
The system is designed as a decoupled, multi-service application orchestrated by Docker Compose.
|
| 43 |
+
graph TD
|
| 44 |
+
subgraph "User's Browser"
|
| 45 |
+
A[React Frontend]
|
| 46 |
+
end
|
| 47 |
+
|
| 48 |
+
subgraph "Backend Services (Docker)"
|
| 49 |
+
B(FastAPI Backend API)
|
| 50 |
+
C[Redis Queue]
|
| 51 |
+
D(Celery Worker)
|
| 52 |
+
end
|
| 53 |
+
|
| 54 |
+
subgraph "External Services"
|
| 55 |
+
E[(Neon DB)]
|
| 56 |
+
F{Data Sources<br/>(yfinance, Google, Reddit)}
|
| 57 |
+
G{Google AI<br/>(Gemini 1.5 Flash)}
|
| 58 |
+
end
|
| 59 |
+
|
| 60 |
+
A -->|1. POST /jobs (ticker)| B;
|
| 61 |
+
B -->|2. Creates Job Record| E;
|
| 62 |
+
B -->|3. Dispatches Task| C;
|
| 63 |
+
|
| 64 |
+
C -->|4. Worker Picks Up Task| D;
|
| 65 |
+
|
| 66 |
+
D -->|5. Agent 1: Data| F;
|
| 67 |
+
D -->|6. Updates DB| E;
|
| 68 |
+
|
| 69 |
+
D -->|7. Agent 2: Intelligence| F;
|
| 70 |
+
D -->|8. Updates DB| E;
|
| 71 |
+
|
| 72 |
+
D -->|9. Agent 3: Analyst| G;
|
| 73 |
+
D -->|10. Final Update| E;
|
| 74 |
+
|
| 75 |
+
A -->|11. GET /jobs/{id} (Polling)| B;
|
| 76 |
+
B -->|12. Reads Job Status/Result| E;
|
| 77 |
+
|
| 78 |
+
|
| 79 |
+
- The user enters a ticker on the React Frontend.
|
| 80 |
+
- A request is sent to the FastAPI Backend, which creates a job record in the Neon Database.
|
| 81 |
+
- FastAPI dispatches the main analysis task to the Redis Queue.
|
| 82 |
+
- The Celery Worker picks up the task and begins the three-stage pipeline.
|
| 83 |
+
- It calls the Data Agent tools to fetch data from yfinance.
|
| 84 |
+
- It then calls the Intelligence Agent tools to scrape Google News & Reddit.
|
| 85 |
+
- Finally, it calls the Analyst Agent, which sends all the collected data in a detailed prompt to the Gemini 1.5 Flash API.
|
| 86 |
+
- The worker saves the final, complete report to the database.
|
| 87 |
+
- Meanwhile, the React frontend polls the API every few seconds, updating the UI with the live status of the pipeline until the final result is ready to be displayed.
|
| 88 |
+
|
| 89 |
|
| 90 |
## Local Setup & Installation
|
| 91 |
+
- Follow these steps to run the project locally.
|
| 92 |
- Prerequisites:
|
| 93 |
+
- Docker & Docker Compose: The easiest way to run the entire stack.
|
| 94 |
+
- Python 3.10+
|
| 95 |
+
- Git: For cloning the repository.
|
| 96 |
+
- Node.js & npm
|
| 97 |
+
- A Hugging Face Account: (Needed if you want to re-download the sentiment model).
|
| 98 |
|
| 99 |
|
| 100 |
1. Clone the repository:
|
|
|
|
| 105 |
|
| 106 |
|
| 107 |
2. Set up environment variables:
|
| 108 |
+
- The project requires a .env file with your secret keys. Create one by copying the example file:
|
| 109 |
```bash
|
| 110 |
Create a .env file in the root of the project by copying the example:
|
| 111 |
cp .env.example .env
|
| 112 |
```
|
| 113 |
|
| 114 |
+
- Now, open the .env file and add your own API keys:
|
| 115 |
+
- DATABASE_URL: Your connection string from your Neon PostgreSQL project.
|
| 116 |
+
- GOOGLE_API_KEY: Your API key for the Gemini model from Google AI Studio.
|
| 117 |
+
|
| 118 |
+
|
| 119 |
+
3. The Sentiment Model:
|
| 120 |
+
The sentiment analysis model (ProsusAI/finbert) is included in the ml_models directory to ensure the application works offline and avoids runtime download issues. If you need to re-download it, follow the instructions in the Hugging Face documentation.
|
| 121 |
+
|
| 122 |
+
4. Build and Run with Docker Compose:
|
| 123 |
+
This single command will build the Docker images for all services and start the entire platform.
|
| 124 |
+
code
|
| 125 |
```bash
|
| 126 |
docker-compose up --build -d
|
| 127 |
```
|
| 128 |
|
|
|
|
| 129 |
4. Access the applications:
|
| 130 |
```bash
|
| 131 |
Frontend: http://localhost:5173
|