customer_id large_stringlengths 9 9 | customer_unique_id large_stringlengths 11 11 | customer_zip_code_prefix large_stringlengths 5 5 | customer_city large_stringclasses 23
values | customer_state large_stringclasses 11
values |
|---|---|---|---|---|
cust_0000 | unique_0000 | 22183 | Niteroi | RJ |
cust_0001 | unique_0001 | 39299 | Brasilia | DF |
cust_0002 | unique_0002 | 22874 | Salvador | BA |
cust_0003 | unique_0003 | 42711 | Uberlandia | MG |
cust_0004 | unique_0004 | 15539 | Guarulhos | SP |
cust_0005 | unique_0005 | 63351 | Sao Paulo | SP |
cust_0006 | unique_0006 | 71267 | Guarulhos | SP |
cust_0007 | unique_0007 | 58354 | Fortaleza | CE |
cust_0008 | unique_0008 | 12557 | Porto Alegre | RS |
cust_0009 | unique_0009 | 48360 | Curitiba | PR |
cust_0010 | unique_0010 | 92018 | Guarulhos | SP |
cust_0011 | unique_0011 | 12200 | Brasilia | DF |
cust_0012 | unique_0012 | 78497 | Olinda | PE |
cust_0013 | unique_0013 | 56975 | Guarulhos | SP |
cust_0014 | unique_0014 | 31357 | Guarulhos | SP |
cust_0015 | unique_0015 | 87505 | Santos | SP |
cust_0016 | unique_0016 | 12869 | Campinas | SP |
cust_0017 | unique_0017 | 71135 | Belo Horizonte | MG |
cust_0018 | unique_0018 | 60108 | Petropolis | RJ |
cust_0019 | unique_0019 | 48467 | Guarulhos | SP |
cust_0020 | unique_0020 | 33328 | Caxias Do Sul | RS |
cust_0021 | unique_0021 | 96831 | Guarulhos | SP |
cust_0022 | unique_0022 | 13987 | Campinas | SP |
cust_0023 | unique_0023 | 68871 | Petropolis | RJ |
cust_0024 | unique_0024 | 32399 | Rio De Janeiro | RJ |
cust_0025 | unique_0025 | 56214 | Florianopolis | SC |
cust_0026 | unique_0026 | 96416 | Sao Paulo | SP |
cust_0027 | unique_0027 | 80271 | Juiz De Fora | MG |
cust_0028 | unique_0028 | 54064 | Juiz De Fora | MG |
cust_0029 | unique_0029 | 80091 | Sao Paulo | SP |
cust_0030 | unique_0030 | 50818 | Caxias Do Sul | RS |
cust_0031 | unique_0031 | 55525 | Santos | SP |
cust_0032 | unique_0032 | 29830 | Sao Paulo | SP |
cust_0033 | unique_0033 | 27429 | Goiania | GO |
cust_0034 | unique_0034 | 16893 | Brasilia | DF |
cust_0035 | unique_0035 | 89909 | Florianopolis | SC |
cust_0036 | unique_0036 | 57333 | Guarulhos | SP |
cust_0037 | unique_0037 | 13436 | Campinas | SP |
cust_0038 | unique_0038 | 84290 | Maringa | PR |
cust_0039 | unique_0039 | 86213 | Rio De Janeiro | RJ |
cust_0040 | unique_0040 | 15895 | Sao Paulo | SP |
cust_0041 | unique_0041 | 29738 | Petropolis | RJ |
cust_0042 | unique_0042 | 40746 | Campinas | SP |
cust_0043 | unique_0043 | 59377 | Goiania | GO |
cust_0044 | unique_0044 | 58404 | Campinas | SP |
cust_0045 | unique_0045 | 64045 | Porto Alegre | RS |
cust_0046 | unique_0046 | 49790 | Santos | SP |
cust_0047 | unique_0047 | 15600 | Uberlandia | MG |
cust_0048 | unique_0048 | 50764 | Juiz De Fora | MG |
cust_0049 | unique_0049 | 84543 | Santos | SP |
cust_0050 | unique_0050 | 55714 | Brasilia | DF |
cust_0051 | unique_0051 | 66835 | Salvador | BA |
cust_0052 | unique_0052 | 83744 | Goiania | GO |
cust_0053 | unique_0053 | 66491 | Fortaleza | CE |
cust_0054 | unique_0054 | 28589 | Uberlandia | MG |
cust_0055 | unique_0055 | 53484 | Goiania | GO |
cust_0056 | unique_0056 | 92989 | Guarulhos | SP |
cust_0057 | unique_0057 | 46212 | Sao Paulo | SP |
cust_0058 | unique_0058 | 53525 | Campinas | SP |
cust_0059 | unique_0059 | 57202 | Campinas | SP |
cust_0060 | unique_0060 | 42635 | Niteroi | RJ |
cust_0061 | unique_0061 | 73208 | Campinas | SP |
cust_0062 | unique_0062 | 43828 | Olinda | PE |
cust_0063 | unique_0063 | 28711 | Petropolis | RJ |
cust_0064 | unique_0064 | 13420 | Sao Paulo | SP |
cust_0065 | unique_0065 | 10301 | Juiz De Fora | MG |
cust_0066 | unique_0066 | 55236 | Guarulhos | SP |
cust_0067 | unique_0067 | 76235 | Joinville | SC |
cust_0068 | unique_0068 | 64240 | Sao Paulo | SP |
cust_0069 | unique_0069 | 75726 | Brasilia | DF |
cust_0070 | unique_0070 | 20492 | Feira De Santana | BA |
cust_0071 | unique_0071 | 16102 | Sao Paulo | SP |
cust_0072 | unique_0072 | 60336 | Santos | SP |
cust_0073 | unique_0073 | 95314 | Florianopolis | SC |
cust_0074 | unique_0074 | 36641 | Curitiba | PR |
cust_0075 | unique_0075 | 44584 | Curitiba | PR |
cust_0076 | unique_0076 | 42745 | Salvador | BA |
cust_0077 | unique_0077 | 33093 | Sao Paulo | SP |
cust_0078 | unique_0078 | 76105 | Niteroi | RJ |
cust_0079 | unique_0079 | 61885 | Sao Paulo | SP |
cust_0080 | unique_0080 | 46631 | Fortaleza | CE |
cust_0081 | unique_0081 | 82991 | Porto Alegre | RS |
cust_0082 | unique_0082 | 14014 | Guarulhos | SP |
cust_0083 | unique_0083 | 21093 | Guarulhos | SP |
cust_0084 | unique_0084 | 28070 | Santos | SP |
cust_0085 | unique_0085 | 45777 | Sao Paulo | SP |
cust_0086 | unique_0086 | 66958 | Maringa | PR |
cust_0087 | unique_0087 | 92074 | Caxias Do Sul | RS |
cust_0088 | unique_0088 | 20729 | Fortaleza | CE |
cust_0089 | unique_0089 | 55017 | Niteroi | RJ |
cust_0090 | unique_0090 | 76320 | Sao Paulo | SP |
cust_0091 | unique_0091 | 37751 | Maringa | PR |
cust_0092 | unique_0092 | 88069 | Feira De Santana | BA |
cust_0093 | unique_0093 | 64748 | Juiz De Fora | MG |
cust_0094 | unique_0094 | 15801 | Feira De Santana | BA |
cust_0095 | unique_0095 | 29190 | Rio De Janeiro | RJ |
cust_0096 | unique_0096 | 59689 | Belo Horizonte | MG |
cust_0097 | unique_0097 | 60993 | Petropolis | RJ |
cust_0098 | unique_0098 | 39592 | Sao Paulo | SP |
cust_0099 | unique_0099 | 20647 | Santos | SP |
🏭 AGENTIC BI — DATA ENGINEERING PROJECT
Enterprise-Grade Streaming Data Platform
Đề tài: Hệ thống Data Engineering end-to-end tích hợp Streaming Pipeline (CDC + Kafka) + Lakehouse (Delta Lake Medallion Architecture) + Agentic BI
Dataset: Olist Brazilian E-commerce (~100K orders, ~1.55M records tổng cộng)
Tác giả: thanhtai435
🎓 Two Architectures in One Repo:
- Standard (
docker-compose.yml) — 16 services, local development, POC-friendly- Enterprise (
docker-compose.enterprise.yml+enterprise/) — Production patterns: CDC (Debezium), idempotent consumers, Delta Lake, K8s manifests, business alerting
📋 MỤC LỤC
- Tổng quan dự án
- Kiến trúc hệ thống
- Kiến trúc Enterprise
- Giai đoạn 1 — Nạp dữ liệu thô (Ingestion / Bronze Layer)
- Giai đoạn 2 — Xây dựng lõi Lakehouse (Transformation)
- Giai đoạn 3 — Huấn luyện AI & Kiểm chứng (Intelligence)
- Giai đoạn 4 — Dashboard & GenAI (Visualization)
- Các thành phần mở rộng
- Cấu trúc dự án
- Technology Stack
- Hướng dẫn triển khai
- Data Modeling — Star Schema
- Đánh giá & Benchmark
- Bảng đối chiếu yêu cầu
- Tài liệu tham khảo
1. TỔNG QUAN DỰ ÁN
1.1 Mục tiêu
Xây dựng hệ thống Data Engineering end-to-end cho một sàn thương mại điện tử Brazil (Olist), bao gồm:
- Ingestion: CDC (Change Data Capture) từ PostgreSQL → Kafka/Redpanda, hoặc batch từ Hugging Face / Kaggle
- Transformation: Xử lý qua 3 tầng Bronze → Silver → Gold theo kiến trúc Medallion (Lakehouse) với Delta Lake
- Intelligence: Huấn luyện 4 mô hình ML (Association Rules, Clustering, Classification, In-database ML)
- Visualization: Dashboard Streamlit + Agentic BI (chat AI hỏi đáp dữ liệu bằng ngôn ngữ tự nhiên)
- Enterprise Patterns: Idempotency, DLQ, Schema Evolution, Exactly-Once Semantics, K8s Deployment
1.2 Dataset
| Bảng | Mô tả | Số dòng (sample) |
|---|---|---|
orders |
Đơn hàng (order_id, status, timestamps) | 500 |
order_items |
Chi tiết sản phẩm trong đơn (price, freight) | ~800 |
customers |
Khách hàng (city, state) | ~450 |
products |
Sản phẩm (category, weight, dimensions) | ~300 |
sellers |
Người bán (city, state) | ~150 |
payments |
Thanh toán (type, installments, value) | ~550 |
reviews |
Đánh giá (score 1-5, comment) | ~500 |
Full dataset (Kaggle): ~100K orders, ~112K order_items, ~99K customers, ~32K products, ~3K sellers.
1.3 Cấu trúc Project
agentic-bi-ecommerce/
├── README.md # Tài liệu này
├── docker-compose.yml # 16+ Docker services (Standard)
├── docker-compose.enterprise.yml # 9 services (Enterprise: Redpanda + Debezium + Delta)
├── .env.example # Template biến môi trường
├── Makefile # Lệnh tiện ích
├── requirements.txt # Python dependencies
│
├── enterprise/ # 🏭 ENTERPRISE ARCHITECTURE
│ ├── README.md # Tài liệu enterprise chi tiết
│ ├── docker-compose.enterprise.yml
│ ├── debezium/
│ │ └── connectors/ # CDC connector configs (Debezium)
│ │ ├── olist-orders-connector.json
│ │ ├── olist-items-connector.json
│ │ ├── olist-payments-connector.json
│ │ └── olist-reviews-connector.json
│ ├── schemas/
│ │ └── avro/ # Avro schema definitions
│ │ ├── raw_order_event.avsc
│ │ └── order_event.avsc
│ ├── consumers/
│ │ ├── Dockerfile.bronze
│ │ ├── requirements.bronze.txt
│ │ └── bronze_consumer.py # Idempotent consumer + DLQ + Prometheus metrics
│ ├── etl/
│ │ ├── Dockerfile.silver
│ │ ├── silver_etl.py # Spark + Delta Lake + Data Quality Engine
│ │ ├── spark-defaults.conf
│ │ └── core-site.xml
│ ├── k8s/
│ │ ├── namespace.yaml # K8s namespace + ResourceQuota
│ │ ├── redpanda-cluster.yaml # Redpanda Operator (3 brokers)
│ │ └── bronze-consumer-deployment.yaml # HPA + Pod Anti-Affinity
│ ├── monitoring/
│ │ ├── prometheus.yml
│ │ ├── rules/
│ │ │ └── business-alerts.yml
│ │ └── grafana/
│ │ └── dashboards/
│ │ └── data-platform.json
│ └── init-scripts/
│ └── postgres-cdc.sql # CDC setup: publication, REPLICA IDENTITY
│
├── data/sample/ # Sample Parquet (cho HF Dataset Viewer)
│ ├── orders.parquet
│ ├── order_items.parquet
│ ├── customers.parquet
│ ├── products.parquet
│ ├── sellers.parquet
│ ├── payments.parquet
│ └── reviews.parquet
│
├── streaming_simulator/ # Kafka streaming simulator (Standard)
│ └── simulator.py # CSV → 6 Kafka topics (order lifecycle)
│
├── transforms/ # ETL Pipeline (Medallion Architecture)
│ ├── bronze_to_silver.py # Tầng Bronze → Silver (8 bảng)
│ └── silver_to_gold.py # Tầng Silver → Gold (Star Schema)
│
├── analytics/ # Data Mining & ML
│ ├── data_preprocessing.py # Data Quality, Cleaning, PCA, Feature Engineering
│ ├── association_rules.py # Apriori from scratch + Market Basket Analysis
│ ├── customer_segmentation.py # K-Means RFM + DBSCAN anomaly detection
│ ├── satisfaction_model.py # Decision Tree + Naive Bayes + Random Forest
│ └── in_database_ml.sql # Feature Store + SQL ML + Model Drift Detection
│
├── agentic_bi/ # Agentic BI (Multi-Agent System)
│ └── orchestrator.py # smolagents: SQL Agent + Insight Agent
│
├── governance/ # Data Governance
│ └── data_governance.py # Lineage, RLS, K-Anonymity, DP, Blockchain Audit
│
├── dbt_project/models/ # dbt models (SQL transforms)
│ └── dbt_models.sql # Staging → Intermediate → Gold SQL
│
├── airflow_dags/ # Orchestration
│ └── daily_etl_dag.py # Airflow DAG (5 tasks, daily 02:00 UTC)
│
├── frontend/ # Dashboard
│ └── streamlit_app.py # 4-page Streamlit app
│
├── init-scripts/ # Database DDL
│ ├── postgres/01_init_dw.sql # Star Schema DDL + Indexes
│ └── clickhouse/01_init_realtime.sql # Real-time analytics tables
│
├── tests/ # Unit tests
│ └── test_pipeline.py # 19 tests
│
└── docs/images/ # Architecture diagrams
├── system_architecture.png
├── data_flow.png
├── star_schema.png
├── agent_architecture.png
└── streaming_detail.png
2. KIẾN TRÚC HỆ THỐNG (Standard)
2.1 Tổng quan kiến trúc Standard
┌─────────────────────────────────────────────────────────────────────────────┐
│ AGENTIC BI — SYSTEM ARCHITECTURE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌─────────────────────────────────────────┐ │
│ │ DATA SOURCE │ │ MESSAGE BROKER (KAFKA) │ │
│ │ ─────────── │ │ 6 topics: orders.created, items, │ │
│ │ Kaggle CSV │───▶│ payments, status, delivered, reviews │ │
│ │ HuggingFace │ └───────────────┬─────────────────────────┘ │
│ └──────────────┘ │ │
│ ▼ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ LAKEHOUSE (MEDALLION ARCHITECTURE) │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────────────────┐ │ │
│ │ │ BRONZE │───▶│ SILVER │───▶│ GOLD │ │ │
│ │ │ Raw CSV │ │ Cleaned │ │ Star Schema: │ │ │
│ │ │ 8 tables │ │ Validated│ │ • dim_time │ │ │
│ │ │ │ │ Parquet │ │ • dim_customer │ │ │
│ │ └──────────┘ └──────────┘ │ • dim_product │ │ │
│ │ │ • dim_seller │ │ │
│ │ PostgreSQL (DW) ◄───────────────│ • dim_geography │ │ │
│ │ ClickHouse (Real-time OLAP) ◄───│ • fact_orders │ │ │
│ │ │ • agg_daily_revenue │ │ │
│ │ └──────────────────────┘ │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │ │
│ ┌─────────────────┼──────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────────┐ │
│ │ ANALYTICS/ML │ │ AGENTIC BI │ │ STREAMLIT │ │
│ │ ──────────── │ │ ────────── │ │ DASHBOARD │ │
│ │ • Apriori │ │ Orchestrator│ │ • KPI Dashboard │ │
│ │ • K-Means │ │ ├─SQL Agent │ │ • AI Chat │ │
│ │ • DBSCAN │ │ └─Insight │ │ • Real-time Monitor │ │
│ │ • DTree/NB/RF│ │ Agent │ │ • Auto Reports │ │
│ └──────────────┘ └──────────────┘ └──────────────────────┘ │
│ │
│ ┌─────────────────────────────────────────────────────────────┐ │
│ │ ORCHESTRATION & GOVERNANCE │ │
│ │ Airflow DAG (daily 02:00) │ Data Lineage │ RLS │ Audit │ │
│ │ dbt models (SQL transforms) │ K-Anonymity │ Differential Privacy │ │
│ └─────────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────────────────┘
2.2 Docker Services (16 containers)
| Service | Image | Port | Vai trò |
|---|---|---|---|
| Zookeeper | confluentinc/cp-zookeeper:7.5.0 | 2181 | Kafka coordination |
| Kafka | confluentinc/cp-kafka:7.5.0 | 9092 | Message broker |
| Schema Registry | confluentinc/cp-schema-registry:7.5.0 | 8081 | Schema management |
| Kafka UI | provectuslabs/kafka-ui | 8080 | Kafka monitoring |
| MinIO | minio/minio | 9000/9001 | Object storage (Bronze/Silver/Gold buckets) |
| PostgreSQL 16 | postgres:16 | 5432 | Data Warehouse (Star Schema) |
| ClickHouse 24 | clickhouse/clickhouse-server:24 | 8123 | Real-time OLAP |
| ChromaDB | chromadb/chroma | 8000 | Vector store (cho Agentic BI) |
| Spark Master | bitnami/spark:3.5 | 8085/7077 | Distributed processing |
| Spark Worker | bitnami/spark:3.5 | — | Processing worker |
| Airflow Webserver | apache/airflow:2.8.0 | 8082 | DAG management UI |
| Airflow Scheduler | apache/airflow:2.8.0 | — | Task scheduling |
| Prometheus | prom/prometheus | 9090 | Metrics collection |
| Grafana | grafana/grafana:10 | 3000 | Monitoring dashboard |
| Agentic BI App | custom Dockerfile | 8501 | Streamlit + Agent |
| Streaming Simulator | custom Dockerfile | — | CSV → Kafka replay |
3. KIẾN TRÚC ENTERPRISE
3.1 Enterprise Data Flow
┌─────────────────────────────────────────────────────────────────────────────┐
│ ENTERPRISE DATA PLATFORM — OLIST │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────────────┐ │
│ │ SOURCE DB │ │ CDC │ │ MESSAGE BUS (Kafka API) │ │
│ │ PostgreSQL │───▶│ Debezium │───▶│ Redpanda (KRaft, no ZK) │ │
│ │ raw_* │ │ (WAL capture│ │ • 3 brokers, RF=3 │ │
│ │ tables │ │ exactly-once│ │ • Partitioned by customer_id │ │
│ │ │ │ streaming)│ │ • Schema Registry (Avro) │ │
│ └─────────────┘ └─────────────┘ └─────────────────┬───────────────┘ │
│ │ │
│ ┌─────────────────────────────┼────────────────┐
│ ▼ ▼ │
│ ┌────────────────────┐ ┌────────────────────┐ │
│ │ STREAM LAYER │ │ BATCH LAYER │ │
│ │ Flink SQL │ │ Spark + Delta │ │
│ │ • Windowed agg │ │ • MERGE INTO │ │
│ │ • Late arrival │ │ • Z-ORDER optimize│ │
│ │ • Watermarking │ │ • Time travel │ │
│ └──────────┬────────┘ └──────────┬────────┘ │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ LAKEHOUSE — Delta Lake on S3/MinIO │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────────┐ │ │
│ │ │ Bronze │─▶│ Silver │─▶│ Gold │ │ │
│ │ │ (CDC) │ │ (clean) │ │ (Star Schema)│ │ │
│ │ └──────────┘ └──────────┘ └──────────────┘ │ │
│ └────────────────────┬──────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ DATA WAREHOUSE — PostgreSQL / BigQuery │ │
│ │ • Partitioned by date │ │
│ │ • Materialized views for KPIs │ │
│ └────────────────────┬──────────────────────┘ │
│ │ │
│ ┌────────────────────┼────────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌────────────┐ ┌──────────────┐ ┌─────────────────┐ │
│ │ ML/AI │ │ Dashboard │ │ Reverse ETL │ │
│ │ Platform │ │ (Streamlit) │ │ (Hightouch) │ │
│ └────────────┘ └──────────────┘ └─────────────────┘ │
│ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ OBSERVABILITY: Prometheus + Grafana + Business Metrics │ │
│ │ • Consumer lag per partition/topic │ │
│ │ • End-to-end latency (event_time → warehouse_load_time) │ │
│ │ • Data quality SLI (% null, % duplicate, schema violations) │ │
│ │ • Alerting: PagerDuty-style thresholds for critical alerts │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────────────────┘
3.2 Enterprise Patterns
| Pattern | Implementation | File |
|---|---|---|
| CDC (Debezium) | PostgreSQL WAL capture → Avro events | enterprise/debezium/connectors/ |
| Exactly-once Producer | enable.idempotence=true |
docker-compose.enterprise.yml |
| Idempotent Consumer | Dedup table + ON CONFLICT DO NOTHING |
enterprise/consumers/bronze_consumer.py |
| Schema Registry (Avro) | Backward compatibility, evolution | enterprise/schemas/avro/ |
| Schema Evolution | mergeSchema=true, backward_transitive |
docker-compose.enterprise.yml + ETL |
| Dead Letter Queue | Structured error metadata + retry topic | enterprise/consumers/bronze_consumer.py |
| Data Quality Engine | Declarative rules per table | enterprise/etl/silver_etl.py |
| Delta Lake (ACID) | MERGE INTO, OPTIMIZE ZORDER, VACUUM |
enterprise/etl/silver_etl.py |
| Time Travel | Delta Lake versioning | VACUUM policy 7 days |
| K8s Deployment | Deployment, HPA, Service, ResourceQuota | enterprise/k8s/ |
| Auto-scaling | HPA by CPU + consumer lag | enterprise/k8s/bronze-consumer-deployment.yaml |
| Pod Anti-Affinity | Spread across nodes for HA | enterprise/k8s/bronze-consumer-deployment.yaml |
| Liveness/Readiness | /health and /ready endpoints |
enterprise/consumers/Dockerfile.bronze |
| Prometheus Metrics | Custom business metrics | enterprise/consumers/bronze_consumer.py |
| Business Alerting | Critical/Warning/Info thresholds | enterprise/monitoring/rules/business-alerts.yml |
3.3 Enterprise Docker Services (9 containers)
| Service | Image | Port | Vai trò |
|---|---|---|---|
| Redpanda | redpandadata/redpanda | 9092/19092/9644 | Kafka-compatible broker (KRaft mode, no Zookeeper) |
| Redpanda Console | redpandadata/console | 8080 | Kafka UI replacement |
| Schema Registry | confluentinc/cp-schema-registry | 8081 | Avro schema management |
| PostgreSQL | postgres:16 | 5432 | Source DB + DW (logical replication enabled) |
| Debezium Connect | debezium/connect | 8083 | CDC connector for PostgreSQL |
| MinIO | quay.io/minio/minio | 9000/9001 | S3-compatible object storage for Delta Lake |
| Bronze Consumer | custom (Python) | 8080 | Idempotent CDC consumer with DLQ + metrics |
| Flink JobManager | flink:1.19 | 8085 | Stream processing (windowed aggregations) |
| Flink TaskManager | flink:1.19 | — | Stream processing worker |
Note: Enterprise architecture replaces Kafka+Zookeeper with Redpanda (KRaft mode) — eliminates Zookeeper, simplifies ops. Replaces Spark standalone with Flink for true stream processing with watermarking and late arrival handling.
4. GIAI ĐOẠN 1 — NẠP DỮ LIỆU THÔ (Ingestion / Bronze Layer)
4.1 Nguồn dữ liệu
- Nguồn chính: Kaggle — Olist Brazilian E-commerce (8 CSV files)
- Sample trên HuggingFace: thanhtai435/agentic-bi-ecommerce (7 Parquet files, 500 orders)
4.2 Streaming Ingestion — Standard (simulator.py)
Replay CSV thành events theo thứ tự thời gian qua 6 Kafka topics:
CSV Files ──▶ Streaming Simulator ──▶ Kafka Topics:
├── ecom.orders.created
├── ecom.orders.items
├── ecom.orders.payments
├── ecom.orders.status_changed
├── ecom.orders.delivered
└── ecom.reviews.submitted
Đặc điểm:
- Mô phỏng toàn bộ order lifecycle: ORDER_CREATED → ITEM_ADDED → PAYMENT_CAPTURED → STATUS_CHANGED → ORDER_DELIVERED → REVIEW_SUBMITTED
- Tốc độ điều chỉnh:
speed=1(real-time 25 tháng),speed=1000(36 phút),3.6 phút)speed=10000( - Snappy compression, batch messages (1000), queue buffering 100K
4.3 Streaming Ingestion — Enterprise (Debezium CDC)
Real CDC from PostgreSQL WAL:
-- Application writes to source DB
INSERT INTO raw_orders (order_id, customer_id, order_status, ...)
VALUES ('abc123', 'cust456', 'created', ...);
-- Debezium captures via pgoutput protocol
-- Emits to Kafka topic: ecom.raw.raw_orders
Key features:
- Exactly-once: PostgreSQL replication slot + Kafka idempotent producer
- Ordered: Same
customer_idroutes to same partition (partition key) - Schema-aware: Avro schema registered with backward compatibility
- Soft deletes: Tombstone events with
__deleted=truefor GDPR compliance
4.4 Bronze Layer
Dữ liệu CSV thô được nạp vào mà không chỉnh sửa (nguyên bản):
| File CSV (Bronze) | Bảng | Mô tả |
|---|---|---|
| olist_orders_dataset.csv | raw_orders | 99,441 đơn hàng |
| olist_order_items_dataset.csv | raw_order_items | 112,650 dòng chi tiết |
| olist_customers_dataset.csv | raw_customers | 99,441 khách hàng |
| olist_products_dataset.csv | raw_products | 32,951 sản phẩm |
| olist_sellers_dataset.csv | raw_sellers | 3,095 người bán |
| olist_order_payments_dataset.csv | raw_payments | 103,886 giao dịch thanh toán |
| olist_order_reviews_dataset.csv | raw_reviews | 99,224 đánh giá |
| olist_geolocation_dataset.csv | raw_geolocation | 1,000,163 vị trí |
4.5 Tự động hóa (Airflow DAG / GitHub Actions)
File airflow_dags/daily_etl_dag.py cấu hình DAG chạy mỗi ngày lúc 02:00 UTC:
bronze_to_silver ──▶ silver_to_gold ──▶ ┬── run_association_rules
├── run_segmentation
├── run_satisfaction_model
└── quality_check
│
▼
notify_complete
- Retry: 2 lần, delay 5 phút
- Timeout: 1 giờ per task
- Quality Check: validate null, duplicates, negative prices trên Gold layer
5. GIAI ĐOẠN 2 — XÂY DỰNG LÕI LAKEHOUSE (Transformation)
5.1 Tầng Silver — Cleaning & Validation
Standard (transforms/bronze_to_silver.py)
| Bảng | Xử lý |
|---|---|
| orders | Dedup order_id, parse 5 timestamp columns, standardize status (lowercase), tính delivery_days, delivery_delay_days, is_late_delivery |
| order_items | Dedup (order_id, order_item_id), validate price/freight ≥ 0 (negative → NaN), tính total_value |
| customers | Dedup customer_id, normalize city (Title Case), state (UPPERCASE) |
| products | Dedup product_id, convert numeric columns, fill missing với median, merge English category names |
| sellers | Dedup seller_id, normalize city/state |
| payments | Dedup (order_id, payment_sequential), standardize payment_type |
| reviews | Dedup review_id, tính has_comment, comment_length, parse dates |
| geolocation | Aggregate lat/lng per zip_code, map state → region |
Output: Parquet files trong data/silver/ + quality_log.csv ghi lại metrics.
Enterprise (enterprise/etl/silver_etl.py — Spark + Delta Lake)
Data Quality Engine — declarative rules per table:
rules = {
"raw_orders": {
"required": ["order_id", "customer_id", "order_purchase_timestamp"],
"unique": ["order_id"],
"valid_values": {
"order_status": ["created", "approved", "shipped", "delivered", "cancelled"]
},
},
"raw_payments": {
"required": ["order_id", "payment_type", "payment_value"],
"positive": ["payment_value"],
"range": {"payment_installments": (1, 24)},
"valid_values": {"payment_type": ["credit_card", "boleto", "voucher", "debit_card"]},
},
}
Delta Lake features:
MERGE INTOfor idempotent upsertsOPTIMIZE ZORDERfor query performanceVACUUMto clean old versions (retain 7 days)- Schema evolution (
mergeSchema=true)
5.2 Tầng Gold — Star Schema
Dimension Tables
| Bảng | Grain | Columns chính |
|---|---|---|
| dim_time | 1 row/ngày | date_key, full_date, day_name, month, quarter, year, is_weekend |
| dim_customer | 1 row/customer | customer_key, customer_id, zip_code, city, state, order_count, first_order, last_order |
| dim_product | 1 row/product | product_key, product_id, category_pt, category_en, weight, dimensions |
| dim_seller | 1 row/seller | seller_key, seller_id, city, state |
| dim_geography | 1 row/zip_code | geo_key, zip_code, city, state, region, lat, lng |
Fact Table
| Bảng | Grain | Measures |
|---|---|---|
| fact_orders | 1 row/order | item_count, total_price, total_freight, total_payment, payment_type, installments, review_score, delivery_days, delivery_delay_days, is_late_delivery |
Aggregation Tables (Wide Tables cho AI/ML)
| Bảng | Mục đích |
|---|---|
| agg_daily_revenue | Doanh thu/đơn hàng/khách hàng theo ngày |
| agg_customer_segments | RFM segments (Champion, Loyal, Regular, At Risk, Lost) |
5.3 dbt Models (dbt_project/models/dbt_models.sql)
Cung cấp SQL transforms tương đương, cấu trúc 3 layer:
Staging (stg_*): 5 views — cleaning, type casting, validation
Intermediate (int_*): 1 enriched view — join items + payments + reviews per order
Marts (Gold): Views cho daily revenue aggregation
Tests: 5 data quality tests (not null, unique, positive, referential integrity, valid status)
5.4 PostgreSQL DDL (init-scripts/postgres/01_init_dw.sql + enterprise/init-scripts/postgres-cdc.sql)
- Star Schema hoàn chỉnh: 4 dimension + 4 fact tables + 4 aggregation tables
- Indexes: 12 indexes trên các FK và timestamp columns
- dim_time pre-populated: 2016–2019 (1,461 rows)
- CDC ready:
wal_level=logical,REPLICA IDENTITY FULL, publicationdbz_publication
6. GIAI ĐOẠN 3 — HUẤN LUYỆN AI & KIỂM CHỨNG (Intelligence)
6.1 Data Preprocessing (analytics/data_preprocessing.py)
Pipeline 7 bước:
| Bước | Mô tả |
|---|---|
| 1. Data Quality Assessment | Đánh giá 6 chiều: Completeness, Uniqueness, Validity, Consistency, Outliers (IQR), Data types |
| 2. Cleaning | Dedup, type casting, normalize strings, handle negatives |
| 3. Outlier Treatment | Capping (Winsorizing) ở percentile 1% và 99% |
| 4. Normalization | MinMax / Z-Score / Robust Scaler |
| 5. PCA | Giảm chiều với explained variance ratio ≥ 95% |
| 6. Feature Engineering | 25+ features: RFM, freight_ratio, GMV, is_free_shipping, region mapping, time features |
| 7. Visualization | Report: missing values, distributions, correlation heatmap |
6.2 Association Rules Mining (analytics/association_rules.py)
Thuật toán: Apriori implement từ đầu (không dùng thư viện)
- Input: Multi-category orders (đơn hàng chứa ≥ 2 categories khác nhau)
- Output: Rules với Support, Confidence, Lift
- Ứng dụng: Cross-selling recommendations ("Khách mua [Bed/Bath] có 65% khả năng mua [Health/Beauty]")
- Visualization: Support vs Confidence scatter, Top 15 rules by Lift, Confidence distribution
6.3 Customer Segmentation (analytics/customer_segmentation.py)
Thuật toán: K-Means + DBSCAN
| Phần | Chi tiết |
|---|---|
| RFM Analysis | Tính Recency, Frequency, Monetary per customer |
| K-Means | Elbow method + Silhouette score → tìm best K → phân cụm |
| Segment Naming | Tự động gán tên: Champions, Loyal, New/Promising, At Risk, Lost/Hibernating |
| DBSCAN | Phát hiện anomaly customers (noise points = khách hàng bất thường) |
| Visualization | Elbow plot, Silhouette plot, scatter Recency vs Monetary, segment size bar chart, DBSCAN anomaly plot |
| Business Output | Khuyến nghị marketing cho từng segment |
6.4 Satisfaction Prediction (analytics/satisfaction_model.py)
Bài toán: Dự đoán khách hàng có hài lòng không (review_score ≥ 4)
| Model | Mô tả |
|---|---|
| Decision Tree (Entropy/ID3) | max_depth=6, min_samples_leaf=50 |
| Decision Tree (Gini/CART) | max_depth=6, min_samples_leaf=50 |
| Gaussian Naive Bayes | Baseline probabilistic model |
| Random Forest | 100 trees, max_depth=8 |
Features (15): delivery_days, delivery_delay, is_late, total_price, total_freight, freight_ratio, n_items, n_sellers, max_installments, avg_weight, avg_photos, avg_desc_len, purchase_hour, purchase_dayofweek, is_weekend
Evaluation:
- Accuracy, F1-Score, AUC-ROC per model
- 5-fold Cross Validation (F1)
- ROC Curve comparison plots
- Feature Importance (Random Forest)
- Decision Tree Rules (text export, depth ≤ 3)
6.5 In-Database ML (analytics/in_database_ml.sql)
| Component | Mô tả |
|---|---|
| Feature Store | feature_store_customer — 15+ features tính sẵn trong SQL (RFM, behavioral, temporal) |
| SQL ML (BigQuery ML syntax) | Logistic Regression cho satisfaction prediction |
| PostgreSQL UDF | predict_satisfaction() — scoring function gọi trực tiếp từ SQL |
| Model Drift Detection | So sánh baseline vs current period, PSI approximation, auto-alert khi drift |
7. GIAI ĐOẠN 4 — DASHBOARD & GENAI (Visualization)
7.1 Streamlit Dashboard (frontend/streamlit_app.py)
App 4 trang:
| Trang | Nội dung |
|---|---|
| 📊 KPI Dashboard | 5 KPI cards (GMV, Orders, AOV, On-time Delivery, Avg Review), Monthly Revenue Trend, Revenue by State, Payment Methods pie chart, Review Distribution, Top Categories |
| 💬 AI Analytics Chat | Chat interface hỏi đáp bằng ngôn ngữ tự nhiên, Quick Actions (Revenue Report, Anomaly Scan, Seller Analysis, Customer Segments) |
| ⚡ Real-time Monitor | Events/sec, Orders/5min, Revenue/5min, Active Alerts, Kafka Topic Throughput, Recent Anomaly Alerts, Pipeline Health (Kafka lag, Flink status, ClickHouse ingestion) |
| 📋 Reports | Scheduled Reports (Daily/Weekly/Monthly), Auto-generated AI reports với Key Findings + Alerts + Recommendations |
7.2 Agentic BI — Multi-Agent System (agentic_bi/orchestrator.py)
Framework: smolagents (HuggingFace)
LLM: Qwen/Qwen3-32B-Instruct (via HF Inference API)
┌──────────────────┐
│ ORCHESTRATOR │
│ (CodeAgent) │
│ max_steps=12 │
└───────┬──────────┘
│
┌─────────────┼─────────────┐
▼ ▼
┌────────────────┐ ┌──────────────────┐
│ SQL AGENT │ │ INSIGHT AGENT │
│ (ToolCalling) │ │ (ToolCalling) │
│ max_steps=8 │ │ max_steps=10 │
└───────┬────────┘ └───────┬──────────┘
│ │
┌───────┴────────┐ ┌────────┴────────────┐
│ • sql_execute │ │ • sql_execute │
│ • get_schema │ │ • get_kpi_summary │
└────────────────┘ │ • detect_anomalies │
└─────────────────────┘
4 Tools:
sql_execute— Chạy SELECT queries trên PostgreSQL (safety check: chỉ cho SELECT/WITH)get_schema_info— Lấy schema + sample data (cho agent hiểu cấu trúc bảng)get_kpi_summary— Pre-computed KPIs: revenue, orders, delivery, reviews, sellers, customersdetect_anomalies— Z-score anomaly detection trên time-series metrics (threshold > 2σ)
Schema Injection: Toàn bộ Star Schema description (~50 dòng) được inject vào agent context → agent hiểu bảng/cột để viết SQL chính xác.
8. CÁC THÀNH PHẦN MỞ RỘNG
Dự án đã mở rộng vượt xa yêu cầu gợi ý với các thành phần sau:
| Thành phần | File | Mô tả |
|---|---|---|
| Streaming Pipeline | streaming_simulator/simulator.py |
Mô phỏng real-time data ingestion qua Kafka (6 topics, order lifecycle) |
| ClickHouse OLAP | init-scripts/clickhouse/ |
Real-time analytics: 5-min windows, delivery tracking, anomaly alerts |
| Data Governance | governance/data_governance.py |
Data Lineage, Row-Level Security, K-Anonymity, Differential Privacy, Blockchain Audit Trail |
| Unit Tests | tests/test_pipeline.py |
19 tests: preprocessing, association rules, clustering, classification, ETL, governance |
| Monitoring | docker-compose.yml + enterprise/monitoring/ |
Prometheus + Grafana stack |
| MinIO Object Storage | docker-compose.yml + enterprise/docker-compose.enterprise.yml |
Bronze/Silver/Gold buckets |
| ChromaDB Vector Store | docker-compose.yml |
Cho Agentic BI context retrieval |
| Spark Cluster | docker-compose.yml |
Distributed processing (master + worker) |
| Apriori from Scratch | analytics/association_rules.py |
Implement thuật toán Apriori không dùng thư viện bên ngoài |
| 🏭 ENTERPRISE ARCHITECTURE | enterprise/ |
CDC (Debezium), idempotent consumers, Delta Lake, K8s manifests, business alerting |
9. CẤU TRÚC DỰ ÁN
Tổng: 70+ files, ~2.1MB code + data
.
├── enterprise/ # 🏭 ENTERPRISE ARCHITECTURE (MỚI)
│ ├── README.md
│ ├── debezium/connectors/ # CDC configs
│ ├── schemas/avro/ # Avro schema definitions
│ ├── consumers/ # Idempotent consumer + DLQ
│ ├── etl/ # Spark + Delta Lake ETL
│ ├── k8s/ # Kubernetes manifests
│ ├── monitoring/ # Prometheus + Grafana
│ └── init-scripts/ # CDC database setup
│
├── agentic_bi/
│ └── orchestrator.py # 15.4KB — Multi-agent system
├── airflow_dags/
│ └── daily_etl_dag.py # 4.3KB — Airflow DAG
├── analytics/
│ ├── data_preprocessing.py # 23.1KB — Full preprocessing pipeline
│ ├── association_rules.py # 11.6KB — Apriori from scratch
│ ├── customer_segmentation.py # 10.0KB — K-Means RFM + DBSCAN
│ ├── satisfaction_model.py # 10.2KB — DT + NB + RF
│ └── in_database_ml.sql # 8.4KB — Feature Store + SQL ML
├── data/sample/ # 7 Parquet files
├── dbt_project/models/
│ └── dbt_models.sql # 6.0KB — dbt SQL transforms
├── docs/images/ # 5 PNG diagrams
├── frontend/
│ └── streamlit_app.py # 13.3KB — 4-page dashboard
├── governance/
│ └── data_governance.py # 10.2KB — Lineage + RLS + Privacy
├── init-scripts/
│ ├── postgres/01_init_dw.sql # 9.9KB — Star Schema DDL
│ └── clickhouse/01_init_realtime.sql
├── streaming_simulator/
│ └── simulator.py # 17.8KB — Kafka event replay
├── tests/
│ └── test_pipeline.py # 8.3KB — 19 unit tests
├── transforms/
│ ├── bronze_to_silver.py # 9.7KB — Cleaning pipeline
│ └── silver_to_gold.py # 9.6KB — Star Schema builder
├── docker-compose.yml # 10.9KB — 16 services (Standard)
├── docker-compose.enterprise.yml # 9 services (Enterprise)
├── Makefile # 4.6KB
├── requirements.txt # 769B
├── Dockerfile
├── Dockerfile.simulator
└── .env.example
10. TECHNOLOGY STACK
Core Pipeline
| Layer | Standard | Enterprise | Vai trò |
|---|---|---|---|
| Ingestion | Apache Kafka + ZK | Redpanda (KRaft) + Debezium CDC | Streaming data ingestion |
| Message Format | JSON | Avro + Schema Registry | Schema evolution, type safety |
| Storage | MinIO | MinIO (S3-compatible) | Object storage cho Bronze/Silver/Gold |
| Lakehouse Format | Parquet | Delta Lake | ACID, time travel, MERGE INTO |
| Data Warehouse | PostgreSQL 16 | PostgreSQL 16 / BigQuery | Star Schema, ACID transactions |
| Real-time OLAP | ClickHouse 24 | ClickHouse 24 | Sub-second analytics queries |
| Transformation | Python Pandas | Spark + Delta | Distributed ETL with data quality |
| Orchestration | Apache Airflow 2.8 | Apache Airflow 2.8 / GitHub Actions | DAG scheduling |
| Processing | Apache Spark 3.5 | Apache Flink 1.19 | True stream processing with watermarking |
AI/ML
| Thành phần | Thư viện/Công nghệ |
|---|---|
| Machine Learning | scikit-learn (KMeans, DBSCAN, DecisionTree, NaiveBayes, RandomForest) |
| Association Rules | Custom Apriori implementation (from scratch) |
| Preprocessing | StandardScaler, MinMaxScaler, RobustScaler, PCA |
| Feature Store | PostgreSQL (SQL-based feature computation) |
| Agentic BI | smolagents (HuggingFace), Qwen3-32B-Instruct |
| Vector Store | ChromaDB |
Enterprise Infrastructure
| Pattern | Công nghệ |
|---|---|
| CDC | Debezium + PostgreSQL logical replication (pgoutput) |
| Idempotency | PostgreSQL ON CONFLICT + dedup hash table |
| DLQ | Kafka topic ecom.orders.dlq + structured error metadata |
| Schema Evolution | Confluent Schema Registry, Avro backward_transitive |
| Exactly-once | Kafka enable.idempotence, transactional consumer |
| Container Orchestration | Kubernetes (K8s) with HPA, Pod Anti-Affinity |
| Monitoring | Prometheus + Grafana with custom business metrics |
| Alerting | Prometheus Alertmanager with PagerDuty-style thresholds |
Frontend & Monitoring
| Thành phần | Công nghệ |
|---|---|
| Dashboard | Streamlit (4-page app) |
| Visualization | Plotly, Matplotlib, Seaborn |
| Monitoring | Prometheus + Grafana |
| Infrastructure | Docker Compose (16 services Standard / 9 services Enterprise) |
| Testing | pytest (19 tests) |
11. HƯỚNG DẪN TRIỂN KHAI
11.1 Standard Architecture
# 1. Khởi động toàn bộ infrastructure (16 containers)
docker compose up -d
# 2. Chờ services khởi động (~30 giây)
docker compose ps
# 3. Chạy ETL Pipeline
make etl
# 4. Chạy Analytics
make analytics
# 5. Mở Dashboard
make app
# http://localhost:8501
11.2 Enterprise Architecture
# 1. Khởi động infrastructure (9 containers)
docker compose -f docker-compose.enterprise.yml up -d
# 2. Đợi healthchecks (Redpanda, Postgres, MinIO, Schema Registry)
docker compose -f docker-compose.enterprise.yml ps
# 3. Đăng ký Debezium connectors
curl -X POST http://localhost:8083/connectors \
-H "Content-Type: application/json" \
-d @enterprise/debezium/connectors/olist-orders-connector.json
# 4. Khởi động Bronze consumer (idempotent + DLQ)
docker compose -f docker-compose.enterprise.yml up -d bronze-consumer
# 5. Chạy Silver ETL (Spark + Delta Lake)
docker compose -f docker-compose.enterprise.yml up silver-etl
# 6. Xem Grafana dashboard
open http://localhost:3000/d/olist-enterprise # admin/admin
11.3 Kubernetes Deployment
# 1. Tạo namespace + resource quotas
kubectl apply -f enterprise/k8s/namespace.yaml
# 2. Deploy Redpanda cluster (3 brokers)
kubectl apply -f enterprise/k8s/redpanda-cluster.yaml
# 3. Deploy Bronze consumers (3 replicas + HPA)
kubectl apply -f enterprise/k8s/bronze-consumer-deployment.yaml
# 4. Verify
kubectl get pods -n olist-data-platform
kubectl top pod -n olist-data-platform
11.4 Service URLs
| Service | Standard | Enterprise |
|---|---|---|
| Streamlit Dashboard | http://localhost:8501 | http://localhost:8501 |
| Kafka/Redpanda UI | http://localhost:8080 | http://localhost:8080 |
| Airflow | http://localhost:8082 | — (use GitHub Actions) |
| Spark Master | http://localhost:8085 | http://localhost:8085 (Flink) |
| MinIO Console | http://localhost:9001 | http://localhost:9001 |
| Grafana | http://localhost:3000 | http://localhost:3000 |
| Prometheus | http://localhost:9090 | http://localhost:9090 |
| Schema Registry | http://localhost:8081 | http://localhost:8081 |
| Debezium Connect | — | http://localhost:8083 |
12. DATA MODELING — STAR SCHEMA
12.1 Star Schema Diagram
┌──────────────┐
│ dim_time │
│ ────────── │
│ date_key PK │
│ full_date │
│ day_name │
│ month/quarter│
│ year │
│ is_weekend │
└──────┬───────┘
│
┌──────────────┐ ┌──────┴───────┐ ┌──────────────┐
│ dim_customer │ │ fact_orders │ │ dim_product │
│ ──────────── │◄─────────── │ ─────────── │ ───────────▶│ ──────────── │
│ customer_key │ customer_ │ order_key PK │ product_ │ product_key │
│ customer_id │ key FK │ order_id │ key FK │ product_id │
│ zip_code │ │ order_status │ │ category_en │
│ city / state │ │ total_price │ │ weight │
│ order_count │ │ total_freight│ │ dimensions │
│ first/last │ │ review_score │ └──────────────┘
└──────────────┘ │ delivery_days│
│ is_late │ ┌──────────────┐
┌──────────────┐ │ purchase_ts │ │ dim_geography│
│ dim_seller │◄─────────── │ │ ───────────▶│ ──────────── │
│ ──────────── │ seller_ └──────────────┘ geography_ │ geo_key │
│ seller_key │ key FK key FK │ zip_code │
│ seller_id │ │ city / state │
│ city / state │ │ region │
│ total_orders │ │ lat / lng │
└──────────────┘ └──────────────┘
12.2 Business Rules
| KPI | Công thức |
|---|---|
| GMV | SUM(total_price + total_freight) |
| AOV | GMV / COUNT(DISTINCT order_id) |
| Late Delivery | delivery_days > estimated_delivery_days |
| NPS Proxy | (% review_score=5) − (% review_score=1) |
| RFM Segment | Recency×Frequency×Monetary quintile scoring |
13. ĐÁNH GIÁ & BENCHMARK
13.1 ML Model Performance
| Model | Accuracy | F1-Score | AUC-ROC | CV F1 (5-fold) |
|---|---|---|---|---|
| Decision Tree (Entropy) | ~0.78 | ~0.85 | ~0.72 | ~0.84±0.01 |
| Decision Tree (Gini) | ~0.78 | ~0.85 | ~0.72 | ~0.84±0.01 |
| Gaussian Naive Bayes | ~0.65 | ~0.72 | ~0.68 | ~0.71±0.02 |
| Random Forest | ~0.80 | ~0.87 | ~0.75 | ~0.86±0.01 |
13.2 Top Features (Random Forest)
delivery_delay— Yếu tố ảnh hưởng lớn nhất đến satisfactiondelivery_days— Thời gian giao hàngfreight_ratio— Tỷ lệ phí ship/giá sản phẩmtotal_price— Giá trị đơn hàngis_late— Giao trễ hay không
13.3 Unit Tests
$ make test
# 19 tests:
# TestDataPreprocessing: 4 tests
# TestAssociationRules: 4 tests
# TestClustering: 3 tests
# TestClassification: 3 tests
# TestETL: 3 tests
# TestGovernance: 2 tests
14. BẢNG ĐỐI CHIẾU VỚI YÊU CẦU BÀI TẬP LỚN
Giai đoạn 1: Thiết lập nền tảng và Nạp dữ liệu thô
| Bước | Yêu cầu | Trạng thái | File/Chi tiết |
|---|---|---|---|
| 1.1 | Đăng ký MotherDuck, tạo Database | ⚠️ Chưa có | Dự án dùng PostgreSQL + ClickHouse thay vì MotherDuck. Cần bổ sung MotherDuck nếu thầy yêu cầu. |
| 1.2 | Tìm dataset trên HuggingFace | ✅ | thanhtai435/agentic-bi-ecommerce |
| 1.3 | Tạo GitHub repo + Secrets | ⚠️ Chưa thấy | Cần tạo GitHub repo riêng (Private) với MOTHERDUCK_TOKEN và HF_TOKEN |
| 1.4 | Script ingest.py (datasets → DuckDB/MotherDuck) | ⚠️ Thay thế | Có transforms/bronze_to_silver.py nhưng dùng Pandas CSV thay vì datasets lib + DuckDB |
| 1.5 | GitHub Actions (CRON 0h sáng) | ⚠️ Thay thế | Có Airflow DAG (daily_etl_dag.py) chạy 02:00 UTC — tương đương nhưng không phải GitHub Actions |
Giai đoạn 2: Xây dựng lõi Lakehouse
| Bước | Yêu cầu | Trạng thái | File/Chi tiết |
|---|---|---|---|
| 2.1 | dbt kết nối MotherDuck | ⚠️ Có SQL nhưng chưa kết nối | dbt_project/models/dbt_models.sql — chưa có dbt_project.yml, profiles.yml |
| 2.2 | Silver: lọc rác, xử lý NULL, format | ✅ | transforms/bronze_to_silver.py — 8 bảng, dedup, validate, type cast |
| 2.3 | Gold: Fact + Dimension (Kimball) + Wide Tables | ✅ | transforms/silver_to_gold.py + init-scripts/postgres/01_init_dw.sql |
Giai đoạn 3: Huấn luyện AI và Kiểm chứng
| Bước | Yêu cầu | Trạng thái | File/Chi tiết |
|---|---|---|---|
| 3.1 | Google Colab kết nối lấy dữ liệu Gold | ⚠️ Chưa có notebook | Cần bổ sung file .ipynb cho Colab |
| 3.2 | Recommendation System | ✅ (tương đương) | association_rules.py + customer_segmentation.py |
| 3.3 | Precision@K, Recall@K, SHAP | ⚠️ Một phần | Có Accuracy, F1, AUC-ROC, CV. Chưa có Precision@K, Recall@K, SHAP |
| 3.4 | Export kết quả vào lại DW | ✅ | Kết quả lưu CSV/Parquet + in_database_ml.sql |
Giai đoạn 4: Dashboard & GenAI
| Bước | Yêu cầu | Trạng thái | File/Chi tiết |
|---|---|---|---|
| 4.1 | Streamlit biểu đồ doanh thu | ✅ | frontend/streamlit_app.py |
| 4.2 | Tích hợp Groq API (Llama 3) | ⚠️ Thay thế | Dùng smolagents + Qwen3-32B thay vì Groq + Llama 3 |
| 4.3 | Deploy Streamlit Cloud | ⚠️ Chưa deploy | Có code nhưng chưa deploy lên Streamlit Cloud |
Enterprise Patterns (Điểm cộng kiến tập)
| Pattern | Trạng thái | File |
|---|---|---|
| CDC (Debezium) | ✅ | enterprise/debezium/connectors/ |
| Exactly-once Producer | ✅ | docker-compose.enterprise.yml |
| Idempotent Consumer | ✅ | enterprise/consumers/bronze_consumer.py |
| Schema Registry (Avro) | ✅ | enterprise/schemas/avro/ |
| Schema Evolution | ✅ | Avro backward_transitive + mergeSchema |
| Dead Letter Queue | ✅ | enterprise/consumers/bronze_consumer.py |
| Data Quality Engine | ✅ | enterprise/etl/silver_etl.py |
| Delta Lake (ACID) | ✅ | enterprise/etl/silver_etl.py |
| Time Travel | ✅ | Delta Lake versioning |
| K8s Deployment | ✅ | enterprise/k8s/ |
| Auto-scaling (HPA) | ✅ | enterprise/k8s/bronze-consumer-deployment.yaml |
| Business Alerting | ✅ | enterprise/monitoring/rules/business-alerts.yml |
15. TÀI LIỆU THAM KHẢO
Dataset
Papers (Agentic BI)
- DataLab: A Unified Platform for LLM-Powered Business Intelligence (arXiv:2412.02205)
- APEX-SQL: Hypothesis-Verification for NL2SQL (arXiv:2602.16720)
- AgentPoirot: Autonomous Insight Discovery (arXiv:2407.06423)
Enterprise Technologies
- Debezium — CDC platform
- Redpanda — Kafka-compatible streaming (KRaft, no ZK)
- Delta Lake — Open-source storage layer for Lakehouse
- Apache Flink — Stream processing with watermarking
- Kubernetes — Container orchestration
Course Materials
Last updated: May 2026
- Downloads last month
- 98