ikarus / README.md
0504ankitsharma
feat: Switch to FREE sentence-transformers (no OpenAI quota issues)
6850ee4
metadata
title: Furniture Product Recommendation API
emoji: πŸ›‹οΈ
colorFrom: blue
colorTo: purple
sdk: docker
sdk_version: '3.11'
app_file: app/main.py
pinned: false
license: mit

πŸ›‹οΈ Furniture Product Recommendation API

AI-powered furniture product recommendation system using FastAPI, OpenAI GPT, and Pinecone vector database.

🌟 Features

  • πŸ” Semantic Search: Find products using natural language queries
  • πŸ’¬ Conversational Interface: Chat-based product discovery
  • πŸ€– AI Descriptions: Generated product descriptions using OpenAI GPT
  • πŸ“Š Analytics Dashboard: Comprehensive product insights
  • 🎯 Similar Products: Find related items
  • πŸš€ Vector Search: Fast semantic search with Pinecone

πŸ› οΈ Tech Stack

  • Backend: FastAPI
  • AI/ML:
    • OpenAI GPT (GenAI & Embeddings)
    • Sentence Transformers (NLP)
    • PyTorch (Computer Vision)
    • LangChain (AI Orchestration)
  • Database: Pinecone Vector DB
  • NLP: HuggingFace Transformers

πŸ“‹ Prerequisites

  • Python 3.11.9
  • Pinecone API Key (Required)
  • OpenAI API Key (Optional - uses FREE sentence-transformers by default)

πŸš€ Quick Start

1. Clone the Repository

git clone https://huggingface.co/spaces/0504ankitsharma/furniture-recommendation-api
cd furniture-recommendation-api

2. Install Dependencies

# Create virtual environment
python -m venv venv

# Activate virtual environment
# Windows:
venv\Scripts\activate
# Linux/Mac:
source venv/bin/activate

# Upgrade pip
pip install --upgrade pip

# Install PyTorch (CPU version)
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu

# Install requirements
pip install -r requirements.txt

3. Configure Environment Variables

Create a .env file:

# OpenAI is OPTIONAL - leave blank to use FREE alternatives
OPENAI_API_KEY=
OPENAI_MODEL=gpt-4o-mini

PINECONE_API_KEY=your_pinecone_api_key_here
PINECONE_INDEX_NAME=ikarus
PINECONE_DIMENSION=1024
PINECONE_ENVIRONMENT=us-east-1-aws

DATA_PATH=./data/dataset.csv
EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2
CV_MODEL=efficientnet_b0
DEBUG=True

4. Run the Application

uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload

Visit http://localhost:8000/docs for interactive API documentation.

πŸ“‘ API Endpoints

Recommendations

  • POST /api/recommendations/search - Search products
  • POST /api/recommendations/chat - Conversational search
  • GET /api/recommendations/similar/{product_id} - Get similar products

Analytics

  • GET /api/analytics/ - Get dataset analytics
  • GET /api/analytics/products - Get all products

Health

  • GET /health - Health check

πŸ’‘ Usage Examples

Search Products

import requests

response = requests.post(
    "http://localhost:8000/api/recommendations/search",
    json={
        "query": "modern dining chairs",
        "top_k": 5,
        "include_description": True
    }
)

print(response.json())

Chat Interface

response = requests.post(
    "http://localhost:8000/api/recommendations/chat",
    json={
        "message": "I need a comfortable office chair",
        "top_k": 3
    }
)

print(response.json())

Get Analytics

response = requests.get("http://localhost:8000/api/analytics/")
print(response.json())

πŸ“ Project Structure

backend/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ main.py                      # FastAPI application
β”‚   β”œβ”€β”€ config.py                    # Configuration
β”‚   β”œβ”€β”€ models.py                    # Pydantic models
β”‚   β”œβ”€β”€ database.py                  # Pinecone integration
β”‚   β”œβ”€β”€ services/
β”‚   β”‚   β”œβ”€β”€ embedding_service.py     # Text embeddings
β”‚   β”‚   β”œβ”€β”€ recommendation_service.py # Recommendation logic
β”‚   β”‚   β”œβ”€β”€ image_service.py         # Computer vision
β”‚   β”‚   └── genai_service.py         # OpenAI GPT
β”‚   β”œβ”€β”€ routes/
β”‚   β”‚   β”œβ”€β”€ recommendations.py       # Recommendation endpoints
β”‚   β”‚   └── analytics.py             # Analytics endpoints
β”‚   └── utils/
β”‚       └── data_loader.py           # Dataset loader
β”œβ”€β”€ data/
β”‚   └── dataset.csv                  # Product dataset
β”œβ”€β”€ .env                             # Environment variables
β”œβ”€β”€ requirements.txt                 # Dependencies
└── README.md                        # Documentation

πŸ”§ Configuration

All configuration is done through environment variables in .env:

  • OPENAI_API_KEY: OpenAI API key (OPTIONAL - uses FREE alternatives if not provided)
  • PINECONE_API_KEY: Pinecone API key (REQUIRED)
  • EMBEDDING_MODEL: Sentence transformers model (FREE, no API needed)
  • DATA_PATH: Path to product dataset
  • EMBEDDING_MODEL: Sentence transformer model
  • DEBUG: Enable debug mode

πŸ“Š Dataset

The system uses a furniture product dataset with the following columns:

  • title, brand, description, price
  • categories, images, manufacturer
  • package_dimensions, country_of_origin
  • material, color, uniq_id

πŸ€– AI Features

1. Semantic Search (FREE)

Uses sentence-transformers/all-MiniLM-L6-v2 locally (no API needed, no quota limits) to create embeddings and find semantically similar products.

2. Generative Descriptions (Optional)

OpenAI GPT generates creative, engaging product descriptions. Falls back to template-based descriptions if API key not provided.

3. Image Classification

EfficientNet-based computer vision for product categorization.

4. Conversational AI

LangChain-powered chatbot for natural product discovery.

πŸ”’ Security Notes

  • Never commit .env file
  • Use environment variables for secrets
  • Enable CORS only for trusted origins in production
  • Rate limit API endpoints for production use

πŸ“ License

MIT License

πŸ‘¨β€πŸ’» Author

Ankit Sharma (@0504ankitsharma)

πŸ™ Acknowledgments

  • OpenAI
  • Pinecone
  • HuggingFace
  • FastAPI

πŸ“§ Contact

For questions or feedback, please open an issue on the repository.


Built with ❀️ using FastAPI and AI