Semantic Product Search Model

This model performs semantic product search using BERT embeddings and a dual-encoder neural network architecture.

Model Architecture

  • Base Model: BERT-base-uncased for text embeddings
  • Encoder: Dual-encoder architecture with separate query and product encoders
  • Similarity Network: Multi-layer perceptron for relevance scoring
  • Input Dimension: 768 (BERT embedding size)
  • Hidden Dimensions: [512, 256, 128]
  • Dropout: 0.3

Usage

See the load_and_run_frontend.py script for loading and using this model.

Files

  • pytorch_model.bin: Model weights
  • config.json: Model configuration
  • tokenizer files: BERT tokenizer files
  • product_catalog.parquet: Product catalog for search
  • product_embeddings.npy: Precomputed product embeddings (optional)

Performance

Trained on Amazon Shopping Queries Dataset with the following metrics:

  • NDCG@10: ~0.54
  • MAP: ~0.54
  • Precision@10: ~0.50
  • Recall@10: ~0.54
Downloads last month
73
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support