Semantic Product Search Model
This model performs semantic product search using BERT embeddings and a dual-encoder neural network architecture.
Model Architecture
- Base Model: BERT-base-uncased for text embeddings
- Encoder: Dual-encoder architecture with separate query and product encoders
- Similarity Network: Multi-layer perceptron for relevance scoring
- Input Dimension: 768 (BERT embedding size)
- Hidden Dimensions: [512, 256, 128]
- Dropout: 0.3
Usage
See the load_and_run_frontend.py script for loading and using this model.
Files
pytorch_model.bin: Model weightsconfig.json: Model configurationtokenizer files: BERT tokenizer filesproduct_catalog.parquet: Product catalog for searchproduct_embeddings.npy: Precomputed product embeddings (optional)
Performance
Trained on Amazon Shopping Queries Dataset with the following metrics:
- NDCG@10: ~0.54
- MAP: ~0.54
- Precision@10: ~0.50
- Recall@10: ~0.54
- Downloads last month
- 73
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support