Instructions to use juliensimon/imdb-demo-infinity with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use juliensimon/imdb-demo-infinity with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="juliensimon/imdb-demo-infinity")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("juliensimon/imdb-demo-infinity", dtype="auto") - Notebooks
- Google Colab
- Kaggle
IMDB Sentiment โ Infinity Optimized
An IMDB sentiment analysis model optimized for Hugging Face Infinity, a containerized inference solution for maximum throughput and lowest latency on Transformers models.
Video walkthrough: Optimize the prediction latency of Transformers with a single Docker command!
Model Details
| Detail | Value |
|---|---|
| Task | Binary sentiment classification (positive/negative) |
| Dataset | IMDB |
| Format | Infinity-optimized binary (infinity_model.bin) |
| Purpose | Demo model for Hugging Face Infinity benchmarks |
- Downloads last month
- 8