Instructions to use Sayan01/tiny-bert-mrpc-distilled with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Sayan01/tiny-bert-mrpc-distilled with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Sayan01/tiny-bert-mrpc-distilled")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Sayan01/tiny-bert-mrpc-distilled") model = AutoModelForSequenceClassification.from_pretrained("Sayan01/tiny-bert-mrpc-distilled") - Notebooks
- Google Colab
- Kaggle
Ctrl+K
This model has 1 file scanned as suspicious.
- 1655829995.2966251
- 1655831357.3873348
- 1655980750.8682323
- 1655980824.6563725
- 1655981347.9312732
- 1655981898.0707114
- 1655983885.7758725
- 1655995300.9766157
- 1655995690.107191
- 1655995919.554158
- 1656603719.8285983
- 1656645802.5409365
- 1657876762.7725685
- 1657913128.8702607
- 8.83 kB xet
- 8.48 kB xet
- 5.12 kB xet
- 8.83 kB xet
- 8.81 kB xet
- 8.81 kB xet
- 8.81 kB xet
- 8.81 kB xet
- 7.02 kB xet
- 5.1 kB xet
- 7.41 kB xet
- 4.28 kB xet
- 16.7 kB xet
- 16.7 kB xet