Instructions to use optimum-intel-internal-testing/tiny-random-squeezebert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use optimum-intel-internal-testing/tiny-random-squeezebert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="optimum-intel-internal-testing/tiny-random-squeezebert")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("optimum-intel-internal-testing/tiny-random-squeezebert") model = AutoModel.from_pretrained("optimum-intel-internal-testing/tiny-random-squeezebert") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 71e6e5db38b05df78919e4ab2571972fc904965291d69cdc09868c7e353be3d5
- Size of remote file:
- 322 kB
- SHA256:
- de13912fa0d2d4a9b080567e82c0b3c48595bfee0ea62d7dcd2626c4c3ce23fe
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.