Instructions to use hf-tiny-model-private/tiny-random-LayoutLMForSequenceClassification with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use hf-tiny-model-private/tiny-random-LayoutLMForSequenceClassification with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="hf-tiny-model-private/tiny-random-LayoutLMForSequenceClassification")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("hf-tiny-model-private/tiny-random-LayoutLMForSequenceClassification") model = AutoModelForSequenceClassification.from_pretrained("hf-tiny-model-private/tiny-random-LayoutLMForSequenceClassification") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ad2f8bb78620b71a7141054fd2fcc97e3dfa0204888447522408c4041deca556
- Size of remote file:
- 891 kB
- SHA256:
- 7c0ab29f88a4eb743e168d6a41acdb935eba61f6b25b9d8be8c33ab7326ad26c
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.