Instructions to use hf-tiny-model-private/tiny-random-Blip2ForConditionalGeneration with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use hf-tiny-model-private/tiny-random-Blip2ForConditionalGeneration with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("visual-question-answering", model="hf-tiny-model-private/tiny-random-Blip2ForConditionalGeneration")# Load model directly from transformers import AutoProcessor, AutoModelForVisualQuestionAnswering processor = AutoProcessor.from_pretrained("hf-tiny-model-private/tiny-random-Blip2ForConditionalGeneration") model = AutoModelForVisualQuestionAnswering.from_pretrained("hf-tiny-model-private/tiny-random-Blip2ForConditionalGeneration") - Notebooks
- Google Colab
- Kaggle
File size: 131 Bytes
ecc6594 | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:739d86c91e81049d6340f4a1aef11eb8648af0175bb45b9e5807a239d1ebead7
size 898676
|