# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("FriendliAI/deepseek-vl2-tiny", dtype="auto")Quick Links
deepseek-ai/deepseek-vl2-tiny
- Model creator: deepseek-ai
- Original model: deepseek-vl2-tiny
Differences
- Added missing chat template to tokenizer_config.json
License
Refer to the license of the original model card.
- Downloads last month
- 24
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-text-to-text", model="FriendliAI/deepseek-vl2-tiny") messages = [ { "role": "user", "content": [ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/p-blog/candy.JPG"}, {"type": "text", "text": "What animal is on the candy?"} ] }, ] pipe(text=messages)