Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
RedHatAI
/
Magistral-Small-2506-FP8
like
6
Follow
Red Hat AI
1.79k
24 languages
vllm
mistral
compressed-tensors
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
3
ValueError: RedHatAI/Magistral-Small-2506-FP8 is not a multimodal model
#2
by
liamtoran
- opened
Oct 7, 2025
Discussion
liamtoran
Oct 7, 2025
Getting title error when running your quant via vllm. Thanks!
See translation
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Comment
·
Sign up
or
log in
to comment