Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
HyX3
/
Magnum-12B-OpenVINO-INT4
like
1
Text Generation
OpenVINO
mistral
int4
magnum
quantized
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Magnum 12B OpenVINO INT4
Magnum 12B OpenVINO INT4
This is an OpenVINO INT4 quantized version of Magnum 12B.
Optimization:
Precision: INT4 (Asymmetric)
Framework: OpenVINO
Goal: CPU-based inference acceleration without significant loss in quality.
Downloads last month
55
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for
HyX3/Magnum-12B-OpenVINO-INT4
Base model
anthracite-org/magnum-v4-12b
Quantized
(
9
)
this model