Add license+base_model metadata; link to upstream
Browse files
README.md
CHANGED
|
@@ -1,4 +1,6 @@
|
|
| 1 |
---
|
|
|
|
|
|
|
| 2 |
tags:
|
| 3 |
- voxtral
|
| 4 |
- quantized
|
|
@@ -8,7 +10,7 @@ library_name: mlx
|
|
| 8 |
|
| 9 |
# Voxtral 3B — Quantized (MLX)
|
| 10 |
|
| 11 |
-
Public quantized weights of the
|
| 12 |
|
| 13 |
## Variants
|
| 14 |
- MLX Q4: `mlx-q4/`
|
|
@@ -36,7 +38,7 @@ print(generate(model, tokenizer, prompt, max_tokens=64))
|
|
| 36 |
- Embeddings are NOT quantized to preserve shape compatibility. As a result, any "bits per weight" metric may exceed the nominal target. This is informational, not an error.
|
| 37 |
|
| 38 |
## License
|
| 39 |
-
-
|
| 40 |
|
| 41 |
## Issues
|
| 42 |
If you notice any mismatch (missing files, wrong checksum), please open an issue.
|
|
|
|
| 1 |
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
base_model: mistralai/Voxtral-Mini-3B-2507
|
| 4 |
tags:
|
| 5 |
- voxtral
|
| 6 |
- quantized
|
|
|
|
| 10 |
|
| 11 |
# Voxtral 3B — Quantized (MLX)
|
| 12 |
|
| 13 |
+
Public quantized weights of the upstream model [`mistralai/Voxtral-Mini-3B-2507`](https://huggingface.co/mistralai/Voxtral-Mini-3B-2507). This repo contains MLX-ready variants only.
|
| 14 |
|
| 15 |
## Variants
|
| 16 |
- MLX Q4: `mlx-q4/`
|
|
|
|
| 38 |
- Embeddings are NOT quantized to preserve shape compatibility. As a result, any "bits per weight" metric may exceed the nominal target. This is informational, not an error.
|
| 39 |
|
| 40 |
## License
|
| 41 |
+
- License: Apache-2.0 (see `LICENSE.txt`). Attribution: upstream model [`mistralai/Voxtral-Mini-3B-2507`](https://huggingface.co/mistralai/Voxtral-Mini-3B-2507).
|
| 42 |
|
| 43 |
## Issues
|
| 44 |
If you notice any mismatch (missing files, wrong checksum), please open an issue.
|