added dataset attribution, and summary
Browse files
README.md
CHANGED
|
@@ -1,5 +1,7 @@
|
|
| 1 |
---
|
| 2 |
base_model: BSC-LT/salamandra-2b-instruct
|
|
|
|
|
|
|
| 3 |
license: apache-2.0
|
| 4 |
library_name: transformers
|
| 5 |
pipeline_tag: text-generation
|
|
@@ -45,6 +47,8 @@ source repo: [BSC-LT/salamandra-2b-instruct](https://huggingface.co/BSC-LT/salam
|
|
| 45 |
|
| 46 |
# Quantization summary
|
| 47 |
|
|
|
|
|
|
|
| 48 |
| **Quantization Type** | **PPL(Q)** | **ln(PPL(Q)/PPL(bf16))** | **File Size (G)** | **Notes** |
|
| 49 |
|-----------------------|------------|------------------------|-------------------|----------------------------------------------------------------|
|
| 50 |
| [**IQ3_M**](salamandra-2b-instruct_IQ3_M.gguf) | 16.774 | 0.086769 | 1.7 | Good size efficiency with acceptable PPL increase |
|
|
|
|
| 1 |
---
|
| 2 |
base_model: BSC-LT/salamandra-2b-instruct
|
| 3 |
+
datasets:
|
| 4 |
+
- oscar
|
| 5 |
license: apache-2.0
|
| 6 |
library_name: transformers
|
| 7 |
pipeline_tag: text-generation
|
|
|
|
| 47 |
|
| 48 |
# Quantization summary
|
| 49 |
|
| 50 |
+
The base model was quantized with a substantive importance matrix over all target languages (some 34x1000 samples, 96MB of text) with samples from the [Open Super-large Crawled ALMAnaCH coRpus](/datasets/oscar-corpus/oscar) dataset.
|
| 51 |
+
|
| 52 |
| **Quantization Type** | **PPL(Q)** | **ln(PPL(Q)/PPL(bf16))** | **File Size (G)** | **Notes** |
|
| 53 |
|-----------------------|------------|------------------------|-------------------|----------------------------------------------------------------|
|
| 54 |
| [**IQ3_M**](salamandra-2b-instruct_IQ3_M.gguf) | 16.774 | 0.086769 | 1.7 | Good size efficiency with acceptable PPL increase |
|