Original model: Boreas-24B-v1.2 by Naphula
Available ExLlamaV3 (release v0.0.18) quantizations
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 13.16 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 18.72 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 24.27 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: apache-2.0
The license for the provided quantized models is derived from the original model. For additional information see the original model's page above, or, if unavailable, the files and the page backups below.
Backups
Original page (click to expand)
Boreas 1.2 - Radioactive Edition
The same components as v1.1 but uses the FLUX_v5 method from v1.0
20 hour FLUX merge using 1000 iterations to find the perfect center
v1.2 is a sub-component of v1.3 but seems to functional very well on its own so I am releasing it seperately. It has none of the bugs associated with v1.
Compared to the RSCE method, all models were within 1% of each other, with Mullein having the highest magnitude at 8%, the rest closer to 7%.
Model tree for DeathGodlike/Boreas-24B-v1.2_EXL3
Base model
Naphula/Boreas-24B-v1.2
