MS3.2-24B-Penumbra-Aether / mergekit_config.yml
Vortex5's picture
Upload folder using huggingface_hub
e98d86d verified
raw
history blame contribute delete
359 Bytes
base_model: Vortex5/MS3.2-24B-Chaos-Skies
models:
- model: TheDrummer/Cydonia-24B-v4.3
- model: LatitudeGames/Hearthfire-24B
- model: Burnt-Toast/ms3.2-24b-longform
merge_method: hpq
parameters:
strength: 0.78
flavor: 0.48
paradox: 0.45
cube_dims: 20
steps: 10
boost: 0.50
dtype: bfloat16
tokenizer:
source: Vortex5/MS3.2-24B-Chaos-Skies