Chem2

Experimental merge of GigaMag-Behemoth-123b and Behemoth-Tess-Maid-123b with Behemoth 1.2 as a base, for fun.

GigaMag-Behemoth-123b = (Behemoth 1.2 & Gigaberg Slerp) + (Behemoth 1.2 & Magnum-v4 Slerp)

Behemoth-Tess-Maid-123b = (Behemoth 1.2 & Tess Slerp) + (Behemoth1.2 & Lumimaid 0.2 Slerp)

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

  • /workspace/Behemoth-Tess-Maid-123b
  • /workspace/cache/models--bruhzair--GigaMag-Behemoth-123b/snapshots/213921094eb5aa5b640db0cb8cd7fd416b8f1f22

Configuration

The following YAML configuration was used to produce this model:

base_model: /workspace/cache/models--bruhzair--GigaMag-Behemoth-123b/snapshots/213921094eb5aa5b640db0cb8cd7fd416b8f1f22
dtype: bfloat16
merge_method: slerp
parameters:
  t:
  - filter: self_attn
    value: [0.1, 0.2, 0.4, 0.8, 0.4, 0.2, 0.1]
  - filter: mlp
    value: [0.1, 0.2, 0.4, 0.8, 0.4, 0.2, 0.1]
  - value: 0.5
slices:
- sources:
  - layer_range: [0, 88]
    model: /workspace/cache/models--bruhzair--GigaMag-Behemoth-123b/snapshots/213921094eb5aa5b640db0cb8cd7fd416b8f1f22
  - layer_range: [0, 88]
    model: /workspace/Behemoth-Tess-Maid-123b
Downloads last month
2
Safetensors
Model size
123B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for bruhzair/Chemical-X-t2-123b

Quantizations
1 model