An extended part of my effort to create Eileithyia-20B. This model is made by following the recipe below, inverting it, then SLERPing the models back together at 0.5, hopefully fusing the models into one block for use with Harmonia.

slices:

  • sources: - model: microsoft/Orca-2-13b

    layer_range: [0, 16]
  • sources: - model: athirdpath/Eileithyia-13B

    layer_range: [8, 24]
  • sources: - model: microsoft/Orca-2-13b

    layer_range: [17, 32]
  • sources: - model: athirdpath/Eileithyia-13B

    layer_range: [25, 40]

merge_method: passthrough

dtype: float16

Thanks to Undi95 for pioneering the recipe.

Downloads last month
1,138
Safetensors
Model size
20B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for athirdpath/CleverMommy-mix-20b

Quantizations
2 models