prototype-0.4x238

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Multi-SLERP merge method.

Models Merged

The following models were included in the merge:

  • /workspace/prototype-0.4x232
  • /workspace/prototype-0.4x231
  • /workspace/prototype-0.4x233

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: /workspace/prototype-0.4x233
    parameters:
      weight: [0.4, 0.2, 0.3, 0.6, 0.3]
  - model: /workspace/prototype-0.4x232
    parameters:
      weight: [0.3, 0.6, 0.2, 0.3, 0.3]
  - model: /workspace/prototype-0.4x231
    parameters:
      weight: [0.3, 0.2, 0.5, 0.3, 0.7]
merge_method: multislerp
tokenizer:
  source: /workspace/prototype-0.4x229
chat_template: llama3
pad_to_multiple_of: 8
int8_mask: true
dtype: bfloat16
Downloads last month
-
Safetensors
Model size
71B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support