sean_test_merge_out

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the linear merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: linear
slices:
- sources:
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_0
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_1
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_2
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_3
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_4
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_5
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_6
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_7
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_8
    parameters:
      weight: 1.0
  - layer_range: [0, 12]
    model:
      model:
        path: mllm-dev/gpt2_f_experiment_9
    parameters:
      weight: 1.0
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for mllm-dev/gpt2_m_experiment_linear

Paper for mllm-dev/gpt2_m_experiment_linear