l3-stack-mocha-2 / README.md
CuriousCat29's picture
Upload folder using huggingface_hub
2446071 verified
metadata
base_model:
  - Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
  - TheDrummer/Fallen-Llama-3.3-70B-v1
library_name: transformers
tags:
  - mergekit
  - merge

merged

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: bfloat16
merge_method: passthrough
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 20]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [10, 30]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [20, 40]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [30, 50]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [40, 60]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1
    - sources:
      - layer_range: [50, 70]
        model: Doctor-Shotgun/L3.3-70B-Magnum-v4-SE
    - sources:
      - layer_range: [60, 80]
        model: TheDrummer/Fallen-Llama-3.3-70B-v1