Amethyst-DL-V1-70B / README.md
Tarek07's picture
Add files using upload-large-folder tool
ab01033 verified
metadata
base_model:
  - mlabonne/Hermes-3-Llama-3.1-70B-lorablated
  - Mawdistical/Lured-Lapine-70B
  - Sao10K/L3-70B-Euryale-v2.1
  - Sao10K/L3.1-70B-Hanami-x1
  - Sao10K/70B-L3.3-Cirrus-x1
library_name: transformers
tags:
  - mergekit
  - merge

MERGE2

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Linear DELLA merge method using mlabonne/Hermes-3-Llama-3.1-70B-lorablated as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Mawdistical/Lured-Lapine-70B
    parameters:
      weight: 0.20
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: Sao10K/L3.1-70B-Hanami-x1
    parameters:
      weight: 0.20
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: Sao10K/L3-70B-Euryale-v2.1
    parameters:
      weight: 0.20
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: Sao10K/70B-L3.3-Cirrus-x1
    parameters:
      weight: 0.20
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: mlabonne/Hermes-3-Llama-3.1-70B-lorablated
    parameters:
      weight: 0.20
      density: 0.7
      epsilon: 0.1
      lambda: 1.0
merge_method: della_linear
base_model: mlabonne/Hermes-3-Llama-3.1-70B-lorablated
parameters:
  normalize: false
  int8_mask: true
dtype: float32
out_dtype: bfloat16
chat_template: llama3
tokenizer:
 source: union
 pad_to_multiple_of: 8