Predonia-24B-V2.1 / README.md
Ateron's picture
Update README.md
97fe0be verified
metadata
base_model:
  - TheDrummer/Precog-24B-v1
  - TheDrummer/Cydonia-24B-v4.3
library_name: transformers
tags:
  - mergekit
  - merge
  - roleplay
language:
  - en

PredoniaV2.1

This is different merge method and seems to have better performance.

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: E:\AI\Precog
    parameters:
      density: [1.0, 0.75, 0.50] # density gradient
      weight: 1.0
  - model: E:\AI\Cydonia 4.3
    parameters:
      density: 0.35
      weight: [0, 0.3, 0.4, 0.5] # weight gradient
merge_method: ties
base_model: E:\AI\Precog
parameters:
  normalize: true
  int8_mask: true
dtype: float16