cesomething 1b test

The following YAML configuration was used to produce this model:

base_model: NovaCorp/CULO-MoE
dtype: bfloat16
merge_method: model_stock
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 16]
        model: NovaCorp/CULO-MoE
      - layer_range: [0, 16]
        model: UmbrellaInc/T-Virus_Isolated.NE.Enhancement-3.2-1B
      - layer_range: [0, 16]
        model: hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
parameters:
  t: [0.6, 0.6, 0.6]
Heretic
Disobedience rate: 5%, original: 27%
KL divergence: 0.0074

Parameters:
direction_index = per layer
attn.o_proj.max_weight = 0.87
attn.o_proj.max_weight_position = 13.11
attn.o_proj.min_weight = 0.66
attn.o_proj.min_weight_distance = 2.14
mlp.down_proj.max_weight = 0.88
mlp.down_proj.max_weight_position = 10.85
mlp.down_proj.min_weight = 0.76
mlp.down_proj.min_weight_distance = 2.71

Downloads last month
35
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for 0xemx9ed4y77/cesomething_1b