β–€β–ˆβ–€ β–ˆβ–„ β–„β–ˆ β–ˆ β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–€β–ˆβ–€ β–€β–ˆβ–€ β–ˆβ–€β–€ β–ˆ β–ˆ β–€ β–ˆ β–ˆβ–€β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–ˆ β–ˆ β–ˆ β–€β–€β–€ β–€ β–€ β–€ β–€ β–€β–€β–€ β–€ β–€ β–€β–€β–€ β–€ β–€β–€β–€ β–€β–€β–€

Abliterated/Heretic MTSAIR/Cotype-Nano

Check Quants

Refusals (this model): 10/100
Original (MTSAIR/Cotype-Nano): 98/100
KL divergence: 0.0903

Parameters
direction_index = per layer
attn.o_proj.max_weight = 1.22
attn.o_proj.max_weight_position = 22.92
attn.o_proj.min_weight = 1.09
attn.o_proj.min_weight_distance = 14.28
mlp.down_proj.max_weight = 0.85
mlp.down_proj.max_weight_position = 17.27
mlp.down_proj.min_weight = 0.34
mlp.down_proj.min_weight_distance = 10.60


Downloads last month
3
Safetensors
Model size
2B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for hereticness/Heretic-Cotype-Nano

Base model

MTSAIR/Cotype-Nano
Finetuned
(4)
this model
Quantizations
2 models