β–€β–ˆβ–€ β–ˆβ–„ β–„β–ˆ β–ˆ β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–€β–ˆβ–€ β–€β–ˆβ–€ β–ˆβ–€β–€ β–ˆ β–ˆ β–€ β–ˆ β–ˆβ–€β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–ˆ β–ˆ β–ˆ β–€β–€β–€ β–€ β–€ β–€ β–€ β–€β–€β–€ β–€ β–€ β–€β–€β–€ β–€ β–€β–€β–€ β–€β–€β–€

Abliterated/Heretic pankajmathur/Mimma-3-1b

Check Quants
Refusals (this model): 12/100
Original (pankajmathur/Mimma-3-1b): 81/100
KL divergence: 0.1252

Parameters
direction_index = per layer
attn.o_proj.max_weight = 1.13
attn.o_proj.max_weight_position = 24.33
attn.o_proj.min_weight = 1.11
attn.o_proj.min_weight_distance = 6.64
mlp.down_proj.max_weight = 0.98
mlp.down_proj.max_weight_position = 20.14
mlp.down_proj.min_weight = 0.64
mlp.down_proj.min_weight_distance = 8.92


Downloads last month
5
Safetensors
Model size
1.0B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for hereticness/Heretic-Mimma-3-1b

Finetuned
(1)
this model
Merges
2 models
Quantizations
2 models