merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
# =========================
# G-Zombie-3.2-1B
# Human-class B.O.W. (Infected Host)
# =========================
base_model: nbeerbower/HumanLlama-3.2-1B
merge_method: slerp
dtype: bfloat16
# bfloat16 ensures numerical stability during spherical interpolation
parameters:
t: 0.45
# SLERP interpolation factor
# 0.45 represents partial infection:
# - Host cognition preserved
# - Alignment degraded
# - Behavioral instability introduced
models:
- model: nbeerbower/HumanLlama-3.2-1B
parameters:
weight: 0.55
# Primary host (human cognitive baseline)
- model: UmbrellaInc/G-Virus.Injector_v2-3.2-1B
parameters:
weight: 0.45
# G-Virus acts as a destabilizing agent,
# not a full cognitive override
- Downloads last month
- 6
Model tree for UmbrellaInc/G-Zombie-3.2-1B
Merge model
this model