output
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SpecLERP v4 merge method using ReadyArt/Omega-Evolution-9B-v2.0 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
merge_method: splerp_v4
base_model: ReadyArt/Omega-Evolution-9B-v2.0
dtype: bfloat16
tokenizer_source: base
architecture: qwen3_5
modules:
text_decoder:
architecture: qwen3_5_text
models:
- model: ReadyArt/Omega-Evolution-9B-v2.0
- model: Dxniz/NaNovel-9B
parameters:
equal_weight: true
t:
- filter: ".*self_attn\\.q_proj.*"
value: 0.5
- filter: ".*self_attn\\.k_proj.*"
value: 0.5
- filter: ".*self_attn\\.v_proj.*"
value: 0.5
- filter: ".*self_attn\\.o_proj.*"
value: 0.5
- filter: ".*linear_attn.*"
value: 0.5
- filter: ".*(gate_proj|up_proj|down_proj).*"
value: 0.5
- value: 0.5
vision_tower:
architecture: qwen3_5_vision
models:
- model: ReadyArt/Omega-Evolution-9B-v2.0
- model: Dxniz/NaNovel-9B
parameters:
t: 0.5
- Downloads last month
- 122