--- base_model: - Novaciano/Eurinoferus-3.2-1B - cazzz307/Abliterated-Llama-3.2-1B-Instruct library_name: transformers tags: - mergekit - merge datasets: - TeichAI/brainstorm-v3.1-grok-4-fast-200x - TeichAI/grok-code-fast-1-1000x - reedmayhew/Grok-3-reasoning-100x --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [Novaciano/Eurinoferus-3.2-1B](https://huggingface.co/Novaciano/Eurinoferus-3.2-1B) as a base. ### Models Merged The following models were included in the merge: * [cazzz307/Abliterated-Llama-3.2-1B-Instruct](https://huggingface.co/cazzz307/Abliterated-Llama-3.2-1B-Instruct) ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: float32 out_dtype: bfloat16 merge_method: arcee_fusion base_model: Novaciano/Eurinoferus-3.2-1B models: - model: Novaciano/Eurinoferus-3.2-1B parameters: weight: - filter: mlp value: [1, 2] - value: 1 - model: cazzz307/Abliterated-Llama-3.2-1B-Instruct parameters: weight: - filter: lm_head value: 1 - value: [1, 0.5] ```