--- base_model: - allenai/Olmo-3.1-7B-RL-Zero-Math - allenai/Olmo-3.1-7B-RL-Zero-Code library_name: transformers tags: - mergekit - merge --- # merged-model This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * [allenai/Olmo-3.1-7B-RL-Zero-Math](https://huggingface.co/allenai/Olmo-3.1-7B-RL-Zero-Math) * [allenai/Olmo-3.1-7B-RL-Zero-Code](https://huggingface.co/allenai/Olmo-3.1-7B-RL-Zero-Code) ### Configuration The following YAML configuration was used to produce this model: ```yaml # Linear merge of OLMo-3.1 Math and Code RL models # Output = 0.5 * Math + 0.5 * Code # # Usage: # modal run modal_merge.py --config examples/olmo3.1-linear-merge.yaml --hf-repo pmahdavi/Olmo-3.1-7B-Math-Code models: - model: allenai/Olmo-3.1-7B-RL-Zero-Math parameters: weight: 0.5 - model: allenai/Olmo-3.1-7B-RL-Zero-Code parameters: weight: 0.5 merge_method: linear dtype: bfloat16 ```