--- base_model: - mllm-dev/gpt2_f_experiment_0 - mllm-dev/merge_diff_data_DROID - mllm-dev/merge_diff_data_YELP library_name: transformers tags: - mergekit - merge --- # merge_out This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the linear [DARE](https://arxiv.org/abs/2311.03099) merge method using [mllm-dev/gpt2_f_experiment_0](https://huggingface.co/mllm-dev/gpt2_f_experiment_0) as a base. ### Models Merged The following models were included in the merge: * [mllm-dev/merge_diff_data_DROID](https://huggingface.co/mllm-dev/merge_diff_data_DROID) * [mllm-dev/merge_diff_data_YELP](https://huggingface.co/mllm-dev/merge_diff_data_YELP) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: model: path: mllm-dev/gpt2_f_experiment_0 dtype: float16 merge_method: dare_linear parameters: normalize: 1.0 slices: - sources: - layer_range: [0, 12] model: model: path: mllm-dev/merge_diff_data_DROID parameters: weight: 0.5 - layer_range: [0, 12] model: model: path: mllm-dev/merge_diff_data_YELP parameters: weight: 0.5 - layer_range: [0, 12] model: model: path: mllm-dev/gpt2_f_experiment_0 ```