--- base_model: - MrRobotoAI/158 - MrRobotoAI/153 - MrRobotoAI/155 - MrRobotoAI/Frigg-v2-8b-ACADEMIC-128K - MrRobotoAI/154 - MrRobotoAI/Odin-v2-8b-NOVELIST-128K library_name: transformers tags: - mergekit - merge --- # merge 13,482 14,559 LINES This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [MrRobotoAI/Odin-v2-8b-NOVELIST-128K](https://huggingface.co/MrRobotoAI/Odin-v2-8b-NOVELIST-128K) as a base. ### Models Merged The following models were included in the merge: * [MrRobotoAI/158](https://huggingface.co/MrRobotoAI/158) * [MrRobotoAI/153](https://huggingface.co/MrRobotoAI/153) * [MrRobotoAI/155](https://huggingface.co/MrRobotoAI/155) * [MrRobotoAI/Frigg-v2-8b-ACADEMIC-128K](https://huggingface.co/MrRobotoAI/Frigg-v2-8b-ACADEMIC-128K) * [MrRobotoAI/154](https://huggingface.co/MrRobotoAI/154) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: MrRobotoAI/153 parameters: density: 0.1666 weight: 0.9 - model: MrRobotoAI/154 parameters: density: 0.1666 weight: 0.9 - model: MrRobotoAI/155 parameters: density: 0.1666 weight: 0.9 - model: MrRobotoAI/158 parameters: density: 0.1666 weight: 0.9 - model: MrRobotoAI/Frigg-v2-8b-ACADEMIC-128K parameters: density: 0.1666 weight: 0.9 - model: MrRobotoAI/Odin-v2-8b-NOVELIST-128K parameters: density: 0.1666 weight: 0.9 merge_method: ties base_model: MrRobotoAI/Odin-v2-8b-NOVELIST-128K dtype: float16 ```