| base_model: hibana2077/Pioneer-2x7B | |
| inference: false | |
| library_name: transformers | |
| merged_models: | |
| - HuggingFaceH4/mistral-7b-grok | |
| - OpenPipe/mistral-ft-optimized-1218 | |
| pipeline_tag: text-generation | |
| quantized_by: Suparious | |
| tags: | |
| - mergekit | |
| - merge | |
| - 4-bit | |
| - AWQ | |
| - text-generation | |
| - autotrain_compatible | |
| - endpoints_compatible | |
| # hibana2077/Pioneer-2x7B AWQ | |
| - Model creator: [hibana2077](https://huggingface.co/hibana2077) | |
| - Original model: [Pioneer-2x7B](https://huggingface.co/hibana2077/Pioneer-2x7B) | |
| ## Model Summary | |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
| This model was merged using the SLERP merge method. | |
| The following models were included in the merge: | |
| * [HuggingFaceH4/mistral-7b-grok](https://huggingface.co/HuggingFaceH4/mistral-7b-grok) | |
| * [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218) | |