TBPMM / README.md
Io2007's picture
Upload folder using huggingface_hub
12ad52d verified
---
base_model:
- AI-MO/Kimina-Prover-Preview-Distill-7B
- nvidia/OpenMath-Nemotron-7B
- Skywork/Skywork-OR1-Math-7B
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the Passthrough merge method.
### Models Merged
The following models were included in the merge:
* [AI-MO/Kimina-Prover-Preview-Distill-7B](https://huggingface.co/AI-MO/Kimina-Prover-Preview-Distill-7B)
* [nvidia/OpenMath-Nemotron-7B](https://huggingface.co/nvidia/OpenMath-Nemotron-7B)
* [Skywork/Skywork-OR1-Math-7B](https://huggingface.co/Skywork/Skywork-OR1-Math-7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 28]
model: Skywork/Skywork-OR1-Math-7B
- sources:
- layer_range: [0, 28]
model: AI-MO/Kimina-Prover-Preview-Distill-7B
- sources:
- layer_range: [0, 28]
model: nvidia/OpenMath-Nemotron-7B
```