phi3-128k-6b / README.md
win10's picture
Upload folder using huggingface_hub
d095c5d verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# phi3-128k
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method using D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct as a base.
### Models Merged
The following models were included in the merge:
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct # embed_tokens comes along with the ride with whatever is the first layer
layer_range: [0, 1]
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct # add dummy second model with 0 weight so tokenizer-based merge routine is invoked for embed_tokens
layer_range: [0, 1]
- sources:
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct
layer_range: [1, 24]
- sources:
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct
layer_range: [8, 20]
- sources:
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct
layer_range: [18, 32]
- model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct
layer_range: [18, 32]
merge_method: passthrough
base_model: D:\text-generation-webui\models\microsoft_Phi-3-mini-128k-instruct
dtype: bfloat16
```