|
|
--- |
|
|
base_model: [] |
|
|
library_name: transformers |
|
|
tags: |
|
|
- mergekit |
|
|
- merge |
|
|
|
|
|
--- |
|
|
# MN-12b-RP-Ink |
|
|
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
I have removed several of the unsused layers as a test. The model still works but it can catch itself into a loop. I am attempting to finetune the model on a longer conversational dataset to see if that issue can be resolved. |
|
|
|
|
|
I would NOT use this model... It is for testing purposes only. |
|
|
|
|
|
|
|
|
## Merge Details |
|
|
### Merge Method |
|
|
|
|
|
This model was merged using the Passthrough merge method. |
|
|
|
|
|
### Models Merged |
|
|
|
|
|
The following models were included in the merge: |
|
|
* /storage/bases/MN-12b-RP-Ink |
|
|
|
|
|
### Configuration |
|
|
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
|
|
```yaml |
|
|
dtype: bfloat16 |
|
|
merge_method: passthrough |
|
|
modules: |
|
|
default: |
|
|
slices: |
|
|
- sources: |
|
|
- layer_range: [0, 27] |
|
|
model: /storage/bases/MN-12b-RP-Ink |
|
|
- sources: |
|
|
- layer_range: [29, 30] |
|
|
model: /storage/bases/MN-12b-RP-Ink |
|
|
- sources: |
|
|
- layer_range: [32, 40] |
|
|
model: /storage/bases/MN-12b-RP-Ink |
|
|
``` |
|
|
|