| --- |
| base_model: |
| - Sao10K/Fimbulvetr-11B-v2 |
| - Joseph717171/Cerebrum-1.0-10.7B |
| library_name: transformers |
| tags: |
| - mergekit |
| - merge |
|
|
| --- |
| # merge |
|
|
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
| ## Merge Details |
| ### Merge Method |
|
|
| This model was merged using the SLERP merge method. |
|
|
| ### Models Merged |
|
|
| The following models were included in the merge: |
| * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) |
| * [Joseph717171/Cerebrum-1.0-10.7B](https://huggingface.co/Joseph717171/Cerebrum-1.0-10.7B) |
|
|
| ### Configuration |
|
|
| The following YAML configuration was used to produce this model: |
|
|
| ```yaml |
| models: |
| - model: Joseph717171/Cerebrum-1.0-10.7B |
| - model: Sao10K/Fimbulvetr-11B-v2 |
| |
| merge_method: slerp |
| |
| tokenizer_merge_method: slerp |
| |
| tokenizer_parameters: |
| t: 0.4 # Aumentato leggermente per bilanciare meglio Fimbulvetr |
| |
| base_model: Sao10K/Fimbulvetr-11B-v2 |
| dtype: bfloat16 |
| |
| parameters: |
| t: [0, 0.3, 0.5, 0.7, 0.5, 0.3, 0] # Curva più graduale e bilanciata |
| temp: 1.3 # Slightly reduced per maggiore precisione |
| |
| density: |
| - threshold: 0.2 |
| t: 0.8 |
| - threshold: 0.6 |
| t: 0.5 |
| - threshold: 0.9 |
| t: 0.3 |
| |
| # Aggiunto un nuovo livello di densità per migliorare la transizione tra i modelli |
| additional_density: |
| - threshold: 0.4 |
| t: 0.6 |
| |
| ``` |
|
|