File size: 1,664 Bytes
e48aee6
 
 
 
 
 
 
 
3038ab2
e48aee6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---
# Eden's-Fall-L3.3-70b-0.3a (bad)

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method.

### Models Merged

The following models were included in the merge:
* /workspace/cache/models--bruhzair--prototype-0.4x259/snapshots/708333670ebce8bcf5ce8511657f1b0a0b972423
* /workspace/cache/models--bruhzair--prototype-0.4x264/snapshots/77cba65aa7a79075bc434fa3a5c30463ff267be9
* /workspace/cache/models--bruhzair--prototype-0.4x263/snapshots/60ed0b327ef5c1af49d5f2e12347edba0d0cde95

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: /workspace/cache/models--bruhzair--prototype-0.4x264/snapshots/77cba65aa7a79075bc434fa3a5c30463ff267be9
    parameters:
      weight: [0.2, 0.15, 0.2, 0.25, 0.2]
  - model: /workspace/cache/models--bruhzair--prototype-0.4x263/snapshots/60ed0b327ef5c1af49d5f2e12347edba0d0cde95
    parameters:
      weight: [0.2, 0.25, 0.2, 0.15, 0.2]
  - model: /workspace/cache/models--bruhzair--prototype-0.4x259/snapshots/708333670ebce8bcf5ce8511657f1b0a0b972423
    parameters:
      weight: [0.7, 0.65, 0.7, 0.65, 0.7]
merge_method: multislerp
tokenizer:
  source: /workspace/cache/models--bruhzair--prototype-0.4x257/snapshots/60a848fe9776f453b6f640662ca07493da8c1d12
chat_template: llama3
parameters:
  normalize_weights: false
  eps: 1e-8
pad_to_multiple_of: 8
int8_mask: true
dtype: bfloat16
```