Envoid commited on
Commit
ae8f939
·
1 Parent(s): 5d52887

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,7 @@ This model started as a block-diagonal [frankenllama merge](https://huggingface.
6
 
7
  However due to some anomaly likely caused by the novel methods used by MythoMax I was unable to initiate the LoRA training needed to bring the resulting model back to order.
8
 
9
- Being a [Chronorctypus-Limarobormes](https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b) enjoyer I decided to look further into the TIES-merging that it utilizes- as cited in [this arvix paper](https://huggingface.co/papers/2306.01708
10
  )
11
 
12
  I used [llama2-22b](https://huggingface.co/chargoddard/llama2-22b) as the base model upon which I merged the MythoMax/Enterredaas frankenmerge, [Dendrite-II](https://huggingface.co/Envoid/Dendrite-II-22B) and [Bacchus](https://huggingface.co/Envoid/Bacchus-22B)
 
6
 
7
  However due to some anomaly likely caused by the novel methods used by MythoMax I was unable to initiate the LoRA training needed to bring the resulting model back to order.
8
 
9
+ Being a [Chronorctypus-Limarobormes](https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b) enjoyer I decided to look further into the TIES-merging that it utilizes- as cited in the arXiv paper: [Resolving Interference When Merging Models](https://huggingface.co/papers/2306.01708
10
  )
11
 
12
  I used [llama2-22b](https://huggingface.co/chargoddard/llama2-22b) as the base model upon which I merged the MythoMax/Enterredaas frankenmerge, [Dendrite-II](https://huggingface.co/Envoid/Dendrite-II-22B) and [Bacchus](https://huggingface.co/Envoid/Bacchus-22B)