jefferylovely commited on
Commit
6616acd
·
verified ·
1 Parent(s): b574ddb

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -5,14 +5,14 @@ tags:
5
  - mergekit
6
  - lazymergekit
7
  - jefferylovely/ThetaMaven
8
- - FelixChao/WestSeverus-7B-DPO-v2
9
  ---
10
 
11
- # jefferylovely/ThetaMaven2
12
 
13
- jefferylovely/ThetaMaven2 is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
14
  * [jefferylovely/ThetaMaven](https://huggingface.co/jefferylovely/ThetaMaven)
15
- * [FelixChao/WestSeverus-7B-DPO-v2](https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2)
16
 
17
  ## 🧩 Configuration
18
 
@@ -21,7 +21,7 @@ slices:
21
  - sources:
22
  - model: jefferylovely/ThetaMaven
23
  layer_range: [0, 32]
24
- - model: FelixChao/WestSeverus-7B-DPO-v2
25
  layer_range: [0, 32]
26
  merge_method: slerp
27
  base_model: jefferylovely/ThetaMaven
@@ -44,7 +44,7 @@ from transformers import AutoTokenizer
44
  import transformers
45
  import torch
46
 
47
- model = "jefferylovely/jefferylovely/ThetaMaven2"
48
  messages = [{"role": "user", "content": "What is a large language model?"}]
49
 
50
  tokenizer = AutoTokenizer.from_pretrained(model)
 
5
  - mergekit
6
  - lazymergekit
7
  - jefferylovely/ThetaMaven
8
+ - BarryFutureman/WildMarcoroni-Variant1-7B
9
  ---
10
 
11
+ # jefferylovely/ThetaMaven3
12
 
13
+ jefferylovely/ThetaMaven3 is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
14
  * [jefferylovely/ThetaMaven](https://huggingface.co/jefferylovely/ThetaMaven)
15
+ * [BarryFutureman/WildMarcoroni-Variant1-7B](https://huggingface.co/BarryFutureman/WildMarcoroni-Variant1-7B)
16
 
17
  ## 🧩 Configuration
18
 
 
21
  - sources:
22
  - model: jefferylovely/ThetaMaven
23
  layer_range: [0, 32]
24
+ - model: BarryFutureman/WildMarcoroni-Variant1-7B
25
  layer_range: [0, 32]
26
  merge_method: slerp
27
  base_model: jefferylovely/ThetaMaven
 
44
  import transformers
45
  import torch
46
 
47
+ model = "jefferylovely/jefferylovely/ThetaMaven3"
48
  messages = [{"role": "user", "content": "What is a large language model?"}]
49
 
50
  tokenizer = AutoTokenizer.from_pretrained(model)
mergekit_config.yml CHANGED
@@ -3,7 +3,7 @@ slices:
3
  - sources:
4
  - model: jefferylovely/ThetaMaven
5
  layer_range: [0, 32]
6
- - model: FelixChao/WestSeverus-7B-DPO-v2
7
  layer_range: [0, 32]
8
  merge_method: slerp
9
  base_model: jefferylovely/ThetaMaven
 
3
  - sources:
4
  - model: jefferylovely/ThetaMaven
5
  layer_range: [0, 32]
6
+ - model: BarryFutureman/WildMarcoroni-Variant1-7B
7
  layer_range: [0, 32]
8
  merge_method: slerp
9
  base_model: jefferylovely/ThetaMaven
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e5627fa69543673db63a31a510d55224532c48e32553f65bcd2796285bf5be68
3
  size 9942981696
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4e8c8963c3ccd45abe22de0d6cb36e49e0e80d43d89c461a2543445374cc2f86
3
  size 9942981696
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6c5fec0ddffa0fbe8a1630d0d8de579680542bcc2a3ad3ebe4acd4aeecc2f581
3
  size 4540516344
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b21205e90d8b0e7330b0cd93faec22a8e1960988d7fd68ee272bdb74858fdb5
3
  size 4540516344