Upload folder using huggingface_hub
Browse files- README.md +4 -5
- mergekit_config.yml +1 -1
README.md
CHANGED
|
@@ -4,17 +4,17 @@ tags:
|
|
| 4 |
- merge
|
| 5 |
- mergekit
|
| 6 |
- lazymergekit
|
|
|
|
| 7 |
- cognitivecomputations/dolphin-2_6-phi-2
|
| 8 |
- mrm8488/phi-2-coder
|
| 9 |
-
- rhysjones/phi-2-orange
|
| 10 |
---
|
| 11 |
|
| 12 |
# phi-2-ties
|
| 13 |
|
| 14 |
phi-2-ties is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
|
|
|
| 15 |
* [cognitivecomputations/dolphin-2_6-phi-2](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)
|
| 16 |
* [mrm8488/phi-2-coder](https://huggingface.co/mrm8488/phi-2-coder)
|
| 17 |
-
* [rhysjones/phi-2-orange](https://huggingface.co/rhysjones/phi-2-orange)
|
| 18 |
|
| 19 |
## 🧩 Configuration
|
| 20 |
|
|
@@ -23,7 +23,7 @@ models:
|
|
| 23 |
- model: rhysjones/phi-2-orange
|
| 24 |
parameters:
|
| 25 |
density: 0.5
|
| 26 |
-
weight: 0.
|
| 27 |
- model: cognitivecomputations/dolphin-2_6-phi-2
|
| 28 |
parameters:
|
| 29 |
density: 0.5
|
|
@@ -43,7 +43,7 @@ dtype: float16
|
|
| 43 |
## 💻 Usage
|
| 44 |
|
| 45 |
```python
|
| 46 |
-
!pip install -qU transformers accelerate
|
| 47 |
|
| 48 |
from transformers import AutoTokenizer
|
| 49 |
import transformers
|
|
@@ -59,7 +59,6 @@ pipeline = transformers.pipeline(
|
|
| 59 |
model=model,
|
| 60 |
torch_dtype=torch.float16,
|
| 61 |
device_map="auto",
|
| 62 |
-
trust_remote_code=True
|
| 63 |
)
|
| 64 |
|
| 65 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
|
|
|
| 4 |
- merge
|
| 5 |
- mergekit
|
| 6 |
- lazymergekit
|
| 7 |
+
- rhysjones/phi-2-orange
|
| 8 |
- cognitivecomputations/dolphin-2_6-phi-2
|
| 9 |
- mrm8488/phi-2-coder
|
|
|
|
| 10 |
---
|
| 11 |
|
| 12 |
# phi-2-ties
|
| 13 |
|
| 14 |
phi-2-ties is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
| 15 |
+
* [rhysjones/phi-2-orange](https://huggingface.co/rhysjones/phi-2-orange)
|
| 16 |
* [cognitivecomputations/dolphin-2_6-phi-2](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)
|
| 17 |
* [mrm8488/phi-2-coder](https://huggingface.co/mrm8488/phi-2-coder)
|
|
|
|
| 18 |
|
| 19 |
## 🧩 Configuration
|
| 20 |
|
|
|
|
| 23 |
- model: rhysjones/phi-2-orange
|
| 24 |
parameters:
|
| 25 |
density: 0.5
|
| 26 |
+
weight: 0.3
|
| 27 |
- model: cognitivecomputations/dolphin-2_6-phi-2
|
| 28 |
parameters:
|
| 29 |
density: 0.5
|
|
|
|
| 43 |
## 💻 Usage
|
| 44 |
|
| 45 |
```python
|
| 46 |
+
!pip install -qU transformers accelerate
|
| 47 |
|
| 48 |
from transformers import AutoTokenizer
|
| 49 |
import transformers
|
|
|
|
| 59 |
model=model,
|
| 60 |
torch_dtype=torch.float16,
|
| 61 |
device_map="auto",
|
|
|
|
| 62 |
)
|
| 63 |
|
| 64 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
mergekit_config.yml
CHANGED
|
@@ -3,7 +3,7 @@ models:
|
|
| 3 |
- model: rhysjones/phi-2-orange
|
| 4 |
parameters:
|
| 5 |
density: 0.5
|
| 6 |
-
weight: 0.
|
| 7 |
- model: cognitivecomputations/dolphin-2_6-phi-2
|
| 8 |
parameters:
|
| 9 |
density: 0.5
|
|
|
|
| 3 |
- model: rhysjones/phi-2-orange
|
| 4 |
parameters:
|
| 5 |
density: 0.5
|
| 6 |
+
weight: 0.3
|
| 7 |
- model: cognitivecomputations/dolphin-2_6-phi-2
|
| 8 |
parameters:
|
| 9 |
density: 0.5
|