Pinkstack commited on
Commit
afdcb70
·
verified ·
1 Parent(s): 0c59ffa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -6
README.md CHANGED
@@ -8,16 +8,17 @@ tags:
8
  - mergekit
9
  - lazymergekit
10
  - microsoft/Phi-3.5-mini-instruct
 
11
  ---
 
 
12
 
13
- # phi-3.5-7b
14
-
15
- phi-3.5-7b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
16
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
17
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
18
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
19
 
20
- ## 🧩 Configuration
21
 
22
  ```yaml
23
  base_model: microsoft/Phi-3.5-mini-instruct
@@ -36,7 +37,7 @@ tokenizer_source: microsoft/Phi-3.5-mini-instruct
36
  dtype: float16
37
  ```
38
 
39
- ## 💻 Usage
40
 
41
  ```python
42
  !pip install -qU transformers accelerate
@@ -45,7 +46,7 @@ from transformers import AutoTokenizer
45
  import transformers
46
  import torch
47
 
48
- model = "Pinkstack/phi-3.5-7b"
49
  messages = [{"role": "user", "content": "What is a large language model?"}]
50
 
51
  tokenizer = AutoTokenizer.from_pretrained(model)
 
8
  - mergekit
9
  - lazymergekit
10
  - microsoft/Phi-3.5-mini-instruct
11
+ license: mit
12
  ---
13
+ Should ideally be used to further fine-tune.
14
+ # phi-3.5-6b
15
 
16
+ phi-3.5-6b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
 
 
17
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
18
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
19
  * [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
20
 
21
+ ## Config
22
 
23
  ```yaml
24
  base_model: microsoft/Phi-3.5-mini-instruct
 
37
  dtype: float16
38
  ```
39
 
40
+ ## Usage
41
 
42
  ```python
43
  !pip install -qU transformers accelerate
 
46
  import transformers
47
  import torch
48
 
49
+ model = "Pinkstack/phi-3.5-6b"
50
  messages = [{"role": "user", "content": "What is a large language model?"}]
51
 
52
  tokenizer = AutoTokenizer.from_pretrained(model)