bibproj commited on
Commit
70bc4ba
·
verified ·
1 Parent(s): 18a3741

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +38 -2
README.md CHANGED
@@ -1,7 +1,43 @@
1
  ---
2
- language: en
 
 
3
  pipeline_tag: text-generation
 
 
 
4
  tags:
5
  - mlx
6
- library_name: mlx
 
 
 
7
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ library_name: transformers
3
+ model_name: Shisa V2.1 14B
4
+ license: mit
5
  pipeline_tag: text-generation
6
+ language:
7
+ - ja
8
+ - en
9
  tags:
10
  - mlx
11
+ base_model:
12
+ - shisa-ai/shisa-v2.1-unphi4-14b
13
+ datasets:
14
+ - shisa-ai/shisa-v2.1-sharegpt
15
  ---
16
+
17
+ # mlx-community/shisa-v2.1-unphi4-14b-mlx-bf16
18
+
19
+ The Model [mlx-community/shisa-v2.1-unphi4-14b-mlx-bf16](https://huggingface.co/mlx-community/shisa-v2.1-unphi4-14b-mlx-bf16) was converted to MLX format from [shisa-ai/shisa-v2.1-unphi4-14b](https://huggingface.co/shisa-ai/shisa-v2.1-unphi4-14b) using mlx-lm version **0.28.4**.
20
+
21
+ You can find other similar translation-related MLX model quants for an Apple Mac at https://huggingface.co/bibproj
22
+
23
+ ## Use with mlx
24
+
25
+ ```bash
26
+ pip install mlx-lm
27
+ ```
28
+
29
+ ```python
30
+ from mlx_lm import load, generate
31
+
32
+ model, tokenizer = load("mlx-community/shisa-v2.1-unphi4-14b-mlx-bf16")
33
+
34
+ prompt="hello"
35
+
36
+ if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
37
+ messages = [{"role": "user", "content": prompt}]
38
+ prompt = tokenizer.apply_chat_template(
39
+ messages, tokenize=False, add_generation_prompt=True
40
+ )
41
+
42
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
43
+ ```