bibproj commited on
Commit
eee3e85
·
verified ·
1 Parent(s): 0d24335

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -1
README.md CHANGED
@@ -1,7 +1,76 @@
1
  ---
2
- language: en
3
  library_name: mlx
 
4
  pipeline_tag: text-generation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  tags:
 
6
  - mlx
 
 
 
7
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
2
  library_name: mlx
3
+ license: apache-2.0
4
  pipeline_tag: text-generation
5
+ language:
6
+ - en
7
+ - de
8
+ - es
9
+ - fr
10
+ - it
11
+ - pt
12
+ - pl
13
+ - nl
14
+ - tr
15
+ - sv
16
+ - cs
17
+ - el
18
+ - hu
19
+ - ro
20
+ - fi
21
+ - uk
22
+ - sl
23
+ - sk
24
+ - da
25
+ - lt
26
+ - lv
27
+ - et
28
+ - bg
29
+ - 'no'
30
+ - ca
31
+ - hr
32
+ - ga
33
+ - mt
34
+ - gl
35
+ - zh
36
+ - ru
37
+ - ko
38
+ - ja
39
+ - ar
40
+ - hi
41
  tags:
42
+ - transformers
43
  - mlx
44
+ - translation
45
+ base_model:
46
+ - utter-project/EuroLLM-22B-Instruct-2512
47
  ---
48
+
49
+ # mlx-community/EuroLLM-22B-Instruct-2512-mlx-6bit
50
+
51
+ The Model [mlx-community/EuroLLM-22B-Instruct-2512-mlx-6bit](https://huggingface.co/mlx-community/EuroLLM-22B-Instruct-2512-mlx-6bit) was converted to MLX format from [utter-project/EuroLLM-22B-Instruct-2512](https://huggingface.co/utter-project/EuroLLM-22B-Instruct-2512) using mlx-lm version **0.28.4**.
52
+
53
+ You can find other similar translation-related MLX model quants for an Apple Mac at https://huggingface.co/bibproj
54
+
55
+ 35 Languages: Bulgarian, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovenian, Spanish, Swedish, Arabic, Catalan, Chinese, Galician, Hindi, Japanese, Korean, Norwegian, Russian, Turkish, and Ukrainian.
56
+
57
+ ## Use with mlx
58
+
59
+ ```bash
60
+ pip install mlx-lm
61
+ ```
62
+
63
+ ```python
64
+ from mlx_lm import load, generate
65
+
66
+ model, tokenizer = load("mlx-community/EuroLLM-22B-Instruct-2512-mlx-6bit")
67
+ prompt="Translate from English to French: Hi there!"
68
+
69
+ if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
70
+ messages = [{"role": "user", "content": prompt}]
71
+ prompt = tokenizer.apply_chat_template(
72
+ messages, tokenize=False, add_generation_prompt=True
73
+ )
74
+
75
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
76
+ ```