LukasF commited on
Commit
fcd9eb5
·
verified ·
1 Parent(s): c945aac

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: Heralax/Augmentoolkit-DataSpecialist-v0.1
5
+ tags:
6
+ - axolotl
7
+ - generated_from_trainer
8
+ - mlx
9
+ - mlx-my-repo
10
+ datasets:
11
+ - 29_mil_asstr.jsonl
12
+ - 40mil_gutenberg.jsonl
13
+ - hle-1_formatted_2mil.jsonl
14
+ - 11_mil_fineweb.jsonl
15
+ - multiturn_segments_shard_01.json
16
+ - multiturn_segments_shard_02.json
17
+ - singleturn_segments_shard_01.json
18
+ - singleturn_segments_shard_02.json
19
+ - openhermes2_5_shard_01.json
20
+ - openhermes2_5_shard_02.json
21
+ - openthoughts-1.parquet
22
+ - openthoughts-2.parquet
23
+ - qwq_10million.jsonl
24
+ - bluemoon-6mil.json
25
+ model-index:
26
+ - name: datagen-sft-1
27
+ results: []
28
+ ---
29
+
30
+ # LukasF/Augmentoolkit-DataSpecialist-v0.1-mlx-4Bit
31
+
32
+ The Model [LukasF/Augmentoolkit-DataSpecialist-v0.1-mlx-4Bit](https://huggingface.co/LukasF/Augmentoolkit-DataSpecialist-v0.1-mlx-4Bit) was converted to MLX format from [Heralax/Augmentoolkit-DataSpecialist-v0.1](https://huggingface.co/Heralax/Augmentoolkit-DataSpecialist-v0.1) using mlx-lm version **0.22.3**.
33
+
34
+ ## Use with mlx
35
+
36
+ ```bash
37
+ pip install mlx-lm
38
+ ```
39
+
40
+ ```python
41
+ from mlx_lm import load, generate
42
+
43
+ model, tokenizer = load("LukasF/Augmentoolkit-DataSpecialist-v0.1-mlx-4Bit")
44
+
45
+ prompt="hello"
46
+
47
+ if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
48
+ messages = [{"role": "user", "content": prompt}]
49
+ prompt = tokenizer.apply_chat_template(
50
+ messages, tokenize=False, add_generation_prompt=True
51
+ )
52
+
53
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
54
+ ```