ShipBuilding commited on
Commit
6fa33d9
·
verified ·
1 Parent(s): 7361661

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -3
README.md CHANGED
@@ -1,3 +1,54 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - el
5
+ base_model:
6
+ - ilsp/Meltemi-7B-Instruct-v1.5
7
+ ---
8
+ # Model Card for Model ID
9
+
10
+ <!-- Provide a quick summary of what the model is/does. -->
11
+
12
+ # kkLLM v0.1
13
+
14
+ kkLLM v0.1 is a LORA fine-tuned version of [Meltemi 7B Instruct v1.5](https://huggingface.co/ilsp/Meltemi-7B-Instruct-v1.5) using a synthetic dataset based on text from the daily greek newspaper "Rizospastis" covering the timespan from 2008 to 2024.
15
+
16
+
17
+ # Running the model with mlx on a Mac
18
+
19
+ ``console
20
+ pip install mlx-lm
21
+ ```
22
+
23
+ ```console
24
+ python -m mlx_lm.generate --model model_kkLLM --prompt "Καλημέρα!" --temp 0.3
25
+ ```
26
+
27
+
28
+ # Running the model on other systems
29
+
30
+ ```python
31
+ from transformers import AutoModelForCausalLM, AutoTokenizer
32
+
33
+ device = "cuda" # or "cpu"
34
+
35
+ model = AutoModelForCausalLM.from_pretrained("model_kkLLM")
36
+ tokenizer = AutoTokenizer.from_pretrained("model_kkLLM")
37
+
38
+ model.to(device)
39
+
40
+ messages = [
41
+ {"role": "user", "content": "Καλημέρα!"},
42
+ ]
43
+
44
+ prompt = tokenizer.apply_chat_template(messages, add_generation_prompt=True, tokenize=False)
45
+ input_prompt = tokenizer(prompt, return_tensors='pt').to(device)
46
+ outputs = model.generate(input_prompt['input_ids'], max_new_tokens=256, do_sample=True, temperature=0.3, use_cache=True, pad_token_id=tokenizer.eos_token_id, attention_mask=input_prompt["attention_mask"])
47
+
48
+ print(tokenizer.batch_decode(outputs)[0])
49
+ ```
50
+
51
+
52
+ # Ethical Considerations
53
+
54
+ This model has been aligned with human preferences, but might generate misleading, harmful, and toxic content.