Goekdeniz-Guelmez commited on
Commit
1e8a9f6
·
verified ·
1 Parent(s): 12e2cc1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -0
README.md ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ First beta test on training VLMs usign MLX-VLM and my overhaul PR.
2
+
3
+ terminal command used:
4
+
5
+ ```shell
6
+ python -m mlx_vlm.lora --model-path mlx-community/Qwen2-VL-2B-Instruct-bf16 --dataset TIGER-Lab/VisualWebInstruct --dataset-config 'example' --output-path /Users/Goekdeniz.Guelmez@computacenter.com/Library/CloudStorage/OneDrive-COMPUTACENTER/Desktop/Qwen2-VL-2B-Instruct-bf16-VisualWebInstruct-lora --batch-size 1 --epochs 1 --learning-rate 1e-6 --grad-checkpoint --train-on-completions --steps-per-report 1
7
+ ```
8
+
9
+
10
+ last 10 steps logs:
11
+
12
+ ```text
13
+ Iter 990: Train loss 0.703, Learning Rate 1.000e-06, It/sec 5.188, Tokens/sec 166.003, Trained Tokens 31677, Peak mem 5.132 GB
14
+ Iter 991: Train loss 3.135, Learning Rate 1.000e-06, It/sec 5.019, Tokens/sec 160.618, Trained Tokens 31709, Peak mem 5.132 GB
15
+ Iter 992: Train loss 1.932, Learning Rate 1.000e-06, It/sec 5.112, Tokens/sec 163.598, Trained Tokens 31741, Peak mem 5.132 GB
16
+ Iter 993: Train loss 0.751, Learning Rate 1.000e-06, It/sec 5.159, Tokens/sec 165.081, Trained Tokens 31773, Peak mem 5.137 GB
17
+ Iter 994: Train loss 2.252, Learning Rate 1.000e-06, It/sec 5.103, Tokens/sec 163.304, Trained Tokens 31805, Peak mem 5.137 GB
18
+ Iter 995: Train loss 0.738, Learning Rate 1.000e-06, It/sec 5.175, Tokens/sec 165.601, Trained Tokens 31837, Peak mem 5.137 GB
19
+ Iter 996: Train loss 1.454, Learning Rate 1.000e-06, It/sec 5.202, Tokens/sec 166.455, Trained Tokens 31869, Peak mem 5.137 GB
20
+ Iter 997: Train loss 1.298, Learning Rate 1.000e-06, It/sec 5.048, Tokens/sec 161.523, Trained Tokens 31901, Peak mem 5.137 GB
21
+ Iter 998: Train loss 2.843, Learning Rate 1.000e-06, It/sec 5.209, Tokens/sec 166.696, Trained Tokens 31933, Peak mem 5.137 GB
22
+ Iter 999: Train loss 1.243, Learning Rate 1.000e-06, It/sec 5.118, Tokens/sec 163.765, Trained Tokens 31965, Peak mem 5.137 GB
23
+ Iter 1000: Train loss 1.513, Learning Rate 1.000e-06, It/sec 5.140, Tokens/sec 164.481, Trained Tokens 31997, Peak mem 5.137 GB
24
+ ```
25
+
26
+ its really fast and works really good.