Improve language tag

#1
by lbourdois - opened
Files changed (1) hide show
  1. README.md +81 -67
README.md CHANGED
@@ -1,68 +1,82 @@
1
- ---
2
- base_model:
3
- - bond005/meno-tiny-0.1
4
- - Qwen/Qwen2.5-1.5B-Instruct
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
- license: apache-2.0
10
- ---
11
- # **SpaceYL/ECE_Poirot**
12
-
13
- First model merged on the ECE intelligence Lab proprietary GPUs
14
-
15
- This model has been produced by:
16
- - **LALAIN Youri**, engineering student at French Engineering School ECE
17
- - **RAGE LILIAN**, engineering student at French Engineering School ECE
18
-
19
- Under the supervision of:
20
- - **Andre-Louis Rochet**, Lecturer at ECE, Co-founder at TW3 Partners
21
- - **Paul Lemaistre**, Lecturer at ECE, CTO at TW3 Partners
22
- - **Mohammed Mounir**, Solution Architect at Exaion
23
- - **Hervé Chibois**, Infrastructure Expert at Exaion
24
- - **Des Bontés Sonafouo**, Chef de projet IT at Omnes
25
-
26
- With the contribution of:
27
- - **ECE engineering school** as sponsor and financial contributor
28
- - **François STEPHAN** as director of ECE
29
- - **Gérard REUS** as acting director of iLAB
30
-
31
-
32
- ### Supervisory structure
33
- The iLab (intelligence Lab) is a structure created by the ECE and dedicated to artificial intelligence
34
-
35
- ### About ECE
36
- ECE, a multi-program, multi-campus, and multi-sector engineering school specializing in digital engineering, trains engineers and technology experts for the 21st century, capable of meeting the challenges of the dual digital and sustainable development revolutions.
37
-
38
-
39
-
40
- ### Models Merged
41
-
42
- The following models were included in the merge:
43
- * [bond005/meno-tiny-0.1](https://huggingface.co/bond005/meno-tiny-0.1)
44
- * [Qwen/Qwen2.5-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct)
45
-
46
- ### Configuration
47
-
48
- The following YAML configuration was used to produce this model:
49
-
50
- ```yaml
51
- slices:
52
- - sources:
53
- - model: bond005/meno-tiny-0.1
54
- layer_range: [0, 28]
55
- - model: Qwen/Qwen2.5-1.5B-Instruct
56
- layer_range: [0, 28]
57
- merge_method: slerp
58
- base_model: Qwen/Qwen2.5-1.5B-Instruct
59
- parameters:
60
- t:
61
- - filter: self_attn
62
- value: [0, 0.25, 0.5, 0.75, 1]
63
- - filter: mlp
64
- value: [1, 0.75, 0.5, 0.25, 0]
65
- - value: 0.5
66
- dtype: bfloat16
67
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
68
  ```
 
1
+ ---
2
+ base_model:
3
+ - bond005/meno-tiny-0.1
4
+ - Qwen/Qwen2.5-1.5B-Instruct
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+ license: apache-2.0
10
+ language:
11
+ - zho
12
+ - eng
13
+ - fra
14
+ - spa
15
+ - por
16
+ - deu
17
+ - ita
18
+ - rus
19
+ - jpn
20
+ - kor
21
+ - vie
22
+ - tha
23
+ - ara
24
+ ---
25
+ # **SpaceYL/ECE_Poirot**
26
+
27
+ First model merged on the ECE intelligence Lab proprietary GPUs
28
+
29
+ This model has been produced by:
30
+ - **LALAIN Youri**, engineering student at French Engineering School ECE
31
+ - **RAGE LILIAN**, engineering student at French Engineering School ECE
32
+
33
+ Under the supervision of:
34
+ - **Andre-Louis Rochet**, Lecturer at ECE, Co-founder at TW3 Partners
35
+ - **Paul Lemaistre**, Lecturer at ECE, CTO at TW3 Partners
36
+ - **Mohammed Mounir**, Solution Architect at Exaion
37
+ - **Hervé Chibois**, Infrastructure Expert at Exaion
38
+ - **Des Bontés Sonafouo**, Chef de projet IT at Omnes
39
+
40
+ With the contribution of:
41
+ - **ECE engineering school** as sponsor and financial contributor
42
+ - **François STEPHAN** as director of ECE
43
+ - **Gérard REUS** as acting director of iLAB
44
+
45
+
46
+ ### Supervisory structure
47
+ The iLab (intelligence Lab) is a structure created by the ECE and dedicated to artificial intelligence
48
+
49
+ ### About ECE
50
+ ECE, a multi-program, multi-campus, and multi-sector engineering school specializing in digital engineering, trains engineers and technology experts for the 21st century, capable of meeting the challenges of the dual digital and sustainable development revolutions.
51
+
52
+
53
+
54
+ ### Models Merged
55
+
56
+ The following models were included in the merge:
57
+ * [bond005/meno-tiny-0.1](https://huggingface.co/bond005/meno-tiny-0.1)
58
+ * [Qwen/Qwen2.5-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct)
59
+
60
+ ### Configuration
61
+
62
+ The following YAML configuration was used to produce this model:
63
+
64
+ ```yaml
65
+ slices:
66
+ - sources:
67
+ - model: bond005/meno-tiny-0.1
68
+ layer_range: [0, 28]
69
+ - model: Qwen/Qwen2.5-1.5B-Instruct
70
+ layer_range: [0, 28]
71
+ merge_method: slerp
72
+ base_model: Qwen/Qwen2.5-1.5B-Instruct
73
+ parameters:
74
+ t:
75
+ - filter: self_attn
76
+ value: [0, 0.25, 0.5, 0.75, 1]
77
+ - filter: mlp
78
+ value: [1, 0.75, 0.5, 0.25, 0]
79
+ - value: 0.5
80
+ dtype: bfloat16
81
+
82
  ```