doberst commited on
Commit
0754a39
·
verified ·
1 Parent(s): 9afec3e

Upload 33 files

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
added_tokens.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|/tool_call|>": 200026,
3
+ "<|/tool|>": 200024,
4
+ "<|assistant|>": 200019,
5
+ "<|end|>": 200020,
6
+ "<|system|>": 200022,
7
+ "<|tag|>": 200028,
8
+ "<|tool_call|>": 200025,
9
+ "<|tool_response|>": 200027,
10
+ "<|tool|>": 200023,
11
+ "<|user|>": 200021
12
+ }
config.json ADDED
@@ -0,0 +1,2613 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_attn_implementation_autoset": true,
3
+ "architectures": [
4
+ "Phi4MMForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "audio_processor": {
9
+ "config": {
10
+ "activation": "swish",
11
+ "activation_checkpointing": "",
12
+ "attention_dim": 1024,
13
+ "attention_heads": 16,
14
+ "batch_norm": false,
15
+ "bias_in_glu": true,
16
+ "causal": true,
17
+ "chunk_size": -1,
18
+ "cnn_layer_norm": true,
19
+ "conv_activation": "swish",
20
+ "conv_glu_type": "swish",
21
+ "depthwise_multiplier": 1,
22
+ "depthwise_seperable_out_channel": 1024,
23
+ "dropout_rate": 0.0,
24
+ "encoder_embedding_config": {
25
+ "input_size": 80
26
+ },
27
+ "ext_pw_kernel_size": 1,
28
+ "ext_pw_out_channel": 1024,
29
+ "input_layer": "nemo_conv",
30
+ "input_size": 80,
31
+ "kernel_size": 3,
32
+ "left_chunk": 18,
33
+ "linear_units": 1536,
34
+ "nemo_conv_settings": {
35
+ "conv_channels": 1024
36
+ },
37
+ "num_blocks": 24,
38
+ "relative_attention_bias_args": {
39
+ "t5_bias_max_distance": 500,
40
+ "type": "t5"
41
+ },
42
+ "time_reduction": 8
43
+ },
44
+ "name": "cascades"
45
+ },
46
+ "auto_map": {
47
+ "AutoConfig": "configuration_phi4mm.Phi4MMConfig",
48
+ "AutoModelForCausalLM": "microsoft/phi-4-multimodal-instruct--modeling_phi4mm.Phi4MMForCausalLM",
49
+ "AutoTokenizer": "microsoft/phi-4-multimodal-instruct--Xenova/gpt-4o"
50
+ },
51
+ "bos_token_id": 199999,
52
+ "embd_layer": {
53
+ "audio_embd_layer": {
54
+ "compression_rate": 8,
55
+ "downsample_rate": 1,
56
+ "embedding_cls": "audio",
57
+ "enable_gradient_checkpointing": true,
58
+ "projection_cls": "mlp",
59
+ "use_conv_downsample": false,
60
+ "use_qformer": false
61
+ },
62
+ "embedding_cls": "image_audio",
63
+ "image_embd_layer": {
64
+ "crop_size": 448,
65
+ "embedding_cls": "tune_image",
66
+ "enable_gradient_checkpointing": true,
67
+ "hd_transform_order": "sub_glb",
68
+ "image_token_compression_cls": "avg_pool_2d",
69
+ "projection_cls": "mlp",
70
+ "use_hd_transform": true,
71
+ "with_learnable_separator": true
72
+ }
73
+ },
74
+ "embd_pdrop": 0.0,
75
+ "eos_token_id": 199999,
76
+ "full_attn_mod": 1,
77
+ "glb_GN": [
78
+ [
79
+ [
80
+ 0.0240478515625,
81
+ -0.03466796875,
82
+ -0.0201416015625,
83
+ 0.0208740234375,
84
+ -0.0042724609375,
85
+ -0.034423828125,
86
+ 0.01043701171875,
87
+ -0.03955078125,
88
+ -0.0103759765625,
89
+ 0.0791015625,
90
+ -0.0225830078125,
91
+ 0.0174560546875,
92
+ 0.006622314453125,
93
+ -0.003143310546875,
94
+ 0.0272216796875,
95
+ 0.0400390625,
96
+ 0.0166015625,
97
+ -0.034912109375,
98
+ 0.015869140625,
99
+ -0.021728515625,
100
+ -0.0106201171875,
101
+ 0.0400390625,
102
+ 0.0081787109375,
103
+ -0.009521484375,
104
+ 0.0107421875,
105
+ 0.000499725341796875,
106
+ 0.0439453125,
107
+ -0.02734375,
108
+ 0.0179443359375,
109
+ -0.012451171875,
110
+ 0.042724609375,
111
+ 0.00043487548828125,
112
+ -0.00213623046875,
113
+ -0.0164794921875,
114
+ 0.0152587890625,
115
+ 0.034912109375,
116
+ 0.0111083984375,
117
+ -0.0732421875,
118
+ -0.017822265625,
119
+ -0.02783203125,
120
+ -0.024658203125,
121
+ -0.0126953125,
122
+ 0.00433349609375,
123
+ -0.0225830078125,
124
+ -0.0294189453125,
125
+ -0.006561279296875,
126
+ 0.027587890625,
127
+ 0.0286865234375,
128
+ 0.0164794921875,
129
+ -0.048583984375,
130
+ -0.061279296875,
131
+ 0.006927490234375,
132
+ -0.0225830078125,
133
+ 0.01434326171875,
134
+ 0.00130462646484375,
135
+ -0.07080078125,
136
+ -0.006011962890625,
137
+ -0.0228271484375,
138
+ 0.01300048828125,
139
+ 0.00225830078125,
140
+ -0.0052490234375,
141
+ -0.0218505859375,
142
+ -0.0025177001953125,
143
+ 0.05078125,
144
+ -0.0283203125,
145
+ -0.033203125,
146
+ -0.0279541015625,
147
+ 0.01025390625,
148
+ -0.011962890625,
149
+ 0.015625,
150
+ -0.0021514892578125,
151
+ 0.013671875,
152
+ 0.0634765625,
153
+ -0.0014190673828125,
154
+ 0.006256103515625,
155
+ 0.000865936279296875,
156
+ 0.03662109375,
157
+ -0.024169921875,
158
+ 0.030517578125,
159
+ 0.035888671875,
160
+ 0.00396728515625,
161
+ -0.035400390625,
162
+ 0.0311279296875,
163
+ -0.015869140625,
164
+ -0.00531005859375,
165
+ 0.0235595703125,
166
+ 0.003143310546875,
167
+ -0.02099609375,
168
+ -0.07177734375,
169
+ -0.035888671875,
170
+ -0.03125,
171
+ 0.021240234375,
172
+ -0.04833984375,
173
+ -0.0299072265625,
174
+ -0.10791015625,
175
+ 0.023681640625,
176
+ 0.0291748046875,
177
+ 0.003936767578125,
178
+ -0.0255126953125,
179
+ 0.018310546875,
180
+ 0.005767822265625,
181
+ 0.01422119140625,
182
+ 0.00787353515625,
183
+ -0.0030059814453125,
184
+ 0.053466796875,
185
+ 0.02734375,
186
+ 0.024658203125,
187
+ -0.0081787109375,
188
+ 0.0419921875,
189
+ -0.0240478515625,
190
+ -0.0208740234375,
191
+ 0.004058837890625,
192
+ -0.03369140625,
193
+ 0.0439453125,
194
+ -0.0625,
195
+ 0.003082275390625,
196
+ 0.01007080078125,
197
+ -0.047119140625,
198
+ -0.0224609375,
199
+ 0.0181884765625,
200
+ 0.0196533203125,
201
+ -0.004608154296875,
202
+ -0.0458984375,
203
+ 0.04736328125,
204
+ -0.01513671875,
205
+ -0.08349609375,
206
+ -0.0576171875,
207
+ -0.0263671875,
208
+ -0.0341796875,
209
+ -0.017578125,
210
+ 0.0145263671875,
211
+ 0.06884765625,
212
+ 0.0291748046875,
213
+ -0.0164794921875,
214
+ 0.0859375,
215
+ -0.02685546875,
216
+ 0.003021240234375,
217
+ -0.0181884765625,
218
+ 0.041015625,
219
+ 0.018310546875,
220
+ -0.04638671875,
221
+ -0.08056640625,
222
+ -0.03759765625,
223
+ 0.0086669921875,
224
+ -0.0244140625,
225
+ 0.01385498046875,
226
+ -0.050048828125,
227
+ -0.037841796875,
228
+ -0.014404296875,
229
+ 0.0196533203125,
230
+ 0.048095703125,
231
+ -0.05029296875,
232
+ 0.000946044921875,
233
+ -0.003875732421875,
234
+ 0.0078125,
235
+ -0.00726318359375,
236
+ -0.01275634765625,
237
+ 0.00193023681640625,
238
+ -0.01556396484375,
239
+ -0.03857421875,
240
+ -0.024169921875,
241
+ -0.009765625,
242
+ -0.0208740234375,
243
+ -0.01141357421875,
244
+ -0.043701171875,
245
+ -0.005096435546875,
246
+ -0.045654296875,
247
+ 0.064453125,
248
+ 0.038818359375,
249
+ 0.0004215240478515625,
250
+ 0.0274658203125,
251
+ 0.00299072265625,
252
+ -0.003265380859375,
253
+ -0.00811767578125,
254
+ -0.034912109375,
255
+ -0.023681640625,
256
+ -0.0238037109375,
257
+ -0.0015106201171875,
258
+ -0.0225830078125,
259
+ 0.005706787109375,
260
+ 0.040283203125,
261
+ 0.047119140625,
262
+ 0.00872802734375,
263
+ -0.00933837890625,
264
+ -0.0546875,
265
+ -0.007476806640625,
266
+ -0.02099609375,
267
+ 0.056396484375,
268
+ 0.0189208984375,
269
+ 0.0184326171875,
270
+ -0.0400390625,
271
+ -0.0142822265625,
272
+ -0.0703125,
273
+ -0.035400390625,
274
+ -0.0086669921875,
275
+ -0.0517578125,
276
+ -0.0289306640625,
277
+ 0.04736328125,
278
+ 0.0028533935546875,
279
+ 0.0439453125,
280
+ 0.0301513671875,
281
+ 0.019287109375,
282
+ -0.0185546875,
283
+ -0.0185546875,
284
+ -0.033935546875,
285
+ 0.0159912109375,
286
+ 0.01434326171875,
287
+ -0.0128173828125,
288
+ -0.0225830078125,
289
+ 0.056884765625,
290
+ 0.0556640625,
291
+ -0.03466796875,
292
+ 0.0135498046875,
293
+ 0.0137939453125,
294
+ 0.0732421875,
295
+ -0.01116943359375,
296
+ -0.0128173828125,
297
+ -0.0004100799560546875,
298
+ 0.01434326171875,
299
+ 0.0299072265625,
300
+ -0.01446533203125,
301
+ -0.050048828125,
302
+ -0.036376953125,
303
+ -0.00775146484375,
304
+ 0.00439453125,
305
+ 0.00811767578125,
306
+ 0.0147705078125,
307
+ 0.01019287109375,
308
+ 0.0019683837890625,
309
+ -0.00830078125,
310
+ -0.007659912109375,
311
+ 0.029541015625,
312
+ -0.003509521484375,
313
+ 0.043701171875,
314
+ -0.007781982421875,
315
+ 0.0211181640625,
316
+ -0.0208740234375,
317
+ 0.039794921875,
318
+ -0.03759765625,
319
+ 0.0045166015625,
320
+ 0.050048828125,
321
+ 0.0196533203125,
322
+ 0.043701171875,
323
+ 0.00848388671875,
324
+ -0.043212890625,
325
+ -0.049560546875,
326
+ -0.062255859375,
327
+ 0.0272216796875,
328
+ 0.03662109375,
329
+ -0.034912109375,
330
+ -0.01336669921875,
331
+ 0.05419921875,
332
+ -0.042236328125,
333
+ 0.000705718994140625,
334
+ 0.003753662109375,
335
+ 0.0225830078125,
336
+ 0.021240234375,
337
+ -0.0181884765625,
338
+ 0.0257568359375,
339
+ 0.0238037109375,
340
+ 0.0034332275390625,
341
+ 0.045166015625,
342
+ 0.021728515625,
343
+ -0.0037384033203125,
344
+ -0.000598907470703125,
345
+ 0.017578125,
346
+ -0.012939453125,
347
+ 0.040771484375,
348
+ -0.05419921875,
349
+ -0.015380859375,
350
+ -0.040771484375,
351
+ -0.004974365234375,
352
+ -0.06689453125,
353
+ 0.0419921875,
354
+ -0.00043487548828125,
355
+ 0.042724609375,
356
+ 0.01361083984375,
357
+ -0.013671875,
358
+ -0.048095703125,
359
+ -0.00787353515625,
360
+ -0.03076171875,
361
+ 0.05078125,
362
+ 0.0269775390625,
363
+ 0.0028076171875,
364
+ -0.0233154296875,
365
+ -0.0023956298828125,
366
+ -0.02294921875,
367
+ -0.0517578125,
368
+ 0.04541015625,
369
+ 0.0035247802734375,
370
+ -0.004302978515625,
371
+ 0.019775390625,
372
+ 0.002777099609375,
373
+ -0.04150390625,
374
+ 0.0150146484375,
375
+ 0.0166015625,
376
+ 0.01104736328125,
377
+ 0.0252685546875,
378
+ 0.02587890625,
379
+ -0.0079345703125,
380
+ -0.00347900390625,
381
+ -0.01171875,
382
+ -0.06298828125,
383
+ -0.023193359375,
384
+ 0.0233154296875,
385
+ -0.0311279296875,
386
+ 0.016845703125,
387
+ -0.006561279296875,
388
+ 0.0257568359375,
389
+ 0.048583984375,
390
+ -0.00567626953125,
391
+ -0.049072265625,
392
+ 0.00119781494140625,
393
+ 0.01416015625,
394
+ -0.0111083984375,
395
+ -0.01556396484375,
396
+ -0.022705078125,
397
+ -0.0184326171875,
398
+ -0.044189453125,
399
+ 0.00469970703125,
400
+ -0.0281982421875,
401
+ 0.031494140625,
402
+ 0.00970458984375,
403
+ -0.00604248046875,
404
+ -0.00023937225341796875,
405
+ 0.00732421875,
406
+ -0.032958984375,
407
+ -0.0361328125,
408
+ -0.00909423828125,
409
+ 0.03857421875,
410
+ -0.06201171875,
411
+ -0.0283203125,
412
+ 0.0791015625,
413
+ -0.0108642578125,
414
+ -0.049072265625,
415
+ 0.01068115234375,
416
+ -0.049072265625,
417
+ -0.0380859375,
418
+ -0.048583984375,
419
+ -0.026123046875,
420
+ -0.00872802734375,
421
+ 0.0021209716796875,
422
+ 0.00140380859375,
423
+ -0.0260009765625,
424
+ 0.0050048828125,
425
+ 0.010986328125,
426
+ -0.0028228759765625,
427
+ 0.0390625,
428
+ -0.0205078125,
429
+ -0.00543212890625,
430
+ -0.0113525390625,
431
+ 0.045166015625,
432
+ 0.00762939453125,
433
+ -0.029541015625,
434
+ -0.0106201171875,
435
+ -0.021484375,
436
+ -0.000362396240234375,
437
+ -0.025146484375,
438
+ -0.0419921875,
439
+ -0.04736328125,
440
+ -0.0186767578125,
441
+ -0.0029144287109375,
442
+ -0.04052734375,
443
+ -0.02734375,
444
+ -0.009521484375,
445
+ 0.0189208984375,
446
+ 0.033935546875,
447
+ -0.031982421875,
448
+ -0.044189453125,
449
+ -0.036376953125,
450
+ -0.0035400390625,
451
+ -0.0191650390625,
452
+ 0.0184326171875,
453
+ -0.0133056640625,
454
+ -0.0240478515625,
455
+ -0.05712890625,
456
+ -0.005157470703125,
457
+ 0.0208740234375,
458
+ 0.0172119140625,
459
+ -0.0034332275390625,
460
+ 0.068359375,
461
+ -0.0191650390625,
462
+ 0.004425048828125,
463
+ 0.04150390625,
464
+ -0.06689453125,
465
+ -0.0224609375,
466
+ -0.002899169921875,
467
+ 0.0167236328125,
468
+ -0.032958984375,
469
+ 0.037353515625,
470
+ -0.0184326171875,
471
+ -0.053466796875,
472
+ -0.0125732421875,
473
+ -0.04296875,
474
+ -0.003143310546875,
475
+ -0.05810546875,
476
+ 0.068359375,
477
+ -0.04150390625,
478
+ -0.01275634765625,
479
+ -0.017333984375,
480
+ -0.06787109375,
481
+ -0.03466796875,
482
+ 0.01806640625,
483
+ -0.00408935546875,
484
+ 0.0294189453125,
485
+ -0.0498046875,
486
+ 0.038330078125,
487
+ -0.0615234375,
488
+ 0.072265625,
489
+ 0.0267333984375,
490
+ -0.055908203125,
491
+ 0.0284423828125,
492
+ -0.0159912109375,
493
+ -0.016845703125,
494
+ 0.051513671875,
495
+ -0.002105712890625,
496
+ 0.0023193359375,
497
+ -0.00592041015625,
498
+ -0.00012874603271484375,
499
+ 0.0247802734375,
500
+ -0.024169921875,
501
+ -0.031982421875,
502
+ -0.0020294189453125,
503
+ -0.06787109375,
504
+ -0.0128173828125,
505
+ 0.0057373046875,
506
+ 0.034912109375,
507
+ -0.01416015625,
508
+ 0.004638671875,
509
+ 0.0032806396484375,
510
+ -0.022705078125,
511
+ -0.015625,
512
+ 0.03564453125,
513
+ -0.0272216796875,
514
+ -0.042724609375,
515
+ -0.03271484375,
516
+ 0.035400390625,
517
+ 0.0419921875,
518
+ 0.00787353515625,
519
+ 0.0281982421875,
520
+ -0.0037841796875,
521
+ -0.01177978515625,
522
+ -0.03857421875,
523
+ 0.056884765625,
524
+ -0.0189208984375,
525
+ 0.061767578125,
526
+ -0.036865234375,
527
+ 0.04638671875,
528
+ 0.060302734375,
529
+ -0.0537109375,
530
+ 0.0439453125,
531
+ 0.00799560546875,
532
+ -0.0196533203125,
533
+ 0.0010528564453125,
534
+ 0.0036468505859375,
535
+ -0.021728515625,
536
+ 0.0032806396484375,
537
+ -0.006256103515625,
538
+ 0.017822265625,
539
+ -0.045166015625,
540
+ -0.0380859375,
541
+ 0.0140380859375,
542
+ 0.016357421875,
543
+ -0.109375,
544
+ -0.05859375,
545
+ 0.047607421875,
546
+ 0.01031494140625,
547
+ -0.01348876953125,
548
+ 0.03466796875,
549
+ -0.01177978515625,
550
+ -0.013916015625,
551
+ -0.0205078125,
552
+ -0.0439453125,
553
+ -0.01214599609375,
554
+ 0.035400390625,
555
+ -0.0184326171875,
556
+ -0.017822265625,
557
+ 0.0361328125,
558
+ -0.03662109375,
559
+ 0.0257568359375,
560
+ 0.0022430419921875,
561
+ -0.03125,
562
+ -0.0267333984375,
563
+ -0.03271484375,
564
+ -0.0260009765625,
565
+ 0.0216064453125,
566
+ 0.04443359375,
567
+ -0.007293701171875,
568
+ -0.0177001953125,
569
+ -0.00286865234375,
570
+ -0.0017242431640625,
571
+ -0.0927734375,
572
+ -0.0164794921875,
573
+ 0.029052734375,
574
+ 0.0242919921875,
575
+ 0.0040283203125,
576
+ 0.012939453125,
577
+ 0.03857421875,
578
+ 0.020263671875,
579
+ -0.041015625,
580
+ -0.0169677734375,
581
+ -0.0301513671875,
582
+ 0.043212890625,
583
+ 0.045654296875,
584
+ 0.01708984375,
585
+ 0.036376953125,
586
+ 0.0125732421875,
587
+ -0.07177734375,
588
+ 0.006011962890625,
589
+ -0.01239013671875,
590
+ -0.0029296875,
591
+ 0.035888671875,
592
+ -0.03173828125,
593
+ 0.028564453125,
594
+ 0.0308837890625,
595
+ -0.0517578125,
596
+ 0.021728515625,
597
+ -0.0179443359375,
598
+ 0.044189453125,
599
+ 0.02783203125,
600
+ -0.0007476806640625,
601
+ 0.0026397705078125,
602
+ 0.02587890625,
603
+ 0.0625,
604
+ 0.06640625,
605
+ 0.0113525390625,
606
+ 0.027099609375,
607
+ 0.00119781494140625,
608
+ -0.021484375,
609
+ 0.0296630859375,
610
+ -0.0106201171875,
611
+ -0.023193359375,
612
+ 0.0322265625,
613
+ 0.03515625,
614
+ 0.00083160400390625,
615
+ -0.0238037109375,
616
+ 0.04443359375,
617
+ 0.013671875,
618
+ 0.011474609375,
619
+ -0.0205078125,
620
+ -0.0191650390625,
621
+ 0.04443359375,
622
+ -0.0225830078125,
623
+ -0.017822265625,
624
+ -0.0341796875,
625
+ 0.06494140625,
626
+ 0.0294189453125,
627
+ -0.040771484375,
628
+ -0.0235595703125,
629
+ 0.043701171875,
630
+ 0.01318359375,
631
+ -0.0277099609375,
632
+ 0.01055908203125,
633
+ -0.0081787109375,
634
+ -0.00714111328125,
635
+ 0.030029296875,
636
+ -0.032470703125,
637
+ -0.0030364990234375,
638
+ 0.01031494140625,
639
+ 0.0211181640625,
640
+ -0.095703125,
641
+ -0.0003795623779296875,
642
+ -0.01611328125,
643
+ 0.0205078125,
644
+ 0.004302978515625,
645
+ 0.00457763671875,
646
+ 0.0281982421875,
647
+ -0.03955078125,
648
+ 0.03369140625,
649
+ -0.011962890625,
650
+ -0.01348876953125,
651
+ 0.0081787109375,
652
+ 0.053955078125,
653
+ -0.02197265625,
654
+ -0.08935546875,
655
+ -0.0205078125,
656
+ 0.0269775390625,
657
+ -8.153915405273438e-05,
658
+ -0.0296630859375,
659
+ 0.034912109375,
660
+ -0.03369140625,
661
+ -0.001007080078125,
662
+ -0.045166015625,
663
+ -0.0093994140625,
664
+ 0.020263671875,
665
+ 0.0291748046875,
666
+ -0.026611328125,
667
+ -0.002197265625,
668
+ -0.030517578125,
669
+ 0.0244140625,
670
+ 0.0166015625,
671
+ 0.0272216796875,
672
+ -0.001312255859375,
673
+ -0.034912109375,
674
+ 0.035400390625,
675
+ 0.0257568359375,
676
+ 0.005279541015625,
677
+ 0.029052734375,
678
+ -0.0196533203125,
679
+ -0.0166015625,
680
+ -0.0002613067626953125,
681
+ -0.000545501708984375,
682
+ 0.0849609375,
683
+ -0.006103515625,
684
+ 0.0390625,
685
+ -0.0296630859375,
686
+ 0.041259765625,
687
+ 0.025634765625,
688
+ 0.01513671875,
689
+ -0.00555419921875,
690
+ 0.01348876953125,
691
+ 0.035400390625,
692
+ 0.01409912109375,
693
+ -0.01806640625,
694
+ -0.0302734375,
695
+ -0.060302734375,
696
+ -0.016845703125,
697
+ -0.016845703125,
698
+ 0.0189208984375,
699
+ -0.0311279296875,
700
+ -0.0537109375,
701
+ -0.0235595703125,
702
+ 0.0269775390625,
703
+ -0.0010223388671875,
704
+ 0.0299072265625,
705
+ 0.00140380859375,
706
+ 0.004974365234375,
707
+ 0.00982666015625,
708
+ 0.0028839111328125,
709
+ -0.0135498046875,
710
+ 0.0203857421875,
711
+ -0.0235595703125,
712
+ -0.0283203125,
713
+ 0.0018157958984375,
714
+ 0.01348876953125,
715
+ -0.0252685546875,
716
+ 0.0186767578125,
717
+ 0.04052734375,
718
+ -0.01324462890625,
719
+ 0.006866455078125,
720
+ 0.022705078125,
721
+ 0.0255126953125,
722
+ 0.012451171875,
723
+ -0.0189208984375,
724
+ -0.007476806640625,
725
+ 0.004425048828125,
726
+ 0.047607421875,
727
+ 0.0140380859375,
728
+ -0.06689453125,
729
+ 0.008056640625,
730
+ -0.0201416015625,
731
+ -0.034423828125,
732
+ 0.023193359375,
733
+ 0.0693359375,
734
+ 0.03125,
735
+ 0.0245361328125,
736
+ -0.029052734375,
737
+ 0.0252685546875,
738
+ -0.04150390625,
739
+ -0.007171630859375,
740
+ -0.0400390625,
741
+ 0.0166015625,
742
+ -0.025146484375,
743
+ -0.0162353515625,
744
+ -0.019287109375,
745
+ -0.0223388671875,
746
+ -0.0089111328125,
747
+ 0.02685546875,
748
+ -0.0634765625,
749
+ 0.050537109375,
750
+ 0.023193359375,
751
+ 0.04931640625,
752
+ 0.0111083984375,
753
+ 0.01275634765625,
754
+ 0.0380859375,
755
+ 0.05419921875,
756
+ -0.05859375,
757
+ -0.0208740234375,
758
+ -0.046142578125,
759
+ 0.01385498046875,
760
+ 0.0081787109375,
761
+ 0.0240478515625,
762
+ 0.0081787109375,
763
+ 0.04443359375,
764
+ -0.04736328125,
765
+ 0.021240234375,
766
+ -0.0084228515625,
767
+ -0.005767822265625,
768
+ 0.0140380859375,
769
+ -0.02587890625,
770
+ 0.0014190673828125,
771
+ -0.0179443359375,
772
+ -0.0267333984375,
773
+ -0.0322265625,
774
+ 0.036376953125,
775
+ -0.049560546875,
776
+ -0.005340576171875,
777
+ 0.021240234375,
778
+ 0.004913330078125,
779
+ 0.02490234375,
780
+ 0.007293701171875,
781
+ -0.0517578125,
782
+ 0.00799560546875,
783
+ -0.040771484375,
784
+ -0.03857421875,
785
+ -0.040283203125,
786
+ -0.007568359375,
787
+ -0.0250244140625,
788
+ -0.0230712890625,
789
+ 0.042724609375,
790
+ 0.0172119140625,
791
+ -0.0185546875,
792
+ -0.01446533203125,
793
+ 0.0296630859375,
794
+ 0.02099609375,
795
+ 0.030029296875,
796
+ 0.03515625,
797
+ -0.0277099609375,
798
+ -0.05029296875,
799
+ 0.031494140625,
800
+ -0.00262451171875,
801
+ -0.02001953125,
802
+ 0.033447265625,
803
+ 0.06103515625,
804
+ -0.0179443359375,
805
+ -0.03564453125,
806
+ -0.0194091796875,
807
+ -0.062255859375,
808
+ 0.0037994384765625,
809
+ 0.038330078125,
810
+ 0.0712890625,
811
+ -0.0380859375,
812
+ 0.00051116943359375,
813
+ 0.033203125,
814
+ 0.025634765625,
815
+ -0.02294921875,
816
+ 0.0247802734375,
817
+ 0.033935546875,
818
+ 0.03955078125,
819
+ -0.01397705078125,
820
+ -0.006103515625,
821
+ -0.062255859375,
822
+ -0.0322265625,
823
+ -0.004119873046875,
824
+ -0.017822265625,
825
+ 0.017333984375,
826
+ 0.04345703125,
827
+ -0.002471923828125,
828
+ 0.0277099609375,
829
+ -0.0162353515625,
830
+ 0.0751953125,
831
+ -0.005828857421875,
832
+ -0.017578125,
833
+ -0.0220947265625,
834
+ -0.0439453125,
835
+ -0.022705078125,
836
+ -0.028076171875,
837
+ -0.0164794921875,
838
+ 0.0260009765625,
839
+ -0.014892578125,
840
+ -0.01806640625,
841
+ -0.01141357421875,
842
+ -0.04248046875,
843
+ -0.0693359375,
844
+ 0.01141357421875,
845
+ 0.0211181640625,
846
+ 0.007415771484375,
847
+ -0.03466796875,
848
+ 0.024658203125,
849
+ 0.016357421875,
850
+ 0.04443359375,
851
+ 0.00830078125,
852
+ -0.033447265625,
853
+ 0.0012359619140625,
854
+ -0.036865234375,
855
+ 0.0286865234375,
856
+ -0.04150390625,
857
+ -0.0308837890625,
858
+ 0.059326171875,
859
+ -0.0213623046875,
860
+ 0.0140380859375,
861
+ 0.060302734375,
862
+ 0.0101318359375,
863
+ 0.052490234375,
864
+ 0.0242919921875,
865
+ -0.0213623046875,
866
+ 0.03857421875,
867
+ -0.000690460205078125,
868
+ 0.048583984375,
869
+ -0.01300048828125,
870
+ 0.006439208984375,
871
+ 0.005950927734375,
872
+ -0.06884765625,
873
+ -0.004364013671875,
874
+ 0.0302734375,
875
+ 0.021728515625,
876
+ 0.029541015625,
877
+ 0.0196533203125,
878
+ -0.0048828125,
879
+ -0.0172119140625,
880
+ 0.0009002685546875,
881
+ -0.0419921875,
882
+ -0.0185546875,
883
+ 0.06396484375,
884
+ -0.0028839111328125,
885
+ 0.0272216796875,
886
+ 0.0247802734375,
887
+ -0.018310546875,
888
+ 0.04052734375,
889
+ 0.06494140625,
890
+ 0.0233154296875,
891
+ -0.0001506805419921875,
892
+ -0.0250244140625,
893
+ -0.06103515625,
894
+ 0.00286865234375,
895
+ -0.00927734375,
896
+ -0.01025390625,
897
+ -0.03466796875,
898
+ -0.00116729736328125,
899
+ 0.029052734375,
900
+ 0.0150146484375,
901
+ 0.0130615234375,
902
+ 0.068359375,
903
+ 0.054931640625,
904
+ 0.037109375,
905
+ 0.025634765625,
906
+ -0.02587890625,
907
+ 0.0458984375,
908
+ 0.06591796875,
909
+ 0.01239013671875,
910
+ -0.0262451171875,
911
+ 0.10693359375,
912
+ -0.07421875,
913
+ -0.0174560546875,
914
+ -0.00604248046875,
915
+ -0.017578125,
916
+ 0.06103515625,
917
+ 0.0322265625,
918
+ -0.040771484375,
919
+ -0.0026397705078125,
920
+ 0.0037841796875,
921
+ -0.05859375,
922
+ -0.03662109375,
923
+ 0.0029449462890625,
924
+ -0.0245361328125,
925
+ 0.0179443359375,
926
+ 0.0220947265625,
927
+ 0.00726318359375,
928
+ -0.01458740234375,
929
+ 0.0054931640625,
930
+ 0.036376953125,
931
+ 0.02099609375,
932
+ 0.0162353515625,
933
+ -0.0250244140625,
934
+ 0.109375,
935
+ -0.024658203125,
936
+ -0.0206298828125,
937
+ -0.0269775390625,
938
+ -0.01043701171875,
939
+ -0.00994873046875,
940
+ -0.007720947265625,
941
+ -0.0002593994140625,
942
+ -0.01385498046875,
943
+ 0.01153564453125,
944
+ 0.0250244140625,
945
+ -0.017333984375,
946
+ -0.034912109375,
947
+ -0.004913330078125,
948
+ -0.0223388671875,
949
+ 0.053955078125,
950
+ 0.033447265625,
951
+ -0.01123046875,
952
+ -0.0213623046875,
953
+ 0.02880859375,
954
+ -0.0059814453125,
955
+ 0.00909423828125,
956
+ 0.0021820068359375,
957
+ -0.050048828125,
958
+ 0.044677734375,
959
+ -0.025390625,
960
+ -0.032958984375,
961
+ -0.033447265625,
962
+ -0.0250244140625,
963
+ -0.047607421875,
964
+ -0.02197265625,
965
+ -0.017333984375,
966
+ -0.00897216796875,
967
+ -0.037353515625,
968
+ -0.047607421875,
969
+ -0.006866455078125,
970
+ 0.0145263671875,
971
+ 0.0245361328125,
972
+ 0.0262451171875,
973
+ 0.01953125,
974
+ 0.036376953125,
975
+ 0.0859375,
976
+ -0.01177978515625,
977
+ -0.00994873046875,
978
+ -0.047119140625,
979
+ 0.0166015625,
980
+ -0.01025390625,
981
+ 0.0093994140625,
982
+ -0.0274658203125,
983
+ -0.0220947265625,
984
+ -0.03369140625,
985
+ -0.00518798828125,
986
+ -0.03466796875,
987
+ 0.00179290771484375,
988
+ 0.03173828125,
989
+ -0.0032958984375,
990
+ 0.036376953125,
991
+ 0.0927734375,
992
+ -0.01531982421875,
993
+ -0.037109375,
994
+ -0.0380859375,
995
+ -0.0147705078125,
996
+ 0.026611328125,
997
+ -0.01165771484375,
998
+ -0.0322265625,
999
+ 0.031005859375,
1000
+ -0.0147705078125,
1001
+ 0.00885009765625,
1002
+ 0.0262451171875,
1003
+ -0.01239013671875,
1004
+ 0.01226806640625,
1005
+ -0.0179443359375,
1006
+ 0.030029296875,
1007
+ -0.0234375,
1008
+ 0.0028076171875,
1009
+ -0.00665283203125,
1010
+ -0.0230712890625,
1011
+ -0.0029296875,
1012
+ -0.02783203125,
1013
+ -0.01190185546875,
1014
+ 0.00299072265625,
1015
+ -0.031982421875,
1016
+ -0.021728515625,
1017
+ 0.0262451171875,
1018
+ 0.04541015625,
1019
+ 0.00189208984375,
1020
+ 0.00811767578125,
1021
+ -0.030029296875,
1022
+ -0.0211181640625,
1023
+ 0.05615234375,
1024
+ 0.00994873046875,
1025
+ -0.0157470703125,
1026
+ 0.03369140625,
1027
+ 0.006683349609375,
1028
+ 0.000865936279296875,
1029
+ -0.0059814453125,
1030
+ -0.007476806640625,
1031
+ -0.0238037109375,
1032
+ 0.0458984375,
1033
+ -0.004119873046875,
1034
+ 0.0230712890625,
1035
+ 0.00732421875,
1036
+ 0.0225830078125,
1037
+ 0.0294189453125,
1038
+ -0.0302734375,
1039
+ -0.023681640625,
1040
+ 0.026123046875,
1041
+ 0.05029296875,
1042
+ 0.056640625,
1043
+ 0.00860595703125,
1044
+ 0.01104736328125,
1045
+ -0.01129150390625,
1046
+ -0.00092315673828125,
1047
+ 0.007293701171875,
1048
+ 0.040771484375,
1049
+ 0.002655029296875,
1050
+ 0.0174560546875,
1051
+ -0.0162353515625,
1052
+ 0.045166015625,
1053
+ -0.026123046875,
1054
+ 0.0022125244140625,
1055
+ 0.02685546875,
1056
+ 0.03173828125,
1057
+ 0.00830078125,
1058
+ -0.0556640625,
1059
+ -0.037109375,
1060
+ 0.0693359375,
1061
+ 0.0291748046875,
1062
+ 0.052490234375,
1063
+ 0.038818359375,
1064
+ 0.0152587890625,
1065
+ -0.03369140625,
1066
+ -0.0218505859375,
1067
+ 0.0157470703125,
1068
+ -0.0260009765625,
1069
+ 0.005706787109375,
1070
+ 0.005462646484375,
1071
+ 0.00494384765625,
1072
+ 0.00885009765625,
1073
+ 0.002044677734375,
1074
+ 0.057861328125,
1075
+ 0.029296875,
1076
+ -0.0311279296875,
1077
+ -0.03662109375,
1078
+ -0.01416015625,
1079
+ 0.007293701171875,
1080
+ 0.018798828125,
1081
+ -0.043701171875,
1082
+ 0.011962890625,
1083
+ 0.0296630859375,
1084
+ 0.00299072265625,
1085
+ -0.023681640625,
1086
+ -0.04443359375,
1087
+ 0.0233154296875,
1088
+ -0.031005859375,
1089
+ 0.0181884765625,
1090
+ 0.05517578125,
1091
+ -0.0010528564453125,
1092
+ -0.00075531005859375,
1093
+ 0.0157470703125,
1094
+ 0.015869140625,
1095
+ -0.0419921875,
1096
+ 0.00775146484375,
1097
+ -0.0159912109375,
1098
+ 0.0186767578125,
1099
+ -0.03857421875,
1100
+ 0.00115966796875,
1101
+ -0.01336669921875,
1102
+ 0.00933837890625,
1103
+ -0.01080322265625,
1104
+ -0.0556640625,
1105
+ 0.00433349609375,
1106
+ -0.0147705078125,
1107
+ 0.03466796875,
1108
+ -0.0308837890625,
1109
+ -0.00162506103515625,
1110
+ 0.050048828125,
1111
+ -0.04150390625,
1112
+ -0.0198974609375,
1113
+ -0.0155029296875,
1114
+ 0.0267333984375,
1115
+ 0.034423828125,
1116
+ 0.03466796875,
1117
+ -0.037841796875,
1118
+ 0.034912109375,
1119
+ 0.0017547607421875,
1120
+ 0.0260009765625,
1121
+ -0.0174560546875,
1122
+ -0.046630859375,
1123
+ -0.0159912109375,
1124
+ -0.0238037109375,
1125
+ 0.04150390625,
1126
+ -0.03759765625,
1127
+ 0.0093994140625,
1128
+ 0.0196533203125,
1129
+ -0.019287109375,
1130
+ 0.01214599609375,
1131
+ 0.01318359375,
1132
+ -0.0203857421875,
1133
+ -0.01318359375,
1134
+ -0.01904296875,
1135
+ 0.0235595703125,
1136
+ 0.0101318359375,
1137
+ 0.003326416015625,
1138
+ -0.04345703125,
1139
+ -0.003265380859375,
1140
+ 0.050537109375,
1141
+ -0.021240234375,
1142
+ 0.0281982421875,
1143
+ -0.004302978515625,
1144
+ 0.0595703125,
1145
+ -0.0062255859375,
1146
+ 0.0145263671875,
1147
+ 0.01214599609375,
1148
+ 0.00250244140625,
1149
+ -0.00909423828125,
1150
+ -0.01519775390625,
1151
+ -0.018310546875,
1152
+ 0.00946044921875,
1153
+ -0.064453125,
1154
+ 0.052490234375,
1155
+ 0.037353515625,
1156
+ 0.00823974609375,
1157
+ -0.0074462890625,
1158
+ -0.044189453125,
1159
+ 0.023193359375,
1160
+ 0.0400390625,
1161
+ 0.003143310546875,
1162
+ -0.00012493133544921875,
1163
+ -0.0230712890625,
1164
+ -0.0169677734375,
1165
+ -0.0032806396484375,
1166
+ -0.0269775390625,
1167
+ 0.01495361328125,
1168
+ 0.033203125,
1169
+ -0.0390625,
1170
+ 0.024169921875,
1171
+ -0.05517578125,
1172
+ 0.01416015625,
1173
+ -0.0057373046875,
1174
+ 0.052490234375,
1175
+ -0.00439453125,
1176
+ -0.039306640625,
1177
+ -0.08056640625,
1178
+ 0.049072265625,
1179
+ 0.002227783203125,
1180
+ 0.02197265625,
1181
+ -0.052978515625,
1182
+ -0.0203857421875,
1183
+ 0.034423828125,
1184
+ -0.0096435546875,
1185
+ 0.043212890625,
1186
+ 0.0361328125,
1187
+ -0.03662109375,
1188
+ 0.038330078125,
1189
+ -0.0380859375,
1190
+ -0.040283203125,
1191
+ 0.0213623046875,
1192
+ 0.02294921875,
1193
+ -0.00152587890625,
1194
+ -0.04296875,
1195
+ 0.0400390625,
1196
+ -0.01361083984375,
1197
+ -0.00872802734375,
1198
+ -0.03125,
1199
+ -0.007476806640625,
1200
+ 0.0267333984375,
1201
+ -0.000583648681640625,
1202
+ -0.06201171875,
1203
+ -0.048828125,
1204
+ 0.041015625,
1205
+ -0.000545501708984375,
1206
+ 0.041015625,
1207
+ 0.052001953125,
1208
+ 0.019287109375,
1209
+ -0.014892578125,
1210
+ 0.01434326171875,
1211
+ 0.0120849609375,
1212
+ 0.0059814453125,
1213
+ -0.0186767578125,
1214
+ 0.01483154296875,
1215
+ -0.02978515625,
1216
+ -0.024658203125,
1217
+ -0.0322265625,
1218
+ 0.056396484375,
1219
+ 0.061279296875,
1220
+ -0.02099609375,
1221
+ -0.0172119140625,
1222
+ 0.0279541015625,
1223
+ 0.02294921875,
1224
+ -0.02099609375,
1225
+ -0.04541015625,
1226
+ -0.00897216796875,
1227
+ -0.032470703125,
1228
+ 0.040283203125,
1229
+ -0.040283203125,
1230
+ -0.040771484375,
1231
+ -0.06787109375
1232
+ ]
1233
+ ]
1234
+ ],
1235
+ "hd_transform_order": "sub_glb",
1236
+ "hidden_act": "silu",
1237
+ "hidden_size": 3072,
1238
+ "image_size": 448,
1239
+ "img_processor": {
1240
+ "_attn_implementation_autoset": true,
1241
+ "_flash_attn_2_enabled": false,
1242
+ "_name_or_path": "",
1243
+ "add_cross_attention": false,
1244
+ "architectures": null,
1245
+ "attention_dropout": 0.0,
1246
+ "bad_words_ids": null,
1247
+ "begin_suppress_tokens": null,
1248
+ "bos_token_id": null,
1249
+ "chunk_size_feed_forward": 0,
1250
+ "cross_attention_hidden_size": null,
1251
+ "decoder_start_token_id": null,
1252
+ "diversity_penalty": 0.0,
1253
+ "do_sample": false,
1254
+ "early_stopping": false,
1255
+ "encoder_no_repeat_ngram_size": 0,
1256
+ "eos_token_id": null,
1257
+ "exponential_decay_length_penalty": null,
1258
+ "finetuning_task": null,
1259
+ "forced_bos_token_id": null,
1260
+ "forced_eos_token_id": null,
1261
+ "hidden_act": "gelu_pytorch_tanh",
1262
+ "hidden_size": 1152,
1263
+ "id2label": {
1264
+ "0": "LABEL_0",
1265
+ "1": "LABEL_1"
1266
+ },
1267
+ "image_size": 448,
1268
+ "intermediate_size": 4304,
1269
+ "is_decoder": false,
1270
+ "is_encoder_decoder": false,
1271
+ "label2id": {
1272
+ "LABEL_0": 0,
1273
+ "LABEL_1": 1
1274
+ },
1275
+ "layer_norm_eps": 1e-06,
1276
+ "length_penalty": 1.0,
1277
+ "max_length": 20,
1278
+ "min_length": 0,
1279
+ "model_type": "siglip_vision_model",
1280
+ "no_repeat_ngram_size": 0,
1281
+ "num_attention_heads": 16,
1282
+ "num_beam_groups": 1,
1283
+ "num_beams": 1,
1284
+ "num_channels": 3,
1285
+ "num_hidden_layers": 27,
1286
+ "num_return_sequences": 1,
1287
+ "output_attentions": false,
1288
+ "output_hidden_states": false,
1289
+ "output_scores": false,
1290
+ "pad_token_id": null,
1291
+ "patch_size": 14,
1292
+ "prefix": null,
1293
+ "problem_type": null,
1294
+ "pruned_heads": {},
1295
+ "remove_invalid_values": false,
1296
+ "repetition_penalty": 1.0,
1297
+ "return_dict": true,
1298
+ "return_dict_in_generate": false,
1299
+ "sep_token_id": null,
1300
+ "suppress_tokens": null,
1301
+ "task_specific_params": null,
1302
+ "temperature": 1.0,
1303
+ "tf_legacy_loss": false,
1304
+ "tie_encoder_decoder": false,
1305
+ "tie_word_embeddings": true,
1306
+ "tokenizer_class": null,
1307
+ "top_k": 50,
1308
+ "top_p": 1.0,
1309
+ "torch_dtype": null,
1310
+ "torchscript": false,
1311
+ "transformers_version": "4.51.0",
1312
+ "typical_p": 1.0,
1313
+ "use_bfloat16": false
1314
+ },
1315
+ "initializer_range": 0.02,
1316
+ "intermediate_size": 8192,
1317
+ "interpolate_factor": 1,
1318
+ "lm_head_bias": false,
1319
+ "max_position_embeddings": 131072,
1320
+ "mlp_bias": false,
1321
+ "model_type": "phi4mm",
1322
+ "num_attention_heads": 24,
1323
+ "num_hidden_layers": 32,
1324
+ "num_img_tokens": 256,
1325
+ "num_key_value_heads": 8,
1326
+ "original_max_position_embeddings": 4096,
1327
+ "pad_token_id": 199999,
1328
+ "partial_rotary_factor": 0.75,
1329
+ "resid_pdrop": 0.0,
1330
+ "rms_norm_eps": 1e-05,
1331
+ "rope_scaling": {
1332
+ "long_factor": [
1333
+ 1,
1334
+ 1.118320672,
1335
+ 1.250641126,
1336
+ 1.398617824,
1337
+ 1.564103225,
1338
+ 1.74916897,
1339
+ 1.956131817,
1340
+ 2.187582649,
1341
+ 2.446418898,
1342
+ 2.735880826,
1343
+ 3.059592084,
1344
+ 3.421605075,
1345
+ 3.826451687,
1346
+ 4.279200023,
1347
+ 4.785517845,
1348
+ 5.351743533,
1349
+ 5.984965424,
1350
+ 6.693110555,
1351
+ 7.485043894,
1352
+ 8.370679318,
1353
+ 9.36110372,
1354
+ 10.4687158,
1355
+ 11.70738129,
1356
+ 13.09260651,
1357
+ 14.64173252,
1358
+ 16.37415215,
1359
+ 18.31155283,
1360
+ 20.47818807,
1361
+ 22.90118105,
1362
+ 25.61086418,
1363
+ 28.64115884,
1364
+ 32.03,
1365
+ 32.1,
1366
+ 32.13,
1367
+ 32.23,
1368
+ 32.6,
1369
+ 32.61,
1370
+ 32.64,
1371
+ 32.66,
1372
+ 32.7,
1373
+ 32.71,
1374
+ 32.93,
1375
+ 32.97,
1376
+ 33.28,
1377
+ 33.49,
1378
+ 33.5,
1379
+ 44.16,
1380
+ 47.77
1381
+ ],
1382
+ "short_factor": [
1383
+ 1.0,
1384
+ 1.0,
1385
+ 1.0,
1386
+ 1.0,
1387
+ 1.0,
1388
+ 1.0,
1389
+ 1.0,
1390
+ 1.0,
1391
+ 1.0,
1392
+ 1.0,
1393
+ 1.0,
1394
+ 1.0,
1395
+ 1.0,
1396
+ 1.0,
1397
+ 1.0,
1398
+ 1.0,
1399
+ 1.0,
1400
+ 1.0,
1401
+ 1.0,
1402
+ 1.0,
1403
+ 1.0,
1404
+ 1.0,
1405
+ 1.0,
1406
+ 1.0,
1407
+ 1.0,
1408
+ 1.0,
1409
+ 1.0,
1410
+ 1.0,
1411
+ 1.0,
1412
+ 1.0,
1413
+ 1.0,
1414
+ 1.0,
1415
+ 1.0,
1416
+ 1.0,
1417
+ 1.0,
1418
+ 1.0,
1419
+ 1.0,
1420
+ 1.0,
1421
+ 1.0,
1422
+ 1.0,
1423
+ 1.0,
1424
+ 1.0,
1425
+ 1.0,
1426
+ 1.0,
1427
+ 1.0,
1428
+ 1.0,
1429
+ 1.0,
1430
+ 1.0
1431
+ ],
1432
+ "type": "longrope"
1433
+ },
1434
+ "rope_theta": 10000.0,
1435
+ "sliding_window": 262144,
1436
+ "speech_lora": {
1437
+ "dp": 0.01,
1438
+ "layer": "((layers.*self_attn\\.(qkv|o)_proj)|(layers.*mlp\\.(gate_up|down)_proj))",
1439
+ "lora_alpha": 640,
1440
+ "r": 320
1441
+ },
1442
+ "sub_GN": [
1443
+ [
1444
+ [
1445
+ [
1446
+ 0.01287841796875,
1447
+ 0.01202392578125,
1448
+ -0.0006866455078125,
1449
+ -0.004180908203125,
1450
+ -3.743171691894531e-05,
1451
+ -0.000934600830078125,
1452
+ 0.001434326171875,
1453
+ 0.007476806640625,
1454
+ -0.0035400390625,
1455
+ -0.0196533203125,
1456
+ 0.00775146484375,
1457
+ 0.00098419189453125,
1458
+ 0.00921630859375,
1459
+ 3.218650817871094e-05,
1460
+ 0.009765625,
1461
+ -0.0120849609375,
1462
+ -0.004241943359375,
1463
+ 0.00994873046875,
1464
+ 0.0013580322265625,
1465
+ 0.0012054443359375,
1466
+ 0.0047607421875,
1467
+ -0.00185394287109375,
1468
+ -0.0242919921875,
1469
+ 0.01214599609375,
1470
+ -0.0101318359375,
1471
+ -0.00070953369140625,
1472
+ -0.005126953125,
1473
+ -0.004425048828125,
1474
+ -0.01251220703125,
1475
+ 0.004119873046875,
1476
+ -0.00274658203125,
1477
+ -0.01055908203125,
1478
+ 0.00494384765625,
1479
+ -0.0028228759765625,
1480
+ 0.0024261474609375,
1481
+ 0.0064697265625,
1482
+ 0.000865936279296875,
1483
+ -0.00103759765625,
1484
+ -0.0025787353515625,
1485
+ 0.0166015625,
1486
+ -0.000675201416015625,
1487
+ 0.01177978515625,
1488
+ -0.00018024444580078125,
1489
+ 0.00238037109375,
1490
+ -0.003326416015625,
1491
+ 0.00153350830078125,
1492
+ -0.00086212158203125,
1493
+ -0.00628662109375,
1494
+ -6.079673767089844e-05,
1495
+ 0.005828857421875,
1496
+ 0.001495361328125,
1497
+ -0.01275634765625,
1498
+ -0.00909423828125,
1499
+ 0.00592041015625,
1500
+ 4.863739013671875e-05,
1501
+ 0.0067138671875,
1502
+ -0.003631591796875,
1503
+ 0.0024871826171875,
1504
+ -8.106231689453125e-05,
1505
+ -0.00148773193359375,
1506
+ -1.2993812561035156e-05,
1507
+ 0.00982666015625,
1508
+ 0.004669189453125,
1509
+ -0.003570556640625,
1510
+ 0.01092529296875,
1511
+ 0.0174560546875,
1512
+ -0.005645751953125,
1513
+ 0.01263427734375,
1514
+ 0.00909423828125,
1515
+ -0.00494384765625,
1516
+ 0.00604248046875,
1517
+ -0.0164794921875,
1518
+ -0.0016326904296875,
1519
+ -0.00112152099609375,
1520
+ 0.00177764892578125,
1521
+ -0.00139617919921875,
1522
+ -0.00653076171875,
1523
+ 0.00982666015625,
1524
+ 0.000370025634765625,
1525
+ -0.0159912109375,
1526
+ 0.00171661376953125,
1527
+ 0.0164794921875,
1528
+ -0.0074462890625,
1529
+ -0.004638671875,
1530
+ -0.01007080078125,
1531
+ -0.004913330078125,
1532
+ 0.0177001953125,
1533
+ -0.00689697265625,
1534
+ 0.0059814453125,
1535
+ 0.014892578125,
1536
+ -0.00927734375,
1537
+ 0.025146484375,
1538
+ 0.0042724609375,
1539
+ -0.00060272216796875,
1540
+ 0.0189208984375,
1541
+ 0.007232666015625,
1542
+ -0.002349853515625,
1543
+ 0.01483154296875,
1544
+ -0.005279541015625,
1545
+ -0.00933837890625,
1546
+ -0.000530242919921875,
1547
+ -0.00811767578125,
1548
+ 0.00848388671875,
1549
+ 0.00225830078125,
1550
+ -0.0026702880859375,
1551
+ -0.016357421875,
1552
+ 0.0034027099609375,
1553
+ -0.006317138671875,
1554
+ -0.00830078125,
1555
+ -0.007476806640625,
1556
+ 0.016357421875,
1557
+ 0.00408935546875,
1558
+ -0.0016632080078125,
1559
+ -0.00872802734375,
1560
+ -0.00787353515625,
1561
+ -0.0021820068359375,
1562
+ 0.00185394287109375,
1563
+ -0.002685546875,
1564
+ -0.013427734375,
1565
+ -0.006744384765625,
1566
+ 4.267692565917969e-05,
1567
+ 0.00372314453125,
1568
+ -0.005340576171875,
1569
+ 0.0010223388671875,
1570
+ -0.0078125,
1571
+ -0.0021209716796875,
1572
+ 0.00994873046875,
1573
+ 0.00616455078125,
1574
+ 0.0277099609375,
1575
+ -0.0096435546875,
1576
+ -0.01300048828125,
1577
+ -0.0167236328125,
1578
+ -0.01220703125,
1579
+ -0.01214599609375,
1580
+ -0.0016326904296875,
1581
+ -0.002685546875,
1582
+ 0.0016632080078125,
1583
+ -0.0177001953125,
1584
+ -0.01080322265625,
1585
+ -0.009521484375,
1586
+ 0.009765625,
1587
+ 0.0107421875,
1588
+ 0.007171630859375,
1589
+ -0.0030364990234375,
1590
+ 0.01141357421875,
1591
+ -0.012451171875,
1592
+ -0.004608154296875,
1593
+ 0.004669189453125,
1594
+ -0.003265380859375,
1595
+ -0.00970458984375,
1596
+ -0.00860595703125,
1597
+ -0.0103759765625,
1598
+ 0.003326416015625,
1599
+ 0.0167236328125,
1600
+ 0.0084228515625,
1601
+ 0.000736236572265625,
1602
+ -0.0032806396484375,
1603
+ 0.0125732421875,
1604
+ -0.004241943359375,
1605
+ 0.0123291015625,
1606
+ -0.0057373046875,
1607
+ 0.0081787109375,
1608
+ 0.0029296875,
1609
+ -0.00872802734375,
1610
+ -0.00150299072265625,
1611
+ 0.01275634765625,
1612
+ 0.0016937255859375,
1613
+ -0.00616455078125,
1614
+ 0.01275634765625,
1615
+ -0.0007171630859375,
1616
+ -0.0220947265625,
1617
+ -0.0042724609375,
1618
+ -0.000949859619140625,
1619
+ 0.004486083984375,
1620
+ 0.0029754638671875,
1621
+ -0.004638671875,
1622
+ 0.0076904296875,
1623
+ 0.00070953369140625,
1624
+ 0.0029449462890625,
1625
+ 0.002227783203125,
1626
+ -0.01544189453125,
1627
+ -0.01080322265625,
1628
+ -0.00057220458984375,
1629
+ 0.00021648406982421875,
1630
+ 0.019775390625,
1631
+ -0.006317138671875,
1632
+ -0.017333984375,
1633
+ -0.015869140625,
1634
+ -0.0032958984375,
1635
+ 0.0120849609375,
1636
+ 0.00518798828125,
1637
+ 0.004669189453125,
1638
+ 0.0164794921875,
1639
+ 0.004119873046875,
1640
+ -0.0007476806640625,
1641
+ -0.0036773681640625,
1642
+ -0.001953125,
1643
+ -0.006805419921875,
1644
+ 0.007537841796875,
1645
+ 0.003265380859375,
1646
+ -0.017822265625,
1647
+ -0.00592041015625,
1648
+ -0.00131988525390625,
1649
+ 0.00714111328125,
1650
+ 0.0079345703125,
1651
+ -0.0015106201171875,
1652
+ 0.004119873046875,
1653
+ 0.0027008056640625,
1654
+ 0.01531982421875,
1655
+ -0.00537109375,
1656
+ -0.00225830078125,
1657
+ -0.0001583099365234375,
1658
+ -0.005828857421875,
1659
+ 0.01336669921875,
1660
+ -0.0069580078125,
1661
+ 0.01312255859375,
1662
+ 0.0262451171875,
1663
+ -0.0027923583984375,
1664
+ 0.006103515625,
1665
+ -0.0166015625,
1666
+ 0.0074462890625,
1667
+ 0.01092529296875,
1668
+ 0.005859375,
1669
+ -0.00921630859375,
1670
+ 0.00640869140625,
1671
+ -0.01007080078125,
1672
+ 0.002105712890625,
1673
+ 0.006072998046875,
1674
+ -0.0093994140625,
1675
+ 0.006011962890625,
1676
+ -0.004425048828125,
1677
+ -0.0164794921875,
1678
+ -0.00909423828125,
1679
+ -0.017333984375,
1680
+ 0.00823974609375,
1681
+ -0.007293701171875,
1682
+ 0.006744384765625,
1683
+ -0.005340576171875,
1684
+ -0.004241943359375,
1685
+ 0.00799560546875,
1686
+ -0.0048828125,
1687
+ -0.01513671875,
1688
+ -0.011474609375,
1689
+ -0.00897216796875,
1690
+ 0.017578125,
1691
+ -0.006683349609375,
1692
+ 0.01025390625,
1693
+ -0.0059814453125,
1694
+ -8.153915405273438e-05,
1695
+ 0.00750732421875,
1696
+ 0.0020294189453125,
1697
+ -0.0033721923828125,
1698
+ 0.00250244140625,
1699
+ 0.005523681640625,
1700
+ -0.00150299072265625,
1701
+ -0.00994873046875,
1702
+ 0.00110626220703125,
1703
+ 0.0084228515625,
1704
+ -0.0098876953125,
1705
+ -0.0245361328125,
1706
+ -0.01495361328125,
1707
+ -0.0078125,
1708
+ -0.0137939453125,
1709
+ -0.00093841552734375,
1710
+ -0.00811767578125,
1711
+ -0.003631591796875,
1712
+ -0.010009765625,
1713
+ -0.01519775390625,
1714
+ 0.00677490234375,
1715
+ 0.0140380859375,
1716
+ -0.0064697265625,
1717
+ -0.002349853515625,
1718
+ 0.003021240234375,
1719
+ -0.0032501220703125,
1720
+ -0.001434326171875,
1721
+ -0.0120849609375,
1722
+ 0.00421142578125,
1723
+ -0.0130615234375,
1724
+ -0.001068115234375,
1725
+ -0.0126953125,
1726
+ 0.0022125244140625,
1727
+ -0.000629425048828125,
1728
+ -0.00140380859375,
1729
+ 0.004669189453125,
1730
+ 0.0062255859375,
1731
+ 0.005584716796875,
1732
+ 0.0018463134765625,
1733
+ 0.01116943359375,
1734
+ -0.0062255859375,
1735
+ 0.0009918212890625,
1736
+ 0.00122833251953125,
1737
+ 0.01141357421875,
1738
+ -0.009521484375,
1739
+ 0.017578125,
1740
+ 0.006561279296875,
1741
+ 0.003875732421875,
1742
+ -0.0107421875,
1743
+ -0.00994873046875,
1744
+ -0.0069580078125,
1745
+ 0.01470947265625,
1746
+ -0.00421142578125,
1747
+ 0.006103515625,
1748
+ 0.000392913818359375,
1749
+ 0.004119873046875,
1750
+ 0.0052490234375,
1751
+ -0.00060272216796875,
1752
+ -0.01080322265625,
1753
+ -0.01068115234375,
1754
+ -0.000774383544921875,
1755
+ -0.0172119140625,
1756
+ -0.000835418701171875,
1757
+ -0.0096435546875,
1758
+ 0.0022735595703125,
1759
+ -0.001434326171875,
1760
+ 0.003692626953125,
1761
+ -0.00119781494140625,
1762
+ 0.0026092529296875,
1763
+ 0.02490234375,
1764
+ 0.015380859375,
1765
+ -0.0201416015625,
1766
+ 0.0238037109375,
1767
+ -0.0103759765625,
1768
+ -0.009033203125,
1769
+ -0.01348876953125,
1770
+ 0.00125885009765625,
1771
+ 0.016845703125,
1772
+ -0.0028533935546875,
1773
+ -0.005126953125,
1774
+ -0.0130615234375,
1775
+ -0.00970458984375,
1776
+ 0.00933837890625,
1777
+ 0.01611328125,
1778
+ -0.0076904296875,
1779
+ -0.002197265625,
1780
+ 0.006988525390625,
1781
+ -0.0223388671875,
1782
+ 0.00445556640625,
1783
+ -0.00433349609375,
1784
+ 0.0084228515625,
1785
+ -0.00762939453125,
1786
+ -0.0064697265625,
1787
+ 0.0150146484375,
1788
+ 0.0150146484375,
1789
+ -0.017333984375,
1790
+ 0.017822265625,
1791
+ 0.00177764892578125,
1792
+ 0.00921630859375,
1793
+ -0.00927734375,
1794
+ 0.0028533935546875,
1795
+ -2.2411346435546875e-05,
1796
+ -0.00130462646484375,
1797
+ -0.00433349609375,
1798
+ -0.0013580322265625,
1799
+ 0.01202392578125,
1800
+ -0.0029754638671875,
1801
+ -0.000385284423828125,
1802
+ -0.004608154296875,
1803
+ -0.0037841796875,
1804
+ 0.002166748046875,
1805
+ 0.01068115234375,
1806
+ -0.00506591796875,
1807
+ 0.001617431640625,
1808
+ -0.0107421875,
1809
+ -7.724761962890625e-05,
1810
+ -0.005523681640625,
1811
+ 0.012451171875,
1812
+ -0.00341796875,
1813
+ 0.00286865234375,
1814
+ 0.0244140625,
1815
+ 0.0032196044921875,
1816
+ 0.0048828125,
1817
+ 0.0177001953125,
1818
+ -0.006072998046875,
1819
+ 0.0087890625,
1820
+ 0.00017833709716796875,
1821
+ -0.00799560546875,
1822
+ -0.0250244140625,
1823
+ 0.003326416015625,
1824
+ 0.0017242431640625,
1825
+ 0.004791259765625,
1826
+ -0.0159912109375,
1827
+ -0.00177764892578125,
1828
+ 0.019775390625,
1829
+ -0.0086669921875,
1830
+ 0.01422119140625,
1831
+ -0.005950927734375,
1832
+ 0.005035400390625,
1833
+ -0.011474609375,
1834
+ 0.00238037109375,
1835
+ -0.004547119140625,
1836
+ 0.01177978515625,
1837
+ 0.0115966796875,
1838
+ 0.0030517578125,
1839
+ -8.7738037109375e-05,
1840
+ -0.00335693359375,
1841
+ 0.00592041015625,
1842
+ 0.009033203125,
1843
+ 0.00139617919921875,
1844
+ -0.0185546875,
1845
+ -0.004547119140625,
1846
+ 0.00543212890625,
1847
+ 0.02001953125,
1848
+ -0.01019287109375,
1849
+ -0.01275634765625,
1850
+ 0.005950927734375,
1851
+ 0.00921630859375,
1852
+ 0.00131988525390625,
1853
+ 2.2530555725097656e-05,
1854
+ -0.00604248046875,
1855
+ 0.00885009765625,
1856
+ -0.000335693359375,
1857
+ -0.00848388671875,
1858
+ -0.0072021484375,
1859
+ 0.0037841796875,
1860
+ 0.00177764892578125,
1861
+ -0.0113525390625,
1862
+ -0.00909423828125,
1863
+ 0.004669189453125,
1864
+ -0.01153564453125,
1865
+ 0.00390625,
1866
+ 0.01116943359375,
1867
+ -0.002288818359375,
1868
+ -0.005615234375,
1869
+ -0.00051116943359375,
1870
+ 0.0029144287109375,
1871
+ 0.0159912109375,
1872
+ -0.017578125,
1873
+ -0.01416015625,
1874
+ 0.0017547607421875,
1875
+ 0.00933837890625,
1876
+ 0.000835418701171875,
1877
+ 0.0064697265625,
1878
+ -0.01080322265625,
1879
+ 0.0172119140625,
1880
+ -0.007659912109375,
1881
+ 0.00159454345703125,
1882
+ 0.006500244140625,
1883
+ -0.00750732421875,
1884
+ 0.002532958984375,
1885
+ -0.00909423828125,
1886
+ 0.006744384765625,
1887
+ -0.0133056640625,
1888
+ 0.002288818359375,
1889
+ -0.00101470947265625,
1890
+ 0.003753662109375,
1891
+ -0.0128173828125,
1892
+ 0.0081787109375,
1893
+ 0.000247955322265625,
1894
+ -0.004302978515625,
1895
+ 0.01300048828125,
1896
+ -0.0019989013671875,
1897
+ 0.01031494140625,
1898
+ 0.0015869140625,
1899
+ 0.0135498046875,
1900
+ -0.00323486328125,
1901
+ -0.00021648406982421875,
1902
+ 0.00927734375,
1903
+ -0.01226806640625,
1904
+ -0.00946044921875,
1905
+ 0.011474609375,
1906
+ -0.01031494140625,
1907
+ -0.006927490234375,
1908
+ -0.0118408203125,
1909
+ 0.004913330078125,
1910
+ 0.01446533203125,
1911
+ 0.0174560546875,
1912
+ -0.00153350830078125,
1913
+ 0.005126953125,
1914
+ 0.00113677978515625,
1915
+ -0.000141143798828125,
1916
+ 0.01373291015625,
1917
+ 0.00738525390625,
1918
+ -0.007415771484375,
1919
+ -0.005615234375,
1920
+ -0.00927734375,
1921
+ 0.012939453125,
1922
+ 0.00173187255859375,
1923
+ -0.00043487548828125,
1924
+ -0.012451171875,
1925
+ 0.0101318359375,
1926
+ -0.00150299072265625,
1927
+ -0.006591796875,
1928
+ 0.0107421875,
1929
+ 0.025634765625,
1930
+ 0.0003414154052734375,
1931
+ -0.00017070770263671875,
1932
+ -0.01171875,
1933
+ 0.01806640625,
1934
+ 0.006256103515625,
1935
+ 0.00982666015625,
1936
+ -0.0030670166015625,
1937
+ -0.0091552734375,
1938
+ -0.0179443359375,
1939
+ 0.0020751953125,
1940
+ 0.006744384765625,
1941
+ -0.00445556640625,
1942
+ -0.00335693359375,
1943
+ -0.00543212890625,
1944
+ -0.015869140625,
1945
+ -0.005523681640625,
1946
+ 0.0118408203125,
1947
+ 0.0011138916015625,
1948
+ -0.00543212890625,
1949
+ -0.00013637542724609375,
1950
+ -0.001617431640625,
1951
+ 0.001617431640625,
1952
+ 0.004150390625,
1953
+ 0.00074005126953125,
1954
+ -0.019287109375,
1955
+ -0.0078125,
1956
+ -0.016357421875,
1957
+ -0.0146484375,
1958
+ -0.003143310546875,
1959
+ 0.0025787353515625,
1960
+ -0.019287109375,
1961
+ -0.005218505859375,
1962
+ -0.00830078125,
1963
+ 0.01080322265625,
1964
+ -0.004180908203125,
1965
+ -0.009765625,
1966
+ -0.006927490234375,
1967
+ -0.00823974609375,
1968
+ -0.005035400390625,
1969
+ -0.0185546875,
1970
+ -0.019775390625,
1971
+ 0.00011396408081054688,
1972
+ -0.0020751953125,
1973
+ -0.00927734375,
1974
+ -0.006622314453125,
1975
+ 0.0037078857421875,
1976
+ -0.0027923583984375,
1977
+ 0.0017242431640625,
1978
+ 0.001983642578125,
1979
+ -0.007080078125,
1980
+ -0.00640869140625,
1981
+ -0.007659912109375,
1982
+ 0.0072021484375,
1983
+ 0.002044677734375,
1984
+ -0.01214599609375,
1985
+ 0.00171661376953125,
1986
+ -0.0003204345703125,
1987
+ -0.0002765655517578125,
1988
+ 0.00921630859375,
1989
+ 0.00738525390625,
1990
+ 0.00958251953125,
1991
+ -0.000583648681640625,
1992
+ -0.0169677734375,
1993
+ 0.000453948974609375,
1994
+ 0.006317138671875,
1995
+ -0.0137939453125,
1996
+ -0.018798828125,
1997
+ 0.0196533203125,
1998
+ 0.01434326171875,
1999
+ 0.0030059814453125,
2000
+ 0.006195068359375,
2001
+ 0.01025390625,
2002
+ 0.015625,
2003
+ -0.00897216796875,
2004
+ 0.004638671875,
2005
+ -0.03466796875,
2006
+ -0.0008697509765625,
2007
+ -0.000835418701171875,
2008
+ 0.0024261474609375,
2009
+ -0.012939453125,
2010
+ 0.00848388671875,
2011
+ -0.000820159912109375,
2012
+ -0.00927734375,
2013
+ -0.015625,
2014
+ 0.00567626953125,
2015
+ -0.0016632080078125,
2016
+ -0.0019989013671875,
2017
+ -0.0028533935546875,
2018
+ -0.002777099609375,
2019
+ 0.0025482177734375,
2020
+ 0.01055908203125,
2021
+ 0.00714111328125,
2022
+ -0.01055908203125,
2023
+ 0.00162506103515625,
2024
+ 0.0098876953125,
2025
+ -0.00421142578125,
2026
+ 0.0024261474609375,
2027
+ 0.01373291015625,
2028
+ 0.01611328125,
2029
+ -0.0106201171875,
2030
+ -0.0004405975341796875,
2031
+ -0.0045166015625,
2032
+ -0.0038909912109375,
2033
+ 0.00145721435546875,
2034
+ 0.01123046875,
2035
+ 0.0022430419921875,
2036
+ -0.0078125,
2037
+ 0.01177978515625,
2038
+ -0.00142669677734375,
2039
+ -0.000701904296875,
2040
+ -0.0009613037109375,
2041
+ 0.01556396484375,
2042
+ 0.01019287109375,
2043
+ -0.0155029296875,
2044
+ -0.00537109375,
2045
+ 0.01483154296875,
2046
+ -0.01043701171875,
2047
+ 0.01165771484375,
2048
+ -0.00799560546875,
2049
+ -0.00390625,
2050
+ -0.00174713134765625,
2051
+ 0.009033203125,
2052
+ 0.00372314453125,
2053
+ -0.004852294921875,
2054
+ -0.003082275390625,
2055
+ 0.012939453125,
2056
+ -0.01055908203125,
2057
+ -0.0052490234375,
2058
+ 0.0022125244140625,
2059
+ 0.001556396484375,
2060
+ -0.010498046875,
2061
+ 0.0020599365234375,
2062
+ 0.01611328125,
2063
+ -0.00994873046875,
2064
+ -0.0189208984375,
2065
+ -0.007537841796875,
2066
+ -0.00150299072265625,
2067
+ 1.0192394256591797e-05,
2068
+ -0.007598876953125,
2069
+ 0.0047607421875,
2070
+ -0.0096435546875,
2071
+ -0.0166015625,
2072
+ 0.0126953125,
2073
+ -0.004547119140625,
2074
+ -0.005828857421875,
2075
+ 0.0007781982421875,
2076
+ -0.0074462890625,
2077
+ 0.000701904296875,
2078
+ 0.0018768310546875,
2079
+ 0.00396728515625,
2080
+ 0.0107421875,
2081
+ -0.0062255859375,
2082
+ 0.0211181640625,
2083
+ -0.0194091796875,
2084
+ 0.004058837890625,
2085
+ -0.005096435546875,
2086
+ 0.0036773681640625,
2087
+ 0.00726318359375,
2088
+ -0.003662109375,
2089
+ 0.00885009765625,
2090
+ -0.008056640625,
2091
+ 0.01446533203125,
2092
+ -0.010009765625,
2093
+ 0.002288818359375,
2094
+ 0.000629425048828125,
2095
+ 0.003814697265625,
2096
+ 7.581710815429688e-05,
2097
+ 0.001739501953125,
2098
+ -0.0068359375,
2099
+ 0.00640869140625,
2100
+ 0.002655029296875,
2101
+ 0.0115966796875,
2102
+ -0.0062255859375,
2103
+ -0.0032806396484375,
2104
+ 0.01116943359375,
2105
+ 0.000690460205078125,
2106
+ -0.0062255859375,
2107
+ -0.01043701171875,
2108
+ 0.0003662109375,
2109
+ 0.01519775390625,
2110
+ -0.00384521484375,
2111
+ 0.002227783203125,
2112
+ -0.0027618408203125,
2113
+ -0.01171875,
2114
+ 0.00286865234375,
2115
+ -0.001495361328125,
2116
+ 0.00177764892578125,
2117
+ -0.009033203125,
2118
+ -0.006744384765625,
2119
+ -0.0184326171875,
2120
+ 0.0023193359375,
2121
+ -0.01190185546875,
2122
+ 0.006103515625,
2123
+ 0.005218505859375,
2124
+ 5.3882598876953125e-05,
2125
+ 0.0013427734375,
2126
+ 0.00360107421875,
2127
+ -0.0031585693359375,
2128
+ 0.0068359375,
2129
+ 0.00156402587890625,
2130
+ 0.0050048828125,
2131
+ 0.02001953125,
2132
+ -0.00323486328125,
2133
+ -0.01165771484375,
2134
+ -0.01275634765625,
2135
+ 0.0002269744873046875,
2136
+ 0.00104522705078125,
2137
+ -0.0004177093505859375,
2138
+ -0.006500244140625,
2139
+ 0.0008087158203125,
2140
+ -0.01123046875,
2141
+ 0.00823974609375,
2142
+ 0.00738525390625,
2143
+ -0.0019683837890625,
2144
+ -0.005340576171875,
2145
+ -0.01214599609375,
2146
+ -0.0027008056640625,
2147
+ 0.0040283203125,
2148
+ 0.01220703125,
2149
+ -0.006988525390625,
2150
+ -0.00579833984375,
2151
+ 0.00372314453125,
2152
+ -0.002197265625,
2153
+ -0.007720947265625,
2154
+ -0.005157470703125,
2155
+ -0.003448486328125,
2156
+ -0.011962890625,
2157
+ 0.0125732421875,
2158
+ -0.00125885009765625,
2159
+ 0.0010223388671875,
2160
+ 0.0012054443359375,
2161
+ -0.0150146484375,
2162
+ -0.00127410888671875,
2163
+ 0.01007080078125,
2164
+ 0.00445556640625,
2165
+ -0.001190185546875,
2166
+ 0.006866455078125,
2167
+ 0.0164794921875,
2168
+ -0.018310546875,
2169
+ -0.00408935546875,
2170
+ -0.0001392364501953125,
2171
+ 0.00543212890625,
2172
+ 0.0020294189453125,
2173
+ 0.0003986358642578125,
2174
+ 0.010498046875,
2175
+ -0.0189208984375,
2176
+ -0.01263427734375,
2177
+ -0.000972747802734375,
2178
+ -0.00787353515625,
2179
+ 0.00811767578125,
2180
+ -0.01263427734375,
2181
+ -0.006500244140625,
2182
+ -0.00689697265625,
2183
+ 0.01263427734375,
2184
+ -0.0024566650390625,
2185
+ 0.0198974609375,
2186
+ -0.006805419921875,
2187
+ 0.00958251953125,
2188
+ -0.0107421875,
2189
+ -0.0031585693359375,
2190
+ 0.021484375,
2191
+ -0.0118408203125,
2192
+ -0.001708984375,
2193
+ 0.00982666015625,
2194
+ -0.0022430419921875,
2195
+ -0.01025390625,
2196
+ -0.00762939453125,
2197
+ -0.0162353515625,
2198
+ -0.00057220458984375,
2199
+ 0.00286865234375,
2200
+ -0.0020904541015625,
2201
+ -0.000255584716796875,
2202
+ 0.01104736328125,
2203
+ -0.006683349609375,
2204
+ 0.0020751953125,
2205
+ 0.000362396240234375,
2206
+ -0.0052490234375,
2207
+ 0.0011444091796875,
2208
+ -0.021484375,
2209
+ -0.00026702880859375,
2210
+ 0.010009765625,
2211
+ -0.0057373046875,
2212
+ 0.0140380859375,
2213
+ -0.00946044921875,
2214
+ 0.0072021484375,
2215
+ 0.0028076171875,
2216
+ -0.0159912109375,
2217
+ -0.00335693359375,
2218
+ 0.0177001953125,
2219
+ 0.0027923583984375,
2220
+ 0.005706787109375,
2221
+ 0.005584716796875,
2222
+ 0.0084228515625,
2223
+ -0.001434326171875,
2224
+ -0.00958251953125,
2225
+ -0.00848388671875,
2226
+ -0.0093994140625,
2227
+ -0.0093994140625,
2228
+ 0.01214599609375,
2229
+ -0.01312255859375,
2230
+ -0.01287841796875,
2231
+ -0.004638671875,
2232
+ -0.002410888671875,
2233
+ 0.005828857421875,
2234
+ -0.004669189453125,
2235
+ -0.006927490234375,
2236
+ 0.002716064453125,
2237
+ -0.0089111328125,
2238
+ 0.004730224609375,
2239
+ 0.0157470703125,
2240
+ -0.00173187255859375,
2241
+ 0.00823974609375,
2242
+ -0.00106048583984375,
2243
+ -0.01953125,
2244
+ 0.0009918212890625,
2245
+ 0.0026397705078125,
2246
+ 0.01397705078125,
2247
+ 0.003265380859375,
2248
+ 0.001556396484375,
2249
+ -0.00116729736328125,
2250
+ -0.001617431640625,
2251
+ 0.009033203125,
2252
+ -0.00823974609375,
2253
+ 0.00732421875,
2254
+ -0.002197265625,
2255
+ -0.01495361328125,
2256
+ -0.019775390625,
2257
+ 0.004058837890625,
2258
+ 0.01513671875,
2259
+ 0.008056640625,
2260
+ -0.0111083984375,
2261
+ 0.0068359375,
2262
+ 0.004669189453125,
2263
+ 0.01409912109375,
2264
+ 0.0001277923583984375,
2265
+ -0.0036773681640625,
2266
+ -0.00555419921875,
2267
+ 0.00408935546875,
2268
+ -0.01531982421875,
2269
+ 0.00081634521484375,
2270
+ 0.007080078125,
2271
+ -0.01080322265625,
2272
+ 0.00665283203125,
2273
+ -0.005584716796875,
2274
+ -0.00457763671875,
2275
+ -0.0125732421875,
2276
+ 0.01141357421875,
2277
+ -0.0108642578125,
2278
+ 0.0277099609375,
2279
+ -0.016845703125,
2280
+ -0.01385498046875,
2281
+ -0.0107421875,
2282
+ -0.0123291015625,
2283
+ -0.01483154296875,
2284
+ -0.0005035400390625,
2285
+ -0.00677490234375,
2286
+ -0.006805419921875,
2287
+ 0.0301513671875,
2288
+ 0.00982666015625,
2289
+ -0.00194549560546875,
2290
+ 0.01519775390625,
2291
+ 0.0028076171875,
2292
+ -0.01531982421875,
2293
+ -0.0076904296875,
2294
+ 0.0048828125,
2295
+ 0.00726318359375,
2296
+ -0.004119873046875,
2297
+ -0.008056640625,
2298
+ 0.0037689208984375,
2299
+ 0.01556396484375,
2300
+ -0.022216796875,
2301
+ -0.0079345703125,
2302
+ 0.01446533203125,
2303
+ 0.00933837890625,
2304
+ 0.01129150390625,
2305
+ -0.021240234375,
2306
+ 0.0038604736328125,
2307
+ -0.00396728515625,
2308
+ -0.001678466796875,
2309
+ 0.005706787109375,
2310
+ -0.006683349609375,
2311
+ -0.0009002685546875,
2312
+ -0.00075531005859375,
2313
+ -0.00020122528076171875,
2314
+ -0.00127410888671875,
2315
+ -0.016845703125,
2316
+ 0.0011138916015625,
2317
+ 0.0145263671875,
2318
+ -0.002593994140625,
2319
+ 0.00262451171875,
2320
+ 0.0034027099609375,
2321
+ -0.0010528564453125,
2322
+ -0.0040283203125,
2323
+ 0.0008392333984375,
2324
+ -0.00054168701171875,
2325
+ 0.005950927734375,
2326
+ 0.0155029296875,
2327
+ 0.0050048828125,
2328
+ 0.000873565673828125,
2329
+ 0.007476806640625,
2330
+ -0.0206298828125,
2331
+ 0.00135040283203125,
2332
+ 0.000751495361328125,
2333
+ 0.0057373046875,
2334
+ 0.0016021728515625,
2335
+ 0.0098876953125,
2336
+ 0.0093994140625,
2337
+ 0.00408935546875,
2338
+ -0.0174560546875,
2339
+ -0.01495361328125,
2340
+ 0.00244140625,
2341
+ 0.00836181640625,
2342
+ -0.00213623046875,
2343
+ 0.0004024505615234375,
2344
+ 0.00640869140625,
2345
+ -0.001953125,
2346
+ 0.0089111328125,
2347
+ -0.005584716796875,
2348
+ 0.006591796875,
2349
+ 0.004730224609375,
2350
+ 0.0010223388671875,
2351
+ 0.0125732421875,
2352
+ 0.007476806640625,
2353
+ -0.00058746337890625,
2354
+ 0.004974365234375,
2355
+ 0.01531982421875,
2356
+ 0.003936767578125,
2357
+ -0.005706787109375,
2358
+ 0.005157470703125,
2359
+ -0.00156402587890625,
2360
+ 0.001983642578125,
2361
+ 0.0115966796875,
2362
+ -0.0272216796875,
2363
+ -0.01953125,
2364
+ -0.00025177001953125,
2365
+ -0.003173828125,
2366
+ -0.003173828125,
2367
+ 0.00897216796875,
2368
+ -0.01202392578125,
2369
+ -0.002471923828125,
2370
+ 0.01556396484375,
2371
+ 0.001190185546875,
2372
+ -0.0218505859375,
2373
+ -2.9802322387695312e-05,
2374
+ -0.015869140625,
2375
+ 0.0118408203125,
2376
+ -0.004974365234375,
2377
+ -0.00347900390625,
2378
+ -0.003997802734375,
2379
+ -0.0029296875,
2380
+ -0.00390625,
2381
+ 0.0150146484375,
2382
+ 0.00457763671875,
2383
+ -0.00020313262939453125,
2384
+ -0.005157470703125,
2385
+ -0.010009765625,
2386
+ -0.0022735595703125,
2387
+ 0.006561279296875,
2388
+ -0.0103759765625,
2389
+ -0.01239013671875,
2390
+ 0.0045166015625,
2391
+ -0.0030670166015625,
2392
+ -0.00933837890625,
2393
+ -0.00616455078125,
2394
+ -0.00250244140625,
2395
+ 0.01031494140625,
2396
+ 0.00193023681640625,
2397
+ -0.0035247802734375,
2398
+ 0.001251220703125,
2399
+ 0.0022735595703125,
2400
+ -0.006378173828125,
2401
+ -0.00787353515625,
2402
+ -0.0263671875,
2403
+ -0.007537841796875,
2404
+ -0.001953125,
2405
+ 0.01177978515625,
2406
+ -0.0037078857421875,
2407
+ -0.01556396484375,
2408
+ -0.00897216796875,
2409
+ -0.0032958984375,
2410
+ 0.00860595703125,
2411
+ -0.002288818359375,
2412
+ -0.002105712890625,
2413
+ -0.0042724609375,
2414
+ -0.0205078125,
2415
+ 0.0069580078125,
2416
+ -0.0028076171875,
2417
+ 0.004302978515625,
2418
+ -0.0146484375,
2419
+ 0.00665283203125,
2420
+ -0.0004367828369140625,
2421
+ -0.01275634765625,
2422
+ -0.001068115234375,
2423
+ -0.007720947265625,
2424
+ 0.01544189453125,
2425
+ 0.0218505859375,
2426
+ -0.01953125,
2427
+ -0.00897216796875,
2428
+ -0.0186767578125,
2429
+ 0.0081787109375,
2430
+ -0.001495361328125,
2431
+ 0.007110595703125,
2432
+ 0.01202392578125,
2433
+ -0.0118408203125,
2434
+ -0.007568359375,
2435
+ -0.007080078125,
2436
+ -0.00848388671875,
2437
+ -0.004669189453125,
2438
+ 0.00469970703125,
2439
+ -0.0008392333984375,
2440
+ 0.0022125244140625,
2441
+ 0.0032958984375,
2442
+ -0.01025390625,
2443
+ 0.006072998046875,
2444
+ 0.0030975341796875,
2445
+ 0.002349853515625,
2446
+ 0.00762939453125,
2447
+ 0.0079345703125,
2448
+ -0.0013427734375,
2449
+ -0.00238037109375,
2450
+ -0.003814697265625,
2451
+ -0.001983642578125,
2452
+ 0.0025177001953125,
2453
+ -0.01513671875,
2454
+ 0.005645751953125,
2455
+ -0.00013065338134765625,
2456
+ -0.0113525390625,
2457
+ -0.0038299560546875,
2458
+ -0.00927734375,
2459
+ -0.0125732421875,
2460
+ -0.004669189453125,
2461
+ -0.0033416748046875,
2462
+ -0.0035552978515625,
2463
+ 0.0093994140625,
2464
+ 0.00189971923828125,
2465
+ -9.250640869140625e-05,
2466
+ 0.000164031982421875,
2467
+ 0.000568389892578125,
2468
+ 0.00537109375,
2469
+ -0.005523681640625,
2470
+ 0.002899169921875,
2471
+ -0.0098876953125,
2472
+ -0.0137939453125,
2473
+ -0.0030059814453125,
2474
+ -0.00701904296875,
2475
+ -0.0084228515625,
2476
+ -0.000823974609375,
2477
+ 0.00799560546875,
2478
+ -0.005706787109375,
2479
+ 0.00823974609375,
2480
+ -0.00946044921875,
2481
+ -0.0030517578125,
2482
+ -0.0169677734375,
2483
+ 0.006378173828125,
2484
+ 0.0024566650390625,
2485
+ 0.00775146484375,
2486
+ 0.00101470947265625,
2487
+ -0.00848388671875,
2488
+ -0.003265380859375,
2489
+ -0.004608154296875,
2490
+ -0.004364013671875,
2491
+ 0.001312255859375,
2492
+ 0.0111083984375,
2493
+ 0.001312255859375,
2494
+ -0.0078125,
2495
+ 0.0003509521484375,
2496
+ -0.00131988525390625,
2497
+ -0.0024261474609375,
2498
+ 0.0047607421875,
2499
+ -0.01129150390625,
2500
+ 0.005645751953125,
2501
+ -0.0103759765625,
2502
+ 0.007232666015625,
2503
+ 0.000408172607421875,
2504
+ 0.006011962890625,
2505
+ 0.004547119140625,
2506
+ 0.00136566162109375,
2507
+ -0.01361083984375,
2508
+ -0.01055908203125,
2509
+ -0.000904083251953125,
2510
+ 0.003509521484375,
2511
+ 0.0037689208984375,
2512
+ -0.024658203125,
2513
+ 0.00909423828125,
2514
+ 0.0034942626953125,
2515
+ 0.0113525390625,
2516
+ 0.005859375,
2517
+ -0.0027313232421875,
2518
+ 0.0010528564453125,
2519
+ 0.0164794921875,
2520
+ -0.01226806640625,
2521
+ -0.013427734375,
2522
+ 0.00023746490478515625,
2523
+ 0.01409912109375,
2524
+ 0.01123046875,
2525
+ -0.00872802734375,
2526
+ -0.0010528564453125,
2527
+ 0.006011962890625,
2528
+ -0.004608154296875,
2529
+ 0.00738525390625,
2530
+ -0.00341796875,
2531
+ -0.00482177734375,
2532
+ 0.0024261474609375,
2533
+ 0.0089111328125,
2534
+ 0.0048828125,
2535
+ 0.007110595703125,
2536
+ 0.002899169921875,
2537
+ -0.004302978515625,
2538
+ 0.004486083984375,
2539
+ 0.00714111328125,
2540
+ 0.0035858154296875,
2541
+ -0.01092529296875,
2542
+ 0.0045166015625,
2543
+ 0.00148773193359375,
2544
+ 0.00118255615234375,
2545
+ 0.00439453125,
2546
+ -0.0135498046875,
2547
+ 0.005523681640625,
2548
+ -0.01055908203125,
2549
+ -0.004364013671875,
2550
+ -0.00567626953125,
2551
+ -0.0050048828125,
2552
+ -0.006011962890625,
2553
+ -0.00848388671875,
2554
+ -0.000545501708984375,
2555
+ -0.01153564453125,
2556
+ 0.00579833984375,
2557
+ 0.0064697265625,
2558
+ -0.004180908203125,
2559
+ -0.00311279296875,
2560
+ -0.000888824462890625,
2561
+ 0.0025177001953125,
2562
+ 0.0012054443359375,
2563
+ 0.0087890625,
2564
+ -0.005401611328125,
2565
+ 0.0032806396484375,
2566
+ -0.01190185546875,
2567
+ -0.009033203125,
2568
+ -0.0111083984375,
2569
+ -0.000640869140625,
2570
+ -0.009765625,
2571
+ -0.0167236328125,
2572
+ -0.0023956298828125,
2573
+ 0.00023937225341796875,
2574
+ -0.0189208984375,
2575
+ -0.007080078125,
2576
+ -0.00014019012451171875,
2577
+ -0.00958251953125,
2578
+ -0.0076904296875,
2579
+ -0.0027008056640625,
2580
+ 0.0047607421875,
2581
+ 0.0087890625,
2582
+ -0.0047607421875,
2583
+ -1.0967254638671875e-05,
2584
+ 0.010009765625,
2585
+ 0.003387451171875,
2586
+ 0.015869140625,
2587
+ 0.0096435546875,
2588
+ 0.010009765625,
2589
+ 1.1861324310302734e-05,
2590
+ 0.001678466796875,
2591
+ -0.00055694580078125,
2592
+ -0.00140380859375,
2593
+ -0.0031280517578125,
2594
+ -0.005645751953125,
2595
+ -0.00162506103515625,
2596
+ -0.003326416015625,
2597
+ 0.0181884765625
2598
+ ]
2599
+ ]
2600
+ ]
2601
+ ],
2602
+ "tie_word_embeddings": true,
2603
+ "torch_dtype": "bfloat16",
2604
+ "transformers_version": "4.51.0",
2605
+ "use_cache": true,
2606
+ "vision_lora": {
2607
+ "dp": 0.0,
2608
+ "layer": "layers.*((self_attn\\.(qkv_proj|o_proj))|(mlp\\.(gate_up|down)_proj))",
2609
+ "lora_alpha": 512,
2610
+ "r": 256
2611
+ },
2612
+ "vocab_size": 200064
2613
+ }
configuration_phi4mm.py ADDED
@@ -0,0 +1,235 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # coding=utf-8
2
+ # Copyright 2024 Microsoft and the HuggingFace Inc. team. All rights reserved.
3
+ #
4
+ # Licensed under the Apache License, Version 2.0 (the "License");
5
+ # you may not use this file except in compliance with the License.
6
+ # You may obtain a copy of the License at
7
+ #
8
+ # http://www.apache.org/licenses/LICENSE-2.0
9
+ #
10
+ # Unless required by applicable law or agreed to in writing, software
11
+ # distributed under the License is distributed on an "AS IS" BASIS,
12
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
+ # See the License for the specific language governing permissions and
14
+ # limitations under the License.
15
+
16
+ """ Phi-4-MM model configuration"""
17
+
18
+ from transformers.configuration_utils import PretrainedConfig
19
+ from transformers.utils import logging
20
+
21
+
22
+ logger = logging.get_logger(__name__)
23
+
24
+
25
+ class Phi4MMConfig(PretrainedConfig):
26
+ r"""
27
+ This is the configuration class to store the configuration of a [`Phi4MMModel`]. It is used to instantiate a Phi-4-MM
28
+ model according to the specified arguments, defining the model architecture.
29
+
30
+ Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
31
+ documentation from [`PretrainedConfig`] for more information.
32
+
33
+ Args:
34
+ vocab_size (`int`, *optional*, defaults to 200064):
35
+ Vocabulary size of the Phi-4-MM model. Defines the number of different tokens that can be represented by the
36
+ `inputs_ids` passed when calling [`Phi4MMModel`].
37
+ hidden_size (`int`, *optional*, defaults to 3072):
38
+ Dimension of the hidden representations.
39
+ intermediate_size (`int`, *optional*, defaults to 8192):
40
+ Dimension of the MLP representations.
41
+ num_hidden_layers (`int`, *optional*, defaults to 32):
42
+ Number of hidden layers in the Transformer decoder.
43
+ num_attention_heads (`int`, *optional*, defaults to 32):
44
+ Number of attention heads for each attention layer in the Transformer decoder.
45
+ num_key_value_heads (`int`, *optional*):
46
+ This is the number of key_value heads that should be used to implement Grouped Query Attention. If
47
+ `num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
48
+ `num_key_value_heads=1` the model will use Multi Query Attention (MQA) otherwise GQA is used. When
49
+ converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
50
+ by meanpooling all the original heads within that group. For more details checkout [this
51
+ paper](https://arxiv.org/pdf/2305.13245.pdf). If it is not specified, will default to
52
+ `num_attention_heads`.
53
+ resid_pdrop (`float`, *optional*, defaults to 0.0):
54
+ Dropout probability for mlp outputs.
55
+ embd_pdrop (`int`, *optional*, defaults to 0.0):
56
+ The dropout ratio for the embeddings.
57
+ attention_dropout (`float`, *optional*, defaults to 0.0):
58
+ The dropout ratio after computing the attention scores.
59
+ hidden_act (`str` or `function`, *optional*, defaults to `"silu"`):
60
+ The non-linear activation function (function or string) in the decoder.
61
+ max_position_embeddings (`int`, *optional*, defaults to 4096):
62
+ The maximum sequence length that this model might ever be used with.
63
+ original_max_position_embeddings (`int`, *optional*, defaults to 4096):
64
+ The maximum sequence length that this model was trained with. This is used to determine the size of the
65
+ original RoPE embeddings when using long scaling.
66
+ initializer_range (`float`, *optional*, defaults to 0.02):
67
+ The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
68
+ rms_norm_eps (`float`, *optional*, defaults to 1e-05):
69
+ The epsilon value used for the RMSNorm.
70
+ use_cache (`bool`, *optional*, defaults to `True`):
71
+ Whether or not the model should return the last key/values attentions (not used by all models). Only
72
+ relevant if `config.is_decoder=True`. Whether to tie weight embeddings or not.
73
+ tie_word_embeddings (`bool`, *optional*, defaults to `False`):
74
+ Whether to tie weight embeddings
75
+ rope_theta (`float`, *optional*, defaults to 10000.0):
76
+ The base period of the RoPE embeddings.
77
+ rope_scaling (`dict`, *optional*):
78
+ The scaling strategy for the RoPE embeddings. If `None`, no scaling is applied. If a dictionary, it must
79
+ contain the following keys: `type`, `short_factor` and `long_factor`. The `type` must be `longrope` and
80
+ the `short_factor` and `long_factor` must be lists of numbers with the same length as the hidden size
81
+ divided by the number of attention heads divided by 2.
82
+ partial_rotary_factor (`float`, *optional*, defaults to 0.5):
83
+ Percentage of the query and keys which will have rotary embedding.
84
+ bos_token_id (`int`, *optional*, defaults to 199999):
85
+ The id of the "beginning-of-sequence" token.
86
+ eos_token_id (`int`, *optional*, defaults to 199999):
87
+ The id of the "end-of-sequence" token.
88
+ pad_token_id (`int`, *optional*, defaults to 199999):
89
+ The id of the padding token.
90
+ sliding_window (`int`, *optional*):
91
+ Sliding window attention window size. If `None`, no sliding window is applied.
92
+
93
+ Example:
94
+
95
+ ```python
96
+ >>> from transformers import Phi4MMModel, Phi4MMConfig
97
+
98
+ >>> # Initializing a Phi-4-MM style configuration
99
+ >>> configuration = Phi4MMConfig.from_pretrained("TBA")
100
+
101
+ >>> # Initializing a model from the configuration
102
+ >>> model = Phi4MMModel(configuration)
103
+
104
+ >>> # Accessing the model configuration
105
+ >>> configuration = model.config
106
+ ```"""
107
+
108
+ model_type = "phi4mm"
109
+ keys_to_ignore_at_inference = ["past_key_values"]
110
+
111
+ def __init__(
112
+ self,
113
+ vocab_size=200064,
114
+ hidden_size=3072,
115
+ intermediate_size=8192,
116
+ num_hidden_layers=32,
117
+ num_attention_heads=32,
118
+ num_key_value_heads=None,
119
+ resid_pdrop=0.0,
120
+ embd_pdrop=0.0,
121
+ attention_dropout=0.0,
122
+ hidden_act="silu",
123
+ max_position_embeddings=4096,
124
+ original_max_position_embeddings=4096,
125
+ initializer_range=0.02,
126
+ rms_norm_eps=1e-5,
127
+ use_cache=True,
128
+ tie_word_embeddings=False,
129
+ rope_theta=10000.0,
130
+ rope_scaling=None,
131
+ partial_rotary_factor=1,
132
+ bos_token_id=199999,
133
+ eos_token_id=199999,
134
+ pad_token_id=199999,
135
+ sliding_window=None,
136
+ embd_layer: str = "default",
137
+ img_processor=None,
138
+ audio_processor=None,
139
+ vision_lora=None,
140
+ speech_lora=None,
141
+ **kwargs,
142
+ ):
143
+ self.embd_layer = embd_layer
144
+ self.img_processor = img_processor
145
+ self.audio_processor = audio_processor
146
+ self.vision_lora = vision_lora
147
+ self.speech_lora = speech_lora
148
+
149
+ self.vocab_size = vocab_size
150
+ self.hidden_size = hidden_size
151
+ self.intermediate_size = intermediate_size
152
+ self.num_hidden_layers = num_hidden_layers
153
+ self.num_attention_heads = num_attention_heads
154
+
155
+ if num_key_value_heads is None:
156
+ num_key_value_heads = num_attention_heads
157
+
158
+ self.num_key_value_heads = num_key_value_heads
159
+ self.resid_pdrop = resid_pdrop
160
+ self.embd_pdrop = embd_pdrop
161
+ self.attention_dropout = attention_dropout
162
+ self.hidden_act = hidden_act
163
+ self.max_position_embeddings = max_position_embeddings
164
+ self.original_max_position_embeddings = original_max_position_embeddings
165
+ self.initializer_range = initializer_range
166
+ self.rms_norm_eps = rms_norm_eps
167
+ self.use_cache = use_cache
168
+ self.rope_theta = rope_theta
169
+ self.rope_scaling = rope_scaling
170
+ self.partial_rotary_factor = partial_rotary_factor
171
+ self._rope_scaling_adjustment()
172
+ self._rope_scaling_validation()
173
+ self.sliding_window = sliding_window
174
+
175
+ super().__init__(
176
+ bos_token_id=bos_token_id,
177
+ eos_token_id=eos_token_id,
178
+ pad_token_id=pad_token_id,
179
+ tie_word_embeddings=tie_word_embeddings,
180
+ **kwargs,
181
+ )
182
+
183
+ def _rope_scaling_adjustment(self):
184
+ """
185
+ Adjust the `type` of the `rope_scaling` configuration for backward compatibility.
186
+ """
187
+ if self.rope_scaling is None:
188
+ return
189
+
190
+ rope_scaling_type = self.rope_scaling.get("type", None)
191
+
192
+ # For backward compatibility if previous version used "su" or "yarn"
193
+ if rope_scaling_type is not None and rope_scaling_type in ["su", "yarn"]:
194
+ self.rope_scaling["type"] = "longrope"
195
+
196
+ def _rope_scaling_validation(self):
197
+ """
198
+ Validate the `rope_scaling` configuration.
199
+ """
200
+ if self.rope_scaling is None:
201
+ return
202
+
203
+ if not isinstance(self.rope_scaling, dict) or len(self.rope_scaling) != 3:
204
+ raise ValueError(
205
+ "`rope_scaling` must be a dictionary with three fields, `type`, `short_factor` and `long_factor`, "
206
+ f"got {self.rope_scaling}"
207
+ )
208
+ rope_scaling_type = self.rope_scaling.get("type", None)
209
+ rope_scaling_short_factor = self.rope_scaling.get("short_factor", None)
210
+ rope_scaling_long_factor = self.rope_scaling.get("long_factor", None)
211
+ if rope_scaling_type is None or rope_scaling_type not in ["longrope"]:
212
+ raise ValueError(f"`rope_scaling`'s type field must be one of ['longrope'], got {rope_scaling_type}")
213
+ if not (
214
+ isinstance(rope_scaling_short_factor, list)
215
+ and all(isinstance(x, (int, float)) for x in rope_scaling_short_factor)
216
+ ):
217
+ raise ValueError(
218
+ f"`rope_scaling`'s short_factor field must be a list of numbers, got {rope_scaling_short_factor}"
219
+ )
220
+ rotary_ndims = int(self.hidden_size // self.num_attention_heads * self.partial_rotary_factor)
221
+ if not len(rope_scaling_short_factor) == rotary_ndims // 2:
222
+ raise ValueError(
223
+ f"`rope_scaling`'s short_factor field must have length {rotary_ndims // 2}, got {len(rope_scaling_short_factor)}"
224
+ )
225
+ if not (
226
+ isinstance(rope_scaling_long_factor, list)
227
+ and all(isinstance(x, (int, float)) for x in rope_scaling_long_factor)
228
+ ):
229
+ raise ValueError(
230
+ f"`rope_scaling`'s long_factor field must be a list of numbers, got {rope_scaling_long_factor}"
231
+ )
232
+ if not len(rope_scaling_long_factor) == rotary_ndims // 2:
233
+ raise ValueError(
234
+ f"`rope_scaling`'s long_factor field must have length {rotary_ndims // 2}, got {len(rope_scaling_long_factor)}"
235
+ )
generation_config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 199999,
4
+ "eos_token_id": [
5
+ 200020,
6
+ 199999
7
+ ],
8
+ "pad_token_id": 199999,
9
+ "transformers_version": "4.51.0"
10
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
openvino_audio_embeddings_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:25c098aea6eeb8c3c8c66c17b1cf7decdd4bd0399451a26f701d9ebeb36458e8
3
+ size 640
openvino_audio_embeddings_model.xml ADDED
@@ -0,0 +1,128 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model0" version="11">
3
+ <layers>
4
+ <layer id="0" name="input_" type="Parameter" version="opset1">
5
+ <data shape="?,?,80" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="input_">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ <dim>80</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="Constant_2102" type="Const" version="opset1">
15
+ <data element_type="f32" shape="1, 1, 80" offset="0" size="320" />
16
+ <output>
17
+ <port id="0" precision="FP32">
18
+ <dim>1</dim>
19
+ <dim>1</dim>
20
+ <dim>80</dim>
21
+ </port>
22
+ </output>
23
+ </layer>
24
+ <layer id="2" name="Multiply_2090" type="Multiply" version="opset1">
25
+ <data auto_broadcast="numpy" />
26
+ <input>
27
+ <port id="0" precision="FP32">
28
+ <dim>-1</dim>
29
+ <dim>-1</dim>
30
+ <dim>80</dim>
31
+ </port>
32
+ <port id="1" precision="FP32">
33
+ <dim>1</dim>
34
+ <dim>1</dim>
35
+ <dim>80</dim>
36
+ </port>
37
+ </input>
38
+ <output>
39
+ <port id="2" precision="FP32">
40
+ <dim>-1</dim>
41
+ <dim>-1</dim>
42
+ <dim>80</dim>
43
+ </port>
44
+ </output>
45
+ </layer>
46
+ <layer id="3" name="Constant_2103" type="Const" version="opset1">
47
+ <data element_type="f32" shape="1, 1, 80" offset="320" size="320" />
48
+ <output>
49
+ <port id="0" precision="FP32">
50
+ <dim>1</dim>
51
+ <dim>1</dim>
52
+ <dim>80</dim>
53
+ </port>
54
+ </output>
55
+ </layer>
56
+ <layer id="4" name="aten::mul/Multiply" type="Add" version="opset1">
57
+ <data auto_broadcast="numpy" />
58
+ <input>
59
+ <port id="0" precision="FP32">
60
+ <dim>-1</dim>
61
+ <dim>-1</dim>
62
+ <dim>80</dim>
63
+ </port>
64
+ <port id="1" precision="FP32">
65
+ <dim>1</dim>
66
+ <dim>1</dim>
67
+ <dim>80</dim>
68
+ </port>
69
+ </input>
70
+ <output>
71
+ <port id="2" precision="FP32" names="last_hidden_state">
72
+ <dim>-1</dim>
73
+ <dim>-1</dim>
74
+ <dim>80</dim>
75
+ </port>
76
+ </output>
77
+ </layer>
78
+ <layer id="5" name="Result_44" type="Result" version="opset1" output_names="last_hidden_state">
79
+ <input>
80
+ <port id="0" precision="FP32">
81
+ <dim>-1</dim>
82
+ <dim>-1</dim>
83
+ <dim>80</dim>
84
+ </port>
85
+ </input>
86
+ </layer>
87
+ </layers>
88
+ <edges>
89
+ <edge from-layer="0" from-port="0" to-layer="2" to-port="0" />
90
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="1" />
91
+ <edge from-layer="2" from-port="2" to-layer="4" to-port="0" />
92
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="1" />
93
+ <edge from-layer="4" from-port="2" to-layer="5" to-port="0" />
94
+ </edges>
95
+ <rt_info>
96
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
97
+ <conversion_parameters>
98
+ <framework value="pytorch" />
99
+ <is_python_object value="True" />
100
+ </conversion_parameters>
101
+ <nncf>
102
+ <friendly_names_were_updated value="True" />
103
+ <version value="2.19.0" />
104
+ <weight_compression>
105
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
106
+ <all_layers value="False" />
107
+ <awq value="False" />
108
+ <backup_mode value="int8_asym" />
109
+ <compression_format value="dequantize" />
110
+ <gptq value="False" />
111
+ <group_size value="-1" />
112
+ <ignored_scope value="[]" />
113
+ <lora_correction value="False" />
114
+ <mode value="int8_sym" />
115
+ <ratio value="1.0" />
116
+ <scale_estimation value="False" />
117
+ <sensitivity_metric value="weight_quantization_error" />
118
+ </weight_compression>
119
+ </nncf>
120
+ <optimum>
121
+ <nncf_version value="2.19.0" />
122
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
123
+ <optimum_version value="2.1.0.dev0" />
124
+ <pytorch_version value="2.9.1+cpu" />
125
+ <transformers_version value="4.51.0" />
126
+ </optimum>
127
+ </rt_info>
128
+ </net>
openvino_audio_encoder_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:80c4ea42f504a9f391f770ffdcf2d747dbe4b62dbfb908b3d10084fee8416ccc
3
+ size 431597364
openvino_audio_encoder_model.xml ADDED
The diff for this file is too large to render. See raw diff
 
openvino_audio_forward_embeddings_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:85cd06a467fd6cc9616c6d531ebdf092c8149395b624fcdf1dab7511d0bc1098
3
+ size 12647472
openvino_audio_forward_embeddings_model.xml ADDED
@@ -0,0 +1,1123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model6" version="11">
3
+ <layers>
4
+ <layer id="0" name="audio_input" type="Parameter" version="opset1">
5
+ <data shape="?,?,80" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="audio_input">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ <dim>80</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="11" type="Const" version="opset1">
15
+ <data element_type="i64" shape="" offset="0" size="8" />
16
+ <output>
17
+ <port id="0" precision="I64" names="11" />
18
+ </output>
19
+ </layer>
20
+ <layer id="2" name="__module.embed/aten::unsqueeze/Unsqueeze" type="Unsqueeze" version="opset1">
21
+ <input>
22
+ <port id="0" precision="FP32">
23
+ <dim>-1</dim>
24
+ <dim>-1</dim>
25
+ <dim>80</dim>
26
+ </port>
27
+ <port id="1" precision="I64" />
28
+ </input>
29
+ <output>
30
+ <port id="2" precision="FP32" names="14,input.1">
31
+ <dim>-1</dim>
32
+ <dim>1</dim>
33
+ <dim>-1</dim>
34
+ <dim>80</dim>
35
+ </port>
36
+ </output>
37
+ </layer>
38
+ <layer id="3" name="self.embed.conv.0.weight" type="Const" version="opset1">
39
+ <data element_type="i8" shape="1024, 1, 3, 3" offset="8" size="9216" />
40
+ <output>
41
+ <port id="0" precision="I8">
42
+ <dim>1024</dim>
43
+ <dim>1</dim>
44
+ <dim>3</dim>
45
+ <dim>3</dim>
46
+ </port>
47
+ </output>
48
+ </layer>
49
+ <layer id="4" name="Convert_2965993" type="Convert" version="opset1">
50
+ <data destination_type="f16" />
51
+ <input>
52
+ <port id="0" precision="I8">
53
+ <dim>1024</dim>
54
+ <dim>1</dim>
55
+ <dim>3</dim>
56
+ <dim>3</dim>
57
+ </port>
58
+ </input>
59
+ <output>
60
+ <port id="1" precision="FP16">
61
+ <dim>1024</dim>
62
+ <dim>1</dim>
63
+ <dim>3</dim>
64
+ <dim>3</dim>
65
+ </port>
66
+ </output>
67
+ </layer>
68
+ <layer id="5" name="self.embed.conv.0.weight/scale" type="Const" version="opset1">
69
+ <data element_type="f16" shape="1024, 1, 1, 1" offset="9224" size="2048" />
70
+ <output>
71
+ <port id="0" precision="FP16">
72
+ <dim>1024</dim>
73
+ <dim>1</dim>
74
+ <dim>1</dim>
75
+ <dim>1</dim>
76
+ </port>
77
+ </output>
78
+ </layer>
79
+ <layer id="6" name="self.embed.conv.0.weight/fq_weights_1" type="Multiply" version="opset1">
80
+ <data auto_broadcast="numpy" />
81
+ <input>
82
+ <port id="0" precision="FP16">
83
+ <dim>1024</dim>
84
+ <dim>1</dim>
85
+ <dim>3</dim>
86
+ <dim>3</dim>
87
+ </port>
88
+ <port id="1" precision="FP16">
89
+ <dim>1024</dim>
90
+ <dim>1</dim>
91
+ <dim>1</dim>
92
+ <dim>1</dim>
93
+ </port>
94
+ </input>
95
+ <output>
96
+ <port id="2" precision="FP16">
97
+ <dim>1024</dim>
98
+ <dim>1</dim>
99
+ <dim>3</dim>
100
+ <dim>3</dim>
101
+ </port>
102
+ </output>
103
+ </layer>
104
+ <layer id="7" name="self.embed.conv.0.weight/fq_weights_1/convert" type="Convert" version="opset1">
105
+ <data destination_type="f32" />
106
+ <input>
107
+ <port id="0" precision="FP16">
108
+ <dim>1024</dim>
109
+ <dim>1</dim>
110
+ <dim>3</dim>
111
+ <dim>3</dim>
112
+ </port>
113
+ </input>
114
+ <output>
115
+ <port id="1" precision="FP32">
116
+ <dim>1024</dim>
117
+ <dim>1</dim>
118
+ <dim>3</dim>
119
+ <dim>3</dim>
120
+ </port>
121
+ </output>
122
+ </layer>
123
+ <layer id="8" name="__module.embed.conv.0/aten::_convolution/Convolution" type="Convolution" version="opset1">
124
+ <data strides="2, 2" dilations="1, 1" pads_begin="1, 1" pads_end="1, 1" auto_pad="explicit" />
125
+ <input>
126
+ <port id="0" precision="FP32">
127
+ <dim>-1</dim>
128
+ <dim>1</dim>
129
+ <dim>-1</dim>
130
+ <dim>80</dim>
131
+ </port>
132
+ <port id="1" precision="FP32">
133
+ <dim>1024</dim>
134
+ <dim>1</dim>
135
+ <dim>3</dim>
136
+ <dim>3</dim>
137
+ </port>
138
+ </input>
139
+ <output>
140
+ <port id="2" precision="FP32">
141
+ <dim>-1</dim>
142
+ <dim>1024</dim>
143
+ <dim>-1</dim>
144
+ <dim>40</dim>
145
+ </port>
146
+ </output>
147
+ </layer>
148
+ <layer id="9" name="__module.embed.conv.0/aten::_convolution/Reshape" type="Const" version="opset1">
149
+ <data element_type="f32" shape="1, 1024, 1, 1" offset="11272" size="4096" />
150
+ <output>
151
+ <port id="0" precision="FP32">
152
+ <dim>1</dim>
153
+ <dim>1024</dim>
154
+ <dim>1</dim>
155
+ <dim>1</dim>
156
+ </port>
157
+ </output>
158
+ </layer>
159
+ <layer id="10" name="__module.embed.conv.0/aten::_convolution/Add" type="Add" version="opset1">
160
+ <data auto_broadcast="numpy" />
161
+ <input>
162
+ <port id="0" precision="FP32">
163
+ <dim>-1</dim>
164
+ <dim>1024</dim>
165
+ <dim>-1</dim>
166
+ <dim>40</dim>
167
+ </port>
168
+ <port id="1" precision="FP32">
169
+ <dim>1</dim>
170
+ <dim>1024</dim>
171
+ <dim>1</dim>
172
+ <dim>1</dim>
173
+ </port>
174
+ </input>
175
+ <output>
176
+ <port id="2" precision="FP32" names="26,input.3">
177
+ <dim>-1</dim>
178
+ <dim>1024</dim>
179
+ <dim>-1</dim>
180
+ <dim>40</dim>
181
+ </port>
182
+ </output>
183
+ </layer>
184
+ <layer id="11" name="__module.embed.conv.1/aten::relu/Relu" type="ReLU" version="opset1">
185
+ <input>
186
+ <port id="0" precision="FP32">
187
+ <dim>-1</dim>
188
+ <dim>1024</dim>
189
+ <dim>-1</dim>
190
+ <dim>40</dim>
191
+ </port>
192
+ </input>
193
+ <output>
194
+ <port id="1" precision="FP32" names="27,input.5">
195
+ <dim>-1</dim>
196
+ <dim>1024</dim>
197
+ <dim>-1</dim>
198
+ <dim>40</dim>
199
+ </port>
200
+ </output>
201
+ </layer>
202
+ <layer id="12" name="__module.embed.conv.2/aten::_convolution/Reshape" type="Const" version="opset1">
203
+ <data element_type="i8" shape="1024, 1, 1, 3, 3" offset="15368" size="9216" />
204
+ <output>
205
+ <port id="0" precision="I8">
206
+ <dim>1024</dim>
207
+ <dim>1</dim>
208
+ <dim>1</dim>
209
+ <dim>3</dim>
210
+ <dim>3</dim>
211
+ </port>
212
+ </output>
213
+ </layer>
214
+ <layer id="13" name="Convert_2966000" type="Convert" version="opset1">
215
+ <data destination_type="f16" />
216
+ <input>
217
+ <port id="0" precision="I8">
218
+ <dim>1024</dim>
219
+ <dim>1</dim>
220
+ <dim>1</dim>
221
+ <dim>3</dim>
222
+ <dim>3</dim>
223
+ </port>
224
+ </input>
225
+ <output>
226
+ <port id="1" precision="FP16">
227
+ <dim>1024</dim>
228
+ <dim>1</dim>
229
+ <dim>1</dim>
230
+ <dim>3</dim>
231
+ <dim>3</dim>
232
+ </port>
233
+ </output>
234
+ </layer>
235
+ <layer id="14" name="__module.embed.conv.2/aten::_convolution/Reshape/scale" type="Const" version="opset1">
236
+ <data element_type="f16" shape="1024, 1, 1, 1, 1" offset="24584" size="2048" />
237
+ <output>
238
+ <port id="0" precision="FP16">
239
+ <dim>1024</dim>
240
+ <dim>1</dim>
241
+ <dim>1</dim>
242
+ <dim>1</dim>
243
+ <dim>1</dim>
244
+ </port>
245
+ </output>
246
+ </layer>
247
+ <layer id="15" name="__module.embed.conv.2/aten::_convolution/Reshape/fq_weights_1" type="Multiply" version="opset1">
248
+ <data auto_broadcast="numpy" />
249
+ <input>
250
+ <port id="0" precision="FP16">
251
+ <dim>1024</dim>
252
+ <dim>1</dim>
253
+ <dim>1</dim>
254
+ <dim>3</dim>
255
+ <dim>3</dim>
256
+ </port>
257
+ <port id="1" precision="FP16">
258
+ <dim>1024</dim>
259
+ <dim>1</dim>
260
+ <dim>1</dim>
261
+ <dim>1</dim>
262
+ <dim>1</dim>
263
+ </port>
264
+ </input>
265
+ <output>
266
+ <port id="2" precision="FP16">
267
+ <dim>1024</dim>
268
+ <dim>1</dim>
269
+ <dim>1</dim>
270
+ <dim>3</dim>
271
+ <dim>3</dim>
272
+ </port>
273
+ </output>
274
+ </layer>
275
+ <layer id="16" name="__module.embed.conv.2/aten::_convolution/Reshape/fq_weights_1/convert" type="Convert" version="opset1">
276
+ <data destination_type="f32" />
277
+ <input>
278
+ <port id="0" precision="FP16">
279
+ <dim>1024</dim>
280
+ <dim>1</dim>
281
+ <dim>1</dim>
282
+ <dim>3</dim>
283
+ <dim>3</dim>
284
+ </port>
285
+ </input>
286
+ <output>
287
+ <port id="1" precision="FP32">
288
+ <dim>1024</dim>
289
+ <dim>1</dim>
290
+ <dim>1</dim>
291
+ <dim>3</dim>
292
+ <dim>3</dim>
293
+ </port>
294
+ </output>
295
+ </layer>
296
+ <layer id="17" name="__module.embed.conv.2/aten::_convolution/GroupConvolution" type="GroupConvolution" version="opset1">
297
+ <data strides="2, 2" pads_begin="1, 1" pads_end="1, 1" dilations="1, 1" auto_pad="explicit" />
298
+ <input>
299
+ <port id="0" precision="FP32">
300
+ <dim>-1</dim>
301
+ <dim>1024</dim>
302
+ <dim>-1</dim>
303
+ <dim>40</dim>
304
+ </port>
305
+ <port id="1" precision="FP32">
306
+ <dim>1024</dim>
307
+ <dim>1</dim>
308
+ <dim>1</dim>
309
+ <dim>3</dim>
310
+ <dim>3</dim>
311
+ </port>
312
+ </input>
313
+ <output>
314
+ <port id="2" precision="FP32">
315
+ <dim>-1</dim>
316
+ <dim>1024</dim>
317
+ <dim>-1</dim>
318
+ <dim>20</dim>
319
+ </port>
320
+ </output>
321
+ </layer>
322
+ <layer id="18" name="__module.embed.conv.2/aten::_convolution/Reshape_1" type="Const" version="opset1">
323
+ <data element_type="f32" shape="1, 1024, 1, 1" offset="26632" size="4096" />
324
+ <output>
325
+ <port id="0" precision="FP32">
326
+ <dim>1</dim>
327
+ <dim>1024</dim>
328
+ <dim>1</dim>
329
+ <dim>1</dim>
330
+ </port>
331
+ </output>
332
+ </layer>
333
+ <layer id="19" name="__module.embed.conv.2/aten::_convolution/Add" type="Add" version="opset1">
334
+ <data auto_broadcast="numpy" />
335
+ <input>
336
+ <port id="0" precision="FP32">
337
+ <dim>-1</dim>
338
+ <dim>1024</dim>
339
+ <dim>-1</dim>
340
+ <dim>20</dim>
341
+ </port>
342
+ <port id="1" precision="FP32">
343
+ <dim>1</dim>
344
+ <dim>1024</dim>
345
+ <dim>1</dim>
346
+ <dim>1</dim>
347
+ </port>
348
+ </input>
349
+ <output>
350
+ <port id="2" precision="FP32" names="34,input.7">
351
+ <dim>-1</dim>
352
+ <dim>1024</dim>
353
+ <dim>-1</dim>
354
+ <dim>20</dim>
355
+ </port>
356
+ </output>
357
+ </layer>
358
+ <layer id="20" name="self.embed.conv.3.weight" type="Const" version="opset1">
359
+ <data element_type="i8" shape="1024, 1024, 1, 1" offset="30728" size="1048576" />
360
+ <output>
361
+ <port id="0" precision="I8">
362
+ <dim>1024</dim>
363
+ <dim>1024</dim>
364
+ <dim>1</dim>
365
+ <dim>1</dim>
366
+ </port>
367
+ </output>
368
+ </layer>
369
+ <layer id="21" name="Convert_2961244" type="Convert" version="opset1">
370
+ <data destination_type="f16" />
371
+ <input>
372
+ <port id="0" precision="I8">
373
+ <dim>1024</dim>
374
+ <dim>1024</dim>
375
+ <dim>1</dim>
376
+ <dim>1</dim>
377
+ </port>
378
+ </input>
379
+ <output>
380
+ <port id="1" precision="FP16">
381
+ <dim>1024</dim>
382
+ <dim>1024</dim>
383
+ <dim>1</dim>
384
+ <dim>1</dim>
385
+ </port>
386
+ </output>
387
+ </layer>
388
+ <layer id="22" name="self.embed.conv.3.weight/scale" type="Const" version="opset1">
389
+ <data element_type="f16" shape="1024, 1, 1, 1" offset="1079304" size="2048" />
390
+ <output>
391
+ <port id="0" precision="FP16">
392
+ <dim>1024</dim>
393
+ <dim>1</dim>
394
+ <dim>1</dim>
395
+ <dim>1</dim>
396
+ </port>
397
+ </output>
398
+ </layer>
399
+ <layer id="23" name="self.embed.conv.3.weight/fq_weights_1" type="Multiply" version="opset1">
400
+ <data auto_broadcast="numpy" />
401
+ <input>
402
+ <port id="0" precision="FP16">
403
+ <dim>1024</dim>
404
+ <dim>1024</dim>
405
+ <dim>1</dim>
406
+ <dim>1</dim>
407
+ </port>
408
+ <port id="1" precision="FP16">
409
+ <dim>1024</dim>
410
+ <dim>1</dim>
411
+ <dim>1</dim>
412
+ <dim>1</dim>
413
+ </port>
414
+ </input>
415
+ <output>
416
+ <port id="2" precision="FP16">
417
+ <dim>1024</dim>
418
+ <dim>1024</dim>
419
+ <dim>1</dim>
420
+ <dim>1</dim>
421
+ </port>
422
+ </output>
423
+ </layer>
424
+ <layer id="24" name="self.embed.conv.3.weight/fq_weights_1/convert" type="Convert" version="opset1">
425
+ <data destination_type="f32" />
426
+ <input>
427
+ <port id="0" precision="FP16">
428
+ <dim>1024</dim>
429
+ <dim>1024</dim>
430
+ <dim>1</dim>
431
+ <dim>1</dim>
432
+ </port>
433
+ </input>
434
+ <output>
435
+ <port id="1" precision="FP32">
436
+ <dim>1024</dim>
437
+ <dim>1024</dim>
438
+ <dim>1</dim>
439
+ <dim>1</dim>
440
+ </port>
441
+ </output>
442
+ </layer>
443
+ <layer id="25" name="__module.embed.conv.3/aten::_convolution/Convolution" type="Convolution" version="opset1">
444
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
445
+ <input>
446
+ <port id="0" precision="FP32">
447
+ <dim>-1</dim>
448
+ <dim>1024</dim>
449
+ <dim>-1</dim>
450
+ <dim>20</dim>
451
+ </port>
452
+ <port id="1" precision="FP32">
453
+ <dim>1024</dim>
454
+ <dim>1024</dim>
455
+ <dim>1</dim>
456
+ <dim>1</dim>
457
+ </port>
458
+ </input>
459
+ <output>
460
+ <port id="2" precision="FP32">
461
+ <dim>-1</dim>
462
+ <dim>1024</dim>
463
+ <dim>-1</dim>
464
+ <dim>20</dim>
465
+ </port>
466
+ </output>
467
+ </layer>
468
+ <layer id="26" name="__module.embed.conv.3/aten::_convolution/Reshape" type="Const" version="opset1">
469
+ <data element_type="f32" shape="1, 1024, 1, 1" offset="1081352" size="4096" />
470
+ <output>
471
+ <port id="0" precision="FP32">
472
+ <dim>1</dim>
473
+ <dim>1024</dim>
474
+ <dim>1</dim>
475
+ <dim>1</dim>
476
+ </port>
477
+ </output>
478
+ </layer>
479
+ <layer id="27" name="__module.embed.conv.3/aten::_convolution/Add" type="Add" version="opset1">
480
+ <data auto_broadcast="numpy" />
481
+ <input>
482
+ <port id="0" precision="FP32">
483
+ <dim>-1</dim>
484
+ <dim>1024</dim>
485
+ <dim>-1</dim>
486
+ <dim>20</dim>
487
+ </port>
488
+ <port id="1" precision="FP32">
489
+ <dim>1</dim>
490
+ <dim>1024</dim>
491
+ <dim>1</dim>
492
+ <dim>1</dim>
493
+ </port>
494
+ </input>
495
+ <output>
496
+ <port id="2" precision="FP32" names="41,input.9">
497
+ <dim>-1</dim>
498
+ <dim>1024</dim>
499
+ <dim>-1</dim>
500
+ <dim>20</dim>
501
+ </port>
502
+ </output>
503
+ </layer>
504
+ <layer id="28" name="__module.embed.conv.1/aten::relu/Relu_1" type="ReLU" version="opset1">
505
+ <input>
506
+ <port id="0" precision="FP32">
507
+ <dim>-1</dim>
508
+ <dim>1024</dim>
509
+ <dim>-1</dim>
510
+ <dim>20</dim>
511
+ </port>
512
+ </input>
513
+ <output>
514
+ <port id="1" precision="FP32" names="42,input.11">
515
+ <dim>-1</dim>
516
+ <dim>1024</dim>
517
+ <dim>-1</dim>
518
+ <dim>20</dim>
519
+ </port>
520
+ </output>
521
+ </layer>
522
+ <layer id="29" name="__module.embed.conv.5/aten::_convolution/Reshape" type="Const" version="opset1">
523
+ <data element_type="i8" shape="1024, 1, 1, 3, 3" offset="1085448" size="9216" />
524
+ <output>
525
+ <port id="0" precision="I8">
526
+ <dim>1024</dim>
527
+ <dim>1</dim>
528
+ <dim>1</dim>
529
+ <dim>3</dim>
530
+ <dim>3</dim>
531
+ </port>
532
+ </output>
533
+ </layer>
534
+ <layer id="30" name="Convert_2966007" type="Convert" version="opset1">
535
+ <data destination_type="f16" />
536
+ <input>
537
+ <port id="0" precision="I8">
538
+ <dim>1024</dim>
539
+ <dim>1</dim>
540
+ <dim>1</dim>
541
+ <dim>3</dim>
542
+ <dim>3</dim>
543
+ </port>
544
+ </input>
545
+ <output>
546
+ <port id="1" precision="FP16">
547
+ <dim>1024</dim>
548
+ <dim>1</dim>
549
+ <dim>1</dim>
550
+ <dim>3</dim>
551
+ <dim>3</dim>
552
+ </port>
553
+ </output>
554
+ </layer>
555
+ <layer id="31" name="__module.embed.conv.5/aten::_convolution/Reshape/scale" type="Const" version="opset1">
556
+ <data element_type="f16" shape="1024, 1, 1, 1, 1" offset="1094664" size="2048" />
557
+ <output>
558
+ <port id="0" precision="FP16">
559
+ <dim>1024</dim>
560
+ <dim>1</dim>
561
+ <dim>1</dim>
562
+ <dim>1</dim>
563
+ <dim>1</dim>
564
+ </port>
565
+ </output>
566
+ </layer>
567
+ <layer id="32" name="__module.embed.conv.5/aten::_convolution/Reshape/fq_weights_1" type="Multiply" version="opset1">
568
+ <data auto_broadcast="numpy" />
569
+ <input>
570
+ <port id="0" precision="FP16">
571
+ <dim>1024</dim>
572
+ <dim>1</dim>
573
+ <dim>1</dim>
574
+ <dim>3</dim>
575
+ <dim>3</dim>
576
+ </port>
577
+ <port id="1" precision="FP16">
578
+ <dim>1024</dim>
579
+ <dim>1</dim>
580
+ <dim>1</dim>
581
+ <dim>1</dim>
582
+ <dim>1</dim>
583
+ </port>
584
+ </input>
585
+ <output>
586
+ <port id="2" precision="FP16">
587
+ <dim>1024</dim>
588
+ <dim>1</dim>
589
+ <dim>1</dim>
590
+ <dim>3</dim>
591
+ <dim>3</dim>
592
+ </port>
593
+ </output>
594
+ </layer>
595
+ <layer id="33" name="__module.embed.conv.5/aten::_convolution/Reshape/fq_weights_1/convert" type="Convert" version="opset1">
596
+ <data destination_type="f32" />
597
+ <input>
598
+ <port id="0" precision="FP16">
599
+ <dim>1024</dim>
600
+ <dim>1</dim>
601
+ <dim>1</dim>
602
+ <dim>3</dim>
603
+ <dim>3</dim>
604
+ </port>
605
+ </input>
606
+ <output>
607
+ <port id="1" precision="FP32">
608
+ <dim>1024</dim>
609
+ <dim>1</dim>
610
+ <dim>1</dim>
611
+ <dim>3</dim>
612
+ <dim>3</dim>
613
+ </port>
614
+ </output>
615
+ </layer>
616
+ <layer id="34" name="__module.embed.conv.5/aten::_convolution/GroupConvolution" type="GroupConvolution" version="opset1">
617
+ <data strides="2, 2" pads_begin="1, 1" pads_end="1, 1" dilations="1, 1" auto_pad="explicit" />
618
+ <input>
619
+ <port id="0" precision="FP32">
620
+ <dim>-1</dim>
621
+ <dim>1024</dim>
622
+ <dim>-1</dim>
623
+ <dim>20</dim>
624
+ </port>
625
+ <port id="1" precision="FP32">
626
+ <dim>1024</dim>
627
+ <dim>1</dim>
628
+ <dim>1</dim>
629
+ <dim>3</dim>
630
+ <dim>3</dim>
631
+ </port>
632
+ </input>
633
+ <output>
634
+ <port id="2" precision="FP32">
635
+ <dim>-1</dim>
636
+ <dim>1024</dim>
637
+ <dim>-1</dim>
638
+ <dim>10</dim>
639
+ </port>
640
+ </output>
641
+ </layer>
642
+ <layer id="35" name="__module.embed.conv.5/aten::_convolution/Reshape_1" type="Const" version="opset1">
643
+ <data element_type="f32" shape="1, 1024, 1, 1" offset="1096712" size="4096" />
644
+ <output>
645
+ <port id="0" precision="FP32">
646
+ <dim>1</dim>
647
+ <dim>1024</dim>
648
+ <dim>1</dim>
649
+ <dim>1</dim>
650
+ </port>
651
+ </output>
652
+ </layer>
653
+ <layer id="36" name="__module.embed.conv.5/aten::_convolution/Add" type="Add" version="opset1">
654
+ <data auto_broadcast="numpy" />
655
+ <input>
656
+ <port id="0" precision="FP32">
657
+ <dim>-1</dim>
658
+ <dim>1024</dim>
659
+ <dim>-1</dim>
660
+ <dim>10</dim>
661
+ </port>
662
+ <port id="1" precision="FP32">
663
+ <dim>1</dim>
664
+ <dim>1024</dim>
665
+ <dim>1</dim>
666
+ <dim>1</dim>
667
+ </port>
668
+ </input>
669
+ <output>
670
+ <port id="2" precision="FP32" names="49,input.13">
671
+ <dim>-1</dim>
672
+ <dim>1024</dim>
673
+ <dim>-1</dim>
674
+ <dim>10</dim>
675
+ </port>
676
+ </output>
677
+ </layer>
678
+ <layer id="37" name="self.embed.conv.6.weight" type="Const" version="opset1">
679
+ <data element_type="i8" shape="1024, 1024, 1, 1" offset="1100808" size="1048576" />
680
+ <output>
681
+ <port id="0" precision="I8">
682
+ <dim>1024</dim>
683
+ <dim>1024</dim>
684
+ <dim>1</dim>
685
+ <dim>1</dim>
686
+ </port>
687
+ </output>
688
+ </layer>
689
+ <layer id="38" name="Convert_2965986" type="Convert" version="opset1">
690
+ <data destination_type="f16" />
691
+ <input>
692
+ <port id="0" precision="I8">
693
+ <dim>1024</dim>
694
+ <dim>1024</dim>
695
+ <dim>1</dim>
696
+ <dim>1</dim>
697
+ </port>
698
+ </input>
699
+ <output>
700
+ <port id="1" precision="FP16">
701
+ <dim>1024</dim>
702
+ <dim>1024</dim>
703
+ <dim>1</dim>
704
+ <dim>1</dim>
705
+ </port>
706
+ </output>
707
+ </layer>
708
+ <layer id="39" name="self.embed.conv.6.weight/scale" type="Const" version="opset1">
709
+ <data element_type="f16" shape="1024, 1, 1, 1" offset="2149384" size="2048" />
710
+ <output>
711
+ <port id="0" precision="FP16">
712
+ <dim>1024</dim>
713
+ <dim>1</dim>
714
+ <dim>1</dim>
715
+ <dim>1</dim>
716
+ </port>
717
+ </output>
718
+ </layer>
719
+ <layer id="40" name="self.embed.conv.6.weight/fq_weights_1" type="Multiply" version="opset1">
720
+ <data auto_broadcast="numpy" />
721
+ <input>
722
+ <port id="0" precision="FP16">
723
+ <dim>1024</dim>
724
+ <dim>1024</dim>
725
+ <dim>1</dim>
726
+ <dim>1</dim>
727
+ </port>
728
+ <port id="1" precision="FP16">
729
+ <dim>1024</dim>
730
+ <dim>1</dim>
731
+ <dim>1</dim>
732
+ <dim>1</dim>
733
+ </port>
734
+ </input>
735
+ <output>
736
+ <port id="2" precision="FP16">
737
+ <dim>1024</dim>
738
+ <dim>1024</dim>
739
+ <dim>1</dim>
740
+ <dim>1</dim>
741
+ </port>
742
+ </output>
743
+ </layer>
744
+ <layer id="41" name="self.embed.conv.6.weight/fq_weights_1/convert" type="Convert" version="opset1">
745
+ <data destination_type="f32" />
746
+ <input>
747
+ <port id="0" precision="FP16">
748
+ <dim>1024</dim>
749
+ <dim>1024</dim>
750
+ <dim>1</dim>
751
+ <dim>1</dim>
752
+ </port>
753
+ </input>
754
+ <output>
755
+ <port id="1" precision="FP32">
756
+ <dim>1024</dim>
757
+ <dim>1024</dim>
758
+ <dim>1</dim>
759
+ <dim>1</dim>
760
+ </port>
761
+ </output>
762
+ </layer>
763
+ <layer id="42" name="__module.embed.conv.6/aten::_convolution/Convolution" type="Convolution" version="opset1">
764
+ <data strides="1, 1" dilations="1, 1" pads_begin="0, 0" pads_end="0, 0" auto_pad="explicit" />
765
+ <input>
766
+ <port id="0" precision="FP32">
767
+ <dim>-1</dim>
768
+ <dim>1024</dim>
769
+ <dim>-1</dim>
770
+ <dim>10</dim>
771
+ </port>
772
+ <port id="1" precision="FP32">
773
+ <dim>1024</dim>
774
+ <dim>1024</dim>
775
+ <dim>1</dim>
776
+ <dim>1</dim>
777
+ </port>
778
+ </input>
779
+ <output>
780
+ <port id="2" precision="FP32">
781
+ <dim>-1</dim>
782
+ <dim>1024</dim>
783
+ <dim>-1</dim>
784
+ <dim>10</dim>
785
+ </port>
786
+ </output>
787
+ </layer>
788
+ <layer id="43" name="__module.embed.conv.6/aten::_convolution/Reshape" type="Const" version="opset1">
789
+ <data element_type="f32" shape="1, 1024, 1, 1" offset="2151432" size="4096" />
790
+ <output>
791
+ <port id="0" precision="FP32">
792
+ <dim>1</dim>
793
+ <dim>1024</dim>
794
+ <dim>1</dim>
795
+ <dim>1</dim>
796
+ </port>
797
+ </output>
798
+ </layer>
799
+ <layer id="44" name="__module.embed.conv.6/aten::_convolution/Add" type="Add" version="opset1">
800
+ <data auto_broadcast="numpy" />
801
+ <input>
802
+ <port id="0" precision="FP32">
803
+ <dim>-1</dim>
804
+ <dim>1024</dim>
805
+ <dim>-1</dim>
806
+ <dim>10</dim>
807
+ </port>
808
+ <port id="1" precision="FP32">
809
+ <dim>1</dim>
810
+ <dim>1024</dim>
811
+ <dim>1</dim>
812
+ <dim>1</dim>
813
+ </port>
814
+ </input>
815
+ <output>
816
+ <port id="2" precision="FP32" names="56,input">
817
+ <dim>-1</dim>
818
+ <dim>1024</dim>
819
+ <dim>-1</dim>
820
+ <dim>10</dim>
821
+ </port>
822
+ </output>
823
+ </layer>
824
+ <layer id="45" name="__module.embed.conv.1/aten::relu/Relu_2" type="ReLU" version="opset1">
825
+ <input>
826
+ <port id="0" precision="FP32">
827
+ <dim>-1</dim>
828
+ <dim>1024</dim>
829
+ <dim>-1</dim>
830
+ <dim>10</dim>
831
+ </port>
832
+ </input>
833
+ <output>
834
+ <port id="1" precision="FP32" names="57,x">
835
+ <dim>-1</dim>
836
+ <dim>1024</dim>
837
+ <dim>-1</dim>
838
+ <dim>10</dim>
839
+ </port>
840
+ </output>
841
+ </layer>
842
+ <layer id="46" name="__module.embed/aten::transpose/Constant" type="Const" version="opset1">
843
+ <data element_type="i32" shape="4" offset="2155528" size="16" />
844
+ <output>
845
+ <port id="0" precision="I32">
846
+ <dim>4</dim>
847
+ </port>
848
+ </output>
849
+ </layer>
850
+ <layer id="47" name="__module.embed/aten::transpose/Transpose" type="Transpose" version="opset1">
851
+ <input>
852
+ <port id="0" precision="FP32">
853
+ <dim>-1</dim>
854
+ <dim>1024</dim>
855
+ <dim>-1</dim>
856
+ <dim>10</dim>
857
+ </port>
858
+ <port id="1" precision="I32">
859
+ <dim>4</dim>
860
+ </port>
861
+ </input>
862
+ <output>
863
+ <port id="2" precision="FP32" names="60">
864
+ <dim>-1</dim>
865
+ <dim>-1</dim>
866
+ <dim>1024</dim>
867
+ <dim>10</dim>
868
+ </port>
869
+ </output>
870
+ </layer>
871
+ <layer id="48" name="Constant_97793" type="Const" version="opset1">
872
+ <data element_type="i64" shape="3" offset="2155544" size="24" />
873
+ <output>
874
+ <port id="0" precision="I64">
875
+ <dim>3</dim>
876
+ </port>
877
+ </output>
878
+ </layer>
879
+ <layer id="49" name="__module.embed/aten::reshape/Reshape" type="Reshape" version="opset1">
880
+ <data special_zero="true" />
881
+ <input>
882
+ <port id="0" precision="FP32">
883
+ <dim>-1</dim>
884
+ <dim>-1</dim>
885
+ <dim>1024</dim>
886
+ <dim>10</dim>
887
+ </port>
888
+ <port id="1" precision="I64">
889
+ <dim>3</dim>
890
+ </port>
891
+ </input>
892
+ <output>
893
+ <port id="2" precision="FP32" names="62">
894
+ <dim>-1</dim>
895
+ <dim>-1</dim>
896
+ <dim>10240</dim>
897
+ </port>
898
+ </output>
899
+ </layer>
900
+ <layer id="50" name="self.embed.out.weight" type="Const" version="opset1">
901
+ <data element_type="i8" shape="1024, 10240" offset="2155568" size="10485760" />
902
+ <output>
903
+ <port id="0" precision="I8">
904
+ <dim>1024</dim>
905
+ <dim>10240</dim>
906
+ </port>
907
+ </output>
908
+ </layer>
909
+ <layer id="51" name="Convert_2956502" type="Convert" version="opset1">
910
+ <data destination_type="f16" />
911
+ <input>
912
+ <port id="0" precision="I8">
913
+ <dim>1024</dim>
914
+ <dim>10240</dim>
915
+ </port>
916
+ </input>
917
+ <output>
918
+ <port id="1" precision="FP16">
919
+ <dim>1024</dim>
920
+ <dim>10240</dim>
921
+ </port>
922
+ </output>
923
+ </layer>
924
+ <layer id="52" name="self.embed.out.weight/scale" type="Const" version="opset1">
925
+ <data element_type="f16" shape="1024, 1" offset="12641328" size="2048" />
926
+ <output>
927
+ <port id="0" precision="FP16">
928
+ <dim>1024</dim>
929
+ <dim>1</dim>
930
+ </port>
931
+ </output>
932
+ </layer>
933
+ <layer id="53" name="self.embed.out.weight/fq_weights_1" type="Multiply" version="opset1">
934
+ <data auto_broadcast="numpy" />
935
+ <input>
936
+ <port id="0" precision="FP16">
937
+ <dim>1024</dim>
938
+ <dim>10240</dim>
939
+ </port>
940
+ <port id="1" precision="FP16">
941
+ <dim>1024</dim>
942
+ <dim>1</dim>
943
+ </port>
944
+ </input>
945
+ <output>
946
+ <port id="2" precision="FP16">
947
+ <dim>1024</dim>
948
+ <dim>10240</dim>
949
+ </port>
950
+ </output>
951
+ </layer>
952
+ <layer id="54" name="self.embed.out.weight/fq_weights_1/convert" type="Convert" version="opset1">
953
+ <data destination_type="f32" />
954
+ <input>
955
+ <port id="0" precision="FP16">
956
+ <dim>1024</dim>
957
+ <dim>10240</dim>
958
+ </port>
959
+ </input>
960
+ <output>
961
+ <port id="1" precision="FP32">
962
+ <dim>1024</dim>
963
+ <dim>10240</dim>
964
+ </port>
965
+ </output>
966
+ </layer>
967
+ <layer id="55" name="__module.embed.out/ov_ext::linear/MatMul" type="MatMul" version="opset1">
968
+ <data transpose_a="false" transpose_b="true" />
969
+ <input>
970
+ <port id="0" precision="FP32">
971
+ <dim>-1</dim>
972
+ <dim>-1</dim>
973
+ <dim>10240</dim>
974
+ </port>
975
+ <port id="1" precision="FP32">
976
+ <dim>1024</dim>
977
+ <dim>10240</dim>
978
+ </port>
979
+ </input>
980
+ <output>
981
+ <port id="2" precision="FP32">
982
+ <dim>-1</dim>
983
+ <dim>-1</dim>
984
+ <dim>1024</dim>
985
+ </port>
986
+ </output>
987
+ </layer>
988
+ <layer id="56" name="Constant_97783" type="Const" version="opset1">
989
+ <data element_type="f32" shape="1, 1, 1024" offset="12643376" size="4096" />
990
+ <output>
991
+ <port id="0" precision="FP32">
992
+ <dim>1</dim>
993
+ <dim>1</dim>
994
+ <dim>1024</dim>
995
+ </port>
996
+ </output>
997
+ </layer>
998
+ <layer id="57" name="__module.embed.out/ov_ext::linear/Add" type="Add" version="opset1">
999
+ <data auto_broadcast="numpy" />
1000
+ <input>
1001
+ <port id="0" precision="FP32">
1002
+ <dim>-1</dim>
1003
+ <dim>-1</dim>
1004
+ <dim>1024</dim>
1005
+ </port>
1006
+ <port id="1" precision="FP32">
1007
+ <dim>1</dim>
1008
+ <dim>1</dim>
1009
+ <dim>1024</dim>
1010
+ </port>
1011
+ </input>
1012
+ <output>
1013
+ <port id="2" precision="FP32" names="last_hidden_state">
1014
+ <dim>-1</dim>
1015
+ <dim>-1</dim>
1016
+ <dim>1024</dim>
1017
+ </port>
1018
+ </output>
1019
+ </layer>
1020
+ <layer id="58" name="Result_95416" type="Result" version="opset1" output_names="last_hidden_state">
1021
+ <input>
1022
+ <port id="0" precision="FP32">
1023
+ <dim>-1</dim>
1024
+ <dim>-1</dim>
1025
+ <dim>1024</dim>
1026
+ </port>
1027
+ </input>
1028
+ </layer>
1029
+ </layers>
1030
+ <edges>
1031
+ <edge from-layer="0" from-port="0" to-layer="2" to-port="0" />
1032
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="1" />
1033
+ <edge from-layer="2" from-port="2" to-layer="8" to-port="0" />
1034
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="0" />
1035
+ <edge from-layer="4" from-port="1" to-layer="6" to-port="0" />
1036
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="1" />
1037
+ <edge from-layer="6" from-port="2" to-layer="7" to-port="0" />
1038
+ <edge from-layer="7" from-port="1" to-layer="8" to-port="1" />
1039
+ <edge from-layer="8" from-port="2" to-layer="10" to-port="0" />
1040
+ <edge from-layer="9" from-port="0" to-layer="10" to-port="1" />
1041
+ <edge from-layer="10" from-port="2" to-layer="11" to-port="0" />
1042
+ <edge from-layer="11" from-port="1" to-layer="17" to-port="0" />
1043
+ <edge from-layer="12" from-port="0" to-layer="13" to-port="0" />
1044
+ <edge from-layer="13" from-port="1" to-layer="15" to-port="0" />
1045
+ <edge from-layer="14" from-port="0" to-layer="15" to-port="1" />
1046
+ <edge from-layer="15" from-port="2" to-layer="16" to-port="0" />
1047
+ <edge from-layer="16" from-port="1" to-layer="17" to-port="1" />
1048
+ <edge from-layer="17" from-port="2" to-layer="19" to-port="0" />
1049
+ <edge from-layer="18" from-port="0" to-layer="19" to-port="1" />
1050
+ <edge from-layer="19" from-port="2" to-layer="25" to-port="0" />
1051
+ <edge from-layer="20" from-port="0" to-layer="21" to-port="0" />
1052
+ <edge from-layer="21" from-port="1" to-layer="23" to-port="0" />
1053
+ <edge from-layer="22" from-port="0" to-layer="23" to-port="1" />
1054
+ <edge from-layer="23" from-port="2" to-layer="24" to-port="0" />
1055
+ <edge from-layer="24" from-port="1" to-layer="25" to-port="1" />
1056
+ <edge from-layer="25" from-port="2" to-layer="27" to-port="0" />
1057
+ <edge from-layer="26" from-port="0" to-layer="27" to-port="1" />
1058
+ <edge from-layer="27" from-port="2" to-layer="28" to-port="0" />
1059
+ <edge from-layer="28" from-port="1" to-layer="34" to-port="0" />
1060
+ <edge from-layer="29" from-port="0" to-layer="30" to-port="0" />
1061
+ <edge from-layer="30" from-port="1" to-layer="32" to-port="0" />
1062
+ <edge from-layer="31" from-port="0" to-layer="32" to-port="1" />
1063
+ <edge from-layer="32" from-port="2" to-layer="33" to-port="0" />
1064
+ <edge from-layer="33" from-port="1" to-layer="34" to-port="1" />
1065
+ <edge from-layer="34" from-port="2" to-layer="36" to-port="0" />
1066
+ <edge from-layer="35" from-port="0" to-layer="36" to-port="1" />
1067
+ <edge from-layer="36" from-port="2" to-layer="42" to-port="0" />
1068
+ <edge from-layer="37" from-port="0" to-layer="38" to-port="0" />
1069
+ <edge from-layer="38" from-port="1" to-layer="40" to-port="0" />
1070
+ <edge from-layer="39" from-port="0" to-layer="40" to-port="1" />
1071
+ <edge from-layer="40" from-port="2" to-layer="41" to-port="0" />
1072
+ <edge from-layer="41" from-port="1" to-layer="42" to-port="1" />
1073
+ <edge from-layer="42" from-port="2" to-layer="44" to-port="0" />
1074
+ <edge from-layer="43" from-port="0" to-layer="44" to-port="1" />
1075
+ <edge from-layer="44" from-port="2" to-layer="45" to-port="0" />
1076
+ <edge from-layer="45" from-port="1" to-layer="47" to-port="0" />
1077
+ <edge from-layer="46" from-port="0" to-layer="47" to-port="1" />
1078
+ <edge from-layer="47" from-port="2" to-layer="49" to-port="0" />
1079
+ <edge from-layer="48" from-port="0" to-layer="49" to-port="1" />
1080
+ <edge from-layer="49" from-port="2" to-layer="55" to-port="0" />
1081
+ <edge from-layer="50" from-port="0" to-layer="51" to-port="0" />
1082
+ <edge from-layer="51" from-port="1" to-layer="53" to-port="0" />
1083
+ <edge from-layer="52" from-port="0" to-layer="53" to-port="1" />
1084
+ <edge from-layer="53" from-port="2" to-layer="54" to-port="0" />
1085
+ <edge from-layer="54" from-port="1" to-layer="55" to-port="1" />
1086
+ <edge from-layer="55" from-port="2" to-layer="57" to-port="0" />
1087
+ <edge from-layer="56" from-port="0" to-layer="57" to-port="1" />
1088
+ <edge from-layer="57" from-port="2" to-layer="58" to-port="0" />
1089
+ </edges>
1090
+ <rt_info>
1091
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
1092
+ <conversion_parameters>
1093
+ <framework value="pytorch" />
1094
+ <is_python_object value="True" />
1095
+ </conversion_parameters>
1096
+ <nncf>
1097
+ <friendly_names_were_updated value="True" />
1098
+ <version value="2.19.0" />
1099
+ <weight_compression>
1100
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
1101
+ <all_layers value="False" />
1102
+ <awq value="False" />
1103
+ <backup_mode value="int8_asym" />
1104
+ <compression_format value="dequantize" />
1105
+ <gptq value="False" />
1106
+ <group_size value="-1" />
1107
+ <ignored_scope value="[]" />
1108
+ <lora_correction value="False" />
1109
+ <mode value="int8_sym" />
1110
+ <ratio value="1.0" />
1111
+ <scale_estimation value="False" />
1112
+ <sensitivity_metric value="weight_quantization_error" />
1113
+ </weight_compression>
1114
+ </nncf>
1115
+ <optimum>
1116
+ <nncf_version value="2.19.0" />
1117
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
1118
+ <optimum_version value="2.1.0.dev0" />
1119
+ <pytorch_version value="2.9.1+cpu" />
1120
+ <transformers_version value="4.51.0" />
1121
+ </optimum>
1122
+ </rt_info>
1123
+ </net>
openvino_audio_speech_projection_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd876f21e8ef990b4dab2ab8bea52dfe7922a6a1d4112d29b0faa4784b3b0668
3
+ size 12607488
openvino_audio_speech_projection_model.xml ADDED
@@ -0,0 +1,366 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model12" version="11">
3
+ <layers>
4
+ <layer id="0" name="input" type="Parameter" version="opset1">
5
+ <data shape="?,?,1024" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="input">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ <dim>1024</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="self.0.weight" type="Const" version="opset1">
15
+ <data element_type="i8" shape="3072, 1024" offset="0" size="3145728" />
16
+ <output>
17
+ <port id="0" precision="I8">
18
+ <dim>3072</dim>
19
+ <dim>1024</dim>
20
+ </port>
21
+ </output>
22
+ </layer>
23
+ <layer id="2" name="Convert_4248226" type="Convert" version="opset1">
24
+ <data destination_type="f16" />
25
+ <input>
26
+ <port id="0" precision="I8">
27
+ <dim>3072</dim>
28
+ <dim>1024</dim>
29
+ </port>
30
+ </input>
31
+ <output>
32
+ <port id="1" precision="FP16">
33
+ <dim>3072</dim>
34
+ <dim>1024</dim>
35
+ </port>
36
+ </output>
37
+ </layer>
38
+ <layer id="3" name="self.0.weight/scale" type="Const" version="opset1">
39
+ <data element_type="f16" shape="3072, 1" offset="3145728" size="6144" />
40
+ <output>
41
+ <port id="0" precision="FP16">
42
+ <dim>3072</dim>
43
+ <dim>1</dim>
44
+ </port>
45
+ </output>
46
+ </layer>
47
+ <layer id="4" name="self.0.weight/fq_weights_1" type="Multiply" version="opset1">
48
+ <data auto_broadcast="numpy" />
49
+ <input>
50
+ <port id="0" precision="FP16">
51
+ <dim>3072</dim>
52
+ <dim>1024</dim>
53
+ </port>
54
+ <port id="1" precision="FP16">
55
+ <dim>3072</dim>
56
+ <dim>1</dim>
57
+ </port>
58
+ </input>
59
+ <output>
60
+ <port id="2" precision="FP16">
61
+ <dim>3072</dim>
62
+ <dim>1024</dim>
63
+ </port>
64
+ </output>
65
+ </layer>
66
+ <layer id="5" name="__module.0/ov_ext::linear/Convert" type="Convert" version="opset1">
67
+ <data destination_type="f32" />
68
+ <rt_info>
69
+ <attribute name="decompression" version="0" />
70
+ </rt_info>
71
+ <input>
72
+ <port id="0" precision="FP16">
73
+ <dim>3072</dim>
74
+ <dim>1024</dim>
75
+ </port>
76
+ </input>
77
+ <output>
78
+ <port id="1" precision="FP32">
79
+ <dim>3072</dim>
80
+ <dim>1024</dim>
81
+ </port>
82
+ </output>
83
+ </layer>
84
+ <layer id="6" name="__module.0/ov_ext::linear/MatMul" type="MatMul" version="opset1">
85
+ <data transpose_a="false" transpose_b="true" />
86
+ <input>
87
+ <port id="0" precision="FP32">
88
+ <dim>-1</dim>
89
+ <dim>-1</dim>
90
+ <dim>1024</dim>
91
+ </port>
92
+ <port id="1" precision="FP32">
93
+ <dim>3072</dim>
94
+ <dim>1024</dim>
95
+ </port>
96
+ </input>
97
+ <output>
98
+ <port id="2" precision="FP32">
99
+ <dim>-1</dim>
100
+ <dim>-1</dim>
101
+ <dim>3072</dim>
102
+ </port>
103
+ </output>
104
+ </layer>
105
+ <layer id="7" name="self.0.bias" type="Const" version="opset1">
106
+ <data element_type="bf16" shape="3072" offset="3151872" size="6144" />
107
+ <output>
108
+ <port id="0" precision="BF16" names="self.0.bias">
109
+ <dim>3072</dim>
110
+ </port>
111
+ </output>
112
+ </layer>
113
+ <layer id="8" name="__module.0/ov_ext::linear/Convert_1" type="Convert" version="opset1">
114
+ <data destination_type="f32" />
115
+ <rt_info>
116
+ <attribute name="decompression" version="0" />
117
+ </rt_info>
118
+ <input>
119
+ <port id="0" precision="BF16">
120
+ <dim>3072</dim>
121
+ </port>
122
+ </input>
123
+ <output>
124
+ <port id="1" precision="FP32">
125
+ <dim>3072</dim>
126
+ </port>
127
+ </output>
128
+ </layer>
129
+ <layer id="9" name="__module.0/ov_ext::linear/Add" type="Add" version="opset1">
130
+ <data auto_broadcast="numpy" />
131
+ <input>
132
+ <port id="0" precision="FP32">
133
+ <dim>-1</dim>
134
+ <dim>-1</dim>
135
+ <dim>3072</dim>
136
+ </port>
137
+ <port id="1" precision="FP32">
138
+ <dim>3072</dim>
139
+ </port>
140
+ </input>
141
+ <output>
142
+ <port id="2" precision="FP32" names="11,input_1">
143
+ <dim>-1</dim>
144
+ <dim>-1</dim>
145
+ <dim>3072</dim>
146
+ </port>
147
+ </output>
148
+ </layer>
149
+ <layer id="10" name="__module.1/aten::gelu/Gelu" type="Gelu" version="opset7">
150
+ <data approximation_mode="ERF" />
151
+ <input>
152
+ <port id="0" precision="FP32">
153
+ <dim>-1</dim>
154
+ <dim>-1</dim>
155
+ <dim>3072</dim>
156
+ </port>
157
+ </input>
158
+ <output>
159
+ <port id="1" precision="FP32" names="13">
160
+ <dim>-1</dim>
161
+ <dim>-1</dim>
162
+ <dim>3072</dim>
163
+ </port>
164
+ </output>
165
+ </layer>
166
+ <layer id="11" name="self.2.weight" type="Const" version="opset1">
167
+ <data element_type="i8" shape="3072, 3072" offset="3158016" size="9437184" />
168
+ <output>
169
+ <port id="0" precision="I8">
170
+ <dim>3072</dim>
171
+ <dim>3072</dim>
172
+ </port>
173
+ </output>
174
+ </layer>
175
+ <layer id="12" name="Convert_4243483" type="Convert" version="opset1">
176
+ <data destination_type="f16" />
177
+ <input>
178
+ <port id="0" precision="I8">
179
+ <dim>3072</dim>
180
+ <dim>3072</dim>
181
+ </port>
182
+ </input>
183
+ <output>
184
+ <port id="1" precision="FP16">
185
+ <dim>3072</dim>
186
+ <dim>3072</dim>
187
+ </port>
188
+ </output>
189
+ </layer>
190
+ <layer id="13" name="self.2.weight/scale" type="Const" version="opset1">
191
+ <data element_type="f16" shape="3072, 1" offset="12595200" size="6144" />
192
+ <output>
193
+ <port id="0" precision="FP16">
194
+ <dim>3072</dim>
195
+ <dim>1</dim>
196
+ </port>
197
+ </output>
198
+ </layer>
199
+ <layer id="14" name="self.2.weight/fq_weights_1" type="Multiply" version="opset1">
200
+ <data auto_broadcast="numpy" />
201
+ <input>
202
+ <port id="0" precision="FP16">
203
+ <dim>3072</dim>
204
+ <dim>3072</dim>
205
+ </port>
206
+ <port id="1" precision="FP16">
207
+ <dim>3072</dim>
208
+ <dim>1</dim>
209
+ </port>
210
+ </input>
211
+ <output>
212
+ <port id="2" precision="FP16">
213
+ <dim>3072</dim>
214
+ <dim>3072</dim>
215
+ </port>
216
+ </output>
217
+ </layer>
218
+ <layer id="15" name="__module.2/ov_ext::linear/Convert" type="Convert" version="opset1">
219
+ <data destination_type="f32" />
220
+ <rt_info>
221
+ <attribute name="decompression" version="0" />
222
+ </rt_info>
223
+ <input>
224
+ <port id="0" precision="FP16">
225
+ <dim>3072</dim>
226
+ <dim>3072</dim>
227
+ </port>
228
+ </input>
229
+ <output>
230
+ <port id="1" precision="FP32">
231
+ <dim>3072</dim>
232
+ <dim>3072</dim>
233
+ </port>
234
+ </output>
235
+ </layer>
236
+ <layer id="16" name="__module.2/ov_ext::linear/MatMul" type="MatMul" version="opset1">
237
+ <data transpose_a="false" transpose_b="true" />
238
+ <input>
239
+ <port id="0" precision="FP32">
240
+ <dim>-1</dim>
241
+ <dim>-1</dim>
242
+ <dim>3072</dim>
243
+ </port>
244
+ <port id="1" precision="FP32">
245
+ <dim>3072</dim>
246
+ <dim>3072</dim>
247
+ </port>
248
+ </input>
249
+ <output>
250
+ <port id="2" precision="FP32">
251
+ <dim>-1</dim>
252
+ <dim>-1</dim>
253
+ <dim>3072</dim>
254
+ </port>
255
+ </output>
256
+ </layer>
257
+ <layer id="17" name="self.2.bias" type="Const" version="opset1">
258
+ <data element_type="bf16" shape="3072" offset="12601344" size="6144" />
259
+ <output>
260
+ <port id="0" precision="BF16" names="self.2.bias">
261
+ <dim>3072</dim>
262
+ </port>
263
+ </output>
264
+ </layer>
265
+ <layer id="18" name="__module.2/ov_ext::linear/Convert_1" type="Convert" version="opset1">
266
+ <data destination_type="f32" />
267
+ <rt_info>
268
+ <attribute name="decompression" version="0" />
269
+ </rt_info>
270
+ <input>
271
+ <port id="0" precision="BF16">
272
+ <dim>3072</dim>
273
+ </port>
274
+ </input>
275
+ <output>
276
+ <port id="1" precision="FP32">
277
+ <dim>3072</dim>
278
+ </port>
279
+ </output>
280
+ </layer>
281
+ <layer id="19" name="__module.2/ov_ext::linear/Add" type="Add" version="opset1">
282
+ <data auto_broadcast="numpy" />
283
+ <input>
284
+ <port id="0" precision="FP32">
285
+ <dim>-1</dim>
286
+ <dim>-1</dim>
287
+ <dim>3072</dim>
288
+ </port>
289
+ <port id="1" precision="FP32">
290
+ <dim>3072</dim>
291
+ </port>
292
+ </input>
293
+ <output>
294
+ <port id="2" precision="FP32" names="last_hidden_state">
295
+ <dim>-1</dim>
296
+ <dim>-1</dim>
297
+ <dim>3072</dim>
298
+ </port>
299
+ </output>
300
+ </layer>
301
+ <layer id="20" name="Result_100412" type="Result" version="opset1" output_names="last_hidden_state">
302
+ <input>
303
+ <port id="0" precision="FP32">
304
+ <dim>-1</dim>
305
+ <dim>-1</dim>
306
+ <dim>3072</dim>
307
+ </port>
308
+ </input>
309
+ </layer>
310
+ </layers>
311
+ <edges>
312
+ <edge from-layer="0" from-port="0" to-layer="6" to-port="0" />
313
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
314
+ <edge from-layer="2" from-port="1" to-layer="4" to-port="0" />
315
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="1" />
316
+ <edge from-layer="4" from-port="2" to-layer="5" to-port="0" />
317
+ <edge from-layer="5" from-port="1" to-layer="6" to-port="1" />
318
+ <edge from-layer="6" from-port="2" to-layer="9" to-port="0" />
319
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="0" />
320
+ <edge from-layer="8" from-port="1" to-layer="9" to-port="1" />
321
+ <edge from-layer="9" from-port="2" to-layer="10" to-port="0" />
322
+ <edge from-layer="10" from-port="1" to-layer="16" to-port="0" />
323
+ <edge from-layer="11" from-port="0" to-layer="12" to-port="0" />
324
+ <edge from-layer="12" from-port="1" to-layer="14" to-port="0" />
325
+ <edge from-layer="13" from-port="0" to-layer="14" to-port="1" />
326
+ <edge from-layer="14" from-port="2" to-layer="15" to-port="0" />
327
+ <edge from-layer="15" from-port="1" to-layer="16" to-port="1" />
328
+ <edge from-layer="16" from-port="2" to-layer="19" to-port="0" />
329
+ <edge from-layer="17" from-port="0" to-layer="18" to-port="0" />
330
+ <edge from-layer="18" from-port="1" to-layer="19" to-port="1" />
331
+ <edge from-layer="19" from-port="2" to-layer="20" to-port="0" />
332
+ </edges>
333
+ <rt_info>
334
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
335
+ <conversion_parameters>
336
+ <framework value="pytorch" />
337
+ <is_python_object value="True" />
338
+ </conversion_parameters>
339
+ <nncf>
340
+ <friendly_names_were_updated value="True" />
341
+ <version value="2.19.0" />
342
+ <weight_compression>
343
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
344
+ <all_layers value="False" />
345
+ <awq value="False" />
346
+ <backup_mode value="int8_asym" />
347
+ <compression_format value="dequantize" />
348
+ <gptq value="False" />
349
+ <group_size value="-1" />
350
+ <ignored_scope value="[]" />
351
+ <lora_correction value="False" />
352
+ <mode value="int8_sym" />
353
+ <ratio value="1.0" />
354
+ <scale_estimation value="False" />
355
+ <sensitivity_metric value="weight_quantization_error" />
356
+ </weight_compression>
357
+ </nncf>
358
+ <optimum>
359
+ <nncf_version value="2.19.0" />
360
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
361
+ <optimum_version value="2.1.0.dev0" />
362
+ <pytorch_version value="2.9.1+cpu" />
363
+ <transformers_version value="4.51.0" />
364
+ </optimum>
365
+ </rt_info>
366
+ </net>
openvino_audio_vision_projection_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd876f21e8ef990b4dab2ab8bea52dfe7922a6a1d4112d29b0faa4784b3b0668
3
+ size 12607488
openvino_audio_vision_projection_model.xml ADDED
@@ -0,0 +1,366 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model9" version="11">
3
+ <layers>
4
+ <layer id="0" name="input" type="Parameter" version="opset1">
5
+ <data shape="?,?,1024" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="input">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ <dim>1024</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="self.0.weight" type="Const" version="opset1">
15
+ <data element_type="i8" shape="3072, 1024" offset="0" size="3145728" />
16
+ <output>
17
+ <port id="0" precision="I8">
18
+ <dim>3072</dim>
19
+ <dim>1024</dim>
20
+ </port>
21
+ </output>
22
+ </layer>
23
+ <layer id="2" name="Convert_4238719" type="Convert" version="opset1">
24
+ <data destination_type="f16" />
25
+ <input>
26
+ <port id="0" precision="I8">
27
+ <dim>3072</dim>
28
+ <dim>1024</dim>
29
+ </port>
30
+ </input>
31
+ <output>
32
+ <port id="1" precision="FP16">
33
+ <dim>3072</dim>
34
+ <dim>1024</dim>
35
+ </port>
36
+ </output>
37
+ </layer>
38
+ <layer id="3" name="self.0.weight/scale" type="Const" version="opset1">
39
+ <data element_type="f16" shape="3072, 1" offset="3145728" size="6144" />
40
+ <output>
41
+ <port id="0" precision="FP16">
42
+ <dim>3072</dim>
43
+ <dim>1</dim>
44
+ </port>
45
+ </output>
46
+ </layer>
47
+ <layer id="4" name="self.0.weight/fq_weights_1" type="Multiply" version="opset1">
48
+ <data auto_broadcast="numpy" />
49
+ <input>
50
+ <port id="0" precision="FP16">
51
+ <dim>3072</dim>
52
+ <dim>1024</dim>
53
+ </port>
54
+ <port id="1" precision="FP16">
55
+ <dim>3072</dim>
56
+ <dim>1</dim>
57
+ </port>
58
+ </input>
59
+ <output>
60
+ <port id="2" precision="FP16">
61
+ <dim>3072</dim>
62
+ <dim>1024</dim>
63
+ </port>
64
+ </output>
65
+ </layer>
66
+ <layer id="5" name="__module.0/ov_ext::linear/Convert" type="Convert" version="opset1">
67
+ <data destination_type="f32" />
68
+ <rt_info>
69
+ <attribute name="decompression" version="0" />
70
+ </rt_info>
71
+ <input>
72
+ <port id="0" precision="FP16">
73
+ <dim>3072</dim>
74
+ <dim>1024</dim>
75
+ </port>
76
+ </input>
77
+ <output>
78
+ <port id="1" precision="FP32">
79
+ <dim>3072</dim>
80
+ <dim>1024</dim>
81
+ </port>
82
+ </output>
83
+ </layer>
84
+ <layer id="6" name="__module.0/ov_ext::linear/MatMul" type="MatMul" version="opset1">
85
+ <data transpose_a="false" transpose_b="true" />
86
+ <input>
87
+ <port id="0" precision="FP32">
88
+ <dim>-1</dim>
89
+ <dim>-1</dim>
90
+ <dim>1024</dim>
91
+ </port>
92
+ <port id="1" precision="FP32">
93
+ <dim>3072</dim>
94
+ <dim>1024</dim>
95
+ </port>
96
+ </input>
97
+ <output>
98
+ <port id="2" precision="FP32">
99
+ <dim>-1</dim>
100
+ <dim>-1</dim>
101
+ <dim>3072</dim>
102
+ </port>
103
+ </output>
104
+ </layer>
105
+ <layer id="7" name="self.0.bias" type="Const" version="opset1">
106
+ <data element_type="bf16" shape="3072" offset="3151872" size="6144" />
107
+ <output>
108
+ <port id="0" precision="BF16" names="self.0.bias">
109
+ <dim>3072</dim>
110
+ </port>
111
+ </output>
112
+ </layer>
113
+ <layer id="8" name="__module.0/ov_ext::linear/Convert_1" type="Convert" version="opset1">
114
+ <data destination_type="f32" />
115
+ <rt_info>
116
+ <attribute name="decompression" version="0" />
117
+ </rt_info>
118
+ <input>
119
+ <port id="0" precision="BF16">
120
+ <dim>3072</dim>
121
+ </port>
122
+ </input>
123
+ <output>
124
+ <port id="1" precision="FP32">
125
+ <dim>3072</dim>
126
+ </port>
127
+ </output>
128
+ </layer>
129
+ <layer id="9" name="__module.0/ov_ext::linear/Add" type="Add" version="opset1">
130
+ <data auto_broadcast="numpy" />
131
+ <input>
132
+ <port id="0" precision="FP32">
133
+ <dim>-1</dim>
134
+ <dim>-1</dim>
135
+ <dim>3072</dim>
136
+ </port>
137
+ <port id="1" precision="FP32">
138
+ <dim>3072</dim>
139
+ </port>
140
+ </input>
141
+ <output>
142
+ <port id="2" precision="FP32" names="11,input_1">
143
+ <dim>-1</dim>
144
+ <dim>-1</dim>
145
+ <dim>3072</dim>
146
+ </port>
147
+ </output>
148
+ </layer>
149
+ <layer id="10" name="__module.1/aten::gelu/Gelu" type="Gelu" version="opset7">
150
+ <data approximation_mode="ERF" />
151
+ <input>
152
+ <port id="0" precision="FP32">
153
+ <dim>-1</dim>
154
+ <dim>-1</dim>
155
+ <dim>3072</dim>
156
+ </port>
157
+ </input>
158
+ <output>
159
+ <port id="1" precision="FP32" names="13">
160
+ <dim>-1</dim>
161
+ <dim>-1</dim>
162
+ <dim>3072</dim>
163
+ </port>
164
+ </output>
165
+ </layer>
166
+ <layer id="11" name="self.2.weight" type="Const" version="opset1">
167
+ <data element_type="i8" shape="3072, 3072" offset="3158016" size="9437184" />
168
+ <output>
169
+ <port id="0" precision="I8">
170
+ <dim>3072</dim>
171
+ <dim>3072</dim>
172
+ </port>
173
+ </output>
174
+ </layer>
175
+ <layer id="12" name="Convert_4233976" type="Convert" version="opset1">
176
+ <data destination_type="f16" />
177
+ <input>
178
+ <port id="0" precision="I8">
179
+ <dim>3072</dim>
180
+ <dim>3072</dim>
181
+ </port>
182
+ </input>
183
+ <output>
184
+ <port id="1" precision="FP16">
185
+ <dim>3072</dim>
186
+ <dim>3072</dim>
187
+ </port>
188
+ </output>
189
+ </layer>
190
+ <layer id="13" name="self.2.weight/scale" type="Const" version="opset1">
191
+ <data element_type="f16" shape="3072, 1" offset="12595200" size="6144" />
192
+ <output>
193
+ <port id="0" precision="FP16">
194
+ <dim>3072</dim>
195
+ <dim>1</dim>
196
+ </port>
197
+ </output>
198
+ </layer>
199
+ <layer id="14" name="self.2.weight/fq_weights_1" type="Multiply" version="opset1">
200
+ <data auto_broadcast="numpy" />
201
+ <input>
202
+ <port id="0" precision="FP16">
203
+ <dim>3072</dim>
204
+ <dim>3072</dim>
205
+ </port>
206
+ <port id="1" precision="FP16">
207
+ <dim>3072</dim>
208
+ <dim>1</dim>
209
+ </port>
210
+ </input>
211
+ <output>
212
+ <port id="2" precision="FP16">
213
+ <dim>3072</dim>
214
+ <dim>3072</dim>
215
+ </port>
216
+ </output>
217
+ </layer>
218
+ <layer id="15" name="__module.2/ov_ext::linear/Convert" type="Convert" version="opset1">
219
+ <data destination_type="f32" />
220
+ <rt_info>
221
+ <attribute name="decompression" version="0" />
222
+ </rt_info>
223
+ <input>
224
+ <port id="0" precision="FP16">
225
+ <dim>3072</dim>
226
+ <dim>3072</dim>
227
+ </port>
228
+ </input>
229
+ <output>
230
+ <port id="1" precision="FP32">
231
+ <dim>3072</dim>
232
+ <dim>3072</dim>
233
+ </port>
234
+ </output>
235
+ </layer>
236
+ <layer id="16" name="__module.2/ov_ext::linear/MatMul" type="MatMul" version="opset1">
237
+ <data transpose_a="false" transpose_b="true" />
238
+ <input>
239
+ <port id="0" precision="FP32">
240
+ <dim>-1</dim>
241
+ <dim>-1</dim>
242
+ <dim>3072</dim>
243
+ </port>
244
+ <port id="1" precision="FP32">
245
+ <dim>3072</dim>
246
+ <dim>3072</dim>
247
+ </port>
248
+ </input>
249
+ <output>
250
+ <port id="2" precision="FP32">
251
+ <dim>-1</dim>
252
+ <dim>-1</dim>
253
+ <dim>3072</dim>
254
+ </port>
255
+ </output>
256
+ </layer>
257
+ <layer id="17" name="self.2.bias" type="Const" version="opset1">
258
+ <data element_type="bf16" shape="3072" offset="12601344" size="6144" />
259
+ <output>
260
+ <port id="0" precision="BF16" names="self.2.bias">
261
+ <dim>3072</dim>
262
+ </port>
263
+ </output>
264
+ </layer>
265
+ <layer id="18" name="__module.2/ov_ext::linear/Convert_1" type="Convert" version="opset1">
266
+ <data destination_type="f32" />
267
+ <rt_info>
268
+ <attribute name="decompression" version="0" />
269
+ </rt_info>
270
+ <input>
271
+ <port id="0" precision="BF16">
272
+ <dim>3072</dim>
273
+ </port>
274
+ </input>
275
+ <output>
276
+ <port id="1" precision="FP32">
277
+ <dim>3072</dim>
278
+ </port>
279
+ </output>
280
+ </layer>
281
+ <layer id="19" name="__module.2/ov_ext::linear/Add" type="Add" version="opset1">
282
+ <data auto_broadcast="numpy" />
283
+ <input>
284
+ <port id="0" precision="FP32">
285
+ <dim>-1</dim>
286
+ <dim>-1</dim>
287
+ <dim>3072</dim>
288
+ </port>
289
+ <port id="1" precision="FP32">
290
+ <dim>3072</dim>
291
+ </port>
292
+ </input>
293
+ <output>
294
+ <port id="2" precision="FP32" names="last_hidden_state">
295
+ <dim>-1</dim>
296
+ <dim>-1</dim>
297
+ <dim>3072</dim>
298
+ </port>
299
+ </output>
300
+ </layer>
301
+ <layer id="20" name="Result_98102" type="Result" version="opset1" output_names="last_hidden_state">
302
+ <input>
303
+ <port id="0" precision="FP32">
304
+ <dim>-1</dim>
305
+ <dim>-1</dim>
306
+ <dim>3072</dim>
307
+ </port>
308
+ </input>
309
+ </layer>
310
+ </layers>
311
+ <edges>
312
+ <edge from-layer="0" from-port="0" to-layer="6" to-port="0" />
313
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
314
+ <edge from-layer="2" from-port="1" to-layer="4" to-port="0" />
315
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="1" />
316
+ <edge from-layer="4" from-port="2" to-layer="5" to-port="0" />
317
+ <edge from-layer="5" from-port="1" to-layer="6" to-port="1" />
318
+ <edge from-layer="6" from-port="2" to-layer="9" to-port="0" />
319
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="0" />
320
+ <edge from-layer="8" from-port="1" to-layer="9" to-port="1" />
321
+ <edge from-layer="9" from-port="2" to-layer="10" to-port="0" />
322
+ <edge from-layer="10" from-port="1" to-layer="16" to-port="0" />
323
+ <edge from-layer="11" from-port="0" to-layer="12" to-port="0" />
324
+ <edge from-layer="12" from-port="1" to-layer="14" to-port="0" />
325
+ <edge from-layer="13" from-port="0" to-layer="14" to-port="1" />
326
+ <edge from-layer="14" from-port="2" to-layer="15" to-port="0" />
327
+ <edge from-layer="15" from-port="1" to-layer="16" to-port="1" />
328
+ <edge from-layer="16" from-port="2" to-layer="19" to-port="0" />
329
+ <edge from-layer="17" from-port="0" to-layer="18" to-port="0" />
330
+ <edge from-layer="18" from-port="1" to-layer="19" to-port="1" />
331
+ <edge from-layer="19" from-port="2" to-layer="20" to-port="0" />
332
+ </edges>
333
+ <rt_info>
334
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
335
+ <conversion_parameters>
336
+ <framework value="pytorch" />
337
+ <is_python_object value="True" />
338
+ </conversion_parameters>
339
+ <nncf>
340
+ <friendly_names_were_updated value="True" />
341
+ <version value="2.19.0" />
342
+ <weight_compression>
343
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
344
+ <all_layers value="False" />
345
+ <awq value="False" />
346
+ <backup_mode value="int8_asym" />
347
+ <compression_format value="dequantize" />
348
+ <gptq value="False" />
349
+ <group_size value="-1" />
350
+ <ignored_scope value="[]" />
351
+ <lora_correction value="False" />
352
+ <mode value="int8_sym" />
353
+ <ratio value="1.0" />
354
+ <scale_estimation value="False" />
355
+ <sensitivity_metric value="weight_quantization_error" />
356
+ </weight_compression>
357
+ </nncf>
358
+ <optimum>
359
+ <nncf_version value="2.19.0" />
360
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
361
+ <optimum_version value="2.1.0.dev0" />
362
+ <pytorch_version value="2.9.1+cpu" />
363
+ <transformers_version value="4.51.0" />
364
+ </optimum>
365
+ </rt_info>
366
+ </net>
openvino_config.json ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dtype": "int4",
3
+ "input_info": null,
4
+ "optimum_version": "2.1.0.dev0",
5
+ "quantization_config": {
6
+ "_dataset_kwargs": {},
7
+ "dataset": null,
8
+ "default_config": {
9
+ "quant_method": "default"
10
+ },
11
+ "ignored_scope": null,
12
+ "num_samples": null,
13
+ "processor": "microsoft/phi-4-multimodal-instruct",
14
+ "quantization_configs": {
15
+ "lm_model": {
16
+ "_dataset_kwargs": {},
17
+ "all_layers": null,
18
+ "backup_precision": null,
19
+ "bits": 4,
20
+ "dataset": null,
21
+ "dq_group_size": null,
22
+ "dtype": "int4",
23
+ "gptq": null,
24
+ "group_size": 64,
25
+ "group_size_fallback": null,
26
+ "ignored_scope": null,
27
+ "lora_correction": null,
28
+ "num_samples": null,
29
+ "processor": "microsoft/phi-4-multimodal-instruct",
30
+ "quant_method": "default",
31
+ "ratio": 1.0,
32
+ "scale_estimation": null,
33
+ "sensitivity_metric": null,
34
+ "statistics_path": null,
35
+ "sym": false,
36
+ "tokenizer": "microsoft/phi-4-multimodal-instruct"
37
+ }
38
+ },
39
+ "tokenizer": "microsoft/phi-4-multimodal-instruct"
40
+ },
41
+ "save_onnx_model": false,
42
+ "transformers_version": "4.51.0"
43
+ }
openvino_detokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:58ce47114b112bcfbd7b232bd32f57b3f19f19c4979b6fd4410aafa2635f9f8f
3
+ size 2998357
openvino_detokenizer.xml ADDED
@@ -0,0 +1,218 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="detokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="Parameter_128" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="Parameter_128">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="Convert_327" type="Convert" version="opset1">
14
+ <data destination_type="i32" />
15
+ <input>
16
+ <port id="0" precision="I64">
17
+ <dim>-1</dim>
18
+ <dim>-1</dim>
19
+ </port>
20
+ </input>
21
+ <output>
22
+ <port id="1" precision="I32">
23
+ <dim>-1</dim>
24
+ <dim>-1</dim>
25
+ </port>
26
+ </output>
27
+ </layer>
28
+ <layer id="2" name="Constant_130" type="Const" version="opset1">
29
+ <data element_type="i32" shape="200029" offset="0" size="800116" />
30
+ <output>
31
+ <port id="0" precision="I32">
32
+ <dim>200029</dim>
33
+ </port>
34
+ </output>
35
+ </layer>
36
+ <layer id="3" name="Constant_132" type="Const" version="opset1">
37
+ <data element_type="i32" shape="200029" offset="800116" size="800116" />
38
+ <output>
39
+ <port id="0" precision="I32">
40
+ <dim>200029</dim>
41
+ </port>
42
+ </output>
43
+ </layer>
44
+ <layer id="4" name="Constant_134" type="Const" version="opset1">
45
+ <data element_type="u8" shape="1398089" offset="1600232" size="1398089" />
46
+ <output>
47
+ <port id="0" precision="U8">
48
+ <dim>1398089</dim>
49
+ </port>
50
+ </output>
51
+ </layer>
52
+ <layer id="5" name="Slice_139" type="Const" version="opset1">
53
+ <data element_type="i32" shape="9" offset="2998321" size="36" />
54
+ <output>
55
+ <port id="0" precision="I32">
56
+ <dim>9</dim>
57
+ </port>
58
+ </output>
59
+ </layer>
60
+ <layer id="6" name="VocabDecoder_141" type="VocabDecoder" version="extension">
61
+ <data skip_tokens="" />
62
+ <input>
63
+ <port id="0" precision="I32">
64
+ <dim>-1</dim>
65
+ <dim>-1</dim>
66
+ </port>
67
+ <port id="1" precision="I32">
68
+ <dim>200029</dim>
69
+ </port>
70
+ <port id="2" precision="I32">
71
+ <dim>200029</dim>
72
+ </port>
73
+ <port id="3" precision="U8">
74
+ <dim>1398089</dim>
75
+ </port>
76
+ <port id="4" precision="I32">
77
+ <dim>9</dim>
78
+ </port>
79
+ </input>
80
+ <output>
81
+ <port id="5" precision="I32">
82
+ <dim>-1</dim>
83
+ </port>
84
+ <port id="6" precision="I32">
85
+ <dim>-1</dim>
86
+ </port>
87
+ <port id="7" precision="I32">
88
+ <dim>-1</dim>
89
+ </port>
90
+ <port id="8" precision="I32">
91
+ <dim>-1</dim>
92
+ </port>
93
+ <port id="9" precision="U8">
94
+ <dim>-1</dim>
95
+ </port>
96
+ </output>
97
+ </layer>
98
+ <layer id="7" name="FuzeRagged_142" type="FuzeRagged" version="extension">
99
+ <input>
100
+ <port id="0" precision="I32">
101
+ <dim>-1</dim>
102
+ </port>
103
+ <port id="1" precision="I32">
104
+ <dim>-1</dim>
105
+ </port>
106
+ <port id="2" precision="I32">
107
+ <dim>-1</dim>
108
+ </port>
109
+ <port id="3" precision="I32">
110
+ <dim>-1</dim>
111
+ </port>
112
+ </input>
113
+ <output>
114
+ <port id="4" precision="I32">
115
+ <dim>-1</dim>
116
+ </port>
117
+ <port id="5" precision="I32">
118
+ <dim>-1</dim>
119
+ </port>
120
+ </output>
121
+ </layer>
122
+ <layer id="8" name="UTF8Validate_143" type="UTF8Validate" version="extension">
123
+ <data replace_mode="true" />
124
+ <input>
125
+ <port id="0" precision="I32">
126
+ <dim>-1</dim>
127
+ </port>
128
+ <port id="1" precision="I32">
129
+ <dim>-1</dim>
130
+ </port>
131
+ <port id="2" precision="U8">
132
+ <dim>-1</dim>
133
+ </port>
134
+ </input>
135
+ <output>
136
+ <port id="3" precision="I32">
137
+ <dim>-1</dim>
138
+ </port>
139
+ <port id="4" precision="I32">
140
+ <dim>-1</dim>
141
+ </port>
142
+ <port id="5" precision="U8">
143
+ <dim>-1</dim>
144
+ </port>
145
+ </output>
146
+ </layer>
147
+ <layer id="9" name="StringTensorPack_144" type="StringTensorPack" version="opset15">
148
+ <input>
149
+ <port id="0" precision="I32">
150
+ <dim>-1</dim>
151
+ </port>
152
+ <port id="1" precision="I32">
153
+ <dim>-1</dim>
154
+ </port>
155
+ <port id="2" precision="U8">
156
+ <dim>-1</dim>
157
+ </port>
158
+ </input>
159
+ <output>
160
+ <port id="3" precision="STRING" names="Result_145,string_output">
161
+ <dim>-1</dim>
162
+ </port>
163
+ </output>
164
+ </layer>
165
+ <layer id="10" name="Result_145" type="Result" version="opset1" output_names="Result_145,string_output">
166
+ <input>
167
+ <port id="0" precision="STRING">
168
+ <dim>-1</dim>
169
+ </port>
170
+ </input>
171
+ </layer>
172
+ </layers>
173
+ <edges>
174
+ <edge from-layer="0" from-port="0" to-layer="1" to-port="0" />
175
+ <edge from-layer="1" from-port="1" to-layer="6" to-port="0" />
176
+ <edge from-layer="2" from-port="0" to-layer="6" to-port="1" />
177
+ <edge from-layer="3" from-port="0" to-layer="6" to-port="2" />
178
+ <edge from-layer="4" from-port="0" to-layer="6" to-port="3" />
179
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="4" />
180
+ <edge from-layer="6" from-port="5" to-layer="7" to-port="0" />
181
+ <edge from-layer="6" from-port="6" to-layer="7" to-port="1" />
182
+ <edge from-layer="6" from-port="7" to-layer="7" to-port="2" />
183
+ <edge from-layer="6" from-port="8" to-layer="7" to-port="3" />
184
+ <edge from-layer="6" from-port="9" to-layer="8" to-port="2" />
185
+ <edge from-layer="7" from-port="4" to-layer="8" to-port="0" />
186
+ <edge from-layer="7" from-port="5" to-layer="8" to-port="1" />
187
+ <edge from-layer="8" from-port="3" to-layer="9" to-port="0" />
188
+ <edge from-layer="8" from-port="4" to-layer="9" to-port="1" />
189
+ <edge from-layer="8" from-port="5" to-layer="9" to-port="2" />
190
+ <edge from-layer="9" from-port="3" to-layer="10" to-port="0" />
191
+ </edges>
192
+ <rt_info>
193
+ <add_attention_mask value="True" />
194
+ <add_prefix_space />
195
+ <add_special_tokens value="True" />
196
+ <bos_token_id value="199999" />
197
+ <chat_template value="{% for message in messages %}{% if message['role'] == 'system' and 'tools' in message and message['tools'] is not none %}{{ '&lt;|' + message['role'] + '|>' + message['content'] + '&lt;|tool|>' + message['tools'] + '&lt;|/tool|>' + '&lt;|end|>' }}{% else %}{{ '&lt;|' + message['role'] + '|>' + message['content'] + '&lt;|end|>' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '&lt;|assistant|>' }}{% else %}{{ eos_token }}{% endif %}" />
198
+ <clean_up_tokenization_spaces />
199
+ <detokenizer_input_type value="i64" />
200
+ <eos_token_id value="199999" />
201
+ <handle_special_tokens_with_re />
202
+ <max_length />
203
+ <number_of_inputs value="1" />
204
+ <openvino_tokenizers_version value="2025.4.1.0-627-e79796a77f3" />
205
+ <openvino_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
206
+ <original_tokenizer_class value="&lt;class 'transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast'>" />
207
+ <pad_token_id value="199999" />
208
+ <skip_special_tokens value="True" />
209
+ <streaming_detokenizer value="False" />
210
+ <tokenizer_output_type value="i64" />
211
+ <tokenizers_version value="0.22.2" />
212
+ <transformers_version value="4.57.3" />
213
+ <use_max_padding value="False" />
214
+ <use_sentencepiece_backend value="False" />
215
+ <utf8_replace_mode value="replace" />
216
+ <with_detokenizer value="True" />
217
+ </rt_info>
218
+ </net>
openvino_language_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4172e77c411f21f9c453efca9d324771f4717096100907baa356f35b01e11efa
3
+ size 2601147138
openvino_language_model.xml ADDED
The diff for this file is too large to render. See raw diff
 
openvino_text_embeddings_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:069b93628617a45d3d5719c7bdbe9f1f8d80f8f9498f927f6d9acc81f12236da
3
+ size 614996740
openvino_text_embeddings_model.xml ADDED
@@ -0,0 +1,179 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model18" version="11">
3
+ <layers>
4
+ <layer id="0" name="input" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="input">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="self.weight" type="Const" version="opset1">
14
+ <data element_type="i8" shape="200064, 3072" offset="0" size="614596608" />
15
+ <output>
16
+ <port id="0" precision="I8">
17
+ <dim>200064</dim>
18
+ <dim>3072</dim>
19
+ </port>
20
+ </output>
21
+ </layer>
22
+ <layer id="2" name="Convert_2190860" type="Convert" version="opset1">
23
+ <data destination_type="f16" />
24
+ <input>
25
+ <port id="0" precision="I8">
26
+ <dim>200064</dim>
27
+ <dim>3072</dim>
28
+ </port>
29
+ </input>
30
+ <output>
31
+ <port id="1" precision="FP16">
32
+ <dim>200064</dim>
33
+ <dim>3072</dim>
34
+ </port>
35
+ </output>
36
+ </layer>
37
+ <layer id="3" name="self.weight/scale" type="Const" version="opset1">
38
+ <data element_type="f16" shape="200064, 1" offset="614596608" size="400128" />
39
+ <output>
40
+ <port id="0" precision="FP16">
41
+ <dim>200064</dim>
42
+ <dim>1</dim>
43
+ </port>
44
+ </output>
45
+ </layer>
46
+ <layer id="4" name="self.weight/fq_weights_0" type="Multiply" version="opset1">
47
+ <data auto_broadcast="numpy" />
48
+ <input>
49
+ <port id="0" precision="FP16">
50
+ <dim>200064</dim>
51
+ <dim>3072</dim>
52
+ </port>
53
+ <port id="1" precision="FP16">
54
+ <dim>200064</dim>
55
+ <dim>1</dim>
56
+ </port>
57
+ </input>
58
+ <output>
59
+ <port id="2" precision="FP16">
60
+ <dim>200064</dim>
61
+ <dim>3072</dim>
62
+ </port>
63
+ </output>
64
+ </layer>
65
+ <layer id="5" name="ov_ext::embedding/Convert" type="Convert" version="opset1">
66
+ <data destination_type="f32" />
67
+ <rt_info>
68
+ <attribute name="decompression" version="0" />
69
+ </rt_info>
70
+ <input>
71
+ <port id="0" precision="FP16">
72
+ <dim>200064</dim>
73
+ <dim>3072</dim>
74
+ </port>
75
+ </input>
76
+ <output>
77
+ <port id="1" precision="FP32">
78
+ <dim>200064</dim>
79
+ <dim>3072</dim>
80
+ </port>
81
+ </output>
82
+ </layer>
83
+ <layer id="6" name="ov_ext::embedding/Convert_1" type="Convert" version="opset1">
84
+ <data destination_type="i32" />
85
+ <input>
86
+ <port id="0" precision="I64">
87
+ <dim>-1</dim>
88
+ <dim>-1</dim>
89
+ </port>
90
+ </input>
91
+ <output>
92
+ <port id="1" precision="I32">
93
+ <dim>-1</dim>
94
+ <dim>-1</dim>
95
+ </port>
96
+ </output>
97
+ </layer>
98
+ <layer id="7" name="ov_ext::embedding/Constant" type="Const" version="opset1">
99
+ <data element_type="i32" shape="" offset="614996736" size="4" />
100
+ <output>
101
+ <port id="0" precision="I32" />
102
+ </output>
103
+ </layer>
104
+ <layer id="8" name="ov_ext::embedding/Gather" type="Gather" version="opset8">
105
+ <data batch_dims="0" />
106
+ <input>
107
+ <port id="0" precision="FP32">
108
+ <dim>200064</dim>
109
+ <dim>3072</dim>
110
+ </port>
111
+ <port id="1" precision="I32">
112
+ <dim>-1</dim>
113
+ <dim>-1</dim>
114
+ </port>
115
+ <port id="2" precision="I32" />
116
+ </input>
117
+ <output>
118
+ <port id="3" precision="FP32" names="inputs_embeds">
119
+ <dim>-1</dim>
120
+ <dim>-1</dim>
121
+ <dim>3072</dim>
122
+ </port>
123
+ </output>
124
+ </layer>
125
+ <layer id="9" name="Result_281828" type="Result" version="opset1" output_names="inputs_embeds">
126
+ <input>
127
+ <port id="0" precision="FP32">
128
+ <dim>-1</dim>
129
+ <dim>-1</dim>
130
+ <dim>3072</dim>
131
+ </port>
132
+ </input>
133
+ </layer>
134
+ </layers>
135
+ <edges>
136
+ <edge from-layer="0" from-port="0" to-layer="6" to-port="0" />
137
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
138
+ <edge from-layer="2" from-port="1" to-layer="4" to-port="0" />
139
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="1" />
140
+ <edge from-layer="4" from-port="2" to-layer="5" to-port="0" />
141
+ <edge from-layer="5" from-port="1" to-layer="8" to-port="0" />
142
+ <edge from-layer="6" from-port="1" to-layer="8" to-port="1" />
143
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="2" />
144
+ <edge from-layer="8" from-port="3" to-layer="9" to-port="0" />
145
+ </edges>
146
+ <rt_info>
147
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
148
+ <conversion_parameters>
149
+ <framework value="pytorch" />
150
+ <is_python_object value="True" />
151
+ </conversion_parameters>
152
+ <nncf>
153
+ <friendly_names_were_updated value="True" />
154
+ <version value="2.19.0" />
155
+ <weight_compression>
156
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
157
+ <all_layers value="False" />
158
+ <awq value="False" />
159
+ <backup_mode value="int8_asym" />
160
+ <compression_format value="dequantize" />
161
+ <gptq value="False" />
162
+ <group_size value="-1" />
163
+ <ignored_scope value="[]" />
164
+ <lora_correction value="False" />
165
+ <mode value="int8_sym" />
166
+ <ratio value="1.0" />
167
+ <scale_estimation value="False" />
168
+ <sensitivity_metric value="weight_quantization_error" />
169
+ </weight_compression>
170
+ </nncf>
171
+ <optimum>
172
+ <nncf_version value="2.19.0" />
173
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
174
+ <optimum_version value="2.1.0.dev0" />
175
+ <pytorch_version value="2.9.1+cpu" />
176
+ <transformers_version value="4.51.0" />
177
+ </optimum>
178
+ </rt_info>
179
+ </net>
openvino_tokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0d28657f94f9cd47a609bb7e2a10f5c1f81341d6970e3c3fd9655db8213b3767
3
+ size 7592501
openvino_tokenizer.xml ADDED
@@ -0,0 +1,734 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="tokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="Parameter_1" type="Parameter" version="opset1">
5
+ <data shape="?" element_type="string" />
6
+ <output>
7
+ <port id="0" precision="STRING" names="Parameter_1">
8
+ <dim>-1</dim>
9
+ </port>
10
+ </output>
11
+ </layer>
12
+ <layer id="1" name="Constant_7" type="Const" version="opset1">
13
+ <data element_type="i32" shape="" offset="0" size="4" />
14
+ <output>
15
+ <port id="0" precision="I32" />
16
+ </output>
17
+ </layer>
18
+ <layer id="2" name="StringTensorUnpack_2" type="StringTensorUnpack" version="opset15">
19
+ <input>
20
+ <port id="0" precision="STRING">
21
+ <dim>-1</dim>
22
+ </port>
23
+ </input>
24
+ <output>
25
+ <port id="1" precision="I32">
26
+ <dim>-1</dim>
27
+ </port>
28
+ <port id="2" precision="I32">
29
+ <dim>-1</dim>
30
+ </port>
31
+ <port id="3" precision="U8">
32
+ <dim>-1</dim>
33
+ </port>
34
+ </output>
35
+ </layer>
36
+ <layer id="3" name="ShapeOf_3" type="ShapeOf" version="opset3">
37
+ <data output_type="i64" />
38
+ <input>
39
+ <port id="0" precision="I32">
40
+ <dim>-1</dim>
41
+ </port>
42
+ </input>
43
+ <output>
44
+ <port id="1" precision="I64">
45
+ <dim>1</dim>
46
+ </port>
47
+ </output>
48
+ </layer>
49
+ <layer id="4" name="Constant_4" type="Const" version="opset1">
50
+ <data element_type="i32" shape="" offset="0" size="4" />
51
+ <output>
52
+ <port id="0" precision="I32" />
53
+ </output>
54
+ </layer>
55
+ <layer id="5" name="Constant_5" type="Const" version="opset1">
56
+ <data element_type="i32" shape="" offset="0" size="4" />
57
+ <output>
58
+ <port id="0" precision="I32" />
59
+ </output>
60
+ </layer>
61
+ <layer id="6" name="Gather_6" type="Gather" version="opset8">
62
+ <data batch_dims="0" />
63
+ <input>
64
+ <port id="0" precision="I64">
65
+ <dim>1</dim>
66
+ </port>
67
+ <port id="1" precision="I32" />
68
+ <port id="2" precision="I32" />
69
+ </input>
70
+ <output>
71
+ <port id="3" precision="I64" />
72
+ </output>
73
+ </layer>
74
+ <layer id="7" name="Constant_8" type="Const" version="opset1">
75
+ <data element_type="i32" shape="" offset="4" size="4" />
76
+ <output>
77
+ <port id="0" precision="I32" />
78
+ </output>
79
+ </layer>
80
+ <layer id="8" name="Range_9" type="Range" version="opset4">
81
+ <data output_type="i32" />
82
+ <input>
83
+ <port id="0" precision="I32" />
84
+ <port id="1" precision="I64" />
85
+ <port id="2" precision="I32" />
86
+ </input>
87
+ <output>
88
+ <port id="3" precision="I32">
89
+ <dim>-1</dim>
90
+ </port>
91
+ </output>
92
+ </layer>
93
+ <layer id="9" name="Constant_10" type="Const" version="opset1">
94
+ <data element_type="i32" shape="" offset="4" size="4" />
95
+ <output>
96
+ <port id="0" precision="I32" />
97
+ </output>
98
+ </layer>
99
+ <layer id="10" name="Constant_11" type="Const" version="opset1">
100
+ <data element_type="i64" shape="" offset="8" size="8" />
101
+ <output>
102
+ <port id="0" precision="I64" />
103
+ </output>
104
+ </layer>
105
+ <layer id="11" name="Add_12" type="Add" version="opset1">
106
+ <data auto_broadcast="numpy" />
107
+ <input>
108
+ <port id="0" precision="I64" />
109
+ <port id="1" precision="I64" />
110
+ </input>
111
+ <output>
112
+ <port id="2" precision="I64" />
113
+ </output>
114
+ </layer>
115
+ <layer id="12" name="Constant_13" type="Const" version="opset1">
116
+ <data element_type="i32" shape="" offset="4" size="4" />
117
+ <output>
118
+ <port id="0" precision="I32" />
119
+ </output>
120
+ </layer>
121
+ <layer id="13" name="Range_14" type="Range" version="opset4">
122
+ <data output_type="i32" />
123
+ <input>
124
+ <port id="0" precision="I32" />
125
+ <port id="1" precision="I64" />
126
+ <port id="2" precision="I32" />
127
+ </input>
128
+ <output>
129
+ <port id="3" precision="I32">
130
+ <dim>-1</dim>
131
+ </port>
132
+ </output>
133
+ </layer>
134
+ <layer id="14" name="Constant_78" type="Const" version="opset1">
135
+ <data element_type="u8" shape="246" offset="16" size="246" />
136
+ <output>
137
+ <port id="0" precision="U8">
138
+ <dim>246</dim>
139
+ </port>
140
+ </output>
141
+ </layer>
142
+ <layer id="15" name="SpecialTokensSplit_79" type="SpecialTokensSplit" version="extension">
143
+ <input>
144
+ <port id="0" precision="I32">
145
+ <dim>-1</dim>
146
+ </port>
147
+ <port id="1" precision="I32">
148
+ <dim>-1</dim>
149
+ </port>
150
+ <port id="2" precision="I32">
151
+ <dim>-1</dim>
152
+ </port>
153
+ <port id="3" precision="I32">
154
+ <dim>-1</dim>
155
+ </port>
156
+ <port id="4" precision="U8">
157
+ <dim>-1</dim>
158
+ </port>
159
+ <port id="5" precision="U8">
160
+ <dim>246</dim>
161
+ </port>
162
+ </input>
163
+ <output>
164
+ <port id="6" precision="I32">
165
+ <dim>-1</dim>
166
+ </port>
167
+ <port id="7" precision="I32">
168
+ <dim>-1</dim>
169
+ </port>
170
+ <port id="8" precision="I32">
171
+ <dim>-1</dim>
172
+ </port>
173
+ <port id="9" precision="I32">
174
+ <dim>-1</dim>
175
+ </port>
176
+ <port id="10" precision="U8">
177
+ <dim>-1</dim>
178
+ </port>
179
+ <port id="11" precision="BOOL">
180
+ <dim>-1</dim>
181
+ </port>
182
+ </output>
183
+ </layer>
184
+ <layer id="16" name="Constant_81" type="Const" version="opset1">
185
+ <data element_type="u8" shape="274" offset="262" size="274" />
186
+ <output>
187
+ <port id="0" precision="U8">
188
+ <dim>274</dim>
189
+ </port>
190
+ </output>
191
+ </layer>
192
+ <layer id="17" name="RegexSplit_82" type="RegexSplit" version="extension">
193
+ <data behaviour="remove" invert="true" max_splits="-1" />
194
+ <input>
195
+ <port id="0" precision="I32">
196
+ <dim>-1</dim>
197
+ </port>
198
+ <port id="1" precision="I32">
199
+ <dim>-1</dim>
200
+ </port>
201
+ <port id="2" precision="I32">
202
+ <dim>-1</dim>
203
+ </port>
204
+ <port id="3" precision="I32">
205
+ <dim>-1</dim>
206
+ </port>
207
+ <port id="4" precision="U8">
208
+ <dim>-1</dim>
209
+ </port>
210
+ <port id="5" precision="BOOL">
211
+ <dim>-1</dim>
212
+ </port>
213
+ <port id="6" precision="U8">
214
+ <dim>274</dim>
215
+ </port>
216
+ </input>
217
+ <output>
218
+ <port id="7" precision="I32">
219
+ <dim>-1</dim>
220
+ </port>
221
+ <port id="8" precision="I32">
222
+ <dim>-1</dim>
223
+ </port>
224
+ <port id="9" precision="I32">
225
+ <dim>-1</dim>
226
+ </port>
227
+ <port id="10" precision="I32">
228
+ <dim>-1</dim>
229
+ </port>
230
+ <port id="11" precision="U8">
231
+ <dim>-1</dim>
232
+ </port>
233
+ <port id="12" precision="BOOL">
234
+ <dim>-1</dim>
235
+ </port>
236
+ </output>
237
+ </layer>
238
+ <layer id="18" name="Constant_84" type="Const" version="opset1">
239
+ <data element_type="i32" shape="200029" offset="536" size="800116" />
240
+ <output>
241
+ <port id="0" precision="I32">
242
+ <dim>200029</dim>
243
+ </port>
244
+ </output>
245
+ </layer>
246
+ <layer id="19" name="Constant_86" type="Const" version="opset1">
247
+ <data element_type="i32" shape="200029" offset="800652" size="800116" />
248
+ <output>
249
+ <port id="0" precision="I32">
250
+ <dim>200029</dim>
251
+ </port>
252
+ </output>
253
+ </layer>
254
+ <layer id="20" name="Constant_88" type="Const" version="opset1">
255
+ <data element_type="u8" shape="1398089" offset="1600768" size="1398089" />
256
+ <output>
257
+ <port id="0" precision="U8">
258
+ <dim>1398089</dim>
259
+ </port>
260
+ </output>
261
+ </layer>
262
+ <layer id="21" name="Constant_96" type="Const" version="opset1">
263
+ <data element_type="i32" shape="199742" offset="2998857" size="798968" />
264
+ <output>
265
+ <port id="0" precision="I32">
266
+ <dim>199742</dim>
267
+ </port>
268
+ </output>
269
+ </layer>
270
+ <layer id="22" name="Constant_98" type="Const" version="opset1">
271
+ <data element_type="i32" shape="199742" offset="3797825" size="798968" />
272
+ <output>
273
+ <port id="0" precision="I32">
274
+ <dim>199742</dim>
275
+ </port>
276
+ </output>
277
+ </layer>
278
+ <layer id="23" name="Constant_100" type="Const" version="opset1">
279
+ <data element_type="u8" shape="718313" offset="4596793" size="718313" />
280
+ <output>
281
+ <port id="0" precision="U8">
282
+ <dim>718313</dim>
283
+ </port>
284
+ </output>
285
+ </layer>
286
+ <layer id="24" name="Constant_102" type="Const" version="opset1">
287
+ <data element_type="i32" shape="199742" offset="5315106" size="798968" />
288
+ <output>
289
+ <port id="0" precision="I32">
290
+ <dim>199742</dim>
291
+ </port>
292
+ </output>
293
+ </layer>
294
+ <layer id="25" name="Constant_104" type="Const" version="opset1">
295
+ <data element_type="i32" shape="199742" offset="6114074" size="798968" />
296
+ <output>
297
+ <port id="0" precision="I32">
298
+ <dim>199742</dim>
299
+ </port>
300
+ </output>
301
+ </layer>
302
+ <layer id="26" name="Constant_106" type="Const" version="opset1">
303
+ <data element_type="u8" shape="679101" offset="6913042" size="679101" />
304
+ <output>
305
+ <port id="0" precision="U8">
306
+ <dim>679101</dim>
307
+ </port>
308
+ </output>
309
+ </layer>
310
+ <layer id="27" name="Constant_90" type="Const" version="opset1">
311
+ <data element_type="i32" shape="14" offset="7592143" size="56" />
312
+ <output>
313
+ <port id="0" precision="I32">
314
+ <dim>14</dim>
315
+ </port>
316
+ </output>
317
+ </layer>
318
+ <layer id="28" name="Constant_92" type="Const" version="opset1">
319
+ <data element_type="i32" shape="14" offset="7592199" size="56" />
320
+ <output>
321
+ <port id="0" precision="I32">
322
+ <dim>14</dim>
323
+ </port>
324
+ </output>
325
+ </layer>
326
+ <layer id="29" name="Constant_94" type="Const" version="opset1">
327
+ <data element_type="u8" shape="164" offset="7592255" size="164" />
328
+ <output>
329
+ <port id="0" precision="U8">
330
+ <dim>164</dim>
331
+ </port>
332
+ </output>
333
+ </layer>
334
+ <layer id="30" name="Constant_107" type="Const" version="opset1">
335
+ <data element_type="i32" shape="14" offset="7592419" size="56" />
336
+ <output>
337
+ <port id="0" precision="I32">
338
+ <dim>14</dim>
339
+ </port>
340
+ </output>
341
+ </layer>
342
+ <layer id="31" name="BPETokenizer_108" type="BPETokenizer" version="extension">
343
+ <data unk_token="" fuse_unk="false" suffix_indicator="" end_suffix="" byte_fallback="false" cache_capacity="40003" />
344
+ <input>
345
+ <port id="0" precision="I32">
346
+ <dim>-1</dim>
347
+ </port>
348
+ <port id="1" precision="I32">
349
+ <dim>-1</dim>
350
+ </port>
351
+ <port id="2" precision="I32">
352
+ <dim>-1</dim>
353
+ </port>
354
+ <port id="3" precision="I32">
355
+ <dim>-1</dim>
356
+ </port>
357
+ <port id="4" precision="U8">
358
+ <dim>-1</dim>
359
+ </port>
360
+ <port id="5" precision="I32">
361
+ <dim>200029</dim>
362
+ </port>
363
+ <port id="6" precision="I32">
364
+ <dim>200029</dim>
365
+ </port>
366
+ <port id="7" precision="U8">
367
+ <dim>1398089</dim>
368
+ </port>
369
+ <port id="8" precision="I32">
370
+ <dim>199742</dim>
371
+ </port>
372
+ <port id="9" precision="I32">
373
+ <dim>199742</dim>
374
+ </port>
375
+ <port id="10" precision="U8">
376
+ <dim>718313</dim>
377
+ </port>
378
+ <port id="11" precision="I32">
379
+ <dim>199742</dim>
380
+ </port>
381
+ <port id="12" precision="I32">
382
+ <dim>199742</dim>
383
+ </port>
384
+ <port id="13" precision="U8">
385
+ <dim>679101</dim>
386
+ </port>
387
+ <port id="14" precision="I32">
388
+ <dim>14</dim>
389
+ </port>
390
+ <port id="15" precision="I32">
391
+ <dim>14</dim>
392
+ </port>
393
+ <port id="16" precision="U8">
394
+ <dim>164</dim>
395
+ </port>
396
+ <port id="17" precision="I32">
397
+ <dim>14</dim>
398
+ </port>
399
+ </input>
400
+ <output>
401
+ <port id="18" precision="I32">
402
+ <dim>-1</dim>
403
+ </port>
404
+ <port id="19" precision="I32">
405
+ <dim>-1</dim>
406
+ </port>
407
+ <port id="20" precision="I32">
408
+ <dim>-1</dim>
409
+ </port>
410
+ </output>
411
+ </layer>
412
+ <layer id="32" name="Constant_109" type="Const" version="opset1">
413
+ <data element_type="i32" shape="" offset="7592475" size="4" />
414
+ <output>
415
+ <port id="0" precision="I32" />
416
+ </output>
417
+ </layer>
418
+ <layer id="33" name="Constant_111" type="Const" version="opset1">
419
+ <data element_type="u8" shape="5" offset="7592479" size="5" />
420
+ <output>
421
+ <port id="0" precision="U8">
422
+ <dim>5</dim>
423
+ </port>
424
+ </output>
425
+ </layer>
426
+ <layer id="34" name="Constant_113" type="Const" version="opset1">
427
+ <data element_type="u8" shape="13" offset="7592484" size="13" />
428
+ <output>
429
+ <port id="0" precision="U8">
430
+ <dim>13</dim>
431
+ </port>
432
+ </output>
433
+ </layer>
434
+ <layer id="35" name="Truncate_114" type="Truncate" version="extension">
435
+ <data m_num_inputs="1" />
436
+ <input>
437
+ <port id="0" precision="I32">
438
+ <dim>-1</dim>
439
+ </port>
440
+ <port id="1" precision="I32">
441
+ <dim>-1</dim>
442
+ </port>
443
+ <port id="2" precision="I32">
444
+ <dim>-1</dim>
445
+ </port>
446
+ <port id="3" precision="I32" />
447
+ <port id="4" precision="U8">
448
+ <dim>5</dim>
449
+ </port>
450
+ <port id="5" precision="U8">
451
+ <dim>13</dim>
452
+ </port>
453
+ </input>
454
+ <output>
455
+ <port id="6" precision="I32">
456
+ <dim>-1</dim>
457
+ </port>
458
+ <port id="7" precision="I32">
459
+ <dim>-1</dim>
460
+ </port>
461
+ <port id="8" precision="I32">
462
+ <dim>-1</dim>
463
+ </port>
464
+ </output>
465
+ </layer>
466
+ <layer id="36" name="Constant_115" type="Const" version="opset1">
467
+ <data element_type="i32" shape="1" offset="0" size="4" />
468
+ <output>
469
+ <port id="0" precision="I32">
470
+ <dim>1</dim>
471
+ </port>
472
+ </output>
473
+ </layer>
474
+ <layer id="37" name="CombineSegments_116" type="CombineSegments" version="extension">
475
+ <input>
476
+ <port id="0" precision="I32">
477
+ <dim>-1</dim>
478
+ </port>
479
+ <port id="1" precision="I32">
480
+ <dim>-1</dim>
481
+ </port>
482
+ <port id="2" precision="I32">
483
+ <dim>-1</dim>
484
+ </port>
485
+ <port id="3" precision="I32">
486
+ <dim>1</dim>
487
+ </port>
488
+ </input>
489
+ <output>
490
+ <port id="4" precision="I32">
491
+ <dim>-1</dim>
492
+ </port>
493
+ <port id="5" precision="I32">
494
+ <dim>-1</dim>
495
+ </port>
496
+ <port id="6" precision="I32">
497
+ <dim>-1</dim>
498
+ </port>
499
+ <port id="7" precision="I32">
500
+ <dim>-1</dim>
501
+ </port>
502
+ <port id="8" precision="I32">
503
+ <dim>-1</dim>
504
+ </port>
505
+ <port id="9" precision="I32">
506
+ <dim>-1</dim>
507
+ </port>
508
+ </output>
509
+ </layer>
510
+ <layer id="38" name="Subtract_117" type="Subtract" version="opset1">
511
+ <data auto_broadcast="numpy" />
512
+ <input>
513
+ <port id="0" precision="I32">
514
+ <dim>-1</dim>
515
+ </port>
516
+ <port id="1" precision="I32">
517
+ <dim>-1</dim>
518
+ </port>
519
+ </input>
520
+ <output>
521
+ <port id="2" precision="I32">
522
+ <dim>-1</dim>
523
+ </port>
524
+ </output>
525
+ </layer>
526
+ <layer id="39" name="Constant_118" type="Const" version="opset1">
527
+ <data element_type="i32" shape="" offset="0" size="4" />
528
+ <output>
529
+ <port id="0" precision="I32" />
530
+ </output>
531
+ </layer>
532
+ <layer id="40" name="ReduceMax_119" type="ReduceMax" version="opset1">
533
+ <data keep_dims="false" />
534
+ <input>
535
+ <port id="0" precision="I32">
536
+ <dim>-1</dim>
537
+ </port>
538
+ <port id="1" precision="I32" />
539
+ </input>
540
+ <output>
541
+ <port id="2" precision="I32" />
542
+ </output>
543
+ </layer>
544
+ <layer id="41" name="Constant_120" type="Const" version="opset1">
545
+ <data element_type="i32" shape="" offset="7592497" size="4" />
546
+ <output>
547
+ <port id="0" precision="I32" />
548
+ </output>
549
+ </layer>
550
+ <layer id="42" name="RaggedToDense_121" type="RaggedToDense" version="extension">
551
+ <data pad_right="true" m_pad_max_length="false" />
552
+ <input>
553
+ <port id="0" precision="I32">
554
+ <dim>-1</dim>
555
+ </port>
556
+ <port id="1" precision="I32">
557
+ <dim>-1</dim>
558
+ </port>
559
+ <port id="2" precision="I32">
560
+ <dim>-1</dim>
561
+ </port>
562
+ <port id="3" precision="I32" />
563
+ <port id="4" precision="I32" />
564
+ </input>
565
+ <output>
566
+ <port id="5" precision="I32">
567
+ <dim>-1</dim>
568
+ <dim>-1</dim>
569
+ </port>
570
+ <port id="6" precision="BOOL">
571
+ <dim>-1</dim>
572
+ <dim>-1</dim>
573
+ </port>
574
+ </output>
575
+ </layer>
576
+ <layer id="43" name="Convert_122" type="Convert" version="opset1">
577
+ <data destination_type="i32" />
578
+ <input>
579
+ <port id="0" precision="BOOL">
580
+ <dim>-1</dim>
581
+ <dim>-1</dim>
582
+ </port>
583
+ </input>
584
+ <output>
585
+ <port id="1" precision="I32">
586
+ <dim>-1</dim>
587
+ <dim>-1</dim>
588
+ </port>
589
+ </output>
590
+ </layer>
591
+ <layer id="44" name="Convert_122.0" type="Convert" version="opset1">
592
+ <data destination_type="i64" />
593
+ <input>
594
+ <port id="0" precision="I32">
595
+ <dim>-1</dim>
596
+ <dim>-1</dim>
597
+ </port>
598
+ </input>
599
+ <output>
600
+ <port id="1" precision="I64" names="attention_mask">
601
+ <dim>-1</dim>
602
+ <dim>-1</dim>
603
+ </port>
604
+ </output>
605
+ </layer>
606
+ <layer id="46" name="RaggedToDense_121.0" type="Convert" version="opset1">
607
+ <data destination_type="i64" />
608
+ <input>
609
+ <port id="0" precision="I32">
610
+ <dim>-1</dim>
611
+ <dim>-1</dim>
612
+ </port>
613
+ </input>
614
+ <output>
615
+ <port id="1" precision="I64" names="input_ids">
616
+ <dim>-1</dim>
617
+ <dim>-1</dim>
618
+ </port>
619
+ </output>
620
+ </layer>
621
+ <layer id="47" name="Result_125" type="Result" version="opset1" output_names="input_ids">
622
+ <input>
623
+ <port id="0" precision="I64">
624
+ <dim>-1</dim>
625
+ <dim>-1</dim>
626
+ </port>
627
+ </input>
628
+ </layer>
629
+ <layer id="45" name="Result_127" type="Result" version="opset1" output_names="attention_mask">
630
+ <input>
631
+ <port id="0" precision="I64">
632
+ <dim>-1</dim>
633
+ <dim>-1</dim>
634
+ </port>
635
+ </input>
636
+ </layer>
637
+ </layers>
638
+ <edges>
639
+ <edge from-layer="0" from-port="0" to-layer="2" to-port="0" />
640
+ <edge from-layer="1" from-port="0" to-layer="8" to-port="0" />
641
+ <edge from-layer="2" from-port="1" to-layer="3" to-port="0" />
642
+ <edge from-layer="2" from-port="1" to-layer="15" to-port="2" />
643
+ <edge from-layer="2" from-port="3" to-layer="15" to-port="4" />
644
+ <edge from-layer="2" from-port="2" to-layer="15" to-port="3" />
645
+ <edge from-layer="3" from-port="1" to-layer="6" to-port="0" />
646
+ <edge from-layer="4" from-port="0" to-layer="6" to-port="1" />
647
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="2" />
648
+ <edge from-layer="6" from-port="3" to-layer="8" to-port="1" />
649
+ <edge from-layer="6" from-port="3" to-layer="11" to-port="0" />
650
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="2" />
651
+ <edge from-layer="8" from-port="3" to-layer="15" to-port="0" />
652
+ <edge from-layer="9" from-port="0" to-layer="13" to-port="0" />
653
+ <edge from-layer="10" from-port="0" to-layer="11" to-port="1" />
654
+ <edge from-layer="11" from-port="2" to-layer="13" to-port="1" />
655
+ <edge from-layer="12" from-port="0" to-layer="13" to-port="2" />
656
+ <edge from-layer="13" from-port="3" to-layer="15" to-port="1" />
657
+ <edge from-layer="14" from-port="0" to-layer="15" to-port="5" />
658
+ <edge from-layer="15" from-port="6" to-layer="17" to-port="0" />
659
+ <edge from-layer="15" from-port="7" to-layer="17" to-port="1" />
660
+ <edge from-layer="15" from-port="8" to-layer="17" to-port="2" />
661
+ <edge from-layer="15" from-port="9" to-layer="17" to-port="3" />
662
+ <edge from-layer="15" from-port="10" to-layer="17" to-port="4" />
663
+ <edge from-layer="15" from-port="11" to-layer="17" to-port="5" />
664
+ <edge from-layer="16" from-port="0" to-layer="17" to-port="6" />
665
+ <edge from-layer="17" from-port="7" to-layer="31" to-port="0" />
666
+ <edge from-layer="17" from-port="8" to-layer="31" to-port="1" />
667
+ <edge from-layer="17" from-port="9" to-layer="31" to-port="2" />
668
+ <edge from-layer="17" from-port="10" to-layer="31" to-port="3" />
669
+ <edge from-layer="17" from-port="11" to-layer="31" to-port="4" />
670
+ <edge from-layer="18" from-port="0" to-layer="31" to-port="5" />
671
+ <edge from-layer="19" from-port="0" to-layer="31" to-port="6" />
672
+ <edge from-layer="20" from-port="0" to-layer="31" to-port="7" />
673
+ <edge from-layer="21" from-port="0" to-layer="31" to-port="8" />
674
+ <edge from-layer="22" from-port="0" to-layer="31" to-port="9" />
675
+ <edge from-layer="23" from-port="0" to-layer="31" to-port="10" />
676
+ <edge from-layer="24" from-port="0" to-layer="31" to-port="11" />
677
+ <edge from-layer="25" from-port="0" to-layer="31" to-port="12" />
678
+ <edge from-layer="26" from-port="0" to-layer="31" to-port="13" />
679
+ <edge from-layer="27" from-port="0" to-layer="31" to-port="14" />
680
+ <edge from-layer="28" from-port="0" to-layer="31" to-port="15" />
681
+ <edge from-layer="29" from-port="0" to-layer="31" to-port="16" />
682
+ <edge from-layer="30" from-port="0" to-layer="31" to-port="17" />
683
+ <edge from-layer="31" from-port="18" to-layer="35" to-port="0" />
684
+ <edge from-layer="31" from-port="19" to-layer="35" to-port="1" />
685
+ <edge from-layer="31" from-port="20" to-layer="35" to-port="2" />
686
+ <edge from-layer="32" from-port="0" to-layer="35" to-port="3" />
687
+ <edge from-layer="33" from-port="0" to-layer="35" to-port="4" />
688
+ <edge from-layer="34" from-port="0" to-layer="35" to-port="5" />
689
+ <edge from-layer="35" from-port="6" to-layer="37" to-port="0" />
690
+ <edge from-layer="35" from-port="7" to-layer="37" to-port="1" />
691
+ <edge from-layer="35" from-port="8" to-layer="37" to-port="2" />
692
+ <edge from-layer="36" from-port="0" to-layer="37" to-port="3" />
693
+ <edge from-layer="37" from-port="5" to-layer="38" to-port="0" />
694
+ <edge from-layer="37" from-port="4" to-layer="38" to-port="1" />
695
+ <edge from-layer="37" from-port="4" to-layer="42" to-port="0" />
696
+ <edge from-layer="37" from-port="5" to-layer="42" to-port="1" />
697
+ <edge from-layer="37" from-port="6" to-layer="42" to-port="2" />
698
+ <edge from-layer="38" from-port="2" to-layer="40" to-port="0" />
699
+ <edge from-layer="39" from-port="0" to-layer="40" to-port="1" />
700
+ <edge from-layer="40" from-port="2" to-layer="42" to-port="3" />
701
+ <edge from-layer="41" from-port="0" to-layer="42" to-port="4" />
702
+ <edge from-layer="42" from-port="6" to-layer="43" to-port="0" />
703
+ <edge from-layer="42" from-port="5" to-layer="46" to-port="0" />
704
+ <edge from-layer="43" from-port="1" to-layer="44" to-port="0" />
705
+ <edge from-layer="44" from-port="1" to-layer="45" to-port="0" />
706
+ <edge from-layer="46" from-port="1" to-layer="47" to-port="0" />
707
+ </edges>
708
+ <rt_info>
709
+ <add_attention_mask value="True" />
710
+ <add_prefix_space />
711
+ <add_special_tokens value="True" />
712
+ <bos_token_id value="199999" />
713
+ <chat_template value="{% for message in messages %}{% if message['role'] == 'system' and 'tools' in message and message['tools'] is not none %}{{ '&lt;|' + message['role'] + '|>' + message['content'] + '&lt;|tool|>' + message['tools'] + '&lt;|/tool|>' + '&lt;|end|>' }}{% else %}{{ '&lt;|' + message['role'] + '|>' + message['content'] + '&lt;|end|>' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '&lt;|assistant|>' }}{% else %}{{ eos_token }}{% endif %}" />
714
+ <clean_up_tokenization_spaces />
715
+ <detokenizer_input_type value="i64" />
716
+ <eos_token_id value="199999" />
717
+ <handle_special_tokens_with_re />
718
+ <max_length />
719
+ <number_of_inputs value="1" />
720
+ <openvino_tokenizers_version value="2025.4.1.0-627-e79796a77f3" />
721
+ <openvino_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
722
+ <original_tokenizer_class value="&lt;class 'transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast'>" />
723
+ <pad_token_id value="199999" />
724
+ <skip_special_tokens value="True" />
725
+ <streaming_detokenizer value="False" />
726
+ <tokenizer_output_type value="i64" />
727
+ <tokenizers_version value="0.22.2" />
728
+ <transformers_version value="4.57.3" />
729
+ <use_max_padding value="False" />
730
+ <use_sentencepiece_backend value="False" />
731
+ <utf8_replace_mode value="replace" />
732
+ <with_detokenizer value="True" />
733
+ </rt_info>
734
+ </net>
openvino_vision_embeddings_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:da8652db47d1521867a714125a5216013ab1c9a12b3ffef651961a2938471c87
3
+ size 399761132
openvino_vision_embeddings_model.xml ADDED
The diff for this file is too large to render. See raw diff
 
openvino_vision_projection_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:05d77053d6ba3be941efc10ec1adbb1f7d969ad631fb31ce7a6c15ba7fe45fd7
3
+ size 13012992
openvino_vision_projection_model.xml ADDED
@@ -0,0 +1,334 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model21" version="11">
3
+ <layers>
4
+ <layer id="0" name="input" type="Parameter" version="opset1">
5
+ <data shape="?,?,1152" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="input">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ <dim>1152</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="self.0.weight" type="Const" version="opset1">
15
+ <data element_type="i8" shape="3072, 1152" offset="0" size="3538944" />
16
+ <output>
17
+ <port id="0" precision="I8">
18
+ <dim>3072</dim>
19
+ <dim>1152</dim>
20
+ </port>
21
+ </output>
22
+ </layer>
23
+ <layer id="2" name="Convert_2951735" type="Convert" version="opset1">
24
+ <data destination_type="f16" />
25
+ <input>
26
+ <port id="0" precision="I8">
27
+ <dim>3072</dim>
28
+ <dim>1152</dim>
29
+ </port>
30
+ </input>
31
+ <output>
32
+ <port id="1" precision="FP16">
33
+ <dim>3072</dim>
34
+ <dim>1152</dim>
35
+ </port>
36
+ </output>
37
+ </layer>
38
+ <layer id="3" name="self.0.weight/scale" type="Const" version="opset1">
39
+ <data element_type="f16" shape="3072, 1" offset="3538944" size="6144" />
40
+ <output>
41
+ <port id="0" precision="FP16">
42
+ <dim>3072</dim>
43
+ <dim>1</dim>
44
+ </port>
45
+ </output>
46
+ </layer>
47
+ <layer id="4" name="self.0.weight/fq_weights_1" type="Multiply" version="opset1">
48
+ <data auto_broadcast="numpy" />
49
+ <input>
50
+ <port id="0" precision="FP16">
51
+ <dim>3072</dim>
52
+ <dim>1152</dim>
53
+ </port>
54
+ <port id="1" precision="FP16">
55
+ <dim>3072</dim>
56
+ <dim>1</dim>
57
+ </port>
58
+ </input>
59
+ <output>
60
+ <port id="2" precision="FP16">
61
+ <dim>3072</dim>
62
+ <dim>1152</dim>
63
+ </port>
64
+ </output>
65
+ </layer>
66
+ <layer id="5" name="self.0.weight/fq_weights_1/convert" type="Convert" version="opset1">
67
+ <data destination_type="f32" />
68
+ <input>
69
+ <port id="0" precision="FP16">
70
+ <dim>3072</dim>
71
+ <dim>1152</dim>
72
+ </port>
73
+ </input>
74
+ <output>
75
+ <port id="1" precision="FP32">
76
+ <dim>3072</dim>
77
+ <dim>1152</dim>
78
+ </port>
79
+ </output>
80
+ </layer>
81
+ <layer id="6" name="__module.0/ov_ext::linear/MatMul" type="MatMul" version="opset1">
82
+ <data transpose_a="false" transpose_b="true" />
83
+ <input>
84
+ <port id="0" precision="FP32">
85
+ <dim>-1</dim>
86
+ <dim>-1</dim>
87
+ <dim>1152</dim>
88
+ </port>
89
+ <port id="1" precision="FP32">
90
+ <dim>3072</dim>
91
+ <dim>1152</dim>
92
+ </port>
93
+ </input>
94
+ <output>
95
+ <port id="2" precision="FP32">
96
+ <dim>-1</dim>
97
+ <dim>-1</dim>
98
+ <dim>3072</dim>
99
+ </port>
100
+ </output>
101
+ </layer>
102
+ <layer id="7" name="Constant_286148" type="Const" version="opset1">
103
+ <data element_type="f32" shape="1, 1, 3072" offset="3545088" size="12288" />
104
+ <output>
105
+ <port id="0" precision="FP32">
106
+ <dim>1</dim>
107
+ <dim>1</dim>
108
+ <dim>3072</dim>
109
+ </port>
110
+ </output>
111
+ </layer>
112
+ <layer id="8" name="__module.0/ov_ext::linear/Add" type="Add" version="opset1">
113
+ <data auto_broadcast="numpy" />
114
+ <input>
115
+ <port id="0" precision="FP32">
116
+ <dim>-1</dim>
117
+ <dim>-1</dim>
118
+ <dim>3072</dim>
119
+ </port>
120
+ <port id="1" precision="FP32">
121
+ <dim>1</dim>
122
+ <dim>1</dim>
123
+ <dim>3072</dim>
124
+ </port>
125
+ </input>
126
+ <output>
127
+ <port id="2" precision="FP32" names="11,input_1">
128
+ <dim>-1</dim>
129
+ <dim>-1</dim>
130
+ <dim>3072</dim>
131
+ </port>
132
+ </output>
133
+ </layer>
134
+ <layer id="9" name="__module.1/aten::gelu/Gelu" type="Gelu" version="opset7">
135
+ <data approximation_mode="ERF" />
136
+ <input>
137
+ <port id="0" precision="FP32">
138
+ <dim>-1</dim>
139
+ <dim>-1</dim>
140
+ <dim>3072</dim>
141
+ </port>
142
+ </input>
143
+ <output>
144
+ <port id="1" precision="FP32" names="13">
145
+ <dim>-1</dim>
146
+ <dim>-1</dim>
147
+ <dim>3072</dim>
148
+ </port>
149
+ </output>
150
+ </layer>
151
+ <layer id="10" name="self.2.weight" type="Const" version="opset1">
152
+ <data element_type="i8" shape="3072, 3072" offset="3557376" size="9437184" />
153
+ <output>
154
+ <port id="0" precision="I8">
155
+ <dim>3072</dim>
156
+ <dim>3072</dim>
157
+ </port>
158
+ </output>
159
+ </layer>
160
+ <layer id="11" name="Convert_2946993" type="Convert" version="opset1">
161
+ <data destination_type="f16" />
162
+ <input>
163
+ <port id="0" precision="I8">
164
+ <dim>3072</dim>
165
+ <dim>3072</dim>
166
+ </port>
167
+ </input>
168
+ <output>
169
+ <port id="1" precision="FP16">
170
+ <dim>3072</dim>
171
+ <dim>3072</dim>
172
+ </port>
173
+ </output>
174
+ </layer>
175
+ <layer id="12" name="self.2.weight/scale" type="Const" version="opset1">
176
+ <data element_type="f16" shape="3072, 1" offset="12994560" size="6144" />
177
+ <output>
178
+ <port id="0" precision="FP16">
179
+ <dim>3072</dim>
180
+ <dim>1</dim>
181
+ </port>
182
+ </output>
183
+ </layer>
184
+ <layer id="13" name="self.2.weight/fq_weights_1" type="Multiply" version="opset1">
185
+ <data auto_broadcast="numpy" />
186
+ <input>
187
+ <port id="0" precision="FP16">
188
+ <dim>3072</dim>
189
+ <dim>3072</dim>
190
+ </port>
191
+ <port id="1" precision="FP16">
192
+ <dim>3072</dim>
193
+ <dim>1</dim>
194
+ </port>
195
+ </input>
196
+ <output>
197
+ <port id="2" precision="FP16">
198
+ <dim>3072</dim>
199
+ <dim>3072</dim>
200
+ </port>
201
+ </output>
202
+ </layer>
203
+ <layer id="14" name="self.2.weight/fq_weights_1/convert" type="Convert" version="opset1">
204
+ <data destination_type="f32" />
205
+ <input>
206
+ <port id="0" precision="FP16">
207
+ <dim>3072</dim>
208
+ <dim>3072</dim>
209
+ </port>
210
+ </input>
211
+ <output>
212
+ <port id="1" precision="FP32">
213
+ <dim>3072</dim>
214
+ <dim>3072</dim>
215
+ </port>
216
+ </output>
217
+ </layer>
218
+ <layer id="15" name="__module.2/ov_ext::linear/MatMul" type="MatMul" version="opset1">
219
+ <data transpose_a="false" transpose_b="true" />
220
+ <input>
221
+ <port id="0" precision="FP32">
222
+ <dim>-1</dim>
223
+ <dim>-1</dim>
224
+ <dim>3072</dim>
225
+ </port>
226
+ <port id="1" precision="FP32">
227
+ <dim>3072</dim>
228
+ <dim>3072</dim>
229
+ </port>
230
+ </input>
231
+ <output>
232
+ <port id="2" precision="FP32">
233
+ <dim>-1</dim>
234
+ <dim>-1</dim>
235
+ <dim>3072</dim>
236
+ </port>
237
+ </output>
238
+ </layer>
239
+ <layer id="16" name="Constant_286149" type="Const" version="opset1">
240
+ <data element_type="f32" shape="1, 1, 3072" offset="13000704" size="12288" />
241
+ <output>
242
+ <port id="0" precision="FP32">
243
+ <dim>1</dim>
244
+ <dim>1</dim>
245
+ <dim>3072</dim>
246
+ </port>
247
+ </output>
248
+ </layer>
249
+ <layer id="17" name="__module.2/ov_ext::linear/Add" type="Add" version="opset1">
250
+ <data auto_broadcast="numpy" />
251
+ <input>
252
+ <port id="0" precision="FP32">
253
+ <dim>-1</dim>
254
+ <dim>-1</dim>
255
+ <dim>3072</dim>
256
+ </port>
257
+ <port id="1" precision="FP32">
258
+ <dim>1</dim>
259
+ <dim>1</dim>
260
+ <dim>3072</dim>
261
+ </port>
262
+ </input>
263
+ <output>
264
+ <port id="2" precision="FP32" names="last_hidden_state">
265
+ <dim>-1</dim>
266
+ <dim>-1</dim>
267
+ <dim>3072</dim>
268
+ </port>
269
+ </output>
270
+ </layer>
271
+ <layer id="18" name="Result_284126" type="Result" version="opset1" output_names="last_hidden_state">
272
+ <input>
273
+ <port id="0" precision="FP32">
274
+ <dim>-1</dim>
275
+ <dim>-1</dim>
276
+ <dim>3072</dim>
277
+ </port>
278
+ </input>
279
+ </layer>
280
+ </layers>
281
+ <edges>
282
+ <edge from-layer="0" from-port="0" to-layer="6" to-port="0" />
283
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
284
+ <edge from-layer="2" from-port="1" to-layer="4" to-port="0" />
285
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="1" />
286
+ <edge from-layer="4" from-port="2" to-layer="5" to-port="0" />
287
+ <edge from-layer="5" from-port="1" to-layer="6" to-port="1" />
288
+ <edge from-layer="6" from-port="2" to-layer="8" to-port="0" />
289
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="1" />
290
+ <edge from-layer="8" from-port="2" to-layer="9" to-port="0" />
291
+ <edge from-layer="9" from-port="1" to-layer="15" to-port="0" />
292
+ <edge from-layer="10" from-port="0" to-layer="11" to-port="0" />
293
+ <edge from-layer="11" from-port="1" to-layer="13" to-port="0" />
294
+ <edge from-layer="12" from-port="0" to-layer="13" to-port="1" />
295
+ <edge from-layer="13" from-port="2" to-layer="14" to-port="0" />
296
+ <edge from-layer="14" from-port="1" to-layer="15" to-port="1" />
297
+ <edge from-layer="15" from-port="2" to-layer="17" to-port="0" />
298
+ <edge from-layer="16" from-port="0" to-layer="17" to-port="1" />
299
+ <edge from-layer="17" from-port="2" to-layer="18" to-port="0" />
300
+ </edges>
301
+ <rt_info>
302
+ <Runtime_version value="2025.4.1-20426-82bbf0292c5-releases/2025/4" />
303
+ <conversion_parameters>
304
+ <framework value="pytorch" />
305
+ <is_python_object value="True" />
306
+ </conversion_parameters>
307
+ <nncf>
308
+ <friendly_names_were_updated value="True" />
309
+ <version value="2.19.0" />
310
+ <weight_compression>
311
+ <advanced_parameters value="{'statistics_path': None, 'lora_adapter_rank': 256, 'group_size_fallback_mode': 'error', 'min_adjusted_group_size': 32, 'awq_params': {'subset_size': 32, 'percent_to_apply': 0.002, 'alpha_min': 0.0, 'alpha_max': 1.0, 'steps': 100, 'prefer_data_aware_scaling': True}, 'scale_estimation_params': {'subset_size': 64, 'initial_steps': 5, 'scale_steps': 5, 'weight_penalty': -1.0}, 'gptq_params': {'damp_percent': 0.1, 'block_size': 128, 'subset_size': 128}, 'lora_correction_params': {'adapter_rank': 8, 'num_iterations': 3, 'apply_regularization': True, 'subset_size': 128, 'use_int8_adapters': True}, 'backend_params': {}, 'codebook': None}" />
312
+ <all_layers value="False" />
313
+ <awq value="False" />
314
+ <backup_mode value="int8_asym" />
315
+ <compression_format value="dequantize" />
316
+ <gptq value="False" />
317
+ <group_size value="-1" />
318
+ <ignored_scope value="[]" />
319
+ <lora_correction value="False" />
320
+ <mode value="int8_sym" />
321
+ <ratio value="1.0" />
322
+ <scale_estimation value="False" />
323
+ <sensitivity_metric value="weight_quantization_error" />
324
+ </weight_compression>
325
+ </nncf>
326
+ <optimum>
327
+ <nncf_version value="2.19.0" />
328
+ <optimum_intel_version value="1.27.0.dev0+132f70d" />
329
+ <optimum_version value="2.1.0.dev0" />
330
+ <pytorch_version value="2.9.1+cpu" />
331
+ <transformers_version value="4.51.0" />
332
+ </optimum>
333
+ </rt_info>
334
+ </net>
preprocessor_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "audio_compression_rate": 8,
3
+ "audio_downsample_rate": 1,
4
+ "audio_feat_stride": 1,
5
+ "auto_map": {
6
+ "AutoFeatureExtractor": "microsoft/phi-4-multimodal-instruct--processing_phi4mm.Phi4MMAudioFeatureExtractor",
7
+ "AutoImageProcessor": "microsoft/phi-4-multimodal-instruct--processing_phi4mm.Phi4MMImageProcessor",
8
+ "AutoProcessor": "microsoft/phi-4-multimodal-instruct--processing_phi4mm.Phi4MMProcessor"
9
+ },
10
+ "dynamic_hd": 36,
11
+ "image_processor_type": "Phi4MMImageProcessor",
12
+ "processor_class": "Phi4MMProcessor"
13
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|endoftext|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<|endoftext|>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c1b9f641d4f8b7247b8d5007dd3b6a9f6a87cb5123134fe0d326f14d10c0585
3
+ size 15524479
tokenizer_config.json ADDED
@@ -0,0 +1,127 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "199999": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "200010": {
13
+ "content": "<|endoftext10|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "200011": {
21
+ "content": "<|endoftext11|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "200018": {
29
+ "content": "<|endofprompt|>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "200019": {
37
+ "content": "<|assistant|>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": true,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "200020": {
45
+ "content": "<|end|>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": true,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "200021": {
53
+ "content": "<|user|>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": true,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "200022": {
61
+ "content": "<|system|>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": true,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "200023": {
69
+ "content": "<|tool|>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": true,
73
+ "single_word": false,
74
+ "special": false
75
+ },
76
+ "200024": {
77
+ "content": "<|/tool|>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": true,
81
+ "single_word": false,
82
+ "special": false
83
+ },
84
+ "200025": {
85
+ "content": "<|tool_call|>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": true,
89
+ "single_word": false,
90
+ "special": false
91
+ },
92
+ "200026": {
93
+ "content": "<|/tool_call|>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": true,
97
+ "single_word": false,
98
+ "special": false
99
+ },
100
+ "200027": {
101
+ "content": "<|tool_response|>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": true,
105
+ "single_word": false,
106
+ "special": false
107
+ },
108
+ "200028": {
109
+ "content": "<|tag|>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": true,
113
+ "single_word": false,
114
+ "special": true
115
+ }
116
+ },
117
+ "bos_token": "<|endoftext|>",
118
+ "chat_template": "{% for message in messages %}{% if message['role'] == 'system' and 'tools' in message and message['tools'] is not none %}{{ '<|' + message['role'] + '|>' + message['content'] + '<|tool|>' + message['tools'] + '<|/tool|>' + '<|end|>' }}{% else %}{{ '<|' + message['role'] + '|>' + message['content'] + '<|end|>' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '<|assistant|>' }}{% else %}{{ eos_token }}{% endif %}",
119
+ "clean_up_tokenization_spaces": false,
120
+ "eos_token": "<|endoftext|>",
121
+ "extra_special_tokens": {},
122
+ "model_max_length": 131072,
123
+ "pad_token": "<|endoftext|>",
124
+ "processor_class": "Phi4MMProcessor",
125
+ "tokenizer_class": "GPT2Tokenizer",
126
+ "unk_token": "<|endoftext|>"
127
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff