hf-transformers-bot commited on
Commit
0ed54d7
·
verified ·
1 Parent(s): e0975ef

Upload 2025-10-09/runs/5450-18379785792/ci_results_run_models_gpu/model_results.json with huggingface_hub

Browse files
2025-10-09/runs/5450-18379785792/ci_results_run_models_gpu/model_results.json ADDED
@@ -0,0 +1,1630 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "models_auto": {
3
+ "failed": {
4
+ "PyTorch": {
5
+ "unclassified": 0,
6
+ "single": 0,
7
+ "multi": 0
8
+ },
9
+ "Tokenizers": {
10
+ "unclassified": 0,
11
+ "single": 0,
12
+ "multi": 0
13
+ },
14
+ "Pipelines": {
15
+ "unclassified": 0,
16
+ "single": 0,
17
+ "multi": 0
18
+ },
19
+ "Trainer": {
20
+ "unclassified": 0,
21
+ "single": 0,
22
+ "multi": 0
23
+ },
24
+ "ONNX": {
25
+ "unclassified": 0,
26
+ "single": 0,
27
+ "multi": 0
28
+ },
29
+ "Auto": {
30
+ "unclassified": 0,
31
+ "single": 0,
32
+ "multi": 0
33
+ },
34
+ "Quantization": {
35
+ "unclassified": 0,
36
+ "single": 0,
37
+ "multi": 0
38
+ },
39
+ "Unclassified": {
40
+ "unclassified": 0,
41
+ "single": 0,
42
+ "multi": 0
43
+ }
44
+ },
45
+ "errors": 0,
46
+ "success": 113,
47
+ "skipped": 5,
48
+ "time_spent": [
49
+ 59.05
50
+ ],
51
+ "error": false,
52
+ "failures": {},
53
+ "job_link": {
54
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397849"
55
+ },
56
+ "captured_info": {}
57
+ },
58
+ "models_bert": {
59
+ "failed": {
60
+ "PyTorch": {
61
+ "unclassified": 0,
62
+ "single": 0,
63
+ "multi": 0
64
+ },
65
+ "Tokenizers": {
66
+ "unclassified": 0,
67
+ "single": 0,
68
+ "multi": 0
69
+ },
70
+ "Pipelines": {
71
+ "unclassified": 0,
72
+ "single": 0,
73
+ "multi": 0
74
+ },
75
+ "Trainer": {
76
+ "unclassified": 0,
77
+ "single": 0,
78
+ "multi": 0
79
+ },
80
+ "ONNX": {
81
+ "unclassified": 0,
82
+ "single": 0,
83
+ "multi": 0
84
+ },
85
+ "Auto": {
86
+ "unclassified": 0,
87
+ "single": 0,
88
+ "multi": 0
89
+ },
90
+ "Quantization": {
91
+ "unclassified": 0,
92
+ "single": 0,
93
+ "multi": 0
94
+ },
95
+ "Unclassified": {
96
+ "unclassified": 0,
97
+ "single": 0,
98
+ "multi": 0
99
+ }
100
+ },
101
+ "errors": 0,
102
+ "success": 0,
103
+ "skipped": 0,
104
+ "time_spent": [],
105
+ "error": false,
106
+ "failures": {},
107
+ "job_link": {},
108
+ "captured_info": {}
109
+ },
110
+ "models_clip": {
111
+ "failed": {
112
+ "PyTorch": {
113
+ "unclassified": 0,
114
+ "single": 0,
115
+ "multi": 0
116
+ },
117
+ "Tokenizers": {
118
+ "unclassified": 0,
119
+ "single": 0,
120
+ "multi": 0
121
+ },
122
+ "Pipelines": {
123
+ "unclassified": 0,
124
+ "single": 0,
125
+ "multi": 0
126
+ },
127
+ "Trainer": {
128
+ "unclassified": 0,
129
+ "single": 0,
130
+ "multi": 0
131
+ },
132
+ "ONNX": {
133
+ "unclassified": 0,
134
+ "single": 0,
135
+ "multi": 0
136
+ },
137
+ "Auto": {
138
+ "unclassified": 0,
139
+ "single": 0,
140
+ "multi": 0
141
+ },
142
+ "Quantization": {
143
+ "unclassified": 0,
144
+ "single": 0,
145
+ "multi": 0
146
+ },
147
+ "Unclassified": {
148
+ "unclassified": 0,
149
+ "single": 0,
150
+ "multi": 0
151
+ }
152
+ },
153
+ "errors": 0,
154
+ "success": 456,
155
+ "skipped": 283,
156
+ "time_spent": [
157
+ 142.17
158
+ ],
159
+ "error": false,
160
+ "failures": {},
161
+ "job_link": {
162
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397841"
163
+ },
164
+ "captured_info": {}
165
+ },
166
+ "models_csm": {
167
+ "failed": {
168
+ "PyTorch": {
169
+ "unclassified": 0,
170
+ "single": 0,
171
+ "multi": 1
172
+ },
173
+ "Tokenizers": {
174
+ "unclassified": 0,
175
+ "single": 0,
176
+ "multi": 0
177
+ },
178
+ "Pipelines": {
179
+ "unclassified": 0,
180
+ "single": 0,
181
+ "multi": 0
182
+ },
183
+ "Trainer": {
184
+ "unclassified": 0,
185
+ "single": 0,
186
+ "multi": 0
187
+ },
188
+ "ONNX": {
189
+ "unclassified": 0,
190
+ "single": 0,
191
+ "multi": 0
192
+ },
193
+ "Auto": {
194
+ "unclassified": 0,
195
+ "single": 0,
196
+ "multi": 0
197
+ },
198
+ "Quantization": {
199
+ "unclassified": 0,
200
+ "single": 0,
201
+ "multi": 0
202
+ },
203
+ "Unclassified": {
204
+ "unclassified": 0,
205
+ "single": 0,
206
+ "multi": 0
207
+ }
208
+ },
209
+ "errors": 0,
210
+ "success": 120,
211
+ "skipped": 84,
212
+ "time_spent": [
213
+ 133.68
214
+ ],
215
+ "error": false,
216
+ "failures": {
217
+ "multi": [
218
+ {
219
+ "line": "tests/models/csm/test_modeling_csm.py::CsmForConditionalGenerationTest::test_multi_gpu_data_parallel_forward",
220
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
221
+ }
222
+ ]
223
+ },
224
+ "job_link": {
225
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397873"
226
+ },
227
+ "captured_info": {}
228
+ },
229
+ "models_detr": {
230
+ "failed": {
231
+ "PyTorch": {
232
+ "unclassified": 0,
233
+ "single": 1,
234
+ "multi": 1
235
+ },
236
+ "Tokenizers": {
237
+ "unclassified": 0,
238
+ "single": 0,
239
+ "multi": 0
240
+ },
241
+ "Pipelines": {
242
+ "unclassified": 0,
243
+ "single": 0,
244
+ "multi": 0
245
+ },
246
+ "Trainer": {
247
+ "unclassified": 0,
248
+ "single": 0,
249
+ "multi": 0
250
+ },
251
+ "ONNX": {
252
+ "unclassified": 0,
253
+ "single": 0,
254
+ "multi": 0
255
+ },
256
+ "Auto": {
257
+ "unclassified": 0,
258
+ "single": 0,
259
+ "multi": 0
260
+ },
261
+ "Quantization": {
262
+ "unclassified": 0,
263
+ "single": 0,
264
+ "multi": 0
265
+ },
266
+ "Unclassified": {
267
+ "unclassified": 0,
268
+ "single": 0,
269
+ "multi": 0
270
+ }
271
+ },
272
+ "errors": 0,
273
+ "success": 179,
274
+ "skipped": 239,
275
+ "time_spent": [
276
+ 72.19,
277
+ 70.56
278
+ ],
279
+ "error": false,
280
+ "failures": {
281
+ "multi": [
282
+ {
283
+ "line": "tests/models/detr/test_modeling_detr.py::DetrModelTest::test_torch_export",
284
+ "trace": "(line 745) torch._dynamo.exc.Unsupported: Attempted to call function marked as skipped"
285
+ }
286
+ ],
287
+ "single": [
288
+ {
289
+ "line": "tests/models/detr/test_modeling_detr.py::DetrModelTest::test_torch_export",
290
+ "trace": "(line 745) torch._dynamo.exc.Unsupported: Attempted to call function marked as skipped"
291
+ }
292
+ ]
293
+ },
294
+ "job_link": {
295
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397845",
296
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397788"
297
+ },
298
+ "captured_info": {}
299
+ },
300
+ "models_gemma3": {
301
+ "failed": {
302
+ "PyTorch": {
303
+ "unclassified": 0,
304
+ "single": 2,
305
+ "multi": 0
306
+ },
307
+ "Tokenizers": {
308
+ "unclassified": 0,
309
+ "single": 0,
310
+ "multi": 0
311
+ },
312
+ "Pipelines": {
313
+ "unclassified": 0,
314
+ "single": 0,
315
+ "multi": 0
316
+ },
317
+ "Trainer": {
318
+ "unclassified": 0,
319
+ "single": 0,
320
+ "multi": 0
321
+ },
322
+ "ONNX": {
323
+ "unclassified": 0,
324
+ "single": 0,
325
+ "multi": 0
326
+ },
327
+ "Auto": {
328
+ "unclassified": 0,
329
+ "single": 0,
330
+ "multi": 0
331
+ },
332
+ "Quantization": {
333
+ "unclassified": 0,
334
+ "single": 0,
335
+ "multi": 0
336
+ },
337
+ "Unclassified": {
338
+ "unclassified": 0,
339
+ "single": 0,
340
+ "multi": 0
341
+ }
342
+ },
343
+ "errors": 0,
344
+ "success": 287,
345
+ "skipped": 173,
346
+ "time_spent": [
347
+ 1112.54
348
+ ],
349
+ "error": false,
350
+ "failures": {
351
+ "single": [
352
+ {
353
+ "line": "tests/models/gemma3/test_modeling_gemma3.py::Gemma3Vision2TextModelTest::test_flash_attn_2_from_config",
354
+ "trace": "(line 777) ValueError: `token_type_ids` is required as a model input when training"
355
+ },
356
+ {
357
+ "line": "tests/models/gemma3/test_modeling_gemma3.py::Gemma3Vision2TextModelTest::test_flash_attn_2_inference_equivalence",
358
+ "trace": "(line 777) ValueError: `token_type_ids` is required as a model input when training"
359
+ }
360
+ ]
361
+ },
362
+ "job_link": {
363
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397789"
364
+ },
365
+ "captured_info": {
366
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397789#step:16:1"
367
+ }
368
+ },
369
+ "models_gemma3n": {
370
+ "failed": {
371
+ "PyTorch": {
372
+ "unclassified": 0,
373
+ "single": 0,
374
+ "multi": 13
375
+ },
376
+ "Tokenizers": {
377
+ "unclassified": 0,
378
+ "single": 0,
379
+ "multi": 0
380
+ },
381
+ "Pipelines": {
382
+ "unclassified": 0,
383
+ "single": 0,
384
+ "multi": 0
385
+ },
386
+ "Trainer": {
387
+ "unclassified": 0,
388
+ "single": 0,
389
+ "multi": 0
390
+ },
391
+ "ONNX": {
392
+ "unclassified": 0,
393
+ "single": 0,
394
+ "multi": 0
395
+ },
396
+ "Auto": {
397
+ "unclassified": 0,
398
+ "single": 0,
399
+ "multi": 0
400
+ },
401
+ "Quantization": {
402
+ "unclassified": 0,
403
+ "single": 0,
404
+ "multi": 0
405
+ },
406
+ "Unclassified": {
407
+ "unclassified": 0,
408
+ "single": 0,
409
+ "multi": 0
410
+ }
411
+ },
412
+ "errors": 0,
413
+ "success": 153,
414
+ "skipped": 357,
415
+ "time_spent": [
416
+ 246.06
417
+ ],
418
+ "error": false,
419
+ "failures": {
420
+ "multi": [
421
+ {
422
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nTextModelTest::test_flash_attn_2_equivalence",
423
+ "trace": "(line 556) AssertionError: Tensor-likes are not close!"
424
+ },
425
+ {
426
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nTextModelTest::test_flash_attn_2_inference_equivalence",
427
+ "trace": "(line 3112) AssertionError: Tensor-likes are not close!"
428
+ },
429
+ {
430
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nTextModelTest::test_flash_attn_2_inference_equivalence_right_padding",
431
+ "trace": "(line 3112) AssertionError: Tensor-likes are not close!"
432
+ },
433
+ {
434
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nTextModelTest::test_multi_gpu_data_parallel_forward",
435
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
436
+ },
437
+ {
438
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_generation_beyond_sliding_window_0_sdpa",
439
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
440
+ },
441
+ {
442
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_generation_beyond_sliding_window_1_eager",
443
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
444
+ },
445
+ {
446
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_generation_beyond_sliding_window_2_flash_attention_2",
447
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
448
+ },
449
+ {
450
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_generation_beyond_sliding_window_with_generation_config",
451
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
452
+ },
453
+ {
454
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_model_4b_batch",
455
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
456
+ },
457
+ {
458
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_model_4b_bf16",
459
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
460
+ },
461
+ {
462
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_model_4b_image",
463
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
464
+ },
465
+ {
466
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_model_4b_multiimage",
467
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
468
+ },
469
+ {
470
+ "line": "tests/models/gemma3n/test_modeling_gemma3n.py::Gemma3nIntegrationTest::test_model_with_audio",
471
+ "trace": "(line 15) TypeError: 'torchcodec.decoders.AudioDecoder' object is not subscriptable"
472
+ }
473
+ ]
474
+ },
475
+ "job_link": {
476
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397857"
477
+ },
478
+ "captured_info": {
479
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397857#step:16:1"
480
+ }
481
+ },
482
+ "models_got_ocr2": {
483
+ "failed": {
484
+ "PyTorch": {
485
+ "unclassified": 0,
486
+ "single": 0,
487
+ "multi": 2
488
+ },
489
+ "Tokenizers": {
490
+ "unclassified": 0,
491
+ "single": 0,
492
+ "multi": 0
493
+ },
494
+ "Pipelines": {
495
+ "unclassified": 0,
496
+ "single": 0,
497
+ "multi": 0
498
+ },
499
+ "Trainer": {
500
+ "unclassified": 0,
501
+ "single": 0,
502
+ "multi": 0
503
+ },
504
+ "ONNX": {
505
+ "unclassified": 0,
506
+ "single": 0,
507
+ "multi": 0
508
+ },
509
+ "Auto": {
510
+ "unclassified": 0,
511
+ "single": 0,
512
+ "multi": 0
513
+ },
514
+ "Quantization": {
515
+ "unclassified": 0,
516
+ "single": 0,
517
+ "multi": 0
518
+ },
519
+ "Unclassified": {
520
+ "unclassified": 0,
521
+ "single": 0,
522
+ "multi": 0
523
+ }
524
+ },
525
+ "errors": 0,
526
+ "success": 255,
527
+ "skipped": 319,
528
+ "time_spent": [
529
+ 121.94,
530
+ 117.08
531
+ ],
532
+ "error": false,
533
+ "failures": {
534
+ "multi": [
535
+ {
536
+ "line": "tests/models/got_ocr2/test_modeling_got_ocr2.py::GotOcr2ModelTest::test_multi_gpu_data_parallel_forward",
537
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
538
+ },
539
+ {
540
+ "line": "tests/models/got_ocr2/test_modeling_got_ocr2.py::GotOcr2IntegrationTest::test_small_model_integration_test_got_ocr_batched",
541
+ "trace": "(line 89) httpx.RemoteProtocolError: Server disconnected without sending a response."
542
+ }
543
+ ]
544
+ },
545
+ "job_link": {
546
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397884",
547
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397921"
548
+ },
549
+ "captured_info": {}
550
+ },
551
+ "models_gpt2": {
552
+ "failed": {
553
+ "PyTorch": {
554
+ "unclassified": 0,
555
+ "single": 2,
556
+ "multi": 3
557
+ },
558
+ "Tokenizers": {
559
+ "unclassified": 0,
560
+ "single": 0,
561
+ "multi": 0
562
+ },
563
+ "Pipelines": {
564
+ "unclassified": 0,
565
+ "single": 0,
566
+ "multi": 0
567
+ },
568
+ "Trainer": {
569
+ "unclassified": 0,
570
+ "single": 0,
571
+ "multi": 0
572
+ },
573
+ "ONNX": {
574
+ "unclassified": 0,
575
+ "single": 0,
576
+ "multi": 0
577
+ },
578
+ "Auto": {
579
+ "unclassified": 0,
580
+ "single": 0,
581
+ "multi": 0
582
+ },
583
+ "Quantization": {
584
+ "unclassified": 0,
585
+ "single": 0,
586
+ "multi": 0
587
+ },
588
+ "Unclassified": {
589
+ "unclassified": 0,
590
+ "single": 0,
591
+ "multi": 0
592
+ }
593
+ },
594
+ "errors": 0,
595
+ "success": 470,
596
+ "skipped": 215,
597
+ "time_spent": [
598
+ 141.87,
599
+ 145.51
600
+ ],
601
+ "error": false,
602
+ "failures": {
603
+ "single": [
604
+ {
605
+ "line": "tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelLanguageGenerationTest::test_contrastive_search_gpt2",
606
+ "trace": "(line 286) TypeError: 'DynamicCache' object is not subscriptable"
607
+ },
608
+ {
609
+ "line": "tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelLanguageGenerationTest::test_flash_attn_2_generate_padding_left",
610
+ "trace": "(line 445) AssertionError: Lists differ: ['<|e[141 chars]ta', \"Hello this is a very long sentence. I'm [46 chars]rry\"] != ['<|e[141 chars]ta', 'Hello this is a very long sentence very [91 chars]ong']"
611
+ }
612
+ ],
613
+ "multi": [
614
+ {
615
+ "line": "tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelTest::test_multi_gpu_data_parallel_forward",
616
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
617
+ },
618
+ {
619
+ "line": "tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelLanguageGenerationTest::test_contrastive_search_gpt2",
620
+ "trace": "(line 286) TypeError: 'DynamicCache' object is not subscriptable"
621
+ },
622
+ {
623
+ "line": "tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelLanguageGenerationTest::test_flash_attn_2_generate_padding_left",
624
+ "trace": "(line 445) AssertionError: Lists differ: ['<|e[141 chars]ta', \"Hello this is a very long sentence. I'm [46 chars]rry\"] != ['<|e[141 chars]ta', 'Hello this is a very long sentence very [91 chars]ong']"
625
+ }
626
+ ]
627
+ },
628
+ "job_link": {
629
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397793",
630
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397852"
631
+ },
632
+ "captured_info": {
633
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397793#step:16:1",
634
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397852#step:16:1"
635
+ }
636
+ },
637
+ "models_gpt_oss": {
638
+ "failed": {
639
+ "PyTorch": {
640
+ "unclassified": 0,
641
+ "single": 0,
642
+ "multi": 0
643
+ },
644
+ "Tokenizers": {
645
+ "unclassified": 0,
646
+ "single": 0,
647
+ "multi": 0
648
+ },
649
+ "Pipelines": {
650
+ "unclassified": 0,
651
+ "single": 0,
652
+ "multi": 0
653
+ },
654
+ "Trainer": {
655
+ "unclassified": 0,
656
+ "single": 0,
657
+ "multi": 0
658
+ },
659
+ "ONNX": {
660
+ "unclassified": 0,
661
+ "single": 0,
662
+ "multi": 0
663
+ },
664
+ "Auto": {
665
+ "unclassified": 0,
666
+ "single": 0,
667
+ "multi": 0
668
+ },
669
+ "Quantization": {
670
+ "unclassified": 0,
671
+ "single": 0,
672
+ "multi": 0
673
+ },
674
+ "Unclassified": {
675
+ "unclassified": 0,
676
+ "single": 0,
677
+ "multi": 0
678
+ }
679
+ },
680
+ "errors": 0,
681
+ "success": 0,
682
+ "skipped": 0,
683
+ "time_spent": [],
684
+ "error": false,
685
+ "failures": {},
686
+ "job_link": {},
687
+ "captured_info": {}
688
+ },
689
+ "models_internvl": {
690
+ "failed": {
691
+ "PyTorch": {
692
+ "unclassified": 0,
693
+ "single": 0,
694
+ "multi": 0
695
+ },
696
+ "Tokenizers": {
697
+ "unclassified": 0,
698
+ "single": 0,
699
+ "multi": 0
700
+ },
701
+ "Pipelines": {
702
+ "unclassified": 0,
703
+ "single": 0,
704
+ "multi": 0
705
+ },
706
+ "Trainer": {
707
+ "unclassified": 0,
708
+ "single": 0,
709
+ "multi": 0
710
+ },
711
+ "ONNX": {
712
+ "unclassified": 0,
713
+ "single": 0,
714
+ "multi": 0
715
+ },
716
+ "Auto": {
717
+ "unclassified": 0,
718
+ "single": 0,
719
+ "multi": 0
720
+ },
721
+ "Quantization": {
722
+ "unclassified": 0,
723
+ "single": 0,
724
+ "multi": 0
725
+ },
726
+ "Unclassified": {
727
+ "unclassified": 0,
728
+ "single": 0,
729
+ "multi": 0
730
+ }
731
+ },
732
+ "errors": 0,
733
+ "success": 0,
734
+ "skipped": 0,
735
+ "time_spent": [],
736
+ "error": false,
737
+ "failures": {},
738
+ "job_link": {},
739
+ "captured_info": {}
740
+ },
741
+ "models_llama": {
742
+ "failed": {
743
+ "PyTorch": {
744
+ "unclassified": 0,
745
+ "single": 0,
746
+ "multi": 0
747
+ },
748
+ "Tokenizers": {
749
+ "unclassified": 0,
750
+ "single": 0,
751
+ "multi": 0
752
+ },
753
+ "Pipelines": {
754
+ "unclassified": 0,
755
+ "single": 0,
756
+ "multi": 0
757
+ },
758
+ "Trainer": {
759
+ "unclassified": 0,
760
+ "single": 0,
761
+ "multi": 0
762
+ },
763
+ "ONNX": {
764
+ "unclassified": 0,
765
+ "single": 0,
766
+ "multi": 0
767
+ },
768
+ "Auto": {
769
+ "unclassified": 0,
770
+ "single": 0,
771
+ "multi": 0
772
+ },
773
+ "Quantization": {
774
+ "unclassified": 0,
775
+ "single": 0,
776
+ "multi": 0
777
+ },
778
+ "Unclassified": {
779
+ "unclassified": 0,
780
+ "single": 0,
781
+ "multi": 0
782
+ }
783
+ },
784
+ "errors": 0,
785
+ "success": 250,
786
+ "skipped": 107,
787
+ "time_spent": [
788
+ 269.84
789
+ ],
790
+ "error": false,
791
+ "failures": {},
792
+ "job_link": {
793
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397801"
794
+ },
795
+ "captured_info": {}
796
+ },
797
+ "models_llava": {
798
+ "failed": {
799
+ "PyTorch": {
800
+ "unclassified": 0,
801
+ "single": 0,
802
+ "multi": 0
803
+ },
804
+ "Tokenizers": {
805
+ "unclassified": 0,
806
+ "single": 0,
807
+ "multi": 0
808
+ },
809
+ "Pipelines": {
810
+ "unclassified": 0,
811
+ "single": 0,
812
+ "multi": 0
813
+ },
814
+ "Trainer": {
815
+ "unclassified": 0,
816
+ "single": 0,
817
+ "multi": 0
818
+ },
819
+ "ONNX": {
820
+ "unclassified": 0,
821
+ "single": 0,
822
+ "multi": 0
823
+ },
824
+ "Auto": {
825
+ "unclassified": 0,
826
+ "single": 0,
827
+ "multi": 0
828
+ },
829
+ "Quantization": {
830
+ "unclassified": 0,
831
+ "single": 0,
832
+ "multi": 0
833
+ },
834
+ "Unclassified": {
835
+ "unclassified": 0,
836
+ "single": 0,
837
+ "multi": 0
838
+ }
839
+ },
840
+ "errors": 0,
841
+ "success": 180,
842
+ "skipped": 66,
843
+ "time_spent": [
844
+ 561.0
845
+ ],
846
+ "error": false,
847
+ "failures": {},
848
+ "job_link": {
849
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397812"
850
+ },
851
+ "captured_info": {}
852
+ },
853
+ "models_mistral3": {
854
+ "failed": {
855
+ "PyTorch": {
856
+ "unclassified": 0,
857
+ "single": 2,
858
+ "multi": 3
859
+ },
860
+ "Tokenizers": {
861
+ "unclassified": 0,
862
+ "single": 0,
863
+ "multi": 0
864
+ },
865
+ "Pipelines": {
866
+ "unclassified": 0,
867
+ "single": 0,
868
+ "multi": 0
869
+ },
870
+ "Trainer": {
871
+ "unclassified": 0,
872
+ "single": 0,
873
+ "multi": 0
874
+ },
875
+ "ONNX": {
876
+ "unclassified": 0,
877
+ "single": 0,
878
+ "multi": 0
879
+ },
880
+ "Auto": {
881
+ "unclassified": 0,
882
+ "single": 0,
883
+ "multi": 0
884
+ },
885
+ "Quantization": {
886
+ "unclassified": 0,
887
+ "single": 0,
888
+ "multi": 0
889
+ },
890
+ "Unclassified": {
891
+ "unclassified": 0,
892
+ "single": 0,
893
+ "multi": 0
894
+ }
895
+ },
896
+ "errors": 0,
897
+ "success": 280,
898
+ "skipped": 247,
899
+ "time_spent": [
900
+ 606.47,
901
+ 595.94
902
+ ],
903
+ "error": false,
904
+ "failures": {
905
+ "single": [
906
+ {
907
+ "line": "tests/models/mistral3/test_modeling_mistral3.py::Mistral3IntegrationTest::test_mistral3_integration_batched_generate",
908
+ "trace": "(line 362) AssertionError: 'Calm waters reflect\\nWooden path to distant shore\\nSilence in the woods' != \"Wooden path to calm,\\nReflections whisper secrets,\\nNature's peace unfolds.\""
909
+ },
910
+ {
911
+ "line": "tests/models/mistral3/test_modeling_mistral3.py::Mistral3IntegrationTest::test_mistral3_integration_batched_generate_multi_image",
912
+ "trace": "(line 438) AssertionError: \"Calm waters reflect\\nWooden path to distant shore\\nPeace in nature's hold\" != 'Calm waters reflect\\nWooden path to distant shore\\nSilence in the scene'"
913
+ }
914
+ ],
915
+ "multi": [
916
+ {
917
+ "line": "tests/models/mistral3/test_modeling_mistral3.py::Mistral3ModelTest::test_multi_gpu_data_parallel_forward",
918
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
919
+ },
920
+ {
921
+ "line": "tests/models/mistral3/test_modeling_mistral3.py::Mistral3IntegrationTest::test_mistral3_integration_batched_generate",
922
+ "trace": "(line 362) AssertionError: 'Calm waters reflect\\nWooden path to distant shore\\nSilence in the woods' != \"Wooden path to calm,\\nReflections whisper secrets,\\nNature's peace unfolds.\""
923
+ },
924
+ {
925
+ "line": "tests/models/mistral3/test_modeling_mistral3.py::Mistral3IntegrationTest::test_mistral3_integration_batched_generate_multi_image",
926
+ "trace": "(line 438) AssertionError: \"Calm waters reflect\\nWooden path to distant shore\\nPeace in nature's hold\" != 'Calm waters reflect\\nWooden path to distant shore\\nSilence in the scene'"
927
+ }
928
+ ]
929
+ },
930
+ "job_link": {
931
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397816",
932
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397895"
933
+ },
934
+ "captured_info": {
935
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397816#step:16:1",
936
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397895#step:16:1"
937
+ }
938
+ },
939
+ "models_modernbert": {
940
+ "failed": {
941
+ "PyTorch": {
942
+ "unclassified": 0,
943
+ "single": 0,
944
+ "multi": 0
945
+ },
946
+ "Tokenizers": {
947
+ "unclassified": 0,
948
+ "single": 0,
949
+ "multi": 0
950
+ },
951
+ "Pipelines": {
952
+ "unclassified": 0,
953
+ "single": 0,
954
+ "multi": 0
955
+ },
956
+ "Trainer": {
957
+ "unclassified": 0,
958
+ "single": 0,
959
+ "multi": 0
960
+ },
961
+ "ONNX": {
962
+ "unclassified": 0,
963
+ "single": 0,
964
+ "multi": 0
965
+ },
966
+ "Auto": {
967
+ "unclassified": 0,
968
+ "single": 0,
969
+ "multi": 0
970
+ },
971
+ "Quantization": {
972
+ "unclassified": 0,
973
+ "single": 0,
974
+ "multi": 0
975
+ },
976
+ "Unclassified": {
977
+ "unclassified": 0,
978
+ "single": 0,
979
+ "multi": 0
980
+ }
981
+ },
982
+ "errors": 0,
983
+ "success": 180,
984
+ "skipped": 182,
985
+ "time_spent": [
986
+ 127.04,
987
+ 127.06
988
+ ],
989
+ "error": false,
990
+ "failures": {},
991
+ "job_link": {
992
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397890",
993
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397804"
994
+ },
995
+ "captured_info": {}
996
+ },
997
+ "models_qwen2": {
998
+ "failed": {
999
+ "PyTorch": {
1000
+ "unclassified": 0,
1001
+ "single": 0,
1002
+ "multi": 0
1003
+ },
1004
+ "Tokenizers": {
1005
+ "unclassified": 0,
1006
+ "single": 0,
1007
+ "multi": 0
1008
+ },
1009
+ "Pipelines": {
1010
+ "unclassified": 0,
1011
+ "single": 0,
1012
+ "multi": 0
1013
+ },
1014
+ "Trainer": {
1015
+ "unclassified": 0,
1016
+ "single": 0,
1017
+ "multi": 0
1018
+ },
1019
+ "ONNX": {
1020
+ "unclassified": 0,
1021
+ "single": 0,
1022
+ "multi": 0
1023
+ },
1024
+ "Auto": {
1025
+ "unclassified": 0,
1026
+ "single": 0,
1027
+ "multi": 0
1028
+ },
1029
+ "Quantization": {
1030
+ "unclassified": 0,
1031
+ "single": 0,
1032
+ "multi": 0
1033
+ },
1034
+ "Unclassified": {
1035
+ "unclassified": 0,
1036
+ "single": 0,
1037
+ "multi": 0
1038
+ }
1039
+ },
1040
+ "errors": 0,
1041
+ "success": 236,
1042
+ "skipped": 101,
1043
+ "time_spent": [
1044
+ 176.22
1045
+ ],
1046
+ "error": false,
1047
+ "failures": {},
1048
+ "job_link": {
1049
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397799"
1050
+ },
1051
+ "captured_info": {
1052
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397799#step:16:1"
1053
+ }
1054
+ },
1055
+ "models_qwen2_5_omni": {
1056
+ "failed": {
1057
+ "PyTorch": {
1058
+ "unclassified": 0,
1059
+ "single": 0,
1060
+ "multi": 2
1061
+ },
1062
+ "Tokenizers": {
1063
+ "unclassified": 0,
1064
+ "single": 0,
1065
+ "multi": 0
1066
+ },
1067
+ "Pipelines": {
1068
+ "unclassified": 0,
1069
+ "single": 0,
1070
+ "multi": 0
1071
+ },
1072
+ "Trainer": {
1073
+ "unclassified": 0,
1074
+ "single": 0,
1075
+ "multi": 0
1076
+ },
1077
+ "ONNX": {
1078
+ "unclassified": 0,
1079
+ "single": 0,
1080
+ "multi": 0
1081
+ },
1082
+ "Auto": {
1083
+ "unclassified": 0,
1084
+ "single": 0,
1085
+ "multi": 0
1086
+ },
1087
+ "Quantization": {
1088
+ "unclassified": 0,
1089
+ "single": 0,
1090
+ "multi": 0
1091
+ },
1092
+ "Unclassified": {
1093
+ "unclassified": 0,
1094
+ "single": 0,
1095
+ "multi": 0
1096
+ }
1097
+ },
1098
+ "errors": 0,
1099
+ "success": 313,
1100
+ "skipped": 101,
1101
+ "time_spent": [
1102
+ 173.83,
1103
+ 177.23
1104
+ ],
1105
+ "error": false,
1106
+ "failures": {
1107
+ "multi": [
1108
+ {
1109
+ "line": "tests/models/qwen2_5_omni/test_modeling_qwen2_5_omni.py::Qwen2_5OmniThinkerForConditionalGenerationModelTest::test_multi_gpu_data_parallel_forward",
1110
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
1111
+ },
1112
+ {
1113
+ "line": "tests/models/qwen2_5_omni/test_modeling_qwen2_5_omni.py::Qwen2_5OmniModelIntegrationTest::test_small_model_integration_test",
1114
+ "trace": "(line 287) http.client.RemoteDisconnected: Remote end closed connection without response"
1115
+ }
1116
+ ]
1117
+ },
1118
+ "job_link": {
1119
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397886",
1120
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397810"
1121
+ },
1122
+ "captured_info": {}
1123
+ },
1124
+ "models_qwen2_5_vl": {
1125
+ "failed": {
1126
+ "PyTorch": {
1127
+ "unclassified": 0,
1128
+ "single": 2,
1129
+ "multi": 2
1130
+ },
1131
+ "Tokenizers": {
1132
+ "unclassified": 0,
1133
+ "single": 0,
1134
+ "multi": 0
1135
+ },
1136
+ "Pipelines": {
1137
+ "unclassified": 0,
1138
+ "single": 0,
1139
+ "multi": 0
1140
+ },
1141
+ "Trainer": {
1142
+ "unclassified": 0,
1143
+ "single": 0,
1144
+ "multi": 0
1145
+ },
1146
+ "ONNX": {
1147
+ "unclassified": 0,
1148
+ "single": 0,
1149
+ "multi": 0
1150
+ },
1151
+ "Auto": {
1152
+ "unclassified": 0,
1153
+ "single": 0,
1154
+ "multi": 0
1155
+ },
1156
+ "Quantization": {
1157
+ "unclassified": 0,
1158
+ "single": 0,
1159
+ "multi": 0
1160
+ },
1161
+ "Unclassified": {
1162
+ "unclassified": 0,
1163
+ "single": 0,
1164
+ "multi": 0
1165
+ }
1166
+ },
1167
+ "errors": 0,
1168
+ "success": 331,
1169
+ "skipped": 97,
1170
+ "time_spent": [
1171
+ 211.86,
1172
+ 203.11
1173
+ ],
1174
+ "error": false,
1175
+ "failures": {
1176
+ "multi": [
1177
+ {
1178
+ "line": "tests/models/qwen2_5_vl/test_modeling_qwen2_5_vl.py::Qwen2_5_VLIntegrationTest::test_small_model_integration_test_batch_different_resolutions",
1179
+ "trace": "(line 612) AssertionError: 'syst[73 chars]ant\\n addCriterion\\nThe dog in the picture app[95 chars]h is' != 'syst[73 chars]ant\\nThe dog in the picture appears to be a La[94 chars]t in'"
1180
+ },
1181
+ {
1182
+ "line": "tests/models/qwen2_5_vl/test_modeling_qwen2_5_vl.py::Qwen2_5_VLIntegrationTest::test_small_model_integration_test_batch_wo_image_flashatt2",
1183
+ "trace": "(line 683) AssertionError: Lists differ: ['sys[216 chars]in', 'system\\nYou are a helpful assistant.\\nus[29 chars]aks'] != ['sys[216 chars]in', \"system\\nYou are a helpful assistant.\\nus[162 chars]ing\"]"
1184
+ }
1185
+ ],
1186
+ "single": [
1187
+ {
1188
+ "line": "tests/models/qwen2_5_vl/test_modeling_qwen2_5_vl.py::Qwen2_5_VLIntegrationTest::test_small_model_integration_test_batch_different_resolutions",
1189
+ "trace": "(line 612) AssertionError: 'syst[73 chars]ant\\n addCriterion\\nThe dog in the picture app[95 chars]h is' != 'syst[73 chars]ant\\nThe dog in the picture appears to be a La[94 chars]t in'"
1190
+ },
1191
+ {
1192
+ "line": "tests/models/qwen2_5_vl/test_modeling_qwen2_5_vl.py::Qwen2_5_VLIntegrationTest::test_small_model_integration_test_batch_wo_image_flashatt2",
1193
+ "trace": "(line 683) AssertionError: Lists differ: ['sys[216 chars]in', 'system\\nYou are a helpful assistant.\\nus[29 chars]aks'] != ['sys[216 chars]in', \"system\\nYou are a helpful assistant.\\nus[162 chars]ing\"]"
1194
+ }
1195
+ ]
1196
+ },
1197
+ "job_link": {
1198
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397871",
1199
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397796"
1200
+ },
1201
+ "captured_info": {
1202
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397871#step:16:1",
1203
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397796#step:16:1"
1204
+ }
1205
+ },
1206
+ "models_qwen2_audio": {
1207
+ "failed": {
1208
+ "PyTorch": {
1209
+ "unclassified": 0,
1210
+ "single": 0,
1211
+ "multi": 2
1212
+ },
1213
+ "Tokenizers": {
1214
+ "unclassified": 0,
1215
+ "single": 0,
1216
+ "multi": 0
1217
+ },
1218
+ "Pipelines": {
1219
+ "unclassified": 0,
1220
+ "single": 0,
1221
+ "multi": 0
1222
+ },
1223
+ "Trainer": {
1224
+ "unclassified": 0,
1225
+ "single": 0,
1226
+ "multi": 0
1227
+ },
1228
+ "ONNX": {
1229
+ "unclassified": 0,
1230
+ "single": 0,
1231
+ "multi": 0
1232
+ },
1233
+ "Auto": {
1234
+ "unclassified": 0,
1235
+ "single": 0,
1236
+ "multi": 0
1237
+ },
1238
+ "Quantization": {
1239
+ "unclassified": 0,
1240
+ "single": 0,
1241
+ "multi": 0
1242
+ },
1243
+ "Unclassified": {
1244
+ "unclassified": 0,
1245
+ "single": 0,
1246
+ "multi": 0
1247
+ }
1248
+ },
1249
+ "errors": 0,
1250
+ "success": 132,
1251
+ "skipped": 72,
1252
+ "time_spent": [
1253
+ 124.19
1254
+ ],
1255
+ "error": false,
1256
+ "failures": {
1257
+ "multi": [
1258
+ {
1259
+ "line": "tests/models/qwen2_audio/test_modeling_qwen2_audio.py::Qwen2AudioForConditionalGenerationModelTest::test_eager_matches_fa2_generate",
1260
+ "trace": "(line 165) RuntimeError: cu_seqlens_q must have shape (batch_size + 1)"
1261
+ },
1262
+ {
1263
+ "line": "tests/models/qwen2_audio/test_modeling_qwen2_audio.py::Qwen2AudioForConditionalGenerationModelTest::test_multi_gpu_data_parallel_forward",
1264
+ "trace": "(line 129) TypeError: 'DynamicCache' object is not iterable"
1265
+ }
1266
+ ]
1267
+ },
1268
+ "job_link": {
1269
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397920"
1270
+ },
1271
+ "captured_info": {}
1272
+ },
1273
+ "models_smolvlm": {
1274
+ "failed": {
1275
+ "PyTorch": {
1276
+ "unclassified": 0,
1277
+ "single": 0,
1278
+ "multi": 1
1279
+ },
1280
+ "Tokenizers": {
1281
+ "unclassified": 0,
1282
+ "single": 0,
1283
+ "multi": 0
1284
+ },
1285
+ "Pipelines": {
1286
+ "unclassified": 0,
1287
+ "single": 0,
1288
+ "multi": 0
1289
+ },
1290
+ "Trainer": {
1291
+ "unclassified": 0,
1292
+ "single": 0,
1293
+ "multi": 0
1294
+ },
1295
+ "ONNX": {
1296
+ "unclassified": 0,
1297
+ "single": 0,
1298
+ "multi": 0
1299
+ },
1300
+ "Auto": {
1301
+ "unclassified": 0,
1302
+ "single": 0,
1303
+ "multi": 0
1304
+ },
1305
+ "Quantization": {
1306
+ "unclassified": 0,
1307
+ "single": 0,
1308
+ "multi": 0
1309
+ },
1310
+ "Unclassified": {
1311
+ "unclassified": 0,
1312
+ "single": 0,
1313
+ "multi": 0
1314
+ }
1315
+ },
1316
+ "errors": 0,
1317
+ "success": 264,
1318
+ "skipped": 97,
1319
+ "time_spent": [
1320
+ 99.17
1321
+ ],
1322
+ "error": false,
1323
+ "failures": {
1324
+ "multi": [
1325
+ {
1326
+ "line": "tests/models/smolvlm/test_modeling_smolvlm.py::SmolVLMForConditionalGenerationIntegrationTest::test_integration_test_video",
1327
+ "trace": "(line 589) AssertionError: 'User[310 chars]ideo depicts a large language model architectu[58 chars]ture' != 'User[310 chars]ideo showcases a large language model, specifi[56 chars] and'"
1328
+ }
1329
+ ]
1330
+ },
1331
+ "job_link": {
1332
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397883"
1333
+ },
1334
+ "captured_info": {
1335
+ "multi": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397883#step:16:1"
1336
+ }
1337
+ },
1338
+ "models_t5": {
1339
+ "failed": {
1340
+ "PyTorch": {
1341
+ "unclassified": 0,
1342
+ "single": 1,
1343
+ "multi": 0
1344
+ },
1345
+ "Tokenizers": {
1346
+ "unclassified": 0,
1347
+ "single": 0,
1348
+ "multi": 0
1349
+ },
1350
+ "Pipelines": {
1351
+ "unclassified": 0,
1352
+ "single": 0,
1353
+ "multi": 0
1354
+ },
1355
+ "Trainer": {
1356
+ "unclassified": 0,
1357
+ "single": 0,
1358
+ "multi": 0
1359
+ },
1360
+ "ONNX": {
1361
+ "unclassified": 0,
1362
+ "single": 0,
1363
+ "multi": 0
1364
+ },
1365
+ "Auto": {
1366
+ "unclassified": 0,
1367
+ "single": 0,
1368
+ "multi": 0
1369
+ },
1370
+ "Quantization": {
1371
+ "unclassified": 0,
1372
+ "single": 0,
1373
+ "multi": 0
1374
+ },
1375
+ "Unclassified": {
1376
+ "unclassified": 0,
1377
+ "single": 0,
1378
+ "multi": 0
1379
+ }
1380
+ },
1381
+ "errors": 0,
1382
+ "success": 286,
1383
+ "skipped": 249,
1384
+ "time_spent": [
1385
+ 190.95
1386
+ ],
1387
+ "error": false,
1388
+ "failures": {
1389
+ "single": [
1390
+ {
1391
+ "line": "tests/models/t5/test_modeling_t5.py::T5ModelIntegrationTests::test_contrastive_search_t5",
1392
+ "trace": "(line 286) TypeError: 'EncoderDecoderCache' object is not subscriptable"
1393
+ }
1394
+ ]
1395
+ },
1396
+ "job_link": {
1397
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397836"
1398
+ },
1399
+ "captured_info": {}
1400
+ },
1401
+ "models_table_transformer": {
1402
+ "failed": {
1403
+ "PyTorch": {
1404
+ "unclassified": 0,
1405
+ "single": 2,
1406
+ "multi": 0
1407
+ },
1408
+ "Tokenizers": {
1409
+ "unclassified": 0,
1410
+ "single": 0,
1411
+ "multi": 0
1412
+ },
1413
+ "Pipelines": {
1414
+ "unclassified": 0,
1415
+ "single": 0,
1416
+ "multi": 0
1417
+ },
1418
+ "Trainer": {
1419
+ "unclassified": 0,
1420
+ "single": 0,
1421
+ "multi": 0
1422
+ },
1423
+ "ONNX": {
1424
+ "unclassified": 0,
1425
+ "single": 0,
1426
+ "multi": 0
1427
+ },
1428
+ "Auto": {
1429
+ "unclassified": 0,
1430
+ "single": 0,
1431
+ "multi": 0
1432
+ },
1433
+ "Quantization": {
1434
+ "unclassified": 0,
1435
+ "single": 0,
1436
+ "multi": 0
1437
+ },
1438
+ "Unclassified": {
1439
+ "unclassified": 0,
1440
+ "single": 0,
1441
+ "multi": 0
1442
+ }
1443
+ },
1444
+ "errors": 0,
1445
+ "success": 52,
1446
+ "skipped": 121,
1447
+ "time_spent": [
1448
+ 48.99
1449
+ ],
1450
+ "error": false,
1451
+ "failures": {
1452
+ "single": [
1453
+ {
1454
+ "line": "tests/models/table_transformer/test_modeling_table_transformer.py::TableTransformerModelTest::test_torch_export",
1455
+ "trace": "(line 745) torch._dynamo.exc.Unsupported: Attempted to call function marked as skipped"
1456
+ },
1457
+ {
1458
+ "line": "tests/models/table_transformer/test_modeling_table_transformer.py::TableTransformerModelIntegrationTests::test_table_detection",
1459
+ "trace": "(line 571) AssertionError: Tensor-likes are not close!"
1460
+ }
1461
+ ]
1462
+ },
1463
+ "job_link": {
1464
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397833"
1465
+ },
1466
+ "captured_info": {
1467
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397833#step:16:1"
1468
+ }
1469
+ },
1470
+ "models_vit": {
1471
+ "failed": {
1472
+ "PyTorch": {
1473
+ "unclassified": 0,
1474
+ "single": 0,
1475
+ "multi": 0
1476
+ },
1477
+ "Tokenizers": {
1478
+ "unclassified": 0,
1479
+ "single": 0,
1480
+ "multi": 0
1481
+ },
1482
+ "Pipelines": {
1483
+ "unclassified": 0,
1484
+ "single": 0,
1485
+ "multi": 0
1486
+ },
1487
+ "Trainer": {
1488
+ "unclassified": 0,
1489
+ "single": 0,
1490
+ "multi": 0
1491
+ },
1492
+ "ONNX": {
1493
+ "unclassified": 0,
1494
+ "single": 0,
1495
+ "multi": 0
1496
+ },
1497
+ "Auto": {
1498
+ "unclassified": 0,
1499
+ "single": 0,
1500
+ "multi": 0
1501
+ },
1502
+ "Quantization": {
1503
+ "unclassified": 0,
1504
+ "single": 0,
1505
+ "multi": 0
1506
+ },
1507
+ "Unclassified": {
1508
+ "unclassified": 0,
1509
+ "single": 0,
1510
+ "multi": 0
1511
+ }
1512
+ },
1513
+ "errors": 0,
1514
+ "success": 0,
1515
+ "skipped": 0,
1516
+ "time_spent": [],
1517
+ "error": false,
1518
+ "failures": {},
1519
+ "job_link": {},
1520
+ "captured_info": {}
1521
+ },
1522
+ "models_wav2vec2": {
1523
+ "failed": {
1524
+ "PyTorch": {
1525
+ "unclassified": 0,
1526
+ "single": 0,
1527
+ "multi": 0
1528
+ },
1529
+ "Tokenizers": {
1530
+ "unclassified": 0,
1531
+ "single": 0,
1532
+ "multi": 0
1533
+ },
1534
+ "Pipelines": {
1535
+ "unclassified": 0,
1536
+ "single": 0,
1537
+ "multi": 0
1538
+ },
1539
+ "Trainer": {
1540
+ "unclassified": 0,
1541
+ "single": 0,
1542
+ "multi": 0
1543
+ },
1544
+ "ONNX": {
1545
+ "unclassified": 0,
1546
+ "single": 0,
1547
+ "multi": 0
1548
+ },
1549
+ "Auto": {
1550
+ "unclassified": 0,
1551
+ "single": 0,
1552
+ "multi": 0
1553
+ },
1554
+ "Quantization": {
1555
+ "unclassified": 0,
1556
+ "single": 0,
1557
+ "multi": 0
1558
+ },
1559
+ "Unclassified": {
1560
+ "unclassified": 0,
1561
+ "single": 0,
1562
+ "multi": 0
1563
+ }
1564
+ },
1565
+ "errors": 0,
1566
+ "success": 346,
1567
+ "skipped": 187,
1568
+ "time_spent": [
1569
+ 283.58
1570
+ ],
1571
+ "error": false,
1572
+ "failures": {},
1573
+ "job_link": {
1574
+ "single": "https://github.com/huggingface/transformers/actions/runs/18379785792/job/52363397831"
1575
+ },
1576
+ "captured_info": {}
1577
+ },
1578
+ "models_whisper": {
1579
+ "failed": {
1580
+ "PyTorch": {
1581
+ "unclassified": 0,
1582
+ "single": 0,
1583
+ "multi": 0
1584
+ },
1585
+ "Tokenizers": {
1586
+ "unclassified": 0,
1587
+ "single": 0,
1588
+ "multi": 0
1589
+ },
1590
+ "Pipelines": {
1591
+ "unclassified": 0,
1592
+ "single": 0,
1593
+ "multi": 0
1594
+ },
1595
+ "Trainer": {
1596
+ "unclassified": 0,
1597
+ "single": 0,
1598
+ "multi": 0
1599
+ },
1600
+ "ONNX": {
1601
+ "unclassified": 0,
1602
+ "single": 0,
1603
+ "multi": 0
1604
+ },
1605
+ "Auto": {
1606
+ "unclassified": 0,
1607
+ "single": 0,
1608
+ "multi": 0
1609
+ },
1610
+ "Quantization": {
1611
+ "unclassified": 0,
1612
+ "single": 0,
1613
+ "multi": 0
1614
+ },
1615
+ "Unclassified": {
1616
+ "unclassified": 0,
1617
+ "single": 0,
1618
+ "multi": 0
1619
+ }
1620
+ },
1621
+ "errors": 0,
1622
+ "success": 0,
1623
+ "skipped": 0,
1624
+ "time_spent": [],
1625
+ "error": false,
1626
+ "failures": {},
1627
+ "job_link": {},
1628
+ "captured_info": {}
1629
+ }
1630
+ }