YusuphaJuwara commited on
Commit
5edc433
·
verified ·
1 Parent(s): 86a198a

Push model using huggingface_hub.

Browse files
Files changed (2) hide show
  1. README.md +5 -1028
  2. model.safetensors +1 -1
README.md CHANGED
@@ -1,1032 +1,9 @@
1
  ---
2
  tags:
3
- - Diffusion
4
- - Data Generation
5
- language: en
6
- task: Data generation for computer vision tasks
7
- datasets: MNIST
8
- metrics:
9
- epoch: 29
10
- train_loss:
11
- - 0.017333630472421646
12
- - 0.013288658112287521
13
- - 0.014131959527730942
14
- - 0.01134562212973833
15
- - 0.015983732417225838
16
- - 0.01854863576591015
17
- - 0.01349550113081932
18
- - 0.013131790794432163
19
- - 0.014642741531133652
20
- - 0.021217338740825653
21
- - 0.012515627779066563
22
- - 0.012617889791727066
23
- - 0.01402036938816309
24
- - 0.010789299383759499
25
- - 0.013743193820118904
26
- - 0.014298714697360992
27
- - 0.014383658766746521
28
- - 0.01215966884046793
29
- - 0.012905692681670189
30
- - 0.016368890181183815
31
- - 0.00955274235457182
32
- - 0.015453857369720936
33
- - 0.017132369801402092
34
- - 0.014982742257416248
35
- - 0.016994431614875793
36
- - 0.01078822836279869
37
- - 0.023941243067383766
38
- - 0.020318197086453438
39
- - 0.01393840741366148
40
- - 0.019643334671854973
41
- - 0.018156161531805992
42
- - 0.01748468540608883
43
- - 0.013957047834992409
44
- - 0.014808719977736473
45
- - 0.015526026487350464
46
- - 0.009133009240031242
47
- - 0.014872062020003796
48
- - 0.011816459707915783
49
- - 0.008824722841382027
50
- - 0.014392831362783909
51
- - 0.016627607867121696
52
- - 0.016759013757109642
53
- - 0.019479580223560333
54
- - 0.011849241331219673
55
- - 0.020260635763406754
56
- - 0.010110241360962391
57
- - 0.015549426898360252
58
- - 0.009782415814697742
59
- - 0.01201682910323143
60
- - 0.015561612322926521
61
- - 0.012369685806334019
62
- - 0.013025159016251564
63
- - 0.009392274543642998
64
- - 0.014243602752685547
65
- - 0.016861967742443085
66
- - 0.014752360992133617
67
- - 0.013326054438948631
68
- - 0.025396553799510002
69
- - 0.014575870707631111
70
- - 0.016638945788145065
71
- - 0.014769568108022213
72
- - 0.016070233657956123
73
- - 0.01536205317825079
74
- - 0.010987811721861362
75
- - 0.01935768872499466
76
- - 0.013108170591294765
77
- - 0.015007180161774158
78
- - 0.016544358804821968
79
- - 0.015549284406006336
80
- - 0.01455738116055727
81
- - 0.017494820058345795
82
- - 0.017457205802202225
83
- - 0.011198613792657852
84
- - 0.012505783699452877
85
- - 0.010834002867341042
86
- - 0.012184059247374535
87
- - 0.010029546916484833
88
- - 0.015648335218429565
89
- - 0.018520375713706017
90
- - 0.01346595399081707
91
- - 0.015328703448176384
92
- - 0.011555761098861694
93
- - 0.011692791245877743
94
- - 0.012428169138729572
95
- - 0.01796085759997368
96
- - 0.013755768537521362
97
- - 0.020420962944626808
98
- - 0.011946371756494045
99
- - 0.013870367780327797
100
- - 0.012402074411511421
101
- - 0.013507485389709473
102
- - 0.014376350678503513
103
- - 0.011529994197189808
104
- - 0.010900256223976612
105
- - 0.015432910062372684
106
- - 0.013659048825502396
107
- - 0.014901657588779926
108
- - 0.012202741578221321
109
- - 0.012503708712756634
110
- - 0.012239048257470131
111
- - 0.012564882636070251
112
- - 0.010759878903627396
113
- - 0.012590278871357441
114
- - 0.013469817116856575
115
- - 0.01277702022343874
116
- - 0.016737882047891617
117
- - 0.017005419358611107
118
- - 0.013821151107549667
119
- - 0.016368744894862175
120
- - 0.015345407649874687
121
- - 0.01913638971745968
122
- - 0.010300843976438046
123
- - 0.01322471909224987
124
- - 0.01335649099200964
125
- - 0.020412936806678772
126
- - 0.011441858485341072
127
- - 0.010504810139536858
128
- - 0.014503758400678635
129
- - 0.017439285293221474
130
- - 0.017329100519418716
131
- - 0.009978150948882103
132
- - 0.015380693599581718
133
- - 0.01138811931014061
134
- - 0.009775767102837563
135
- - 0.013396445661783218
136
- - 0.01537265069782734
137
- - 0.011595050804316998
138
- - 0.017803644761443138
139
- - 0.014677044935524464
140
- - 0.01619725301861763
141
- - 0.013134379871189594
142
- - 0.015728184953331947
143
- - 0.014533241279423237
144
- - 0.01421249657869339
145
- - 0.017625225707888603
146
- - 0.020569326356053352
147
- - 0.0125654311850667
148
- - 0.014046120457351208
149
- - 0.01817973516881466
150
- - 0.016164248809218407
151
- - 0.013101212680339813
152
- - 0.01644863374531269
153
- - 0.01667088083922863
154
- - 0.011696787551045418
155
- - 0.01109676156193018
156
- - 0.010174819268286228
157
- - 0.012544909492135048
158
- - 0.01674845442175865
159
- - 0.014616350643336773
160
- - 0.011529577895998955
161
- - 0.010458449833095074
162
- - 0.01460303831845522
163
- - 0.014858235605061054
164
- - 0.01419829297810793
165
- - 0.01783718355000019
166
- - 0.011369994841516018
167
- - 0.02003888227045536
168
- - 0.011950835585594177
169
- - 0.019291898235678673
170
- - 0.01352714467793703
171
- - 0.012682795524597168
172
- - 0.00845418032258749
173
- - 0.02131934091448784
174
- - 0.013714057393372059
175
- - 0.016623608767986298
176
- - 0.015697596594691277
177
- - 0.010072177276015282
178
- - 0.017170598730444908
179
- - 0.009768838994204998
180
- - 0.01315668597817421
181
- - 0.02059948444366455
182
- - 0.014639005996286869
183
- - 0.01953229121863842
184
- - 0.013747887685894966
185
- - 0.01194606814533472
186
- - 0.015497948043048382
187
- - 0.013894002884626389
188
- - 0.009731888771057129
189
- - 0.01871037483215332
190
- - 0.013150541111826897
191
- - 0.011971859261393547
192
- - 0.021171938627958298
193
- - 0.010266777127981186
194
- - 0.01326258759945631
195
- - 0.011156465858221054
196
- - 0.010782522149384022
197
- - 0.012736622244119644
198
- - 0.01170029304921627
199
- - 0.012558034621179104
200
- - 0.012115784920752048
201
- - 0.018035227432847023
202
- - 0.01650198921561241
203
- - 0.010216460563242435
204
- - 0.015150435268878937
205
- - 0.014034537598490715
206
- - 0.014928328804671764
207
- - 0.011596925556659698
208
- - 0.013667029328644276
209
- - 0.013545772060751915
210
- - 0.015956567600369453
211
- - 0.009544087573885918
212
- - 0.017125392332673073
213
- - 0.01240022387355566
214
- - 0.013762716203927994
215
- - 0.012325942516326904
216
- - 0.01271053496748209
217
- - 0.013317981734871864
218
- - 0.018280625343322754
219
- - 0.014640440233051777
220
- - 0.013242043554782867
221
- - 0.015343820676207542
222
- - 0.015041814185678959
223
- - 0.019607240334153175
224
- - 0.01811889372766018
225
- - 0.012411669827997684
226
- - 0.015786658972501755
227
- - 0.015110167674720287
228
- - 0.013584893196821213
229
- - 0.011473085731267929
230
- - 0.011934284120798111
231
- - 0.017903532832860947
232
- - 0.014214505441486835
233
- - 0.017017945647239685
234
- - 0.016758205369114876
235
- - 0.00958204735070467
236
- - 0.01589513197541237
237
- - 0.015542767941951752
238
- - 0.015510055236518383
239
- - 0.015265804715454578
240
- - 0.01926662027835846
241
- - 0.011616100557148457
242
- - 0.013844001106917858
243
- - 0.015248811803758144
244
- - 0.01700497418642044
245
- - 0.013940848410129547
246
- - 0.014618311077356339
247
- - 0.015082359313964844
248
- - 0.015676476061344147
249
- - 0.016524111852049828
250
- - 0.021901674568653107
251
- - 0.015105407685041428
252
- - 0.010298303328454494
253
- - 0.01830863021314144
254
- - 0.011573568917810917
255
- - 0.014276545494794846
256
- - 0.014323501847684383
257
- - 0.014678534120321274
258
- - 0.01621430553495884
259
- - 0.011429899372160435
260
- - 0.015631375834345818
261
- - 0.019196055829524994
262
- - 0.015073194168508053
263
- - 0.012785394676029682
264
- - 0.014001746661961079
265
- - 0.015699023380875587
266
- - 0.018142329528927803
267
- - 0.011073355562984943
268
- - 0.013980533927679062
269
- - 0.011681457050144672
270
- - 0.012066834606230259
271
- - 0.013719703070819378
272
- - 0.01293028425425291
273
- - 0.01355150155723095
274
- - 0.015440668910741806
275
- - 0.012632201425731182
276
- - 0.018365660682320595
277
- - 0.014937124215066433
278
- - 0.014378379099071026
279
- - 0.013518700376152992
280
- - 0.01319380197674036
281
- - 0.017831405624747276
282
- - 0.01214677095413208
283
- - 0.016612902283668518
284
- - 0.01331848744302988
285
- - 0.014788851141929626
286
- - 0.014340704306960106
287
- - 0.015866780653595924
288
- - 0.017376836389303207
289
- - 0.011296931654214859
290
- - 0.013966034166514874
291
- - 0.012803178280591965
292
- - 0.01330440491437912
293
- - 0.015451428480446339
294
- - 0.01845480501651764
295
- - 0.012205432169139385
296
- - 0.010485924780368805
297
- - 0.016200676560401917
298
- - 0.020071299746632576
299
- - 0.013355085626244545
300
- - 0.017148183658719063
301
- - 0.01713094301521778
302
- - 0.01277029886841774
303
- - 0.013219157233834267
304
- - 0.01505940593779087
305
- - 0.016952453181147575
306
- - 0.016247907653450966
307
- - 0.013557398691773415
308
- - 0.012139378115534782
309
- - 0.013064206577837467
310
- - 0.015468801371753216
311
- - 0.014736601151525974
312
- - 0.01571846380829811
313
- - 0.01600896567106247
314
- - 0.013209763914346695
315
- - 0.014491603709757328
316
- - 0.01178015023469925
317
- - 0.011373682878911495
318
- - 0.016068875789642334
319
- - 0.01728208176791668
320
- - 0.010332310572266579
321
- - 0.013087634928524494
322
- - 0.014020969159901142
323
- - 0.01590432971715927
324
- - 0.017970509827136993
325
- - 0.009974922984838486
326
- - 0.019206790253520012
327
- - 0.011687812395393848
328
- - 0.016681257635354996
329
- - 0.013558768667280674
330
- - 0.014685174450278282
331
- - 0.014954106882214546
332
- - 0.011504645459353924
333
- - 0.01859305240213871
334
- - 0.01922658644616604
335
- - 0.01751636527478695
336
- - 0.013726362958550453
337
- - 0.014151672832667828
338
- - 0.014673897065222263
339
- - 0.018810413777828217
340
- - 0.011140655726194382
341
- - 0.0174961369484663
342
- - 0.014765569008886814
343
- - 0.008223514072597027
344
- - 0.019447948783636093
345
- - 0.01526790950447321
346
- - 0.009782285429537296
347
- - 0.014783073216676712
348
- - 0.01241692528128624
349
- - 0.011931253597140312
350
- - 0.01614314131438732
351
- - 0.01561375055462122
352
- - 0.012693971395492554
353
- - 0.01731012389063835
354
- - 0.010400650091469288
355
- - 0.011815676465630531
356
- - 0.012627491727471352
357
- - 0.01670985296368599
358
- - 0.014029555022716522
359
- - 0.015182431787252426
360
- - 0.011831680312752724
361
- - 0.01444346085190773
362
- - 0.010238436050713062
363
- - 0.019696911796927452
364
- - 0.013084035366773605
365
- - 0.014980046078562737
366
- - 0.02293357439339161
367
- - 0.012449936009943485
368
- - 0.0201268307864666
369
- - 0.01455872505903244
370
- - 0.013037462718784809
371
- - 0.011681891977787018
372
- - 0.011929433792829514
373
- - 0.012943126261234283
374
- - 0.01795036904513836
375
- - 0.01483464241027832
376
- - 0.01500445231795311
377
- - 0.010292630642652512
378
- - 0.011287686415016651
379
- - 0.010131157003343105
380
- - 0.013747447170317173
381
- - 0.014640001580119133
382
- - 0.016291307285428047
383
- - 0.009836975485086441
384
- - 0.009369305334985256
385
- - 0.01337192952632904
386
- - 0.017312021926045418
387
- - 0.01569763571023941
388
- - 0.015083231031894684
389
- - 0.009576041251420975
390
- - 0.017608392983675003
391
- - 0.016406038776040077
392
- - 0.017029061913490295
393
- - 0.01601594313979149
394
- - 0.021131832152605057
395
- - 0.014295983128249645
396
- - 0.018255343660712242
397
- - 0.012127683497965336
398
- - 0.016475819051265717
399
- - 0.012101727537810802
400
- - 0.015622656792402267
401
- - 0.013654403388500214
402
- - 0.014168176800012589
403
- - 0.017973223701119423
404
- - 0.012863966636359692
405
- - 0.020493410527706146
406
- - 0.01402577105909586
407
- - 0.014867295511066914
408
- - 0.01224115677177906
409
- - 0.017477117478847504
410
- - 0.010653967969119549
411
- - 0.01298364158719778
412
- - 0.01671256124973297
413
- - 0.015526299364864826
414
- - 0.014925869181752205
415
- - 0.016414795070886612
416
- - 0.02012038230895996
417
- - 0.00936219748109579
418
- - 0.01281198300421238
419
- - 0.01668417453765869
420
- - 0.014464844949543476
421
- - 0.012710928916931152
422
- - 0.01374725066125393
423
- - 0.017560379579663277
424
- - 0.013117346912622452
425
- - 0.013731447979807854
426
- - 0.010643073357641697
427
- - 0.01845804788172245
428
- - 0.016197731718420982
429
- - 0.012650095857679844
430
- - 0.010339869186282158
431
- - 0.015053506940603256
432
- - 0.019049691036343575
433
- - 0.012722444720566273
434
- - 0.021629853174090385
435
- - 0.02189377322793007
436
- - 0.018102159723639488
437
- - 0.018124759197235107
438
- - 0.015199276618659496
439
- - 0.014299195259809494
440
- - 0.011958462186157703
441
- - 0.013961698859930038
442
- - 0.012612021528184414
443
- - 0.018045106902718544
444
- - 0.013044118881225586
445
- - 0.016424162313342094
446
- - 0.012477574869990349
447
- - 0.013037440367043018
448
- - 0.015926027670502663
449
- - 0.0176842138171196
450
- - 0.013084822334349155
451
- - 0.018889371305704117
452
- - 0.013848566450178623
453
- - 0.015453696250915527
454
- - 0.015553737990558147
455
- - 0.012461688369512558
456
- - 0.015496906824409962
457
- - 0.018106279894709587
458
- - 0.013074343092739582
459
- - 0.01853059232234955
460
- - 0.015158895403146744
461
- - 0.014679958112537861
462
- - 0.018050523474812508
463
- - 0.012020675465464592
464
- - 0.012696592137217522
465
- - 0.01475496031343937
466
- - 0.01765030063688755
467
- - 0.01360783725976944
468
- - 0.01146648358553648
469
- - 0.014982355758547783
470
- - 0.01551226619631052
471
- - 0.010762779042124748
472
- - 0.014675485901534557
473
- - 0.014809508807957172
474
- - 0.009451356716454029
475
- - 0.015422300435602665
476
- - 0.019429028034210205
477
- - 0.017073918133974075
478
- - 0.016603562980890274
479
- - 0.01398480124771595
480
- - 0.013359147123992443
481
- - 0.011818716302514076
482
- - 0.01639053039252758
483
- - 0.017027592286467552
484
- - 0.01357841957360506
485
- - 0.01619878038764
486
- - 0.01626768335700035
487
- - 0.016250109300017357
488
- - 0.01230620127171278
489
- - 0.01024792529642582
490
- - 0.016911029815673828
491
- - 0.02147149108350277
492
- - 0.015476671978831291
493
- - 0.017730101943016052
494
- - 0.010438419878482819
495
- - 0.012225031852722168
496
- - 0.014002553187310696
497
- - 0.015893159434199333
498
- - 0.015178621746599674
499
- - 0.014046695083379745
500
- - 0.012262053787708282
501
- - 0.012009000405669212
502
- - 0.012601504102349281
503
- - 0.018276376649737358
504
- - 0.008967944420874119
505
- - 0.01598045416176319
506
- - 0.015678750351071358
507
- - 0.018675170838832855
508
- - 0.015139997936785221
509
- - 0.012568139471113682
510
- - 0.012870525941252708
511
- - 0.018368573859333992
512
- - 0.01459738053381443
513
- - 0.011436809785664082
514
- - 0.009900271892547607
515
- - 0.007469954900443554
516
- - 0.015696309506893158
517
- - 0.01766524650156498
518
- - 0.014384942129254341
519
- - 0.015850253403186798
520
- - 0.012514765374362469
521
- - 0.010037478059530258
522
- - 0.020646242424845695
523
- - 0.010820741765201092
524
- - 0.013820165768265724
525
- - 0.012688442133367062
526
- - 0.015065834857523441
527
- - 0.019048379734158516
528
- - 0.019603244960308075
529
- - 0.017720071598887444
530
- - 0.016139227896928787
531
- - 0.01498098112642765
532
- - 0.016578644514083862
533
- - 0.012104857712984085
534
- - 0.015515106730163097
535
- - 0.009655576199293137
536
- - 0.013652966357767582
537
- - 0.015815764665603638
538
- - 0.012256995774805546
539
- - 0.016371391713619232
540
- - 0.015710975974798203
541
- - 0.01938571222126484
542
- - 0.010568182915449142
543
- - 0.012457290664315224
544
- - 0.018121393397450447
545
- - 0.018976321443915367
546
- - 0.017733415588736534
547
- - 0.011717230081558228
548
- - 0.015544308349490166
549
- - 0.014414794743061066
550
- - 0.019137132912874222
551
- - 0.010987960733473301
552
- - 0.01528106164187193
553
- - 0.014228345826268196
554
- - 0.009589200839400291
555
- - 0.017499784007668495
556
- - 0.015450101345777512
557
- - 0.015299881808459759
558
- - 0.00903781782835722
559
- - 0.019468164071440697
560
- - 0.012247482314705849
561
- - 0.013476804830133915
562
- - 0.014939687214791775
563
- - 0.016065072268247604
564
- - 0.018567383289337158
565
- - 0.011817574501037598
566
- - 0.017804346978664398
567
- - 0.017104292288422585
568
- - 0.01296243630349636
569
- - 0.010723194107413292
570
- - 0.012646554037928581
571
- - 0.015276512131094933
572
- - 0.019176820293068886
573
- - 0.019983038306236267
574
- - 0.017482733353972435
575
- - 0.013310551643371582
576
- - 0.012764478102326393
577
- - 0.013598884455859661
578
- - 0.01335060689598322
579
- - 0.017629235982894897
580
- - 0.015604191459715366
581
- - 0.013178203254938126
582
- - 0.010128206573426723
583
- - 0.01262409333139658
584
- - 0.014265517704188824
585
- - 0.01178326178342104
586
- - 0.01979014091193676
587
- - 0.013225648552179337
588
- - 0.015234929509460926
589
- - 0.015112904831767082
590
- - 0.014096125960350037
591
- - 0.012725974433124065
592
- - 0.010879795998334885
593
- - 0.012719851918518543
594
- - 0.01735844835639
595
- - 0.014749348163604736
596
- - 0.010376974008977413
597
- - 0.009538698010146618
598
- - 0.00996649544686079
599
- - 0.021506574004888535
600
- - 0.015249261632561684
601
- - 0.01273189764469862
602
- - 0.01661057211458683
603
- - 0.011376317590475082
604
- - 0.023469718173146248
605
- - 0.015995053574442863
606
- - 0.011175882071256638
607
- - 0.010422191582620144
608
- - 0.010957159101963043
609
- - 0.011366610415279865
610
- - 0.01954798959195614
611
- - 0.016317147761583328
612
- - 0.016325432807207108
613
- - 0.01599765755236149
614
- - 0.01575544849038124
615
- - 0.007038059178739786
616
- - 0.012063322588801384
617
- - 0.015736889094114304
618
- - 0.013678666204214096
619
- - 0.009853698313236237
620
- - 0.017307603731751442
621
- - 0.012014580890536308
622
- - 0.009932931512594223
623
- - 0.011424548923969269
624
- - 0.01599118858575821
625
- - 0.013508500531315804
626
- - 0.01540001854300499
627
- - 0.016566166654229164
628
- - 0.016172491014003754
629
- - 0.01712065190076828
630
- - 0.014992985874414444
631
- - 0.018751105293631554
632
- - 0.015973305329680443
633
- - 0.010924093425273895
634
- - 0.015129655599594116
635
- - 0.009636759757995605
636
- - 0.008269858546555042
637
- - 0.01651899702847004
638
- - 0.013600446283817291
639
- - 0.019705941900610924
640
- - 0.013589570298790932
641
- - 0.013534970581531525
642
- - 0.02242007665336132
643
- - 0.016660450026392937
644
- - 0.014255590736865997
645
- - 0.008913667872548103
646
- - 0.01189431082457304
647
- - 0.014090865850448608
648
- - 0.010125054977834225
649
- - 0.01180209405720234
650
- - 0.017894700169563293
651
- - 0.015846332535147667
652
- - 0.010791189968585968
653
- - 0.01573799178004265
654
- - 0.012432614341378212
655
- - 0.022227119654417038
656
- - 0.015712173655629158
657
- - 0.013672086410224438
658
- - 0.013004968874156475
659
- - 0.013909653760492802
660
- - 0.014477739110589027
661
- - 0.011897847056388855
662
- - 0.009448765777051449
663
- - 0.015882251784205437
664
- - 0.019516658037900925
665
- - 0.014047859236598015
666
- - 0.010066770948469639
667
- - 0.01699071004986763
668
- - 0.013653088361024857
669
- - 0.013432429172098637
670
- - 0.014027731493115425
671
- - 0.01933608390390873
672
- - 0.016060229390859604
673
- - 0.013173962943255901
674
- - 0.01745576784014702
675
- - 0.014340981841087341
676
- - 0.01754864677786827
677
- - 0.013641875237226486
678
- - 0.01707356795668602
679
- - 0.014532758854329586
680
- - 0.011571956798434258
681
- - 0.015317556448280811
682
- - 0.014494989067316055
683
- - 0.013046471402049065
684
- - 0.016881175339221954
685
- - 0.022536374628543854
686
- - 0.01539967954158783
687
- - 0.015241493470966816
688
- - 0.019774286076426506
689
- - 0.016427915543317795
690
- - 0.015296381898224354
691
- - 0.01625637151300907
692
- - 0.01671222597360611
693
- - 0.013162882998585701
694
- - 0.01151960901916027
695
- - 0.012253244407474995
696
- - 0.012692741118371487
697
- - 0.014883951283991337
698
- - 0.015987901017069817
699
- - 0.010062171146273613
700
- - 0.012439112178981304
701
- - 0.017100248485803604
702
- - 0.01593669503927231
703
- - 0.013002506457269192
704
- - 0.012204378843307495
705
- - 0.020815391093492508
706
- - 0.017704259604215622
707
- - 0.016283998265862465
708
- - 0.01374803576618433
709
- - 0.01326970849186182
710
- - 0.013681869953870773
711
- - 0.01916087418794632
712
- - 0.014228479005396366
713
- - 0.015284082852303982
714
- - 0.013118225149810314
715
- - 0.016419051215052605
716
- - 0.015038863755762577
717
- - 0.01239532046020031
718
- - 0.019062280654907227
719
- - 0.012171631678938866
720
- - 0.014914407394826412
721
- - 0.013709590770304203
722
- - 0.011244338005781174
723
- - 0.019630681723356247
724
- - 0.01725885458290577
725
- - 0.01697155460715294
726
- - 0.021054275333881378
727
- - 0.016259025782346725
728
- - 0.009422065690159798
729
- - 0.01735103875398636
730
- - 0.01650640182197094
731
- - 0.010875373147428036
732
- - 0.013264614157378674
733
- - 0.01598810963332653
734
- - 0.012092385441064835
735
- - 0.013890203088521957
736
- - 0.015502367168664932
737
- - 0.012324106879532337
738
- - 0.013529404066503048
739
- - 0.014866003766655922
740
- - 0.015617732889950275
741
- - 0.023216811940073967
742
- - 0.01205433625727892
743
- - 0.014513203874230385
744
- - 0.01249017845839262
745
- - 0.018115442246198654
746
- - 0.016401495784521103
747
- - 0.008781210519373417
748
- - 0.015552668832242489
749
- - 0.013504442758858204
750
- - 0.013529480434954166
751
- - 0.01829630881547928
752
- - 0.013242039829492569
753
- - 0.012767331674695015
754
- - 0.01085098646581173
755
- - 0.015984922647476196
756
- - 0.01622430421411991
757
- - 0.014977293089032173
758
- - 0.01617429219186306
759
- - 0.013590424321591854
760
- - 0.017472051084041595
761
- - 0.010097428224980831
762
- - 0.017874449491500854
763
- - 0.0138082941994071
764
- - 0.014818093739449978
765
- - 0.01580427587032318
766
- - 0.011258402839303017
767
- - 0.009780320338904858
768
- - 0.008917277678847313
769
- - 0.013946537859737873
770
- - 0.014117077924311161
771
- - 0.013454439118504524
772
- - 0.01596743054687977
773
- - 0.015738828107714653
774
- - 0.013158791698515415
775
- - 0.012628567405045033
776
- - 0.014631734229624271
777
- - 0.016542457044124603
778
- - 0.016338307410478592
779
- - 0.015391473658382893
780
- - 0.01225336454808712
781
- - 0.012100942432880402
782
- - 0.012476086616516113
783
- - 0.013983197510242462
784
- - 0.012865250930190086
785
- - 0.007715191226452589
786
- - 0.01578512042760849
787
- - 0.012592366896569729
788
- - 0.012816290371119976
789
- - 0.013900418765842915
790
- - 0.01595882512629032
791
- - 0.015227974392473698
792
- - 0.019105294719338417
793
- - 0.01876126416027546
794
- - 0.019027739763259888
795
- - 0.014572840183973312
796
- - 0.015690086409449577
797
- - 0.01635241135954857
798
- - 0.013527040369808674
799
- - 0.019538069143891335
800
- - 0.018505524843931198
801
- - 0.011950402520596981
802
- - 0.018318668007850647
803
- - 0.01488092914223671
804
- - 0.0143594266846776
805
- - 0.015413353219628334
806
- - 0.01464089285582304
807
- - 0.016019968315958977
808
- - 0.012322613038122654
809
- - 0.012920897454023361
810
- - 0.014625931158661842
811
- - 0.017360452562570572
812
- - 0.014710121788084507
813
- - 0.011658741161227226
814
- - 0.015851540490984917
815
- - 0.010808035731315613
816
- - 0.01432371512055397
817
- - 0.01630624569952488
818
- - 0.015824779868125916
819
- - 0.02195395901799202
820
- - 0.013746476732194424
821
- - 0.01332824770361185
822
- - 0.01349385641515255
823
- - 0.0163558479398489
824
- - 0.013516408391296864
825
- - 0.012082776054739952
826
- - 0.016824590042233467
827
- - 0.011953448876738548
828
- - 0.01093829981982708
829
- - 0.0142142865806818
830
- - 0.010084887035191059
831
- - 0.014252163469791412
832
- - 0.011788444593548775
833
- - 0.012525035999715328
834
- - 0.015296266414225101
835
- - 0.010380174033343792
836
- - 0.012750339694321156
837
- - 0.014415226876735687
838
- - 0.012476591393351555
839
- - 0.013235586695373058
840
- - 0.014202125370502472
841
- - 0.011311601847410202
842
- - 0.015194435603916645
843
- - 0.02020971290767193
844
- - 0.015044353902339935
845
- - 0.015578138642013073
846
- - 0.020265983417630196
847
- - 0.01364318747073412
848
- - 0.01709783263504505
849
- - 0.010391856543719769
850
- - 0.022328035905957222
851
- - 0.015082002617418766
852
- - 0.010758745484054089
853
- - 0.013903602957725525
854
- - 0.012786509469151497
855
- - 0.01855980046093464
856
- - 0.017456097528338432
857
- - 0.012117626145482063
858
- - 0.014750443398952484
859
- - 0.012240667827427387
860
- - 0.015729811042547226
861
- - 0.012610701844096184
862
- - 0.014339456334710121
863
- - 0.017079828307032585
864
- - 0.014494889415800571
865
- - 0.01783207431435585
866
- - 0.014108117669820786
867
- - 0.021185392513871193
868
- - 0.013971948996186256
869
- - 0.015639817342162132
870
- - 0.015100396238267422
871
- - 0.013443170115351677
872
- - 0.00709156459197402
873
- - 0.019846085458993912
874
- - 0.013161527924239635
875
- - 0.009439036250114441
876
- - 0.011946340091526508
877
- - 0.014587686397135258
878
- - 0.018949003890156746
879
- - 0.014682353474199772
880
- - 0.015960445627570152
881
- - 0.015610297210514545
882
- - 0.010965105146169662
883
- - 0.01784789003431797
884
- - 0.011388910934329033
885
- - 0.01935971900820732
886
- - 0.013128170743584633
887
- - 0.01897837221622467
888
- - 0.011938890442252159
889
- - 0.013166290707886219
890
- - 0.016698339954018593
891
- - 0.01595478132367134
892
- - 0.013264144770801067
893
- - 0.01719718798995018
894
- - 0.016705019399523735
895
- - 0.011047291569411755
896
- - 0.013877559453248978
897
- - 0.01578536070883274
898
- - 0.012804252095520496
899
- - 0.015369860455393791
900
- - 0.017873162403702736
901
- - 0.016867324709892273
902
- - 0.014693135395646095
903
- - 0.011220413260161877
904
- - 0.012721297331154346
905
- - 0.01309678703546524
906
- - 0.01651678793132305
907
- - 0.017204713076353073
908
- - 0.010976647958159447
909
- - 0.01707112230360508
910
- - 0.016209879890084267
911
- - 0.022403232753276825
912
- - 0.014484064653515816
913
- - 0.012560160830616951
914
- - 0.013944897800683975
915
- - 0.01397325936704874
916
- - 0.01506958156824112
917
- - 0.0225509162992239
918
- - 0.014464533887803555
919
- - 0.015014613047242165
920
- - 0.013324749656021595
921
- - 0.013082613237202168
922
- - 0.012776761315762997
923
- - 0.01873886026442051
924
- - 0.00917312502861023
925
- - 0.009365808218717575
926
- - 0.012312336824834347
927
- - 0.012824994511902332
928
- - 0.009552647359669209
929
- - 0.015907958149909973
930
- - 0.011724616400897503
931
- - 0.01663212850689888
932
- - 0.017701929435133934
933
- - 0.01721535436809063
934
- - 0.018392160534858704
935
- - 0.013118850998580456
936
- - 0.012717662379145622
937
- - 0.015175447799265385
938
- - 0.014076733030378819
939
- - 0.008925280533730984
940
- - 0.009806336835026741
941
- - 0.015773199498653412
942
- - 0.017806628718972206
943
- - 0.02067144215106964
944
- - 0.015171931125223637
945
- - 0.013003258965909481
946
- - 0.013786658644676208
947
- - 0.015748729929327965
948
- - 0.01698893867433071
949
- license: unknown
950
- model-index:
951
- - name: diffusion-practice-v1
952
- results:
953
- - task:
954
- type: nlp
955
- name: Data Generation with Diffusion Model
956
- dataset:
957
- name: MNIST
958
- type: mnist
959
- metrics:
960
- - type: loss
961
- value: '0.02'
962
- name: Loss
963
- verified: false
964
  ---
965
 
966
- # NLI-FEVER Model
967
-
968
- This model is fine-tuned for Natural Language Inference (NLI) tasks using the FEVER dataset.
969
-
970
- ## Model description
971
-
972
- ## Intended uses & limitations
973
-
974
- This model is intended for use in NLI tasks, particularly those related to fact-checking and verifying information.
975
- It should not be used for tasks it wasn't explicitly trained for.
976
-
977
- ## Training and evaluation data
978
-
979
- The model was trained on the FEVER (Fact Extraction and VERification) dataset.
980
-
981
- ## Training procedure
982
-
983
- The model was trained for 29 epochs
984
- Train Losses of [0.017333630472421646, 0.013288658112287521, 0.014131959527730942, 0.01134562212973833, 0.015983732417225838, 0.01854863576591015, 0.01349550113081932, 0.013131790794432163, 0.014642741531133652, 0.021217338740825653, 0.012515627779066563, 0.012617889791727066, 0.01402036938816309, 0.010789299383759499, 0.013743193820118904, 0.014298714697360992, 0.014383658766746521, 0.01215966884046793, 0.012905692681670189, 0.016368890181183815, 0.00955274235457182, 0.015453857369720936, 0.017132369801402092, 0.014982742257416248, 0.016994431614875793, 0.01078822836279869, 0.023941243067383766, 0.020318197086453438, 0.01393840741366148, 0.019643334671854973, 0.018156161531805992, 0.01748468540608883, 0.013957047834992409, 0.014808719977736473, 0.015526026487350464, 0.009133009240031242, 0.014872062020003796, 0.011816459707915783, 0.008824722841382027, 0.014392831362783909, 0.016627607867121696, 0.016759013757109642, 0.019479580223560333, 0.011849241331219673, 0.020260635763406754, 0.010110241360962391, 0.015549426898360252, 0.009782415814697742, 0.01201682910323143, 0.015561612322926521, 0.012369685806334019, 0.013025159016251564, 0.009392274543642998, 0.014243602752685547, 0.016861967742443085, 0.014752360992133617, 0.013326054438948631, 0.025396553799510002, 0.014575870707631111, 0.016638945788145065, 0.014769568108022213, 0.016070233657956123, 0.01536205317825079, 0.010987811721861362, 0.01935768872499466, 0.013108170591294765, 0.015007180161774158, 0.016544358804821968, 0.015549284406006336, 0.01455738116055727, 0.017494820058345795, 0.017457205802202225, 0.011198613792657852, 0.012505783699452877, 0.010834002867341042, 0.012184059247374535, 0.010029546916484833, 0.015648335218429565, 0.018520375713706017, 0.01346595399081707, 0.015328703448176384, 0.011555761098861694, 0.011692791245877743, 0.012428169138729572, 0.01796085759997368, 0.013755768537521362, 0.020420962944626808, 0.011946371756494045, 0.013870367780327797, 0.012402074411511421, 0.013507485389709473, 0.014376350678503513, 0.011529994197189808, 0.010900256223976612, 0.015432910062372684, 0.013659048825502396, 0.014901657588779926, 0.012202741578221321, 0.012503708712756634, 0.012239048257470131, 0.012564882636070251, 0.010759878903627396, 0.012590278871357441, 0.013469817116856575, 0.01277702022343874, 0.016737882047891617, 0.017005419358611107, 0.013821151107549667, 0.016368744894862175, 0.015345407649874687, 0.01913638971745968, 0.010300843976438046, 0.01322471909224987, 0.01335649099200964, 0.020412936806678772, 0.011441858485341072, 0.010504810139536858, 0.014503758400678635, 0.017439285293221474, 0.017329100519418716, 0.009978150948882103, 0.015380693599581718, 0.01138811931014061, 0.009775767102837563, 0.013396445661783218, 0.01537265069782734, 0.011595050804316998, 0.017803644761443138, 0.014677044935524464, 0.01619725301861763, 0.013134379871189594, 0.015728184953331947, 0.014533241279423237, 0.01421249657869339, 0.017625225707888603, 0.020569326356053352, 0.0125654311850667, 0.014046120457351208, 0.01817973516881466, 0.016164248809218407, 0.013101212680339813, 0.01644863374531269, 0.01667088083922863, 0.011696787551045418, 0.01109676156193018, 0.010174819268286228, 0.012544909492135048, 0.01674845442175865, 0.014616350643336773, 0.011529577895998955, 0.010458449833095074, 0.01460303831845522, 0.014858235605061054, 0.01419829297810793, 0.01783718355000019, 0.011369994841516018, 0.02003888227045536, 0.011950835585594177, 0.019291898235678673, 0.01352714467793703, 0.012682795524597168, 0.00845418032258749, 0.02131934091448784, 0.013714057393372059, 0.016623608767986298, 0.015697596594691277, 0.010072177276015282, 0.017170598730444908, 0.009768838994204998, 0.01315668597817421, 0.02059948444366455, 0.014639005996286869, 0.01953229121863842, 0.013747887685894966, 0.01194606814533472, 0.015497948043048382, 0.013894002884626389, 0.009731888771057129, 0.01871037483215332, 0.013150541111826897, 0.011971859261393547, 0.021171938627958298, 0.010266777127981186, 0.01326258759945631, 0.011156465858221054, 0.010782522149384022, 0.012736622244119644, 0.01170029304921627, 0.012558034621179104, 0.012115784920752048, 0.018035227432847023, 0.01650198921561241, 0.010216460563242435, 0.015150435268878937, 0.014034537598490715, 0.014928328804671764, 0.011596925556659698, 0.013667029328644276, 0.013545772060751915, 0.015956567600369453, 0.009544087573885918, 0.017125392332673073, 0.01240022387355566, 0.013762716203927994, 0.012325942516326904, 0.01271053496748209, 0.013317981734871864, 0.018280625343322754, 0.014640440233051777, 0.013242043554782867, 0.015343820676207542, 0.015041814185678959, 0.019607240334153175, 0.01811889372766018, 0.012411669827997684, 0.015786658972501755, 0.015110167674720287, 0.013584893196821213, 0.011473085731267929, 0.011934284120798111, 0.017903532832860947, 0.014214505441486835, 0.017017945647239685, 0.016758205369114876, 0.00958204735070467, 0.01589513197541237, 0.015542767941951752, 0.015510055236518383, 0.015265804715454578, 0.01926662027835846, 0.011616100557148457, 0.013844001106917858, 0.015248811803758144, 0.01700497418642044, 0.013940848410129547, 0.014618311077356339, 0.015082359313964844, 0.015676476061344147, 0.016524111852049828, 0.021901674568653107, 0.015105407685041428, 0.010298303328454494, 0.01830863021314144, 0.011573568917810917, 0.014276545494794846, 0.014323501847684383, 0.014678534120321274, 0.01621430553495884, 0.011429899372160435, 0.015631375834345818, 0.019196055829524994, 0.015073194168508053, 0.012785394676029682, 0.014001746661961079, 0.015699023380875587, 0.018142329528927803, 0.011073355562984943, 0.013980533927679062, 0.011681457050144672, 0.012066834606230259, 0.013719703070819378, 0.01293028425425291, 0.01355150155723095, 0.015440668910741806, 0.012632201425731182, 0.018365660682320595, 0.014937124215066433, 0.014378379099071026, 0.013518700376152992, 0.01319380197674036, 0.017831405624747276, 0.01214677095413208, 0.016612902283668518, 0.01331848744302988, 0.014788851141929626, 0.014340704306960106, 0.015866780653595924, 0.017376836389303207, 0.011296931654214859, 0.013966034166514874, 0.012803178280591965, 0.01330440491437912, 0.015451428480446339, 0.01845480501651764, 0.012205432169139385, 0.010485924780368805, 0.016200676560401917, 0.020071299746632576, 0.013355085626244545, 0.017148183658719063, 0.01713094301521778, 0.01277029886841774, 0.013219157233834267, 0.01505940593779087, 0.016952453181147575, 0.016247907653450966, 0.013557398691773415, 0.012139378115534782, 0.013064206577837467, 0.015468801371753216, 0.014736601151525974, 0.01571846380829811, 0.01600896567106247, 0.013209763914346695, 0.014491603709757328, 0.01178015023469925, 0.011373682878911495, 0.016068875789642334, 0.01728208176791668, 0.010332310572266579, 0.013087634928524494, 0.014020969159901142, 0.01590432971715927, 0.017970509827136993, 0.009974922984838486, 0.019206790253520012, 0.011687812395393848, 0.016681257635354996, 0.013558768667280674, 0.014685174450278282, 0.014954106882214546, 0.011504645459353924, 0.01859305240213871, 0.01922658644616604, 0.01751636527478695, 0.013726362958550453, 0.014151672832667828, 0.014673897065222263, 0.018810413777828217, 0.011140655726194382, 0.0174961369484663, 0.014765569008886814, 0.008223514072597027, 0.019447948783636093, 0.01526790950447321, 0.009782285429537296, 0.014783073216676712, 0.01241692528128624, 0.011931253597140312, 0.01614314131438732, 0.01561375055462122, 0.012693971395492554, 0.01731012389063835, 0.010400650091469288, 0.011815676465630531, 0.012627491727471352, 0.01670985296368599, 0.014029555022716522, 0.015182431787252426, 0.011831680312752724, 0.01444346085190773, 0.010238436050713062, 0.019696911796927452, 0.013084035366773605, 0.014980046078562737, 0.02293357439339161, 0.012449936009943485, 0.0201268307864666, 0.01455872505903244, 0.013037462718784809, 0.011681891977787018, 0.011929433792829514, 0.012943126261234283, 0.01795036904513836, 0.01483464241027832, 0.01500445231795311, 0.010292630642652512, 0.011287686415016651, 0.010131157003343105, 0.013747447170317173, 0.014640001580119133, 0.016291307285428047, 0.009836975485086441, 0.009369305334985256, 0.01337192952632904, 0.017312021926045418, 0.01569763571023941, 0.015083231031894684, 0.009576041251420975, 0.017608392983675003, 0.016406038776040077, 0.017029061913490295, 0.01601594313979149, 0.021131832152605057, 0.014295983128249645, 0.018255343660712242, 0.012127683497965336, 0.016475819051265717, 0.012101727537810802, 0.015622656792402267, 0.013654403388500214, 0.014168176800012589, 0.017973223701119423, 0.012863966636359692, 0.020493410527706146, 0.01402577105909586, 0.014867295511066914, 0.01224115677177906, 0.017477117478847504, 0.010653967969119549, 0.01298364158719778, 0.01671256124973297, 0.015526299364864826, 0.014925869181752205, 0.016414795070886612, 0.02012038230895996, 0.00936219748109579, 0.01281198300421238, 0.01668417453765869, 0.014464844949543476, 0.012710928916931152, 0.01374725066125393, 0.017560379579663277, 0.013117346912622452, 0.013731447979807854, 0.010643073357641697, 0.01845804788172245, 0.016197731718420982, 0.012650095857679844, 0.010339869186282158, 0.015053506940603256, 0.019049691036343575, 0.012722444720566273, 0.021629853174090385, 0.02189377322793007, 0.018102159723639488, 0.018124759197235107, 0.015199276618659496, 0.014299195259809494, 0.011958462186157703, 0.013961698859930038, 0.012612021528184414, 0.018045106902718544, 0.013044118881225586, 0.016424162313342094, 0.012477574869990349, 0.013037440367043018, 0.015926027670502663, 0.0176842138171196, 0.013084822334349155, 0.018889371305704117, 0.013848566450178623, 0.015453696250915527, 0.015553737990558147, 0.012461688369512558, 0.015496906824409962, 0.018106279894709587, 0.013074343092739582, 0.01853059232234955, 0.015158895403146744, 0.014679958112537861, 0.018050523474812508, 0.012020675465464592, 0.012696592137217522, 0.01475496031343937, 0.01765030063688755, 0.01360783725976944, 0.01146648358553648, 0.014982355758547783, 0.01551226619631052, 0.010762779042124748, 0.014675485901534557, 0.014809508807957172, 0.009451356716454029, 0.015422300435602665, 0.019429028034210205, 0.017073918133974075, 0.016603562980890274, 0.01398480124771595, 0.013359147123992443, 0.011818716302514076, 0.01639053039252758, 0.017027592286467552, 0.01357841957360506, 0.01619878038764, 0.01626768335700035, 0.016250109300017357, 0.01230620127171278, 0.01024792529642582, 0.016911029815673828, 0.02147149108350277, 0.015476671978831291, 0.017730101943016052, 0.010438419878482819, 0.012225031852722168, 0.014002553187310696, 0.015893159434199333, 0.015178621746599674, 0.014046695083379745, 0.012262053787708282, 0.012009000405669212, 0.012601504102349281, 0.018276376649737358, 0.008967944420874119, 0.01598045416176319, 0.015678750351071358, 0.018675170838832855, 0.015139997936785221, 0.012568139471113682, 0.012870525941252708, 0.018368573859333992, 0.01459738053381443, 0.011436809785664082, 0.009900271892547607, 0.007469954900443554, 0.015696309506893158, 0.01766524650156498, 0.014384942129254341, 0.015850253403186798, 0.012514765374362469, 0.010037478059530258, 0.020646242424845695, 0.010820741765201092, 0.013820165768265724, 0.012688442133367062, 0.015065834857523441, 0.019048379734158516, 0.019603244960308075, 0.017720071598887444, 0.016139227896928787, 0.01498098112642765, 0.016578644514083862, 0.012104857712984085, 0.015515106730163097, 0.009655576199293137, 0.013652966357767582, 0.015815764665603638, 0.012256995774805546, 0.016371391713619232, 0.015710975974798203, 0.01938571222126484, 0.010568182915449142, 0.012457290664315224, 0.018121393397450447, 0.018976321443915367, 0.017733415588736534, 0.011717230081558228, 0.015544308349490166, 0.014414794743061066, 0.019137132912874222, 0.010987960733473301, 0.01528106164187193, 0.014228345826268196, 0.009589200839400291, 0.017499784007668495, 0.015450101345777512, 0.015299881808459759, 0.00903781782835722, 0.019468164071440697, 0.012247482314705849, 0.013476804830133915, 0.014939687214791775, 0.016065072268247604, 0.018567383289337158, 0.011817574501037598, 0.017804346978664398, 0.017104292288422585, 0.01296243630349636, 0.010723194107413292, 0.012646554037928581, 0.015276512131094933, 0.019176820293068886, 0.019983038306236267, 0.017482733353972435, 0.013310551643371582, 0.012764478102326393, 0.013598884455859661, 0.01335060689598322, 0.017629235982894897, 0.015604191459715366, 0.013178203254938126, 0.010128206573426723, 0.01262409333139658, 0.014265517704188824, 0.01178326178342104, 0.01979014091193676, 0.013225648552179337, 0.015234929509460926, 0.015112904831767082, 0.014096125960350037, 0.012725974433124065, 0.010879795998334885, 0.012719851918518543, 0.01735844835639, 0.014749348163604736, 0.010376974008977413, 0.009538698010146618, 0.00996649544686079, 0.021506574004888535, 0.015249261632561684, 0.01273189764469862, 0.01661057211458683, 0.011376317590475082, 0.023469718173146248, 0.015995053574442863, 0.011175882071256638, 0.010422191582620144, 0.010957159101963043, 0.011366610415279865, 0.01954798959195614, 0.016317147761583328, 0.016325432807207108, 0.01599765755236149, 0.01575544849038124, 0.007038059178739786, 0.012063322588801384, 0.015736889094114304, 0.013678666204214096, 0.009853698313236237, 0.017307603731751442, 0.012014580890536308, 0.009932931512594223, 0.011424548923969269, 0.01599118858575821, 0.013508500531315804, 0.01540001854300499, 0.016566166654229164, 0.016172491014003754, 0.01712065190076828, 0.014992985874414444, 0.018751105293631554, 0.015973305329680443, 0.010924093425273895, 0.015129655599594116, 0.009636759757995605, 0.008269858546555042, 0.01651899702847004, 0.013600446283817291, 0.019705941900610924, 0.013589570298790932, 0.013534970581531525, 0.02242007665336132, 0.016660450026392937, 0.014255590736865997, 0.008913667872548103, 0.01189431082457304, 0.014090865850448608, 0.010125054977834225, 0.01180209405720234, 0.017894700169563293, 0.015846332535147667, 0.010791189968585968, 0.01573799178004265, 0.012432614341378212, 0.022227119654417038, 0.015712173655629158, 0.013672086410224438, 0.013004968874156475, 0.013909653760492802, 0.014477739110589027, 0.011897847056388855, 0.009448765777051449, 0.015882251784205437, 0.019516658037900925, 0.014047859236598015, 0.010066770948469639, 0.01699071004986763, 0.013653088361024857, 0.013432429172098637, 0.014027731493115425, 0.01933608390390873, 0.016060229390859604, 0.013173962943255901, 0.01745576784014702, 0.014340981841087341, 0.01754864677786827, 0.013641875237226486, 0.01707356795668602, 0.014532758854329586, 0.011571956798434258, 0.015317556448280811, 0.014494989067316055, 0.013046471402049065, 0.016881175339221954, 0.022536374628543854, 0.01539967954158783, 0.015241493470966816, 0.019774286076426506, 0.016427915543317795, 0.015296381898224354, 0.01625637151300907, 0.01671222597360611, 0.013162882998585701, 0.01151960901916027, 0.012253244407474995, 0.012692741118371487, 0.014883951283991337, 0.015987901017069817, 0.010062171146273613, 0.012439112178981304, 0.017100248485803604, 0.01593669503927231, 0.013002506457269192, 0.012204378843307495, 0.020815391093492508, 0.017704259604215622, 0.016283998265862465, 0.01374803576618433, 0.01326970849186182, 0.013681869953870773, 0.01916087418794632, 0.014228479005396366, 0.015284082852303982, 0.013118225149810314, 0.016419051215052605, 0.015038863755762577, 0.01239532046020031, 0.019062280654907227, 0.012171631678938866, 0.014914407394826412, 0.013709590770304203, 0.011244338005781174, 0.019630681723356247, 0.01725885458290577, 0.01697155460715294, 0.021054275333881378, 0.016259025782346725, 0.009422065690159798, 0.01735103875398636, 0.01650640182197094, 0.010875373147428036, 0.013264614157378674, 0.01598810963332653, 0.012092385441064835, 0.013890203088521957, 0.015502367168664932, 0.012324106879532337, 0.013529404066503048, 0.014866003766655922, 0.015617732889950275, 0.023216811940073967, 0.01205433625727892, 0.014513203874230385, 0.01249017845839262, 0.018115442246198654, 0.016401495784521103, 0.008781210519373417, 0.015552668832242489, 0.013504442758858204, 0.013529480434954166, 0.01829630881547928, 0.013242039829492569, 0.012767331674695015, 0.01085098646581173, 0.015984922647476196, 0.01622430421411991, 0.014977293089032173, 0.01617429219186306, 0.013590424321591854, 0.017472051084041595, 0.010097428224980831, 0.017874449491500854, 0.0138082941994071, 0.014818093739449978, 0.01580427587032318, 0.011258402839303017, 0.009780320338904858, 0.008917277678847313, 0.013946537859737873, 0.014117077924311161, 0.013454439118504524, 0.01596743054687977, 0.015738828107714653, 0.013158791698515415, 0.012628567405045033, 0.014631734229624271, 0.016542457044124603, 0.016338307410478592, 0.015391473658382893, 0.01225336454808712, 0.012100942432880402, 0.012476086616516113, 0.013983197510242462, 0.012865250930190086, 0.007715191226452589, 0.01578512042760849, 0.012592366896569729, 0.012816290371119976, 0.013900418765842915, 0.01595882512629032, 0.015227974392473698, 0.019105294719338417, 0.01876126416027546, 0.019027739763259888, 0.014572840183973312, 0.015690086409449577, 0.01635241135954857, 0.013527040369808674, 0.019538069143891335, 0.018505524843931198, 0.011950402520596981, 0.018318668007850647, 0.01488092914223671, 0.0143594266846776, 0.015413353219628334, 0.01464089285582304, 0.016019968315958977, 0.012322613038122654, 0.012920897454023361, 0.014625931158661842, 0.017360452562570572, 0.014710121788084507, 0.011658741161227226, 0.015851540490984917, 0.010808035731315613, 0.01432371512055397, 0.01630624569952488, 0.015824779868125916, 0.02195395901799202, 0.013746476732194424, 0.01332824770361185, 0.01349385641515255, 0.0163558479398489, 0.013516408391296864, 0.012082776054739952, 0.016824590042233467, 0.011953448876738548, 0.01093829981982708, 0.0142142865806818, 0.010084887035191059, 0.014252163469791412, 0.011788444593548775, 0.012525035999715328, 0.015296266414225101, 0.010380174033343792, 0.012750339694321156, 0.014415226876735687, 0.012476591393351555, 0.013235586695373058, 0.014202125370502472, 0.011311601847410202, 0.015194435603916645, 0.02020971290767193, 0.015044353902339935, 0.015578138642013073, 0.020265983417630196, 0.01364318747073412, 0.01709783263504505, 0.010391856543719769, 0.022328035905957222, 0.015082002617418766, 0.010758745484054089, 0.013903602957725525, 0.012786509469151497, 0.01855980046093464, 0.017456097528338432, 0.012117626145482063, 0.014750443398952484, 0.012240667827427387, 0.015729811042547226, 0.012610701844096184, 0.014339456334710121, 0.017079828307032585, 0.014494889415800571, 0.01783207431435585, 0.014108117669820786, 0.021185392513871193, 0.013971948996186256, 0.015639817342162132, 0.015100396238267422, 0.013443170115351677, 0.00709156459197402, 0.019846085458993912, 0.013161527924239635, 0.009439036250114441, 0.011946340091526508, 0.014587686397135258, 0.018949003890156746, 0.014682353474199772, 0.015960445627570152, 0.015610297210514545, 0.010965105146169662, 0.01784789003431797, 0.011388910934329033, 0.01935971900820732, 0.013128170743584633, 0.01897837221622467, 0.011938890442252159, 0.013166290707886219, 0.016698339954018593, 0.01595478132367134, 0.013264144770801067, 0.01719718798995018, 0.016705019399523735, 0.011047291569411755, 0.013877559453248978, 0.01578536070883274, 0.012804252095520496, 0.015369860455393791, 0.017873162403702736, 0.016867324709892273, 0.014693135395646095, 0.011220413260161877, 0.012721297331154346, 0.01309678703546524, 0.01651678793132305, 0.017204713076353073, 0.010976647958159447, 0.01707112230360508, 0.016209879890084267, 0.022403232753276825, 0.014484064653515816, 0.012560160830616951, 0.013944897800683975, 0.01397325936704874, 0.01506958156824112, 0.0225509162992239, 0.014464533887803555, 0.015014613047242165, 0.013324749656021595, 0.013082613237202168, 0.012776761315762997, 0.01873886026442051, 0.00917312502861023, 0.009365808218717575, 0.012312336824834347, 0.012824994511902332, 0.009552647359669209, 0.015907958149909973, 0.011724616400897503, 0.01663212850689888, 0.017701929435133934, 0.01721535436809063, 0.018392160534858704, 0.013118850998580456, 0.012717662379145622, 0.015175447799265385, 0.014076733030378819, 0.008925280533730984, 0.009806336835026741, 0.015773199498653412, 0.017806628718972206, 0.02067144215106964, 0.015171931125223637, 0.013003258965909481, 0.013786658644676208, 0.015748729929327965, 0.01698893867433071].
985
-
986
- ## How to use
987
-
988
- You can use this model directly with a pipeline for text classification:
989
-
990
- ```python
991
- from transformers import pipeline
992
-
993
- classifier = pipeline("text-classification", model="YusuphaJuwara/nli-fever")
994
- result = classifier("premise", "hypothesis")
995
- print(result)
996
- ```
997
-
998
- ## Saved Metrics
999
-
1000
- This model repository includes a `metrics.json` file containing detailed training metrics.
1001
- You can load these metrics using the following code:
1002
-
1003
- ```python
1004
- from huggingface_hub import hf_hub_download
1005
- import json
1006
-
1007
- metrics_file = hf_hub_download(repo_id="YusuphaJuwara/nli-fever", filename="metrics.json")
1008
- with open(metrics_file, 'r') as f:
1009
- metrics = json.load(f)
1010
-
1011
- # Now you can access metrics like:
1012
- print("Last epoch: ", metrics['last_epoch'])
1013
- print("Final validation loss: ", metrics['val_losses'][-1])
1014
- print("Final validation accuracy: ", metrics['val_accuracies'][-1])
1015
- ```
1016
-
1017
- These metrics can be useful for continuing training from the last epoch or for detailed analysis of the training process.
1018
-
1019
- ## Training results
1020
- ![Include a plot of your training metrics here](training_plot.png)
1021
-
1022
- Limitations and bias
1023
- ## This model may exhibit biases present in the training data. Always validate results and use the model responsibly.
1024
-
1025
- ## Plots
1026
- ![Labels distribution plots](label_distribution.png)
1027
- ![loss plots](loss_plot.png)
1028
- ![accuracy plots](accuracy_plot.png)
1029
- ![f1 score plots](f1_score_plot.png)
1030
- ![confusion matrix plots](confusion_matrix.png)
1031
- ![precision recall curve plots](precision_recall_curve.png)
1032
- ![roc curve plots](roc_curve.png)
 
1
  ---
2
  tags:
3
+ - model_hub_mixin
4
+ - pytorch_model_hub_mixin
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ---
6
 
7
+ This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
+ - Library: [More Information Needed]
9
+ - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:12f154ad471e303c085df2ed0dedb4d0a40d0b7200ad1dcf07cf58cefa3d0905
3
  size 40785636
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0d003564ed4f431623ec740b772846f6a79d2c66d77e0e2c5ce537a229f79ca7
3
  size 40785636