File size: 217,272 Bytes
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
942c98a
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
 
 
 
 
 
 
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
8a8a55b
 
942c98a
8a8a55b
1c89b07
8a8a55b
 
 
 
927e447
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
942c98a
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d5df505
 
8a8a55b
 
 
 
 
 
 
 
 
 
d5df505
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ff2dc74
 
 
 
 
 
 
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
8a8a55b
1c89b07
 
8a8a55b
 
 
 
 
 
1c89b07
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1c89b07
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8a8a55b
 
1c89b07
 
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c6e6873
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8a8a55b
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "04837caf",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Sat Apr  4 13:45:27 2026       \n",
      "+-----------------------------------------------------------------------------------------+\n",
      "| NVIDIA-SMI 580.82.07              Driver Version: 580.82.07      CUDA Version: 13.0     |\n",
      "+-----------------------------------------+------------------------+----------------------+\n",
      "| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |\n",
      "| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |\n",
      "|                                         |                        |               MIG M. |\n",
      "|=========================================+========================+======================|\n",
      "|   0  Tesla T4                       Off |   00000000:00:04.0 Off |                    0 |\n",
      "| N/A   57C    P8             10W /   70W |       0MiB /  15360MiB |      0%      Default |\n",
      "|                                         |                        |                  N/A |\n",
      "+-----------------------------------------+------------------------+----------------------+\n",
      "\n",
      "+-----------------------------------------------------------------------------------------+\n",
      "| Processes:                                                                              |\n",
      "|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |\n",
      "|        ID   ID                                                               Usage      |\n",
      "|=========================================================================================|\n",
      "|  No running processes found                                                             |\n",
      "+-----------------------------------------------------------------------------------------+\n"
     ]
    }
   ],
   "source": [
    "!nvidia-smi"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "01fa72f5",
   "metadata": {},
   "source": [
    "# OPIUM: Masked Diffusion Language Model (MDLM)\n",
    "\n",
    "A from-scratch implementation of a **Masked Diffusion Language Model** based on [Sahoo et al., NeurIPS 2024](https://arxiv.org/abs/2406.07524).\n",
    "\n",
    "## How it works\n",
    "1. **Forward process**: Gradually mask tokens with probability increasing over time `t \u2208 [0,1]`\n",
    "2. **Reverse process**: A bidirectional transformer learns to predict masked tokens conditioned on timestep\n",
    "3. **Loss**: Weighted cross-entropy at masked positions \u2014 same as BERT's MLM but integrated over noise levels\n",
    "4. **Sampling**: Start from all `[MASK]` tokens, iteratively unmask via the learned reverse process\n",
    "\n",
    "## Architecture (~200M params)\n",
    "- Bidirectional Transformer (no causal mask) with timestep conditioning\n",
    "- 16 layers, 768 hidden dim, 12 attention heads\n",
    "- GPT-2 tokenizer (50,257 vocab)\n",
    "- RoPE positional embeddings\n",
    "- Log-linear noise schedule: `\u03b1(t) = 1 - t`"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "3aa89227",
   "metadata": {},
   "outputs": [],
   "source": [
    "!pip install -q torch transformers datasets accelerate wandb einops"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "bed9312b",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Using device: cuda\n",
      "CUDA available: True\n",
      "GPU: Tesla T4\n",
      "Memory: 15.6 GB\n"
     ]
    }
   ],
   "source": [
    "import os\n",
    "os.environ['PYTORCH_CUDA_ALLOC_CONF'] = 'expandable_segments:True'\n",
    "\n",
    "import torch\n",
    "import torch.nn as nn\n",
    "import torch.nn.functional as F\n",
    "from torch.utils.data import DataLoader\n",
    "from torch.amp import autocast, GradScaler\n",
    "import math\n",
    "import time\n",
    "import numpy as np\n",
    "from dataclasses import dataclass\n",
    "from transformers import GPT2TokenizerFast\n",
    "from datasets import load_dataset\n",
    "from einops import rearrange\n",
    "\n",
    "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
    "print(f\"Using device: {device}\")\n",
    "print(f\"CUDA available: {torch.cuda.is_available()}\")\n",
    "if torch.cuda.is_available():\n",
    "    print(f\"GPU: {torch.cuda.get_device_name(0)}\")\n",
    "    print(f\"Memory: {torch.cuda.get_device_properties(0).total_memory / 1e9:.1f} GB\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c197184e",
   "metadata": {},
   "source": [
    "## Configuration"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "82480781",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Estimated parameters: ~219M\n"
     ]
    }
   ],
   "source": [
    "@dataclass\n",
    "class MDLMConfig:\n",
    "    # Model\n",
    "    vocab_size: int = 50258        # GPT-2 vocab (50257) + 1 MASK token\n",
    "    mask_token_id: int = 50257     # Our added [MASK] token\n",
    "    hidden_dim: int = 768\n",
    "    num_heads: int = 12\n",
    "    num_layers: int = 12           # 12 layers fits T4 16GB\n",
    "    mlp_ratio: float = 4.0\n",
    "    dropout: float = 0.0\n",
    "\n",
    "    # Training\n",
    "    seq_len: int = 256\n",
    "    batch_size: int = 32           # T4 16GB \u2014 small batch, more accum\n",
    "    grad_accum_steps: int = 2      # Effective batch = 128\n",
    "    learning_rate: float = 3e-4\n",
    "    weight_decay: float = 0.01\n",
    "    warmup_steps: int = 1000\n",
    "    max_steps: int = 50_000       # ~8 hours on T4\n",
    "    ema_decay: float = 0.9999\n",
    "    max_grad_norm: float = 1.0\n",
    "\n",
    "    # Sampling\n",
    "    sampling_steps: int = 256      # Number of denoising steps at inference\n",
    "\n",
    "    # Logging\n",
    "    log_every: int = 100\n",
    "    sample_every: int = 2500\n",
    "    save_every: int = 5000\n",
    "\n",
    "config = MDLMConfig()\n",
    "\n",
    "n_params = (\n",
    "    config.vocab_size * config.hidden_dim +\n",
    "    config.num_layers * (4 * config.hidden_dim**2 + 4 * config.hidden_dim * int(config.hidden_dim * config.mlp_ratio)) +\n",
    "    config.hidden_dim * config.vocab_size\n",
    ")\n",
    "print(f'Estimated parameters: ~{n_params / 1e6:.0f}M')\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e1d0beb6",
   "metadata": {},
   "source": [
    "## Noise Schedule\n",
    "\n",
    "Log-linear schedule: `\u03b1(t) = 1 - t` where `t \u2208 [0, 1]`\n",
    "\n",
    "- At `t=0`: `\u03b1=1`, all tokens are unmasked (clean data)\n",
    "- At `t=1`: `\u03b1=0`, all tokens are masked (pure noise)\n",
    "\n",
    "The MDLM paper proved the loss is **invariant** to the schedule form via change of variables, so simplest works best."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "id": "47efd2e9",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAABdEAAAGGCAYAAACUkchWAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjAsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvlHJYcgAAAAlwSFlzAAAPYQAAD2EBqD+naQAAplBJREFUeJzs3XdYU2cbBvA77A0CAqIoOBFUQKiIglvBVXHUrbj3oNZa7eeorXW1WtxWrVZbratq68K6BfcCt6Ki4gAZCrIhOd8fgWgEFBVyAty/68p1yck5yZPQ8p48ec97SwRBEEBERERERERERERERHloiF0AEREREREREREREZG6YhOdiIiIiIiIiIiIiKgAbKITERERERERERERERWATXQiIiIiIiIiIiIiogKwiU5EREREREREREREVAA20YmIiIiIiIiIiIiICsAmOhERERERERERERFRAdhEJyIiIiIiIiIiIiIqAJvoREREREREREREREQFYBOdRDVq1Ci0bt26UPv27NkT3bt3L+aK1MeAAQNgZGQk2vMfO3YMEokE27dvF62GN0kkEnz33Xdil6H2fv/9d0gkEly4cKFYn4e/DyIqbjxHKBjPEZSVxjGJ4zkRkbgGDBgAe3v7jz5WzHG6KJ07dw46Ojp4+PDhe/cNDg6GkZERYmNji+05itP8+fPh6OgImUz23n0nT54MT0/PPNvj4+NhaGiIffv2FUeJJDI20Uk0kZGRWLNmDb799lvFtqdPn+K7775DWFhYnv2/+eYb/P333wgPD1dhlXlJJBLFTUNDA7a2tmjTpg2OHTsmal1vW758OX7//XexyyAiIjWVnJyMGTNmwM/PD+bm5pBIJGozbvAcoXjxHIGIiAqiqi/xCEhNTcV3332nFucJzZo1w4ABA/Js/9///odevXqhSpUqim0FnUf4+fmhevXqmDNnToHPs3v3bmhoaCA6Ovqdz3Hu3DmMGjUK7u7u0NbWhkQi+bgXVkhJSUmYN28evvnmG2hoyFul7/r9BAYGIjw8HP/++6/SdgsLCwwZMgTTpk0r1npJHGyik2gWLVoEBwcHNG/eXLHt6dOnmDlzZr4fkN3c3ODh4YEFCxaosMr8tW7dGn/88QfWr1+PESNG4MqVK2jRogX2798vdmkKpe0DclpaGqZOnSp2GUREpUZcXBy+//573Lx5Ey4uLmKXo4TnCMWL5whERETvtnr1aty+fbtYnyM1NRUzZ85UiyZ6fsLCwnDo0CGMGDFCafu7ziOGDx+OX3/9Fa9evcr3/r1798Ld3R02NjbvfI59+/ZhzZo1kEgkqFq16qe/mPdYu3YtsrOz0atXL8W2d/1+bGxs0KlTJ/z888957hsxYgQuXbqEI0eOFGfJJAI20UkUWVlZ2Lhx4wdfet29e3fs2LEDycnJxVRZ4dSsWRN9+/ZFv379MH36dBw8eBCCICAoKKjAY9LT0wt1WRDlT09PD1paWmKXQURUalSoUAHPnj3Dw4cP8dNPP4ldjgLPEehD8RyBiIiKmra2NnR1dcUuQ1Tr1q1D5cqV0bBhw0If07VrV2RkZGDbtm353r9v3z60b9/+vc8xcuRIJCYm4sKFC4Ve3u9TrFu3Dp9//jn09PQKfUz37t0RGhqK+/fvK22vXbs26tSpU6omLJAcm+hUpEJCQtCwYUPo6+vDwcEBy5YtAwD4+/ujT58+iv1CQ0MRFxeHVq1aKbYdO3YMn332GQBg4MCBisuh3/zD07p1a6SkpODgwYOqeUGFVLduXVhaWiIyMhLA67VCN2/ejKlTp6JixYowMDBAUlISAGDbtm1wd3eHvr4+LC0t0bdvXzx58iTfx75//z58fX1haGgIW1tbfP/99xAE4Z312Nvb4/r16zh+/LjifWzWrJnSY37xxRcwNzeHgYEBGjZsiL179773dWZkZKBDhw4wNTXFqVOnAAAymQxBQUFwdnaGnp4erK2tMXz4cLx48SJPTR06dEBoaCgaNGgAPT09VK1aFRs2bHjv8wJ51+x89eoVAgMDYW9vD11dXVhZWaF169a4dOnSOx+noLXtvvvuuzyXiEkkEowZMwa7du1CnTp1oKurC2dnZwQHB+d77J07d9C3b1+YmpqifPnymDZtGgRBQFRUFDp16gQTExPY2NjkmSmZmZmJ6dOnw93dHaampjA0NISPjw+OHj2ap87NmzfD3d0dxsbGMDExQd26dbFo0aJ3vuYXL16gQYMGqFSpkmI2RUZGBmbMmIHq1atDV1cXdnZ2mDRpEjIyMpSOzcjIwJdffony5cvD2NgYn3/+OR4/fvzO5yOikkFXV1cxC0gVeI7AcwR1P0fgeE5EVHiXL19G27ZtYWJiAiMjI7Rs2RJnzpxR2icrKwszZ85EjRo1oKenBwsLC3h7eyuN1dHR0Rg4cCAqVaoEXV1dVKhQAZ06dcKDBw8KfO5///0XEokEV65cUWz7+++/IZFI0KVLF6V9a9eujR49eiht+/PPPxVjrbm5OXr27ImoqCilffL73BgfH49+/frBxMQEZmZmCAgIQHh4eIFL4j158gT+/v4wMjJC+fLlMXHiREilUgDAgwcPUL58eQDAzJkzFWNy7niWlZWFW7du4dmzZwW+D4WxatUq1KhRA/r6+nB3d8fRo0chlUphaWmJ1atXv/PYXbt2oUWLFkqfk993HmFlZYV69erhn3/+yfN4V69eRVRUlFITPb/nAABra2vo6+t/5KuWu3btGlq3bg0DAwNUrFgR06dPhyAICAwMROPGjRX7RUZG4sqVK0rnnu/7/QBQ7J/fa23dujV279793vMyKlnYRKcic+rUKbRq1QrZ2dn46aef4OXlhTFjxmDHjh3477//0LFjR6V9JRIJ3NzcFNtq166N77//HgAwbNgw/PHHH/jjjz/QpEkTxT5OTk7Q19fHyZMnVffCCuHFixd48eIFLCwslLb/8MMP2Lt3LyZOnIjZs2dDR0cHv//+O7p37w5NTU3MmTMHQ4cOxY4dO+Dt7Y2XL18qHS+VSuHn5wdra2vMnz8f7u7umDFjBmbMmPHOeoKCglCpUiU4Ojoq3sf//e9/AICYmBg0atQIBw4cwKhRo/Djjz8iPT0dn3/+OXbu3FngY6alpaFjx444deoUDh06hEaNGgGQX6719ddfo3Hjxli0aBEGDhyIjRs3wtfXF1lZWUqPcffuXXTr1g2tW7fGggULUK5cOQwYMADXr18v7FutMGLECKxYsQJdu3bF8uXLMXHiROjr6+PmzZsf/FjvEhoailGjRqFnz56YP38+0tPT0bVrV8THx+fZt0ePHpDJZJg7dy48PT0xa9YsBAUFoXXr1qhYsSLmzZuH6tWrY+LEiThx4oTiuKSkJKxZswbNmjXDvHnz8N133yE2Nha+vr5KyxYcPHgQvXr1Qrly5TBv3jzMnTsXzZo1e+f/D3FxcWjRogViYmJw/Phx1KpVCzKZDJ9//jl+/vlndOzYEUuWLIG/vz9++eWXPCeYQ4YMQVBQENq0aYO5c+dCW1tb6aSHiKgweI7Ac4SSdI7A8ZyI6N2uX78OHx8fhIeHY9KkSZg2bRoiIyPRrFkznD17VrHfd999h5kzZ6J58+ZYunQp/ve//6Fy5cpKX2p27doVO3fuxMCBA7F8+XKMGzcOr169wqNHjwp8fm9vb0gkEqW/wSEhIdDQ0EBoaKhiW2xsLG7duqV0vvDjjz+if//+qFGjBhYuXIjAwEAcPnwYTZo0yTPWvkkmk6Fjx47466+/EBAQgB9//BHPnj1DQEBAvvtLpVL4+vrCwsICP//8M5o2bYoFCxZg1apVAIDy5ctjxYoVAIDOnTsrxuTcLwGePHmC2rVrY8qUKQXW9D4LFizA8OHD4erqioULF0JTUxOdOnXC33//jYSEBHTo0KHAY588eYJHjx6hfv36StvfdR6Ry93dXfGF+pv27dsHKysreHh4vPM5isK9e/fg7e2NBw8eYPbs2ejcuTN++OEH/Prrr9i1a1eec08ASnW87/cDAKampqhWrVq+47e7uztevnz5UecxpMYEoiLSokULwcjISEhISBAEQRBkMpng6uoq2NjYCFpaWsKLFy8U+/bt21ewsLDI8xjnz58XAAjr1q0r8Hlq1qwptG3btqjLLzQAwuDBg4XY2Fjh+fPnwtmzZ4WWLVsKAIQFCxYIgiAIR48eFQAIVatWFVJTUxXHZmZmClZWVkKdOnWEtLQ0xfY9e/YIAITp06crtgUEBAgAhLFjxyq2yWQyoX379oKOjo4QGxv7zjqdnZ2Fpk2b5tkeGBgoABBCQkIU2169eiU4ODgI9vb2glQqVXoN27ZtE169eiU0bdpUsLS0FC5fvqw4LiQkRAAgbNy4Uek5goOD82yvUqWKAEA4ceKEYtvz588FXV1d4auvvnrnaxEE+fs+Y8YMxc+mpqbC6NGj33vc2wICAoQqVark2T5jxgzh7T+JAAQdHR3h7t27im3h4eECAGHJkiV5jh02bJhiW3Z2tlCpUiVBIpEIc+fOVWx/8eKFoK+vLwQEBCjtm5GRofTcL168EKytrYVBgwYpto0fP14wMTERsrOzC3x969atEwAI58+fF549eyY4OzsLVatWFR48eKDY548//hA0NDSU/hsQBEFYuXKlAEA4efKkIAiCEBYWJgAQRo0apbRf79698/w+iKhkK8z4+yl4jsBzhFzqfI7A8ZyISPnvT0H8/f0FHR0d4d69e4ptT58+FYyNjYUmTZootrm4uAjt27cv8HFevHghABB++umnD67T2dlZ6N69u+Ln+vXrC1988YUAQLh586YgCIKwY8cOAYAQHh4uCIIgPHjwQNDU1BR+/PFHpce6evWqoKWlpbT97c+Nf//9twBACAoKUmyTSqVCixYt8pyf5I7T33//vdLzuLm5Ce7u7oqfY2NjC/w7HBkZKQBQGmc+REpKimBkZCR4e3sLMplMEARBiImJEbS1tQUbGxvBw8PjnccfOnRIACDs3r07z30FnUfkmj17tgBAiImJUdru4+Oj9Hre9RxvGj16dJ7P6u8zaNAgQSKRCLdv31Zs69Spk2BjYyMAEK5du6bYPnXqVAGA8OrVK6XHeNfvJ1ebNm2E2rVr59l+6tQpAYCwZcuWD6qb1BtnolORyMrKQmhoKDp06IBy5coBkF9a26FDB0RHR8PHxwdmZmaK/ePj4xX7fahy5cohLi6uKMr+aL/99hvKly8PKysreHp64uTJk5gwYQICAwOV9gsICFC6BOnChQt4/vw5Ro0apbTWVvv27eHo6Jjv5dJjxoxR/Dt3eZHMzEwcOnToo2rft28fGjRoAG9vb8U2IyMjDBs2DA8ePMCNGzeU9k9MTESbNm1w69YtHDt2DK6uror7tm3bBlNTU7Ru3RpxcXGKm7u7O4yMjPJcvuzk5AQfHx/Fz+XLl0etWrXyrCFWGGZmZjh79iyePn36wcd+iFatWqFatWqKn+vVqwcTE5N8ax4yZIji35qamvDw8IAgCBg8eLBS3W+/Zk1NTejo6ACQz3BISEhAdnY2PDw8lGZpmJmZFXqpgsePH6Np06bIysrCiRMnlJLOt23bhtq1a8PR0VHp99aiRQsAUPze9u3bBwAYN26c0mO//d85EdG78ByB5wgl7RyB4zkRUcGkUin+++8/+Pv7KwU+VqhQAb1790ZoaKhiiTIzMzNcv34dERER+T6Wvr4+dHR0cOzYsTxLfb2Pj48PQkJCAMiX8QoPD8ewYcNgaWmp2B4SEgIzMzPUqVMHALBjxw7IZDJ0795d6e+mjY0NatSoke/yW7mCg4Ohra2NoUOHKrZpaGhg9OjRBR7zdlimj49Pocc1e3t7CILw0etqnz9/HsnJyejbt69iqZTcc5Po6Gilmdj5yb3y+mPOyXKPefOc7OXLlzh9+rTSVVCf8hzvc+TIEXh6eqJmzZqKbR07dkR0dDQcHBzg7OysVIeWlhaMjIw++HkKOvfM7z2gko8JPFQk4uLikJmZqfQHCoDiUuz8/kALH7k2lCAIedbLeltCQgIyMzM/6vHNzc0VH4AK0qlTJ4wZMwYSiQTGxsZwdnaGoaFhnv0cHByUfn748CEAoFatWnn2dXR0VLr0DJAPym8nUee+x+9aI+5dHj58CE9Pzzzba9eurbg/9yQDkH/ASk9Px+XLl5UGGgCIiIhAYmIirKys8n2u58+fK/1cuXLlPPuUK1fug0+YAGD+/PkICAiAnZ0d3N3d0a5dO/Tv37/Ik7s/pOa39zU1NYWenh4sLS3zbH97OZj169djwYIFuHXrltIl7m/+NzRq1Chs3boVbdu2RcWKFdGmTRt0794dfn5+eWrp168ftLS0cPPmzTxrHkdERODmzZuKNd7elvt7e/jwITQ0NJS+RADy/++XiMqWtLQ0JCYmKm0raH11niPwHCE/6nyOwPGciKhgsbGxSE1NzfdvSO3atSGTyRAVFQVnZ2d8//336NSpE2rWrIk6derAz88P/fr1Q7169QDI81nmzZuHr776CtbW1mjYsCE6dOiA/v37vze3xcfHBytXrsTdu3dx7949SCQSeHl5KZrrQ4cORUhICBo3bgwNDfn80YiICAiCgBo1auT7mNra2gU+38OHD1GhQgUYGBgoba9evXq+++vp6eX5+/yx49q7FHROlvtFcn7nX6Ghoe9touf6mHOy3GPePCc7cOAAAKBNmzZF8hy5oqOjlX42NTWFvr4+nj59qrSMD/Duc8+PVdC5Z37vAZV8bKJTkcidMfX2H4jcmWVv//GysLD46MHjxYsXBQ56ubp06YLjx49/1OMfPXpUKRgjP5UqVVIKnSjIpwZhqINOnTph8+bNmDt3LjZs2KA4AQHks6ysrKywcePGfI99+6RBU1Mz3/0+ZtDs3r07fHx8sHPnTvz333/46aefMG/ePOzYsQNt27Yt8LiCBrHcgJe3fUjN+e1bmOP//PNPDBgwAP7+/vj6669hZWWlWA/33r17iv2srKwQFhaGAwcOYP/+/di/fz/WrVuH/v37Y/369UqP36VLF2zYsAGLFi3CnDlzlO6TyWSoW7cuFi5cmG9tdnZ2+W4nIsq1ZcsWDBw4UGlbQX/LeY6QP54jqN85wrtq4XhORPThmjRpgnv37uGff/7Bf//9hzVr1uCXX37BypUrFVf9BAYGomPHjti1axcOHDiAadOmYc6cOThy5IhSPsrbcq+aOnHiBO7fv4/69esrAp0XL16M5ORkXL58GT/++KPiGJlMBolEgv379+f7d/1jZiIXpKBxo6gVdE72rvMvY2Pjd763ABRZLh9zTpZ7zJtfPu/btw+NGzeGqalpkTxHrgoVKij9vG7dOgwYMAB6enofdO6ZnZ2NV69ewdjY+IOe/8WLF3m+ZM/dDiDf+6jkYhOdikS5cuVgaGiYJ/xj9+7dAOSBEe7u7ortjo6O2LhxIxITE5X+iL7vW7rs7GxERUXh888/f+d+CxYs+Og/xC4uLh91XGHkXoJ7+/ZtxaW2uW7fvq10iS4gH+Tv37+v9O3xnTt3ACBPUvjbCnovq1Spgtu3b+fZfuvWLaUac/n7+6NNmzYYMGAAjI2NFeEaAFCtWjUcOnQIjRs3FqUZUKFCBYwaNQqjRo3C8+fPUb9+ffz444/v/IBcrly5fANjcmcAimH79u2oWrUqduzYofR7yy8cTkdHBx07dkTHjh0hk8kwatQo/Prrr5g2bZrSLIixY8eievXqmD59OkxNTTF58mTFfdWqVUN4eDhatmz5zv/nqlSpAplMhnv37inNNMnvvx8iKlt8fX0LtRQFwHOEwuI5QtH6mHOET8XxnIjKgvLly8PAwKDA8UJDQ0PpSzxzc3MMHDgQAwcORHJyMpo0aYLvvvtOaemsatWq4auvvsJXX32FiIgIuLq6YsGCBfjzzz8LrKNy5cqoXLkyQkJCcP/+fcWSYE2aNMGECROwbds2SKVSpYZptWrVIAgCHBwc8szQfp8qVarg6NGjSE1NVZqNfvfu3Q96nDcVxSzlgs7Jcn8H+Z1/vXr1Ks951tscHR0BAJGRkXnue1/dkZGRsLS0VHxhLggCgoODMXHixEI/R2G9/dpzr4yzs7N757lnQXXkXiUBFO73ExkZme/5Ye5ryr2aj0oHrolORaZp06bYtWuXYv2zlJQUxeyjNxO6AcDLywuCIODixYtK23Mvdy4oFfvGjRtIT09Ho0aN3lmLu7s7WrVq9VG34liPK5eHhwesrKywcuVKZGRkKLbv378fN2/eVFofLNfSpUsV/xYEAUuXLoW2tjZatmz5zucyNDTM931s164dzp07h9OnTyu2paSkYNWqVbC3t4eTk1OeY/r374/Fixdj5cqV+OabbxTbu3fvDqlUih9++CHPMdnZ2e9MN/8UUqk0zyVrVlZWsLW1VXpf81OtWjUkJibiypUrim3Pnj3Dzp07i6XWwsidpfDmbLazZ88q/Y4A5LlkXENDQzHI5/e6p02bhokTJ2LKlClKjY3u3bvjyZMnWL16dZ5j0tLSkJKSAgCKRsPixYuV9gkKCirsSyOiUqpChQp5xs934TnC+/EcoWh8yjnCp+J4TkRlgaamJtq0aYN//vlHafmwmJgYbNq0Cd7e3jAxMQGQ9++dkZERqlevrvhbl5qaivT0dKV9qlWrBmNj40L9zfbx8cGRI0dw7tw5RRPd1dUVxsbGmDt3LvT19ZW+qO/SpQs0NTUxc+bMPFc6CYKQp943+fr6IisrS+lvrkwmw7Jly95bZ0Fym/H5jYlZWVm4desWnj179s7HKOiczNnZGRYWFtiwYYNi33PnziEsLEzx73epWLEi7OzscOHChTz3FXQekevixYvw8vJS/Hz+/Hk8f/48z7nMu56jsN5+7bkz05s2bYqQkBBFM1smk2HNmjUA8j/3BJCnjnf9fgB5Nsy9e/fyPfe8ePEiTE1N8yx3RyUbZ6JTkZk0aRKaN2+OZs2aYdCgQfjnn3+QlJSE9u3bY8WKFbC3t0fv3r1haGgIb29vWFhY4NChQ0qzrapVqwYzMzOsXLkSxsbGMDQ0hKenp2IdyYMHD8LAwACtW7cW62V+Em1tbcybNw8DBw5E06ZN0atXL8TExGDRokWwt7fHl19+qbS/np4egoODERAQAE9PT+zfvx979+7Ft99+W+D6l7nc3d2xYsUKzJo1C9WrV4eVlRVatGiByZMn46+//kLbtm0xbtw4mJubY/369YiMjMTff/+tdCn2m8aMGYOkpCT873//g6mpKb799ls0bdoUw4cPx5w5cxAWFoY2bdpAW1sbERER2LZtGxYtWoRu3boV2fuX69WrV6hUqRK6desGFxcXGBkZ4dChQzh//jwWLFjwzmN79uyJb775Bp07d8a4ceOQmpqKFStWoGbNmkqhX6rUoUMH7NixA507d0b79u0RGRmJlStXwsnJCcnJyYr9hgwZgoSEBLRo0QKVKlXCw4cPsWTJEri6uhb4DfdPP/2ExMREjB49GsbGxujbty/69euHrVu3YsSIETh69CgaN24MqVSKW7duYevWrThw4AA8PDzg6uqKXr16Yfny5UhMTESjRo1w+PDhT5ptQUTqZenSpXj58qVi3czdu3fj8ePHAOQzYN81Q+lD8Bzh/XiOUDQ+5RzhU3E8J6LSZO3atQgODs6zffz48Zg1axYOHjwIb29vjBo1ClpaWvj111+RkZGB+fPnK/Z1cnJCs2bN4O7uDnNzc1y4cAHbt29XBGPfuXMHLVu2RPfu3eHk5AQtLS3s3LkTMTEx6Nmz53tr9PHxwcaNGyGRSBTLu2hqaqJRo0Y4cOAAmjVrppRlUq1aNcyaNQtTpkzBgwcP4O/vD2NjY0RGRmLnzp0YNmxYntnSufz9/dGgQQN89dVXuHv3LhwdHfHvv/8iISEBwMfNKtfX14eTkxO2bNmCmjVrwtzcHHXq1EGdOnXw5MkT1K5dGwEBAR8VLqqvr49x48ZhxowZ6NKlC1q2bImFCxfCyckJgiBg8uTJSE5ORufOnQt8jE6dOmHnzp151v0u6DwCkOdxXLlyRSlwde/evQV+GV/Qczx8+BB//PEHgNfN7VmzZgGQXxXQr1+/d77+wMBArFmzBq1atcLYsWNx5swZXLt2DV26dMH27dvRoEED9OzZE9bW1qhatSrq1KmDQ4cOYdCgQUrvYUG/HwA4dOgQBEFAp06d8jz/wYMH0bFjR66JXtoIREXor7/+EpycnARtbW3BxsZG2LZtm/D06VOhSZMmgkQiESIjIxX7jhs3TqhevXqex/jnn38EJycnQUtLSwAgrFu3TnGfp6en0LdvXxW8koIBEEaPHv3OfY4ePSoAELZt25bv/Vu2bBHc3NwEXV1dwdzcXOjTp4/w+PFjpX0CAgIEQ0ND4d69e0KbNm0EAwMDwdraWpgxY4YglUrfW2d0dLTQvn17wdjYWAAgNG3aVHHfvXv3hG7duglmZmaCnp6e0KBBA2HPnj2Feg2TJk0SAAhLly5VbFu1apXg7u4u6OvrC8bGxkLdunWFSZMmCU+fPlXsU6VKFaF9+/Z56mzatKlSbQUBIMyYMUMQBEHIyMgQvv76a8HFxUUwNjYWDA0NBRcXF2H58uXvfRxBEIT//vtPqFOnjqCjoyPUqlVL+PPPP4UZM2YIb/9JLOh3XaVKFSEgIEDxc+6xsbGxSvvl/g7f1rRpU8HZ2Vnxs0wmE2bPni1UqVJF0NXVFdzc3IQ9e/YIAQEBQpUqVRT7bd++XWjTpo1gZWUl6OjoCJUrVxaGDx8uPHv2TLHPunXrBADC+fPnFdukUqnQq1cvQUtLS9i1a5cgCIKQmZkpzJs3T3B2dhZ0dXWFcuXKCe7u7sLMmTOFxMRExbFpaWnCuHHjBAsLC8HQ0FDo2LGjEBUVpfT7IKKSq0qVKgKAfG9vjtlFgecIcjxHUN9zBI7nRESv//4UdIuKihIEQRAuXbok+Pr6CkZGRoKBgYHQvHlz4dSpU0qPNWvWLKFBgwaCmZmZoK+vLzg6Ogo//vijkJmZKQiCIMTFxQmjR48WHB0dBUNDQ8HU1FTw9PQUtm7dWqhar1+/LgAQateuned5AQjTpk3L97i///5b8Pb2FgwNDQVDQ0PB0dFRGD16tHD79m3FPm//7RYEQYiNjRV69+4tGBsbC6ampsKAAQOEkydPCgCEzZs3Kx2b37iR32fOU6dOCe7u7oKOjo7S3+TIyEgBgNLnzg+VnZ0tTJs2TahUqZKgra0tODk5CTdv3hRCQ0OFqlWr5nl9b7t06ZIAQAgJCVHa/q7ziBUrVggGBgZCUlKSYpuHh4cwatSoD3qO3HON/G6FOTcQBEE4dOiQ4OHhoTinWrx4sfDq1SuhU6dOgqampnD06FHFvgsXLhSMjIyE1NRUpcco6PcjCILQo0cPwdvbO8/z3rx5UwAgHDp0qFB1UskhEYRPiMEl+gT379+Ho6Mj9u/f/97LjgEgLCwM9evXx6VLl+Dq6lr8BRIREZEoeI5AREREJcGuXbvQuXNnhIaGonHjxmKXU+RatmwJW1tbxazw93Fzc0OzZs3wyy+/AJAv81OhQgXs2bMH7dq1K5LnKA6JiYmoWrUq5s+fj8GDB793/+joaDg4OGDz5s15ZqIHBgbixIkTuHjxImeilzJsopOoRo4cibt37xYqnKxnz56QyWTYunWrCiojIiIiMfEcgYiIiNRJWlqaUli2VCpFmzZtcOHCBURHR4sSpF3czp49Cx8fH0REROQJGH9bcHAwunXrhvv378PKygqAfMmeTZs24Ztvvinw/fmQ5yhO8+bNw7p163Djxo0Cl7DLNXnyZMV6/G+Kj49HlSpVsHXr1gK/NKCSi010IiIiIiIiIiKidxgyZAjS0tLg5eWFjIwM7NixA6dOncLs2bMxZcoUscsjomLGJjoREREREREREdE7bNq0CQsWLMDdu3eRnp6O6tWrY+TIkYqgVCIq3dhEJyIiIiIiIiIiIiIqwLsX+SEiIiIiIiIiIiIiKsPYRCciIiIiIiIiIiIiKoCW2AWomkwmw9OnT2FsbAyJRCJ2OUREREoEQcCrV69ga2v73lT4soZjOBERqTOO4fnj+E1EROqssON3mWuiP336FHZ2dmKXQURE9E5RUVGoVKmS2GWoFY7hRERUEnAMV8bxm4iISoL3jd9lrolubGwMQP7GmJiYiFwNERGRsqSkJNjZ2SnGK3qNYzgREakzjuH54/hNRETqrLDjd5lroudePmZiYsIBnIiI1BYvd86LYzgREZUEHMOVcfwmIqKS4H3jNxdqIyIiIiIiIiIiIiIqAJvoREREREREREREREQFYBOdiIiIiIiIiIiIiKgAbKITERERERERERERERVA1Cb6iRMn0LFjR9ja2kIikWDXrl3vPebYsWOoX78+dHV1Ub16dfz+++/FXicREREp4xhOREREREREZYWoTfSUlBS4uLhg2bJlhdo/MjIS7du3R/PmzREWFobAwEAMGTIEBw4cKOZKiYiI6E0cw4mIiIiIiKis0BLzydu2bYu2bdsWev+VK1fCwcEBCxYsAADUrl0boaGh+OWXX+Dr61tcZRIREdFbOIYTERERERFRWVGi1kQ/ffo0WrVqpbTN19cXp0+fLvCYjIwMJCUlKd2IiIhItTiGExERqd77ll8TBAHTp09HhQoVoK+vj1atWiEiIkJpn4SEBPTp0wcmJiYwMzPD4MGDkZycrMJXQUREJL4S1USPjo6GtbW10jZra2skJSUhLS0t32PmzJkDU1NTxc3Ozk4VpRIREdEbOIYTERGp3vuWX5s/fz4WL16MlStX4uzZszA0NISvry/S09MV+/Tp0wfXr1/HwYMHsWfPHpw4cQLDhg1T1UsgIiJSCyWqif4xpkyZgsTERMUtKiqqSB8/9lVGkT4eERERyRX3GE5ERFTaP8+1bdsWs2bNQufOnfPcJwgCgoKCMHXqVHTq1An16tXDhg0b8PTpU8WM9Zs3byI4OBhr1qyBp6cnvL29sWTJEmzevBlPnz5V8ashIiIST4lqotvY2CAmJkZpW0xMDExMTKCvr5/vMbq6ujAxMVG6FZWrjxPhM/8Igg7dQbZUVmSPS0REVNqo2xhORERlW3JGNr7aGo62i0IQl1y6G+kFiYyMRHR0tNJya6ampvD09FQst3b69GmYmZnBw8NDsU+rVq2goaGBs2fP5vu4xbkc29rQSDSeewQ/H7hdZI9JRERUGCWqie7l5YXDhw8rbTt48CC8vLxEqSf4+jOkZ8kQdCgCPVedQVRCqih1EBERqTt1G8OJiKjsuvzoBdovDsHflx4jISUDJ+/GiV2SKKKjowEg3+XWcu+Ljo6GlZWV0v1aWlowNzdX7PO24lyOLSUjG09epiE+pWx+8UFEROIRtYmenJyMsLAwhIWFAZB/Ex4WFoZHjx4BkF/G3b9/f8X+I0aMwP379zFp0iTcunULy5cvx9atW/Hll1+KUT6+9nVEUA9XGOlq4cLDF2i3KAT/hD0RpRYiIiJVKuljOBERlT1SmYClRyLQbeVpPIxPRUUzfWwe5oVOrhXFLq1UKc7l2HS05C2MjGxeCU5ERKolahP9woULcHNzg5ubGwBgwoQJcHNzw/Tp0wEAz549U3wYBwAHBwfs3bsXBw8ehIuLCxYsWIA1a9bA19dXlPoBwN+tIvaP90H9ymZ4lZGN8ZvD8OWWMLxKzxKtJiIiouJWGsZwIiIqO568TEOv1Wfw8393IJUJ6Ohii33jfdDAwVzs0kRjY2MDAPkut5Z7n42NDZ4/f650f3Z2NhISEhT7vK04l2PLbaJnsolOREQqpiXmkzdr1gyCIBR4/++//57vMZcvXy7Gqj6cnbkBtg73wpIjd7HkSAR2Xn6CCw8TENTDDe5VyoldHhERUZErLWM4ERGVfnuuPMW3O64iKT0bhjqa+L5THXSpXxESiUTs0kTl4OAAGxsbHD58GK6urgCApKQknD17FiNHjgQgX47t5cuXuHjxItzd3QEAR44cgUwmg6enp8pr5kx0IiISi6hN9NJES1MDX7auCZ8alhi/OQxRCWno/utpjGtRA6ObV4OWZolafp6IiIiIiKhES8nIxnf/Xse2i48BAK52ZljU0xVVLAxFrkx1kpOTcffuXcXPucuvmZubo3LlyggMDMSsWbNQo0YNODg4YNq0abC1tYW/vz8AoHbt2vDz88PQoUOxcuVKZGVlYcyYMejZsydsbW1V/np0tTQBcCY6ERGpHpvoRczD3hz7A30wbdc1/BP2FL8cuoOQiFj80sMVduYGYpdHRERERERU6oVFvUTg5st4EJ8KDQkwunl1jGtZA9plbHLThQsX0Lx5c8XPEyZMAAAEBATg999/x6RJk5CSkoJhw4bh5cuX8Pb2RnBwMPT09BTHbNy4EWPGjEHLli2hoaGBrl27YvHixSp/LQCXcyEiIvFIhHddi10KJSUlwdTUFImJiUW6Nlt+dl5+jGm7riM5IxvGulr4sUtdfO6i+m/riYio5FDlOFXS8L0hIqL3kcoErDx+D78cvINsmYCKZvr4pYerStY+5ziVv6J8X4KvRWPEnxfhXqUc/h7ZqIgqJCKisqyw41TZ+hpexTq7VcK+ca9DR8f9dRkTtjJ0lIiIiIiIqKg9fZmG3qvP4KcDt5EtE9ChXoUyHx5a2uhyJjoREYmETfRiVtlCHjo6rmUNaEiAHZeeoP3iUFx+9ELs0oiIiIiIiEqFvVeewS/oBM5GJsBQRxM/f+GCJb3cYKqvLXZpVIReB4tKRa6EiIjKGjbRVUBLUwMTWtfEluFeqGimj0cJqei28jSWHI6AVFamVtMhIiIiIiIqMikZ2fh6WzhGb7qEpPRsuNiZYe84H3RzrwSJRCJ2eVTEOBOdiIjEwia6Cn1mb459433wuYstpDIBCw7eQa9VZ/DkZZrYpREREREREZUo4VEv0WFJKLZdfAyJBBjTvDq2j/CCvaWh2KVRMWGwKBERiYVNdBUz1dfGop6u+KWHC4x0tXDuQQL8gk5gd/hTsUsjIiIiIiJSe1KZgGVH76LrilOIjEtBBVM9/DW0ISb61oK2Jj/ilmaKJrqUTXQiIlItLbELKIskEgk6u1WCe2VzjN9yGZcfvcTYvy7j2O1YzOzkDCNd/lqIiIiIiIje9vRlGr7cEoazkQkAgHZ1bTCncz2YGnDt87JARzN3TXQ20YmISLX4Nb2IFKGjLapDQwL8fekx2i0KYegoERERERHRW/ZdfYa2i0JwNjIBBjqamN+tHpb1rs8GehnyOliUTXQiIlItNtFFpq2pgQltamHzMOXQ0aVHGDpKRERERESUkpGNSdvDMWrjJSSmZcGlkin2jvNBdw87hoeWMW+uiS4I/LxMRESqwya6mmjgIA8d7ZgTOvrzf3fQazVDR4mIiIiIqOy68lgeHrr1gjw8dFSzatg+shEcGB5aJulqaSr+nSVlE52IiFSHTXQ1YqqvjcU9XbHgCxcY6mjiXGQC2gadwJ4rDB0lIiIiIqKyQyoTsPzYXXRZ/jo8dNOQhpjk58jw0DJMV+v1757hokREpEo8+1AzEokEXd0rYd94H7jamSEpPRtjNl3GxG3hSM7IFrs8IiIiIiKiYvUsMQ191pzB/ODbyJYJaFfXBvvH+8CrmoXYpZHIdN74AiWT66ITEZEKsYmupqpYGGLbCC+MzQkd3X7xMdovZugoERERERGVXvuvPoNfUAjO3M8JD+0qDw81M9ARuzRSAxoaEmhpyNfBz8iWilwNERGVJWyiqzFtTQ189Ubo6MN4ho4SEREREVHpk5KRjW+2X8HInPDQernhoZ8xPJSUvRkuSkREpCpsopcAuaGjHepVUAodfcrQUSIiIiIiKuFyw0O3XIhShIf+zfBQKoAum+hERCQCNtFLCFN9bSzp5Yaf3wgd9Qs6gb1XnoldGhERERER0QeTyQSsPH5PER5qY6KHjUM8GR5K75Q7Ez2DTXQiIlIhLbELoMKTSCTo5l4JHlXKYfyWMIRHvcToTZdw7HYlfPe5Mwx1+eskIiIiIiL19ywxDRO2hOP0/XgAQNs6NpjTpS7XPqf3UiznImUTnYiIVIdf75dA9paG2D7CC2OaV4dEAmzLCR0Ni3opdmlERERERETvFHxNHh56+n489LU1Ma9rXSzvw/BQKhydnKsUMrLYRCciItVhE72E0tbUwETfWtg8tCFsTfXwID4V3VacwrKjdxk6SkREREREaic1MxuT/76CEX/Kw0PrVjTF3nHe6PFZZYaHUqHpaGkC4Ex0IiJSLTbRSzjPqhbYP74J2tergGyZgJ8O3EZvho4SEREREZEaufo4ER0Wh2LzeXl46Mic8NCq5Y3ELo1KGAaLEhGRGNhELwVMDbSxtJcbfupWDwY6mjjL0FEiIiIiIlIDivDQFSdx/43w0G/8HBVrWxN9CB020YmISAQ8ayklJBIJvvCww75xPnCpZIqk9GyM3nQJX28LR0pGttjlERERERFRGROdmI6+v53F3P23kCUV0LaODYIDfdComqXYpVEJppiJLpWKXAkREZUlbKKXMvaWhtg+shFGN6+mFDoaztBRIiIiIiJSkeBr0fBbdAKn7jE8lIpWbrAoZ6ITEZEqsYleCmlrauBrX0f8NbQhKuSEjnZl6CgRERERERWz1MxsTNlxBSP+vIiXqQwPpaKXu5xLBpvoRESkQmyil2INq1ogeHwTtK/L0FEiIiIiIipe154kosOSUPx1Th4eOrxpVYaHUpFjsCgREYmBTfRSztRAG0t7K4eOtl0Ugn1XGTpKRERERESfTiYT8Ovxe+i8/CTux6bA2kQXGwd7Ykrb2gwPpSLHmehERCQGntGUAbmho3tzQkcT07IwauMlTNrO0FEiIiIiIvp40Ynp6Lf2LObkhIf6OlsjeHwTNKrO8FAqHjqciU5ERCJgE70MccgJHR3VTB46uvXCY3RYEoorj1+KXRoREREREZUwB67Lw0NP3pWHh87pUhcr+7qjnCHDQ6n46GhqAgAypWyiExGR6rCJXsZoa2pgkp8jNg2Rh45GxqWgy/JTWHHsHmQMHSUiIiIioveQh4dexfA/5OGhdSqaYM84b/RqwPBQKn6K5Vyy2EQnIiLVYRO9jPKqZoH9433Qrq4NsmUC5gXfQp81Z/EskaGjRERERESUv9fhoY8AAMObVMWOkY1RjeGhpCKKYFGpVORKiIioLGETvQwzM9DBst71MT8ndPT0/Xj4BYUg+BpDR4mIiIiI6DWZTMCqE2+Fhw7xxJR2DA8l1eKa6EREJAae7ZRxEokE3XNCR+vlhI6O+PMSJv99BamZDB0lIiIiIirrYpLS0X/tOczeJw8PbeMkDw9tzPBQEoEum+hERCQCNtEJQE7o6IhGGJkTOrr5fBQ6LA7F1ceJYpdGREREREQi+e96NPyCTiD0bhz0tDUwu3Nd/NqP4aEkHsVMdAaLEhGRCrGJTgo6Whr4xs8RG4d4wsZED/fjUtBlxUmsPM7QUSIiIiKisiQtU4pvd17FsD8u4kVqFpxtTbBnrA96ezI8lMSlo8lgUSIiUj020SmPRtUsERzog7Z1bJAlFTB3/y30/e0sohPTxS6NiIiIiIiKmTw8NASbzsrDQ4c1qYodoxqhuhXDQ0l8nIlORERiYBOd8mVmoIPlfepjXte60NfWxKl78fBbdALB16LFLo2IiIiIiIqBTCZgTch9dFl+CvdiU2BlrIs/B3vi23a1oaulKXZ5RACg+G8xg2uiExGRCrGJTgWSSCTo8Vll7BnnjToVTfAyNQsj/ryIKTsYOkpEREREVJrEJKUjYN05zNp7E5lSGVo7WSM4sAm8azA8lNSLDoNFiYhIBGyi03tVK2+EHSMbY3jTqpBIgL/OMXSUiIiIiKi0OHgjBn5BJxASIQ8P/bFzHazq5w5zhoeSGmITnYiIxMAmOhWKjpYGprStjY2DPWFtosvQUSIiIiKiEi4tU4r/7byKoRsu4EVqFpwqmGDPWG/08azC8FBSW4pg0WypyJUQEVFZwiY6fZBG1S0RPL4JfJ2tFaGj/dYydJSIiIiIqCS5/jQRHZeGYmNOeOhQHwfsHN0I1a2MRa6M6N0YLEpERGJgE50+WDlDHazs6445XeShoyfvykNHD1xn6CgRERERkTrLDQ/tvOwU7j5PhpWxLv4Y3AD/a+/E8FAqEXS5nAsREYmATXT6KBKJBL0aKIeODv/jIqbsuMrQUSIiIiIiNfT8rfDQVrXl4aE+NcqLXRpRobGJTkREYmATnT6JInS0SVUAwF/nHqHDklBce8LQUSIiIiIidXHwRgz8FoUowkNn+dfB6v4MD6WSh8GiREQkBtGb6MuWLYO9vT309PTg6emJc+fOvXP/oKAg1KpVC/r6+rCzs8OXX36J9HSuxy0mHS0NTGlXGxuH5ISOxqag8/KT+JWho0REpRrHcCIi9ZeWKcXUXfLw0ISUTNTOCQ/t25DhoVQy5TbRM9hEJyIiFRK1ib5lyxZMmDABM2bMwKVLl+Di4gJfX188f/483/03bdqEyZMnY8aMGbh58yZ+++03bNmyBd9++62KK6f8NH4rdHQOQ0eJiEotjuFEROrvxtMkdFwaij/PvA4P3cXwUCrhdDTlbYxsmcBJW0REpDKiNtEXLlyIoUOHYuDAgXBycsLKlSthYGCAtWvX5rv/qVOn0LhxY/Tu3Rv29vZo06YNevXq9d6Zb6Q6DB0lIiobOIYTEamv3PBQ/2UnGR5KpY6u9uv/hjOlnI1ORESqIVoTPTMzExcvXkSrVq1eF6OhgVatWuH06dP5HtOoUSNcvHhR8YH7/v372LdvH9q1a6eSmqlwckNHd4/1hrMtQ0eJiEobjuFEROrr+at0DPj9PMNDqVCkUimmTZsGBwcH6Ovro1q1avjhhx8gCK9neAuCgOnTp6NChQrQ19dHq1atEBERIVrNuTPRAS7pQkREqqMl1hPHxcVBKpXC2tpaabu1tTVu3bqV7zG9e/dGXFwcvL29IQgCsrOzMWLEiHdeCp6RkYGMjAzFz0lJSUXzAui9qlsZYceoRljw3x2sOnEff517hLOR8Vjc0w11KpqKXR4REX0kjuFEROrp8M0YfL39ChJSMqGrpYFpHZzQx7My1z6nAs2bNw8rVqzA+vXr4ezsjAsXLmDgwIEwNTXFuHHjAADz58/H4sWLsX79ejg4OGDatGnw9fXFjRs3oKenp/KatTVf//fMcFEiIlIV0YNFP8SxY8cwe/ZsLF++HJcuXcKOHTuwd+9e/PDDDwUeM2fOHJiamipudnZ2KqyYdLU08W272vhzsCesjF+Hjq46wdBRIqKyhGM4EVHxSc+SYtquaxi8nuGh9GFOnTqFTp06oX379rC3t0e3bt3Qpk0bxZVjgiAgKCgIU6dORadOnVCvXj1s2LABT58+xa5du0SpWSKRvBEuKhWlBiIiKntEa6JbWlpCU1MTMTExSttjYmJgY2OT7zHTpk1Dv379MGTIENStWxedO3fG7NmzMWfOHMhk+X8DPWXKFCQmJipuUVFRRf5a6P28a1giOLAJWjvJQ0dn77uF/mvPISaJoaNERCUNx3AiIvVx81kSOi4JxR9nHgIABnvLw0NrWDM8lN6vUaNGOHz4MO7cuQMACA8PR2hoKNq2bQsAiIyMRHR0tNISbqampvD09CxwCbeMjAwkJSUp3Yqabs6SLpyJTkREqiJaE11HRwfu7u44fPiwYptMJsPhw4fh5eWV7zGpqanQ0FAuWVNTHiry5pptb9LV1YWJiYnSjcRhbqiDVf3cMbtzXehpayD0bhz8gk7gP4aOEhGVKBzDiYjEJ5MJ+C00Ep2WnkTE82SUN9bFhkENMK0Dw0Op8CZPnoyePXvC0dER2tracHNzQ2BgIPr06QMAiI6Wf1bLbwm33PvepooryXS1c5roDBYlIiIVEW1NdACYMGECAgIC4OHhgQYNGiAoKAgpKSkYOHAgAKB///6oWLEi5syZAwDo2LEjFi5cCDc3N3h6euLu3buYNm0aOnbsqPggTupNIpGgt2dlNHAwx/jNl3H9aRKG/XERvT0rY1p7J+jr8PdIRFQScAwnIhLP81fp+HrbFRy/EwsAaFXbCvO61oOFka7IlVFJs3XrVmzcuBGbNm2Cs7MzwsLCEBgYCFtbWwQEBHzUY06ZMgUTJkxQ/JyUlFTkjXQdzkQnIiIVE7WJ3qNHD8TGxmL69OmIjo6Gq6srgoODFd9yP3r0SGnW2tSpUyGRSDB16lQ8efIE5cuXR8eOHfHjjz+K9RLoI70dOrrp7COcvR+Pxb3c4GzL0FEiInXHMZyISBxHbsXg621XEJ8THjq1gxP6MjyUPtLXX3+tmI0OAHXr1sXDhw8xZ84cBAQEKJZpi4mJQYUKFRTHxcTEwNXVNd/H1NXVha5u8X6hk7smOpvoRESkKhKhoGuoS6mkpCSYmpoiMTGRl4WridCIOEzYGobnrzKgo6mBSX61MKixAzQ0+EGAiMoejlMF43tDRGVZepYUs/fdxIbT8rXPHW2MsbiXG2py7XO1URLHKQsLC8yaNQsjR45UbJszZw7WrVuHO3fuQBAE2NraYuLEifjqq68AyF+nlZUVfv/9d0Xz/V2K431p88tx3IlJxsYhnmhc3bJIHpOIiMqmwo5Tos5EJwJeh45O2n4Fh27GYNbemzh+JxYLvnCBlYme2OUREREREYnqVnQSxv11GXdikgEAgxo7YJJfLehpczks+jS5V4VVrlwZzs7OuHz5MhYuXIhBgwYBkC/HGRgYiFmzZqFGjRpwcHDAtGnTYGtrC39/f9Hq5kx0IiJSNTbRSS2YG+pgdX93bDr3CD/suYGQiDj4LQrB/K710MrJ+v0PQERERERUygiCgHUnH2Bu8C1kZstgaaSLn7+oh2a1rMQujUqJJUuWYNq0aRg1ahSeP38OW1tbDB8+HNOnT1fsM2nSJKSkpGDYsGF4+fIlvL29ERwcDD098SY85YbnZrCJTkREKsLlXEjt3H3+CuP+CsONZ0kAgL4NK+N/7Rg6SkRlA8epgvG9IaKyJPZVBiZuC1eEh7Z0tML8bgwPVWccp/JXHO9Lr1VncDonU+tzF9sieUwiIiqbCjtOaRR4D5FIqlsZY+foRhjq4wAA+PPMI3RcGoobT5NEroyIiIiIqPgdvfUcfkEncPxOLHS1NPBDJ2esCfBgA50oB5dzISIiVWMTndSSrpYm/tfeCX8MboDyxrq4+zwZ/stOYk3IfchkZeriCSIiIiIqI9KzpJjxzzUM/P084lMy4WhjjN1jvdHPyx4SiUTs8ojURm4TPSNbKnIlRERUVrCJTmrNp0Z5BI/3Qava1siUyjBr700M+P08nr9KF7s0IiIiIqIicys6CZ2WnsT60w8BAAMb22PX6MaoaW0scmVE6ocz0YmISNXYRCe1Z2Gki9X93THLvw70tDVw4k4s/IJCcPhmjNilERERERF9EkEQ8PvJSHy+9CRux7yCpZEufh/4GWZ0dIaeNjOBiPKjyyY6ERGpGJvoVCJIJBL0bVgFe8Z6o3YFEySkZGLw+guY/s81pGfxEj4iIiIiKnliX2Vg4O/n8d3uG8jMlqGFoxWCA33QrJaV2KURqTU20YmISNXYRKcSpbqVMXaNboQh3vLQ0Q2nH6LjklDcfMbQUSIiIiIqOY7eeo62i07g2O1Y6GhpYObnzvgtwAOWDA8lei8dzZwmupRNdCIiUg020anE0dXSxNQOTtgwSB46GvE8GZ2WnsRvoZEMHSUiIiIitZaeJcV3/17HwN/PIy45E7WsjbF7jDcCGjE8lKiwXgeLsolORESqwSY6lVhNauaGjlohUyrDD3tuMHSUiIiIiNTW7ehX6LT0JH4/9QAAMKCRPf4Z0xi1bBgeSvQhGCxKRESqxiY6lWjy0FEP/NDJGbpa8tDRtgwdJSIiIiI1khse2nFpaE54qA7WDfwM333O8FCij6GjKf//hjPRiYhIVdhEpxJPIpGgn5c9do/1hqONMeIZOkpEREREaiIuOQOD3ggPbVarPPaPb4LmDA8l+mi62pyJTkREqsUmOpUaNa2NsWt0Ywxq/Dp09POlDB0lIiIiInEcvf0cfkEncDQnPHRGRyesG/AZyhszPJToUzBYlIiIVI1NdCpV9LQ1Mb2jE34f+BksjXRxJyYZnZadxNrQSAgCQ0eJiIiIqPgpwkPXycNDa1ob4d8xjTGwsQPDQ4mKgCJYlFceExGRirCJTqVSs1pWOBDog5aOVsjMluH7PTcwYN15xL7KELs0IiIiIirFbke/gv8y5fDQf8d4w9HGRNzCiEoRRbAoZ6ITEZGKsIlOpZaFkS7WBLwOHT1+JxZ+QSdw5BZDR4mIiIioaAmCgPWnHqDj0lDcis4JDx3A8FCi4qCrxTXRiYhItdhEp1Itv9DRQb9fwAyGjhIRERFREYlLzsDg9Rcw49/ryuGhjgwPJSoObKITEZGqsYlOZUJu6OjAxvYAgPWnH6LT0pO4Fc3QUSIiIiL6eMduP4dfUAiO3HrO8FAiFeFyLkREpGpsolOZoaetiRkdnRWho7djXuHzpSex7iRDR4mIiIjow6RnSTFz93UMWHcecckZqGlthH9GMzyUSBV0NOVLJHEmOhERqQqb6FTmNKtlheBAH7TICR2dufsGBv7O0FEiIiIiKpw7MfLw0HUnHwAAAryq4N8x3qhdgeGhRKqQOxM9g010IiJSETbRqUyyNNLFbwEe+D4ndPTY7Vi0XXQCR289F7s0IiIiIlJTivDQJfLwUAtDHawd4IGZneowPJRIhXS4JjoREakYm+hUZkkkEvR/I3Q0LjkTA38/j+/+vc7QUSIiIiJSEpecgSE54aEZ2TI0rVke+wN90MLRWuzSiMocXc5EJyIiFWMTncq8t0NHfz/1AJ2WnsTt6FfiFkZEREREauH4nVj4BYXg8K3n0NHUwPQO8vBQK2M9sUsjKpNez0Tn5CciIlINNtGJ8Dp0dN3Az2BppIPbMa/QcWkofmfoKBEREVGZlZ4lxfe7byBg7TnEJWeghpUR/hnTGIO8HaChwfBQIrHoaOY00aWciU5ERKrBJjrRG5rXssL+8U3QvFZ5ZGbL8N3uGxjE0FEiIiKiMiciJzx07clIAEB/ryrYPZbhoUTq4M3lXDjpiYiIVIFNdKK3lDfWxdoBn2Hm587Q0dLA0dzQ0dsMHSUiIiIq7QRBwB+nH6BDTniouaFOTiA9w0OJ1EXuci6CAGTL2EQnIqLixyY6UT4kEgkCGtlj9xhv1LLOCR1dx9BRIiIiotIsPjkDQzdcwLR/5OGhTWqWR3CgD1rWZngokTrR1Xr9hVYmw0WJiEgF2EQneodaNsb4Z0xjDGhkD0AeOuq/jKGjRERERKVNSEQs/BaF4NBNeXjotA5O+J3hoURqKXcmOsAmOhERqQab6ETvoaetie8+d8a6AfLQ0VvRr/D50lCsP/WA6+8RERERlXAZ2VLM2nMD/X47h9hX8vDQXaMbYzDDQ4nUlqaGBJo5/38yXJSIiFSBTXSiQmruKA8dbVarPDKyZZjx73UMWX8BcckMHSUiIiIqie4+f4XOy05hTag8PLRfQ3l4qJMtw0OJ1J2OZk64aBab6EREVPzYRCf6AOWNdbFuwGeY0dEJOloaOHzrOfyCQnCMoaNEREREJYYgCPjzzEN0WBKKG8+SYG6ogzX9PfCDP8NDiUqK3CVdMqXMrCIiouLHJjrRB5JIJBjY2AH/jmmMmtZGiEvOwIB15/H97hsMHSUiIiJScwkpmRj2x0VM3XUN6Vky+NSwRPB4H7RyYngoUUmim9NEz+Ca6EREpAJsohN9JEcbE/w7xlsROrr2ZCT8l53EnRiGjhIRERGpo5CIWPgGncDBGzHQ0dTA1Pa1sX5gA1iZMDyUqKRRzERnE52IiFSATXSiT/Bm6KiFoTx0tOOSUGw4zdBRIiIiInWRkS3Fj3tfh4dWzwkPHeJTleGhRCUUm+hERKRKbKITFYHmjlbYH+iDpjXloaPT/5GHjsYzdJSIiIhIVHefJ6PzslNYHSIPD+3bsDJ2j2F4KFFJpwgWZROdiIhUgE10oiJiZayH3wfmhI5qykNHfYNCcPxOrNilEREREZU5giBg49mH6LAkBDeeJaGcgTZW9/fALP+60NdheChRSafLmehERKRCbKITFaHc0NF/xjRGDSt56GjA2nP4fvcNZGQzdJSIiIhIFRJSMjH8j4v4387X4aEHApugNcNDiUoNXS35l2GZUjbRiYio+LGJTlQMalcwwe6x3ujvVQVAbujoKUQwdJSIiIioWIVGxMEv6AT+uxEDbU0Jw0OJSimuiU5ERKrEJjpRMdHT1sT3nergtwAPWBjq4OazJHRYEoo/zjxk6CgRERFREcvIlmL2vpvo+9tZPH+VgWrlDbFzFMNDiUorNtGJiEiV2EQnKmYta1srhY5O23UNQzcwdJSIiIioqNx9nowuy09h1Yn7AIA+npWxZ6wP6lQ0FbkyIiour4NFuWwmEREVPzbRiVTAylgP6wZ8hukd5KGjh24+h9+iEJxg6CgRERHRR3szPPT6U3l46Kp+7vixM8NDiUq73JnoGZyJTkREKsAmOpGKaGhIMMjbAbtGy0NHY19loP/ac/hhD0NHiYiIiD7Ui7fCQ72rWyI4sAnaONuIXRoRqYBiORcGixIRkQqwiU6kYk628tDRfg3loaO/hTJ0lIiIiOhDnLwbB79Fr8ND/9euNjYMagBrhocSlRm6XBOdiIhUiE10IhHoaWviB/86WNPfA+YMHSUiIiIqlMxsGebkhIfGJGWgak546NAmDA8lKmsYLEpERKrEJjqRiFo5WSN4vA98algydJSIiIjoHe7FJqPLipP49cR9CALQ27My9jI8lKjM4proRESkSqI30ZctWwZ7e3vo6enB09MT586de+f+L1++xOjRo1GhQgXo6uqiZs2a2Ldvn4qqJSp6ViZ6WD+wAaa9FToaEsHQUSJSbxzDiUgVBEHAprOP0H5xCK49kYeH/trPHbMZHkpUpulqciY6ERGpjqhN9C1btmDChAmYMWMGLl26BBcXF/j6+uL58+f57p+ZmYnWrVvjwYMH2L59O27fvo3Vq1ejYsWKKq6cqGhpaEgw+K3Q0X6/ncMsho4SkZriGE5EqvAiJRMj/ryIb3deRXqWDI2rWyA4sAl8GR5KVGSePHmCvn37wsLCAvr6+qhbty4uXLiguF8QBEyfPh0VKlSAvr4+WrVqhYiICBErluNyLkREpEqiNtEXLlyIoUOHYuDAgXBycsLKlSthYGCAtWvX5rv/2rVrkZCQgF27dqFx48awt7dH06ZN4eLiouLKiYqHk60J/h3zOnR0TWgkOi87hbvPGTpKROqFYzgRFbdTOeGhB67Lw0O/beeIPwZ5MjyUqAi9ePECjRs3hra2Nvbv348bN25gwYIFKFeunGKf+fPnY/HixVi5ciXOnj0LQ0ND+Pr6Ij09XcTKAV0t+ZUomVI20YmIqPiJ1kTPzMzExYsX0apVq9fFaGigVatWOH36dL7H/Pvvv/Dy8sLo0aNhbW2NOnXqYPbs2ZBKC56pm5GRgaSkJKUbkTrT15GHjq7OCR29kRM6+idDR4lITXAMJ6LilBse2uet8NBhTaoxPJSoiM2bNw92dnZYt24dGjRoAAcHB7Rp0wbVqlUDIJ+FHhQUhKlTp6JTp06oV68eNmzYgKdPn2LXrl2i1s6Z6EREpEqiNdHj4uIglUphbW2ttN3a2hrR0dH5HnP//n1s374dUqkU+/btw7Rp07BgwQLMmjWrwOeZM2cOTE1NFTc7O7sifR1ExaX1G6Gj6VkyTN11DcP+uIiElEyxSyOiMo5jOBEVl7fDQ3s1qIw9Y70ZHkpUTP799194eHjgiy++gJWVFdzc3LB69WrF/ZGRkYiOjlb64tzU1BSenp4FfnGuKq+DRbn8JRERFT/Rg0U/hEwmg5WVFVatWgV3d3f06NED//vf/7By5coCj5kyZQoSExMVt6ioKBVWTPRpckNHp7avDR1NDRy8EQPfoBMMHSWiEodjOBG9iyAI+OvcI3RYHIprT5JgZqCNlX3dMadLXRjoaIldHlGpdf/+faxYsQI1atTAgQMHMHLkSIwbNw7r168HAMWX4x/yxbmqriTT0cxtonMmOhERFT/RzkgtLS2hqamJmJgYpe0xMTGwsck/KKhChQrQ1taGpqamYlvt2rURHR2NzMxM6Ojo5DlGV1cXurq6RVs8kQppaEgwxKcqvKpZYPzmMNx9nox+v53DUB8HTPStpVgLkIhIVTiGE1FRepGSick7ruDAdfnflEbVLLCwuytsTLn2OVFxk8lk8PDwwOzZswEAbm5uuHbtGlauXImAgICPesw5c+Zg5syZRVlmvricCxERqZJoM9F1dHTg7u6Ow4cPK7bJZDIcPnwYXl5e+R7TuHFj3L17FzLZ60Hyzp07qFChQr4fvolKE2dbU+we442+DSsDAFaH5IaOJotcGRGVNRzDiaionLobh7aLQhThoVPaOuLPwZ5soBOpSIUKFeDk5KS0rXbt2nj06BEAKL4c/5AvzlV1JZlubhOdwaJERKQCoi7nMmHCBKxevRrr16/HzZs3MXLkSKSkpGDgwIEAgP79+2PKlCmK/UeOHImEhASMHz8ed+7cwd69ezF79myMHj1arJdApFL6OpqY5V8Xq/t7oJyBdk7oaAg2nmXoKBGpFsdwIvoUmdkyzN1/C31+O4vopHRUtZSHhw5vyvBQIlVq3Lgxbt++rbTtzp07qFKlCgDAwcEBNjY2Sl+cJyUl4ezZswV+ca6rqwsTExOlW3HgTHQiIlIlURcY7NGjB2JjYzF9+nRER0fD1dUVwcHBivXWHj16BA2N131+Ozs7HDhwAF9++SXq1auHihUrYvz48fjmm2/EeglEomjtZI0DgU3w1bZwhETE4X87r+H47VjM61oP5Qw5o5OIih/HcCL6WPdjkzF+cxiuPkkEAPRqYIdpHZy49jmRCL788ks0atQIs2fPRvfu3XHu3DmsWrUKq1atAgBIJBIEBgZi1qxZqFGjBhwcHDBt2jTY2trC399f1NpfB4uyiU5ERMVPIpSx6atJSUkwNTVFYmJisX0jTqQqMpmAtScjMS/4FrKkAqxNdLGwuysaV7cUuzQi+kgcpwrG94aoZBMEAVsvROG7f28gLUsKMwNtzO1SD3518l8SgqikKanj1J49ezBlyhRERETAwcEBEyZMwNChQxX3C4KAGTNmYNWqVXj58iW8vb2xfPly1KxZs1CPX1zvy8WHCei64jQqmxvgxKTmRfa4RERUthR2nGITnagUuP40EeP+uox7sSmQSIChPlXxVZuaDB0lKoGKa5x6+fIldu7ciZCQEDx8+BCpqakoX7483Nzc4Ovri0aNGhXZcxUXjuFEJdfL1ExM2XEV+69FA2B4KJVORTVOff/995g4cSIMDAyUtqelpeGnn37C9OnTP7VUlSqu8fvq40R0XBoKGxM9nPm2ZZE9LhERlS2FHadEXROdiIqGs60p9oz1QR/PyhAEYNWJ++iynKGjRAQ8ffoUQ4YMQYUKFTBr1iykpaXB1dUVLVu2RKVKlXD06FG0bt0aTk5O2LJli9jlElEpdOpeHPyCQrD/WjS0NSWYzPBQoneaOXMmkpPznsenpqZi5syZIlSknnS1GSxKRESqw4UHiUoJfR1N/Ni5LprWLI9v/r6C60/loaPTOjihd4PKkEgY0kVUFrm5uSEgIAAXL16Ek5NTvvukpaVh165dCAoKQlRUFCZOnKjiKomoNMrMluGXQ3ew8vg9CAJQ1dIQi3q6oW4lU7FLI1JrgiDke+4eHh4Oc3NzESpSTzqaDBYlIiLVYROdqJRp42wDFzszfLU1HKF3GTpKVNbduHEDFhYW79xHX18fvXr1Qq9evRAfH6+iyoioNIuMS8H4zZdx5bE8PLTnZ3aY3pHhoUTvUq5cOUgkEkgkEtSsWVOpkS6VSpGcnIwRI0aIWKF6eR0sKhW5EiIiKgt4FktUClmb6GHDoAZYE3ofPx24jf9uxCD88QmGjhKVQW820E+cOIFGjRpBS0t5+M/OzsapU6fQpEmT9zbciYjeRRAEbLvwGN/tvo7UTClM9bUxt0tdtK1bQezSiNReUFAQBEHAoEGDMHPmTJiavr5qQ0dHB/b29vDy8hKxQvWS20TPkgqQyQRoaPDKWyIiKj5sohOVUhoaEgxrUg2Nqlli3ObLuB+bgr6/ncUwn6r4qk0txUknEZUdzZs3x7Nnz2BlZaW0PTExEc2bN4dUyplcRPTxElOzMGXnFey7Kg8P9apqgYU9XFDBVF/kyohKhoCAAACAg4MDGjVqBG1tbZErUm9vfp7JlMqgp6EpYjVERFTasYlOVMrVqWiKPWO98cOem/jr3CP8euI+Tt6Lw6KebqhW3kjs8ohIhQpaYzU+Ph6GhoYiVEREpcXpe/GYsDUMzxLToaUhwVdtamFYk6rQ5MxQog/WtGlTyGQy3LlzB8+fP4dMprzmd5MmTUSqTL3ovt1E12YTnYiIig+b6ERlgIGOFuZ0qYtmteSho9eeJKHD4lBM7+iEnp/ZMXSUqJTr0qULAEAikWDAgAHQ1dVV3CeVSnHlyhU0atRIrPKIqATLksqw8ODr8FAHS0ME9XCFi52Z2KURlVhnzpxB79698fDhQwiCoHSfRCLhlWM5coNFAYaLEhFR8WMTnagM8XW2gaudGSZsDcPJu/GYsuMqjt1+jrldGDpKVJrlrqkqCAKMjY2hr/96aQUdHR00bNgQQ4cOFas8IiqhIuNSELj5MsJzwkN7eMjDQw11+RGD6FOMGDECHh4e2Lt3LypUqMAJLwWQSCTQ0dRAplSGDDbRiYiomPEMl6iMsTbRwx+DPBWhoweuxyA8KgQLu7ugEUNHiUqldevWAQDs7e0xceJELt1CRJ+E4aFExSsiIgLbt29H9erVxS5F7eloyZvonIlORETFjcmCRGVQbujozlGNUdXSENFJ6ejz21nM2X+TJ6BEpdiMGTPYQCeiT5KYmoUxmy5j0t9XkJopRcOq5tg/3ocNdKIi5Onpibt374pdRomQGy7KzzBERFTcPnom+qNHj/Dw4UOkpqaifPnycHZ2VlpjlYjUX52Kptgzzhs/7LmBv85F4dfj93HqbjwW9XRFVYaOEpUKfn5++O6779CwYcN37vfq1SssX74cRkZGGD16tIqqI6KS5Mz9eEzYEoanOeGhE9rUxPAm1RgeSlQErly5ovj32LFj8dVXXyE6Ohp169aFtra20r716tVTdXlqK3dddDbRiYiouH1QE/3BgwdYsWIFNm/ejMePHyuFnOjo6MDHxwfDhg1D165doaHBSe5EJYE8dLQemta0wuQdV3D1SSLaLw7FjI5O6MHQUaIS74svvkDXrl1hamqKjh07wsPDA7a2ttDT08OLFy9w48YNhIaGYt++fWjfvj1++uknsUsmIjWTJZUh6NAdLD8mDw+1tzDAop5uDA8lKkKurq6QSCRKn7EHDRqk+HfufQwWVaarndNE53tCRETFrNBN9HHjxmH9+vXw9fXFrFmz0KBBA9ja2kJfXx8JCQm4du0aQkJCMH36dMycORPr1q3DZ599Vpy1E1ER8qvzOnT01L14TN5xFcdux2Ju17owM2DoKFFJNXjwYPTt2xfbtm3Dli1bsGrVKiQmykMAJRIJnJyc4Ovri/Pnz6N27doiV0tE6uZBXArGvxEe2t2jEmZ0dGZ4KFERi4yMFLuEEil3JjqDRYmIqLgV+uzX0NAQ9+/fh4WFRZ77rKys0KJFC7Ro0QIzZsxAcHAwoqKi2EQnKmFsTPXw52BPrA65j5//u43g69EIi3qJhT1c0KgaQ0eJSipdXV307dsXffv2BQAkJiYiLS0NFhYWeS4RJyICcsJDLz7Gd//Kw0NN9ORXrrWvx7XPiYpDlSpVxC6hRMpdE51NdCIiKm6FbqLPmTOn0A/q5+f3UcUQkfg0NCQY3rQaGlWzxPjNl3E/LgV91pzF8CbVMKF1TcWJKhGVXKampjA1NRW7DCJSU4mpWfh251XsvfoMAODpYI5ferjC1kxf5MqIyoZ///033+0SiQR6enqoXr06HBwcVFyVemKwKBERqcpHXYfZokUL7NixA2ZmZkrbk5KS4O/vjyNHjhRFbUQkorqVlENHVx6/h1P34rCopxscLA3FLo+IPsKMGTPQuXNnuLq65rnP0dERt27dUn1RRKRWzt6Px5dvhId+2bomRjRleCiRKvn7++dZHx1QXhfd29sbu3btQrly5USqUj0wWJSIiFTlo6aUHjt2DJmZmXm2p6enIyQk5JOLIiL1kBs6urJvfZgZaOPK40S0XxyCLecf5TmpJyL1Fx0dDV9fXzg4OGDChAnYvn07fv75ZzRr1gy2trZil0dEIsqSyvDTgVvoufoMniamw97CAH+PbITRzauzgU6kYgcPHsRnn32GgwcPIjExEYmJiTh48CA8PT2xZ88enDhxAvHx8Zg4caLYpYpOV1sTAJvoRERU/D5oJvqVK1cU/75x4waio6MVP0ulUgQHB6NixYpFVx0RqQW/OhXgYmeGr7aG49S9eHzztzx0dE4Xho4SlSRTpkzB48ePsX//fqxbtw67d+/GvXv3YGdnh5MnT4pdHhGJ5EFcCsZvCUN41EsAwBfulTDjc2cYMTyUSBTjx4/HqlWr0KhRI8W2li1bQk9PD8OGDcP169cRFBSEQYMGiVilelDMRJeyiU5ERMXrg86MXV1dIZFIIJFI0KJFizz36+vrY8mSJUVWHBGpjwqm+vhzsCdWhdzHzwduY/+1aFx+xNBRopIkICAA5cuXx61bt1CzZk0A8i/Ff/jhB/Tp0wfHjx8XuUIiUiVBELA9Jzw0heGhRGrj3r17MDExybPdxMQE9+/fBwDUqFEDcXFxqi5N7ejmBotmSUWuhIiISrsPaqJHRkZCEARUrVoV586dQ/ny5RX36ejowMrKCpqamkVeJBGpBw0NCUY0rYbGb4WOjmhaDV+2Yugokbo7f/48rly5gurVqyu2OTk5YdOmTbCyshKxMiJStcTULHy76yr2XpGHhzbICQ+tyPBQItG5u7vj66+/xoYNGxSfuWNjYzFp0iR89tlnAICIiAjY2dmJWaZaUASLciY6EREVsw9qolepUgUAIJNxgCIqy3JDR7/ffQObz0dhxbF7OHmXoaNE6s7BwQHbt2/H5MmTlbYfOXIERkZGIlVFRKrG8FAi9fbbb7+hU6dOqFSpkqJRHhUVhapVq+Kff/4BACQnJ2Pq1KlilqkWGCxKRESqUugm+pkzZ9CwYcNC7ZuamorIyEg4Ozt/dGFEpN4MdLQwt2s9NK1ZHpN3XFWEjn7X0RlfeFSCRMIP4kTq5qeffoK/vz82bNiAhg0bQkdHBxEREQgJCcGaNWvELo+IilmWVIbFhyOw7OhdyASgioUBFvV0g6udmdilEdEbatWqhRs3buC///7DnTt3FNtat24NDQ1509jf31/ECtWHvo78SviUTC7nQkRExavQay/069cPvr6+2LZtG1JSUvLd58aNG/j2229RrVo1XLx4sciKJCL11bZuBQQH+sCrqgVSM6WY9PcVjNl0GYmpWWKXRkRvadeuHc6dO4dWrVrh2bNniI6OhqurKy5fvoz+/fuLXR4RFaOH8Sn4YuVpLDkib6B3c6+EveN82EAnUlMaGhrw8/PDuHHjMG7cOPj6+ioa6PSahaEOACAhOVPkSoiIqLQr9Ez0GzduYMWKFZg6dSp69+6NmjVrwtbWFnp6enjx4gVu3bqF5ORkdO7cGf/99x/q1q1bnHUTkRqpYKqPP4d4YtWJ+1jw323svfoMlx69wC89XNGwqoXY5RHRG1xdXbF48WKxyyAiFREEAX9feoIZ/1xDSqYUxnpamNOlLjrUsxW7NCJ6w+LFizFs2DDo6em9d5weN26ciqpSfxZGugCA+JQMkSshIqLSTiIIgvChB124cAGhoaF4+PAh0tLSYGlpCTc3NzRv3hzm5ubFUWeRSUpKgqmpKRITE/NNPCeiT3Pl8UuM3xyGyLgUSCTAqGbVENiqJrQ1OXOGqDA4ThWM7w3Rh0lMy8L/dl7FntzwUHtz/NKT4aFExeVTxikHBwdcuHABFhYWcHBwKHA/iUSC+/fvf2qpKlWc43fwtWiM+PMi3CqbYeeoxkX62EREVDYUdpz6oGDRXE5OTvDw8Mj3vnXr1mHgwIEf87BEVArUq2SGPWPloaNbLkRh2dF7CI2Qh47aM3SUiIhIJc5FJuDLLWF48jINmhoSTGB4KJFai4yMzPff9G6WRvLlXOK5nAsRERWzj5oaWr58efj7+2P9+vVISEhQbD9y5Aj+97//FVlxRFQyGepqYV63eljRpz5M9bUR/jgR7RaHYOuFKHzExS9ERERUSNlSGRb+dxs9V53Gk5dpqGJhgO0jvDC6eXU20IlKmMzMTNy+fRvZ2dlil6K2zHPXRE9hE52IiIrXRzXRIyIiYGZmhkGDBsHGxgZ16tSBiYkJevXqhQULFhR1jURUQrWtWwH7x/ugYVVzeejodoaOEhERFZdH8an44tfTWPxWeKhb5XJil0ZEHyA1NRWDBw+GgYEBnJ2d8ejRIwDA2LFjMXfuXJGrUy+5a6InZ2QjPUsqcjVERFSafVQT/a+//sKWLVvQo0cPzJ8/H1OmTIGfnx/S09ORmclvgInoNVszfWwc0hBf+9aCloYEe68+Q9tFJ3D2frzYpRGVaXPnzsXLly/FLoOIioAgCNhx6THaLQ7B5UcvYaynhSW93PDzFy4w0v2o1RuJSERTpkxBeHg4jh07Bj09PcX2Vq1aYcuWLSJWpn5M9LSgrSm/yoaz0YmIqDh91Fn1zz//jJ07d8LPz0+xrU+fPggPD0ebNm0QEBBQZAUSUcmnqSHB6ObV4V3dEuM3X8aD+FT0XH2GoaNEIpo9eza6d+8OMzMzsUshok+QlJ6FqTuv4d/wpwDk4aELe7igUjkDkSsjoo+1a9cubNmyBQ0bNoRE8noZJmdnZ9y7d0/EytSPRCJBOQMdPH+VgYSUTNgyOJmIiIrJR3WuUlJSYGNjk2d7rVq1uF4bERXIxc4Me8f5oLtHJQgCsOzoPXRbeRoP4lLELo2ozGE+AVHJd/5BAtoGheDf8KfQ1JDgq9Y18dewhmygE5VwsbGxsLKyyrM9JSVFqalOcrlLusQlZ4hcCRERlWYf1UTv2rUrevXqha1bt+LRo0eIjo5GSEgI/P394ePjU9Q1ElEpYqirhfndXLC8T32Y6GkhPOol2i8OwTaGjhIRERVKbnhoj1/l4aGVzQ2wbYQXxraswfBQolLAw8MDe/fuVfyc2zhfs2YNvLy8xCpLbVkwXJSIiFTgo5ZzWbp0KQICAtCrVy/FNolEgi5dumDZsmVFVhwRlV7t6laAq50ZvtwShrORCfh6+xUcuxOL2f51YWqgLXZ5RKXejRs3YGtrK3YZRPSBHsWnInDLZVx69BIA0KV+Rcz83BnGehw7iUqL2bNno23btrhx4ways7OxaNEi3LhxA6dOncLx48fFLk/tWBjJm+jxyWyiExFR8fmomeiGhobYvn07nj9/jlOnTuHixYuIj4/H1q1bUb58+aKukYhKKVszfWwa+kbo6BWGjhKpip2dHTQ1NcUug4g+wM7L8vDQSznhoYt7uWFhd1c20IlKGW9vb4SHhyM7Oxt169bFf//9BysrK5w+fRru7u5il6d2zHNmosdzJjoRERWjj5qJnsvCwgIWFhZFVQsRlUG5oaONc0JHH8anotfqMxjVrDrGt6rB0FEiIirzktKzMG3XNfwTJg8P/cy+HH7p4cq1z4lKqf79+6N58+aYPHkyqlWrJnY5as8yZ030eK6JTkRExYjdKSJSC645oaPd3CtBJgBLj97FFytP42E8Q0eJiKjsupATHvpPmDw8dELrmvhrKMNDiUozHR0dzJkzBzVr1oSdnR369u2LNWvWICIiQuzS1JI510QnIiIVYBOdiNSGka4Wfv7CBUt7u8FYTwthUS/RblEItl98zNBRIiIqU7KlMiw8eAfdc8JD7cz1sXW4F8a1rAEtXqVFVKqtWbMGd+7cwaNHjzB//nwYGRlhwYIFcHR0RKVKlcQuT+1YcDkXIiJSAZ6BE5Ha6VDPFsGBTdDA3hwpmVJM3BaOsX9dRmJaltilERERFbuohFR0//U0Fh+OgEwAOrtVxL5xPnCvUk7s0ohIhcqVKwcLCwuUK1cOZmZm0NLSYgZZPhTBoilczoWIiIoPm+hEpJYqmunjr2ENMbFNTWhqSLDnyjO0WxSCc5EJYpdGVOpERUVh0KBBYpdBRJCHh7ZdlBMeqquFRT1d8UsPhocSlSXffvstGjVqBAsLC0yePBnp6emYPHkyoqOjcfnyZbHLUzsWhvI10ROSOROdiIiKzycFixIRFSdNDQnGtKiREzoahkcJqei56jRGN6+OcS0ZOkpUVBISErB+/XqsXbtW7FKIyqy3w0M9qsjDQ+3MufY5UVkzd+5clC9fHjNmzECXLl1Qs2ZNsUtSa+Y5M9FTMqVIz5JCT1tT5IqIiKg0YhOdiNSeW+Vy2DfeBzP+uY6/Lz3GkiN3ERIRh0U9XVHFwlDs8ojU3r///vvO++/fv6+iSogoPxcfJmD85jA8fpEGTQ0JxrWogdHNq3Htc6Iy6vLlyzh+/DiOHTuGBQsWQEdHB02bNkWzZs3QrFkzNtXfYqyrBW1NCbKkAuJTMlHRTF/skoiIqBRiE52ISgQjXS0s6O6CZrXK49udVxWho993qoMu9StCIpGIXSKR2vL394dEInlnQC//HyJSvWypDEuO3MWSI/K1z+3M9RHUw41rnxOVcS4uLnBxccG4ceMAAOHh4fjll18wevRoyGQySKVSkStULxKJBBaGuohOSkd8cgab6EREVCzYRCeiEqWjiy3cKpvhyy1hOP/gBb7aFo5jd2Ixy78OTPW5XixRfipUqIDly5ejU6dO+d4fFhYGd3d3FVdFVLZFJaQicEsYLj58AUAeHvp9J2eufU5EEAQBly9fxrFjx3Ds2DGEhoYiKSkJ9erVQ9OmTcUuTy2ZG+rIm+gpXBediIiKB5voRFTiVCpngM3DvLD86F0EHY7A7vCnuPTwBYJ6uuIze3OxyyNSO+7u7rh48WKBTfT3zVInoqL1T9gTTN15Da8ysmGsq4VZneugk2tFscsiIjVhbm6O5ORkuLi4oGnTphg6dCh8fHxgZmYmdmlqyyJnXXSGixIRUXFRi4UWly1bBnt7e+jp6cHT0xPnzp0r1HGbN2+GRCKBv79/8RZIRGpHU0OCsS1rYNsIL1Q2N8CTl2no8etpLPzvNrKlMrHLI1IrX3/9NRo1alTg/dWrV8fRo0c/+HE5fhN9mKT0LARuvozxm8PwKiMb7lXkmR9soBPRm/7880/Ex8fjwoULWLBgATp27MgG+ntYGMqb6PEpGSJXQkREpZXoTfQtW7ZgwoQJmDFjBi5dugQXFxf4+vri+fPn7zzuwYMHmDhxInx8fFRUKRGpo/qVy2HvOG90qV8RMgFYfOQuvvj1NB7Fp4pdGpHa8PHxgZ+fX4H3GxoafvDl4Ry/iT7MxYcJaLcoBLvCnkJDAgS2qoEtwxrCztxA7NKISM20b98eJiYmojz33LlzIZFIEBgYqNiWnp6O0aNHw8LCAkZGRujatStiYmJEqa8gFka6AMDlXIiIqNiI3kRfuHAhhg4dioEDB8LJyQkrV66EgYEB1q5dW+AxUqkUffr0wcyZM1G1alUVVktE6shYTxsLu7ticS83GOtp4fKjl2i3OAQ7Lj3mEhVExYTjN1HhZEtlWHQoAt1/PYPHL9JQqZw+to3wQmCrmtDSFP1UnIhI4fz58/j1119Rr149pe1ffvkldu/ejW3btuH48eN4+vQpunTpIlKV+TPPnYnO5VyIiKiYiHrmnpmZiYsXL6JVq1aKbRoaGmjVqhVOnz5d4HHff/89rKysMHjwYFWUSUQlxOcuttg/3gef2ZdDckY2JmwNx/jNYUhKzxK7NKJSheM3UeFEJaSi56oz+OXQHUhlAvxdbbFvvA/cqzC/g4jUS3JyMvr06YPVq1ejXLlyiu2JiYn47bffsHDhQrRo0QLu7u5Yt24dTp06hTNnzohYsTLL3DXROROdiIiKiahN9Li4OEilUlhbWyttt7a2RnR0dL7HhIaG4rfffsPq1asL9RwZGRlISkpSuhFR6VWpnAH+GtoQE1rXhKaGBP+GP0XboBCcf5AgdmlEpYYqxm+AYziVbP+EPUG7RSG48PAFjHS1ENTDFUE93WCipy12aUREeYwePRrt27dX+oIcAC5evIisrCyl7Y6OjqhcufI7vzhXNXPDnOVckrkmOhERFY8SdQ3pq1ev0K9fP6xevRqWlpaFOmbOnDkwNTVV3Ozs7Iq5SiISm5amBsa1rIGtw71gZ67P0FEikX3M+A1wDKeS6VV6Fr7cEqYID61f2Qz7x/vA343hoUSknjZv3oxLly5hzpw5ee6Ljo6Gjo5OnmDTd31xLsaX4BZGucGinIlORETFQ0vMJ7e0tISmpmaeUJKYmBjY2Njk2f/evXt48OABOnbsqNgmk8kbYlpaWrh9+zaqVaumdMyUKVMwYcIExc9JSUn8EE5URrhXKYd943ww45/r2HH5CRYfuYuQu3FY1MMNlS0Y5Eb0sVQxfgMcw6nkufjwBQK3XEZUQho0JMDYFjUwtkV1rn1ORGorKioK48ePx8GDB6Gnp1ckjzlnzhzMnDmzSB6rsCy4JjoRERUzUc/odXR04O7ujsOHDyu2yWQyHD58GF5eXnn2d3R0xNWrVxEWFqa4ff7552jevDnCwsLy/WCtq6sLExMTpRsRlR3GetpY2MMVi3q6wlj3dejozsuPxS6NqMRSxfgNcAynkkMqE7D4cAS6/3oaUQmvw0O/bM3wUCJSbxcvXsTz589Rv359aGlpQUtLC8ePH8fixYuhpaUFa2trZGZm4uXLl0rHFfTFOSD/EjwxMVFxi4qKKvbXYWEkX84lLUuKtExpsT8fERGVPaLORAeACRMmICAgAB4eHmjQoAGCgoKQkpKCgQMHAgD69++PihUrYs6cOdDT00OdOnWUjs+9rOzt7UREb+rkWhH1K5fDl1vCcOHhC3y5JRzHbsfiB/86XJ+W6CNw/CaSe/wiFV9uCcP5By8AAP6utvieYwsRlRAtW7bE1atXlbYNHDgQjo6O+Oabb2BnZwdtbW0cPnwYXbt2BQDcvn0bjx49yveLc0D+Jbiurm6x1/4mQx1N6GhpIDNbhviUDFTS4VWnRERUtERvovfo0QOxsbGYPn06oqOj4erqiuDgYEVY2aNHj6ChwRk8RPTp7MwNsHlYQyw7eg+Lj0Tgn7CnuPjwBRb1dIV7FXOxyyMqUTh+EwH/hj/F/3Zexav0bBjpauEHf2d0dqskdllERIVmbGyc5wttQ0NDWFhYKLYPHjwYEyZMgLm5OUxMTDB27Fh4eXmhYcOGYpScL4lEAgtDHTxLTEd8ciYqlWMTnYiIipZEEARB7CJUKSkpCaampkhMTORl4URl2Jvr1mpqSDC2RXWMac51a0l8HKcKxveG1EVyRjam/3MNOy49AQDUr2yGIOZtEJV5pWWcatasGVxdXREUFAQASE9Px1dffYW//voLGRkZ8PX1xfLlywtczuVtqnpfOiwJwbUnSVg34DM0d7QqtuchIqLSpbDjlOgz0YmIxPB26GjQoQiERsThlx6usDNnE4SIiPJ36dELBG4Ow6OEVGhIgDEtamAcw0OJqBQ5duyY0s96enpYtmwZli1bJk5BhWRuKF9CJi45Q+RKiIioNOLZPhGVWW+Hjl54+ALtFoXgn7AnYpdGRERqJjc89IuVp/EoIRUVzfSxZbgXJjA8lIhILVga6gAAElIyRa6EiIhKI57xE1GZ18m1IvaN94F7lXJ4lZGN8ZvD8OWWMLxKzxK7NCIiUgOPX6Si56rTWHjwDqQyAZ+72GLfeB98Zs88DSIidWGe00SPZxOdiIiKAZvoRESQh45uGdYQga1qQEMC7Lz8BO0Wh+Diwxdil0ZERCL6N/wp2i4KwfkHL2Ckq4WF3V2wqKcrTPW1xS6NiIjeYGEkX84lPplNdCIiKnpsohMR5dDS1EBgq5rYNsILlcrpIyohDd1/PY1FhyKQLZWJXR4REalQckY2vtoajnF/Xcar9Gy4VTbDvnE+6FK/EiQSidjlERHRWywUy7lwTXQiIip6bKITEb3FvYo59o33gb+rLaQyAb8cuoOeq84gKiFV7NKIiEgFLj96gfaLQ/D3pcfQkABjW1TH1uFeqGzB4GkiInVlYcTlXIiIqPiwiU5ElA8TPW0E9XRDUA9XGDF0lIioTJDKBCw9EoFuK0/jYbw8PHTzMC981aYWtBkeSkSk1hRronM5FyIiKgb8NEBE9A7+bhWxf7wP6lc2Y+goEVEp9uRlGnqtOoOf/5OHh3bMCQ9t4MDwUCKiksAyd010LudCRETFgE10IqL3sDM3wNbhXhjfkqGjRESl0e7wp/ALOoFzDxJgqKOJBV+4YDHDQ4mISpTcmejpWTKkZmaLXA0REZU2bKITERWClqYGvmxdE1uHe6Gi2evQ0cWHIyCVCWKXR0REHyE3PHRsTnioi50Z9o33QVd3hocSEZU0Bjqa0NOWtzi4pAsRERU1NtGJiD6Ah7059gf6oFNO6OjCg3fQc9VpPH7B0FEiopIkLOqlIjxUIgHGNK+O7SO8UMXCUOzSiIjoI0gkElgY5i7pwiY6EREVLTbRiYg+kImeNhb1dMMvPVxgpKuF8w9eoO2iEPwb/lTs0oiI6D1yw0O7rjiFh/GpsDXVw+ahDTHRl+GhREQl3etwUa6LTkRERYufFIiIPlJnt0rYN84HbpXN8Co9G+P+uowJWxk6SkSkrp68TEOv1a/DQ9vXq4D945vAs6qF2KUREVERsDDKbaJzJjoRERUtNtGJiD5BZQsDbBvuhXE5oaM7Lj1B+8WhuPSIoaNEROpkz5WnaBt0Auci5eGhP3/hgqW93GBqwPBQIqLSoqKZPgDgQXyKyJUQEVFpwyY6EdEn0tLUwITWNbElJ3T0UUIqvljJ0FEiInWQnJGNidvCMWbTZSTlhIfuHeeDbgwPJSIqdWpaGwMA7sQki1wJERGVNmyiExEVkc/szbFvvA8+d2HoKBGROsgND91+UTk81N6S4aFERKVRDWsjAEDE81ciV0JERKUNm+hEREXIVF8bi3q6YmF35dDR3QwdJSJSGalMwLKjd9GN4aFERGVK7kz0RwmpSMuUilwNERGVJvwUQURUxCQSCbrUVw4dHfvXZXy1NRzJGdlil0dEVKo9fZmG3qvP4KcDt5HN8FAiojLF0kgX5oY6EATgXiyXdCEioqLDJjoRUTGpbGGArcO9MK5FdWhIgL8vPUa7RSG4zNBRIqJisffKM/gFncDZyAQY6Ghifrd6DA8lIipjqltxSRciIip6bKITERUjbU0NTGhTC5uHvQ4d7bbyNJYwdJSIqMikZGTj623hGL3pkjw8tJIp9o7zQXcPO4aHEhGVMTVz1kVnuCgRERUlNtGJiFSggYM8dLRjTujogoN30GvVGTx5mSZ2aUREJVp4Tnjotpzw0FHNqmH7yEZwYHgoEVGZlLsuekQMZ6ITEVHRYROdiEhFTPW1sbinKxZ84QJDHU2ce5AAv6ATDB0lIvoIUpmA5cfuouuKU3gQn4oKpnr4a2hDTPJzZHgoEVEZVsNK3kTnTHQiIipK/IRBRKRCEokEXd0rYd94H7javQ4dnbiNoaNERIX1LDENfdacwfzgnPDQuhUQPL4JGjI8lIiozKuRs5xL1ItUpGVKRa6GiIhKCzbRiYhEUMXCENtGeGFsTujo9ouP0X5xCMKiXopdGhGRWtt/9Rn8gkJw5n5OeGjXeljam+GhREQkZ2mkC3NDHQgCcC+Ws9GJiKhosIlORCQSbU0NfPVG6OjD+FR0XXEKS48wdJSI6G0pGdn4ZvsVjNx4CYlpWaiXGx76GcNDiYhIWQ2r3HBRrotORERFg010IiKR5YaOdqhXAVKZgJ//u4Neqxk6SkSU68rjl+iwJBRbLkQpwkP/ZngoEREVIDdclOuiExFRUWETnYhIDZjqa2NJLzf8nBs6GpmAtkEnsOcKQ0eJqOySyQSsOHYPXZafQmRcCiqY6mHTEIaHEhHRu9XMWRf97nPORCcioqKhJXYBREQkJ5FI0M29EjyqlMP4LWEIj3qJMZsu49jtWHz3uTOMdPknm4jKjmeJafhqazhO3YsHALSra4PZnevCzEBH5MqIiEjdVbfiTHQiIipanMJDRKRm7C0NsX2EF8Y0rw4JQ0eJqAwKviYPDz11Lx762vLw0GW967OBTkREhZI7Ez3qRSrSMqUiV0NERKUBm+hERGpIW1MDE31rYfPQhrA11cPD+FR0W3EKy47eZegoEZVaqZnZmPz3FYz4Ux4eWreiKfaO82Z4KBERfRALI11YGOpAEIC7zzkbnYiIPh2b6EREasyzqgX2j2+C9vUqIFsm4KcDt9F79Rk8ZegoEZUy154kosOSUGw+Lw8PHdFUHh5atbyR2KUREVEJVCNnNvqdGK6LTkREn45NdCIiNWdqoI2lvdzwU7d6MNTRxNnIBPgFncDeK8/ELo2I6JPJZAJWHr+HzstP4n5sCmxM9LBxiCcmt3WEjhZPVYmI6OPUtJavix7BmehERFQE+MmEiKgEkEgk+MLDDnvH+cClkimS0rMxetMlfL0tHCkZ2WKXR0T0UaIT09H3t7OYu/8WsqQC/JxtsH+8DxpVsxS7NCIiKuFqWMlnokdwJjoRERUBNtGJiEoQe0tDbB/ZCKObV4NEAmzLCR0NZ+goEZUwwdei4bfohCI8dG6XuljRtz7KGTI8lIiIPl2NnJnod56ziU5ERJ+OTXQiohJGW1MDX/s64q+hDVHBVA8P4lPRlaGjRFRCpGZmY8qOKxjx50W8TM1CnYom2DPOGz0bVGZ4KBERFZnc5VyiEtKQmskrN4mI6NOwiU5EVEI1rGqB4PFN0L7u69DRPmvO4FkiQ0eJSD3lhof+dU4eHjq8aVXsGNkY1RgeSkRERczcUAeWRvKrm+5yXXQiIvpEbKITEZVgpgbaWNrbDfO71YOBjibO3E+AX1AI9l9l6CgRqQ+ZTMCvb4SHWpvoYuNgT0xpW5vhoUREVGwcbUwAAGFc+pCIiD4RP7UQEZVwEokE3XNCR+tVMkViWhZGbryEb7ZfYegoEYkuOjEd/daexZyc8FBfZ2sEj2+CRtUZHkpERMWrUXULAMCJO7EiV0JERCUdm+hERKWEg6Uh/h7ZCKOayUNHt1yIQoclobjy+KXYpRFRGXXgujw89ORdeXjonC51sbKvO8NDiYhIJZrWLA8AOHUvHpnZMpGrISKikoxNdCKiUkRbUwOT/ByxaYg8dDQyLgVdlp/C8mMMHSUi1ZGHh17F8D+Uw0N7MTyUiIhUqLaNCSyNdJGaKcWFhwlil0NERCUYm+hERKWQVzUL7B/vg3Z1bZAtEzA/mKGjRKQar8NDH8nDQ5swPJSIiMShoSFBkxry5cNO3IkTuRoiIirJ2EQnIiqlzAx0sKx3fczvytBRIip+MpmAVSfyCQ9tx/BQIiIST5OcJV24LjoREX0KfqIhIirFJBIJun+WN3R08t9XkJrJ0FEiKhoxSenov/YcZu9jeCgREakX75yZ6DeeJSH2VYbI1RARUUnFJjoRURngYGmI7SMaYWRO6Ojm81HosDgUVx8nil0aEZVwB65Hwy/oBELvxkFPWwOzOzM8lIiI1IelkS7qVjQFAIREcDY6ERF9HDbRiYjKCB0tDXzj54iNQzxhY6KH+3Ep6LLiJFYevwcZQ0eJ6AOlZUrx7U55eOiL1Cw425pgz1gf9PZkeCgREamXJjXls9GPc0kXIiL6SGyiExGVMY2qWSI40Adt69ggSypg7v5b6PvbWUQnpotdGhGVENeeJKL9khBsOvsIQE546KhGqG7F8FAiIlI/TWrI10UPiYjj5BEiIvooatFEX7ZsGezt7aGnpwdPT0+cO3euwH1Xr14NHx8flCtXDuXKlUOrVq3euT8REeVlZqCD5X3qY17XutDX1sSpe/HwW3QCwdeixS6NShCO32WPTCZg9Yn7yuGhQ+ThobpammKXR0REH2jOnDn47LPPYGxsDCsrK/j7++P27dtK+6Snp2P06NGwsLCAkZERunbtipiYGJEq/jj1q5SDka4WElIycf1pktjlEBFRCSR6E33Lli2YMGECZsyYgUuXLsHFxQW+vr54/vx5vvsfO3YMvXr1wtGjR3H69GnY2dmhTZs2ePLkiYorJyIq2SQSCXp8Vhl7x3mjbkVTvEzNwog/LzJ0lAqF43fZE5OUjoB15/DjvpvIkgpo4yQPD23M8FAiohLr+PHjGD16NM6cOYODBw8iKysLbdq0QUpKimKfL7/8Ert378a2bdtw/PhxPH36FF26dBGx6g+nramBRtUsAAAnuC46ERF9BIkgCKJey+Tp6YnPPvsMS5cuBQDIZDLY2dlh7NixmDx58nuPl0qlKFeuHJYuXYr+/fu/d/+kpCSYmpoiMTERJiYmn1w/EVFpkJktw8KDd/DriXsQBKCqpSEW9XRD3UqmYpdW5pSUcUrV4zdQct6b0ujgjRhM2h6OF6lZ0NPWwPQOzujVwI5rnxMRvaE0jFOxsbGwsrLC8ePH0aRJEyQmJqJ8+fLYtGkTunXrBgC4desWateujdOnT6Nhw4bvfUx1eV/+PPMQU3ddQwN7c2wd4SVaHUREpF4KO06JOhM9MzMTFy9eRKtWrRTbNDQ00KpVK5w+fbpQj5GamoqsrCyYm5vne39GRgaSkpKUbkREpExHSwOT2zJ0lApHFeM3wDFcHaRlSvG/nVcxdMMFhocSEZUBiYmJAKAYny9evIisrCylMd/R0RGVK1cu9JivLprWlK+LfunRC7xKzxK5GiIiKmlEbaLHxcVBKpXC2tpaabu1tTWiowu3Lu8333wDW1tbpUH9TXPmzIGpqaniZmdn98l1ExGVVgwdpcJQxfgNcAwX27UnieiwJAQbc8JDhzE8lIioVJPJZAgMDETjxo1Rp04dAEB0dDR0dHRgZmamtO+7xnx1/RLcztwAVcsbIlsmYP9V5gAREdGHEX1N9E8xd+5cbN68GTt37oSenl6++0yZMgWJiYmKW1RUlIqrJCIqWRg6SsWtMOM3wDFcLDKZgDUh8vDQe7EpsDLWxZ+DPfEtw0OJiEq10aNH49q1a9i8efMnPY46fwne3UNey4YzDyDyyrZERFTCiNpEt7S0hKamZp5k75iYGNjY2Lzz2J9//hlz587Ff//9h3r16hW4n66uLkxMTJRuRET0bgWFjk7ZcZWho6SS8RvgGC6G5znhobP2ysNDWztZIziwCbxrMDyUiKg0GzNmDPbs2YOjR4+iUqVKiu02NjbIzMzEy5cvlfZ/15ivzl+Cd/ewg46WBq49SUJY1EuxyyEiohJE1Ca6jo4O3N3dcfjwYcU2mUyGw4cPw8ur4KCP+fPn44cffkBwcDA8PDxUUSoRUZlUtbwR/h7ZCMObVoVEAvx17hE6LAnFtSeJYpdGIuL4XToduhEDv0UhCImIg562Bn7sXAer+rnD3FBH7NKIiKiYCIKAMWPGYOfOnThy5AgcHByU7nd3d4e2trbSmH/79m08evSowDFfnb8ENzfUQYd6FQAAf5x+KHI1RERUkoi+nMuECROwevVqrF+/Hjdv3sTIkSORkpKCgQMHAgD69++PKVOmKPafN28epk2bhrVr18Le3h7R0dGIjo5GcnKyWC+BiKhU09HSwJS2tbFxsCesTXRxPzYFnZefxKoTDB0tyzh+lx5pmVJM3XUVQzZcQEJKJpwqmGDPWG/08azC8FAiolJu9OjR+PPPP7Fp0yYYGxsrxue0tDQAgKmpKQYPHowJEybg6NGjuHjxIgYOHAgvLy80bNhQ5Oo/Tn8vewDAnivPEJ+cIW4xRERUYmiJXUCPHj0QGxuL6dOnIzo6Gq6urggODlaElT169AgaGq97/StWrEBmZia6deum9DgzZszAd999p8rSiYjKlEbVLRE8vgkm77iCA9djMHvfLZy4E4cF3V1gbVLwutZUOnH8Lh1uPE3C+M2XEfFc/mXGUB8HTPStxbXPiYjKiBUrVgAAmjVrprR93bp1GDBgAADgl19+gYaGBrp27YqMjAz4+vpi+fLlKq606LjamaFeJVNceZyILReiMKpZdbFLIiKiEkAilLE0jaSkJJiamiIxMVGtLisjIiopBEHAlvNRmLn7BtKypChnoI25XevB1/nda2FT4XCcKhjfm6IjkwlYezIS84NvI1Mqg5WxLhZ0d4FPjfJil0ZEVGJxnMqfOr4vWy9EYdL2K6hopo8Tk5pDU4NXXhERlVWFHadEX86FiIhKFolEgp4NKmPPOG/UqWiCF6lZGP4HQ0eJSoo3w0MzpTK0qm2N/eN92EAnIqIy43MXW5gZaOPJyzQcvfVc7HKIiKgEYBOdiIg+SrXyRtgxsjGGN60KQB462pGho0Rq7fBN5fDQWf51sLq/OyyMdMUujYiISGX0tDXR3cMOALDhDANGiYjo/dhEJyKij6YIHR0iDx29lxM6uvrEfYaOEqmR9Cwppu26hsHr5eGhtXPCQ/s2ZHgoERGVTX08K0MiAU7cicX1p5wEQkRE78YmOhERfbLGOaGjbZyskSUV8OO+m+i/9hxiktLFLo2ozLv5LAkdl4Tij5yZdoO9HbBrdCNUtzIWuTIiIiLxVLEwRId6tgCAmf/eQBmLiyMiog/EJjoRERWJcoY6+LWfO2Z3rgs9bQ2E3o2DX9AJ/Hc9WuzSiMokmUzAb6GR6LT0JCKeJ6O8sS42DGqAaR2coKulKXZ5REREopvS1hF62ho49yABu688E7scIiJSY2yiExFRkZFIJOjtWRl7xvrA2VYeOjrsj4v4dudVpGVKxS6PqMx4/iodA34/jx/23ECmVIaWjlYIHu+DJjUZHkpERJTL1kwfI5tWBwDM2XcTqZnZIldERETqik10IiIqctWtjLBjVCMMayIPHd109hE6LAlh6CiRChy+GYO2QSE4cScWuloa+KGTM9YEeDA8lIiIKB/Dm1ZFRTN9PEtMx8pj98Quh4iI1BSb6EREVCx0tTTxbbva+HOwJ6yMGTpKVNzSs6SY/o88PDQ+JROONsbYPdYb/bzsGR5KRERUAD1tTUxtXxsAsPLEfUQlpIpcERERqSM20YmIqFh517BEcGATtH4jdDRg3Tk8Z+goUZG5+SwJny8NxYbT8vDQQf9v787Do6zvvY9/Zk8CSViygUaRfRNpQRCQopZzcAPtdapUfRBwax9xpXXXYt2wantolZZLrcBpVVp9FK0iVlFEFi0iKAdIUAHBJSFBSCaEZLbf88ckkwQyQm7I3JPM+3Vdc83knt89+eVr2k/y5c58x5ykV64bo775DA8FAOBwzh5coFE9uyoQiuiB1zfbvR0AQBKiiQ4AaHVdOnj15JRhevAng5Xmcer9z8o1Yc4KvbW51O6tAW2aMUbPrNyuC+au0tbSKuV09GnhFSP064kMDwUA4Eg5HA7NmjRQLqdDb24q1d/X7rR7SwCAJEMTHQCQEA6HQ5eNPFGvXX+6BnaLDh29+n8+0t2LGToKWFHmr9W0+Wt132ubFQhFh4e+edNYjWN4KAAALda/IEs3j+8jSbrnlU3a+BWzfAAADWiiAwASqndepl6eMVpXjz1JkvS3D3Zq4hMrtekbflEBjtQ7RaU6e84KvVc3PPQ+hocCAHDUrj2jt8YPyFMgFNH/fXad9lUH7N4SACBJ0EQHACScz+3SXecN1F+vHKG8TJ8+312ln8xdraffZ+go8H1qgmHNeuV/dcWCpsNDL2d4KAAAR83pdOh3Fw/ViV0z9NXeA7rp7xv42RQAIIkmOgDARmP75GrpTT/S+AH5CoQjeuB1ho4C8RSVRIeHLmw0PHTxDIaHAgBwLGWne/Tny4YpzePU8uIyzVn2md1bAgAkAZroAABbdeng1VOXD9MDFzYMHT37D+/rbYaOApKiw0Pnr9quSU80DA9dMP1U/XriQKV5GB4KAMCxNrB7lh688GRJ0h+Xfaan399m844AAHajiQ4AsJ3D4dD/OS06dHRAtyx9tz+gqxg6CqjMX6vpC9bqN/+MDg89q3+elt40Vmf0y7N7awAAtGv/Nex43XBWb0nSA69voZEOACmOJjoAIGn0zsvU4hmjddXpTYeObv6m0uadAYn3btFunfOHFVpeXCZv3fDQv0wdrhyGhwIAkBA3/0dfXU8jHQAgmugAgCTjc7t09/kD9T9XjFBu3dDRC+euYugoUkZNMKx7X92k6QvWqrwqOjz0NYaHAgCQcA6HQzMPaqTPe+8LGcPPpACQamiiAwCS0o/65mrpjWMZOoqUUlRSqQueWKUFq3dIkqaP6cHwUAAAbHRwI/3hN4r0yxc+UU2QtxwEgFRCEx0AkLS6dvTFho763A1DR5dtYego2pfGw0OLS/3K6ejV/OmnatbEQQwPBQDAZvWN9LvPGyCX06GXPv5aF81bo6/3HbB7awCABKGJDgBIavVDR1+/oWHo6JULP9I9i/+XK4DQLpRX1eqKRsNDz+yXq6U3/UhnMjwUAICk4XA4dNXYnvrrFSPUOcOjjV9XaNLjK7Xq83K7twYASACa6ACANqF+6OiVdUNH//rBl5r0xEpt+Zaho2i73i3erbPnrNC7dcNDfzNpkJ6ZdirDQwEASFKje+fon9efrkHds7Rnf0CXPf2h7nx5o/w1Qbu3BgBoRTTRAQBths/t0j3nD9TCuqGjW0urdMHcVXpm5XYGPKFNiQ0PnR8dHtovP1P/vO50TR3N8FAAAJLd8Z0z9OIvRuuykSdIkp77cKf+879X6N3i3TbvDADQWmiiAwDanHF1Q0d/3D9PgVBE9722WdPmr1WZv9burQGHtbXUrwvnNgwPnTa6h165boz6FTA8FACAtiLd69KDPzlZz199mk7okqFvK2o0ff5aXfvsOu0o32/39gAAxxhNdABAm9S1o09PTx2u+y8YJJ/bqfe2lunsOSv0ThFDR5GcjDFauHqHJj6+UkUlDcND753E8FAAANqqUb26aulNY3Xl6SfJ4ZCWbCzR+N+/p3tf3aQ9VVzgAQDtBU10AECb5XA4NGVUD/3z+tPVvyBTe/YHdMWCjzTrFYaOIrmUV9XqyoUfadarm1QbiuiMfrl640aGhwIA0B5keN265/yBWnLDWI3rm6tQxGjB6h0a9+hyPfpmEX8tCQDtAE10AECb1zc/U4tnjNEVY6JDRxeuiQ4dLSph6Cjst7x4t86e877eKdotr9upeycO1Pxppyo3k+GhAAC0JwO6ZWnhFSP0tytHalD3LFXVhjT33S805rfv6M6XN/I2LwDQhjlMik1iq6ysVHZ2tioqKpSVlWX3dgAAx9jy4t361QufqryqVl63U3ec01/T2tCwRnIqvrZWm5pgWL9dWqT5q3ZIkvrmd9QfL/mB+hck/94BAC3X1nIqUVK1LpGI0b82l2jee9u0Ydc+SZLDEZ3tc8mIE3RW/zx5XFzXCAB2O9KcookOAGh3yqtqdduLn2pZ0W5J0hn9cvXoT09pE1f+klPxtaXabC3164bn16uoxC9JmjrqRN1x7gDe+xwA2rG2lFOJlOp1McZo7Y69mvfeF3qn7mdTScrN9OmiYcfrJz84Tn3yGS4OAHahiR5Hqgc4AKQKY4z++sGXevD1LaoNRZTT0atHf3qKzuyf3O9BTU7F1xZqc/D3XdcOXj160RCd1T/f7q0BAFpZW8gpO1CXBjvK92vR2l16cd0ulVcFYsf7F2Rq4inddf6QbjqxawcbdwgAqYcmehwEOACkloOvCJ42uoduP6d/0l4RTE7Fl+y1OfgvIMb1zdWjFw1RXmaazTsDACRCsueUXajLoQKhiJZtKdWL677Sis/KFAw3tGX65nfUjwfk68f98/SDEzrL5Wwbb0kIAG0VTfQ4CHAASD0Hvzd1v/xM/fGSH6hfQfL96Sw5FV8y1+a9rWX65T8+ib4Xv8up2+vei9/JL74AkDKSOafsRF2+X0V1UG9uKtE/P/1Gq7/Yo3CkoUXTOcOj0b1yNKZ3jk7vnaMTumbYuFMAaJ9oosdBgANA6ooOHf1E5VUBed1O3XlOf01NsqGj5FR8yVibmmBYjywt1jOrtkuKXj32h5/9QAO6Jcf+AACJk4w5lQyoy5GrqA5q+dbdWrZlt5YX71ZlTajJ88d1SteIk7ro1B5ddGqPzuqd1zGpfo4FgLaIJnocBDgApLYyf61uffETvVtcJkk6s1+uHkmioaPkVHzJVpvPSv26vtFbBV0+6kTdyfBQAEhZyZZTyYK6WBMMR/TpV/u08rM9WvV5uT7euVehSNP2TXa6R0OOz9Ypx3fSKYWdNOT4bOVl+misA0AL0ESPgwAHABhjtHD1Dj30RpECSTZ0lJyKL1lqY4zR3z74Ug8wPBQA0Eiy5FSyoS7HRlVtSOt37tXa7d9p7Y69Wr9rr2qCkUPW5XT0amD3bA3slqUB3TLVryBTPXM6yut22rBrAEh+R5pT7gTuCQCApOBwODRtzEka1StHNzy/XsWlfk1fsDbph47CfnuqanXb//tUb29heCgAAEicjj63xvbJ1dg+uZKiV6oXl/i1Ydc+ffrVPm3YtU+f765SeVVAK7aWacXWsti5bqdDJ+V0UJ/8juqV23DrkZOhzDSPXV8SALQpNNEBACmrX0GmXrlujB5+o0gLVu/QgtU79MG2PfrDz5Jz6CjstWJrmX75wicq80eHh952Tn9NZ3goAACwgcfl1ODjsjX4uGxJJ0qKzmopKvFr0zcV2vRNpbaW+FVc6pe/JqTPdlfps91Vh7xOTkevTuzaQT26dlBhl3Sd0CVDhV0yVNg5Q3mZPn7OAYA6NNEBACktzePSvZMGaVy/XN3ywicqKvFr4hMrdde5A3T5qBN5T0moNhQdHvqXldHhoX3yosNDB3bnT9IBAEDySPO4NLSwk4YWdoodM8aopLJGxSV+fVG2X1+UVemL3VX6oix61Xr9bd2Xew95PY/LoW7Z6TquU7qO65yubtlp6pYdvS/ITlNBVpo6ZXj4eRlASqCJDgCApDP75emNG3+kW178RMuLyzTr1U16b2uZHvnpEOV0TI6ho0i8z0r9umHRBm35tlISw0MBAEDb4nA46hrf6TqjX9Pn/DVBfbmnWjv27NeXe6q1c0+1du2N3r7ZV6Ng2Gjnd9Xa+V113Nf3up3Kz/IpPzNNuZk+5WX6lFt369rBp5xMn3I6epXT0cfPTwDaNJroAADUyc30af60U2NDR98p2q2z57yvxy4aojP62T90FIljjNHfPtypB17brNpQRF06ePXoT4foxwMYHgoAANqHzDRPo7eEaSoUjqjUX6uv9x7Q1/uq9fXeA/q2okYlFTX6pqJGJRUHtLc6qEAool3fHdCu7w4c9vNleF3q2tGrLh186trBq84ZXnXp4FHnused0j3qlOFVpwxP9JbuVZrHyZXuAJICTXQAABqpHzp6Wq+uuvH5DSou9WvafIaOppLo8NCNentLqSTpR31z9RjDQwEAQApxu5zRt3HplC6pS7NraoJhlflrVVpZo9LKWpVX1Wq3v0Zl/lqV+Wu1Z39A5f5alVcFFAhHVB0Iq/oIG+71vC6nsjM8yk73KCvNHb1P9ygrzaPMNLey0qP3mXUfZ/qijzumudXRF725eF93AMcATXQAAJrRvyBLr1w3RrOXbNHCNV8ydDRFvP9ZmWb+g+GhAAAAh5PmcUWHkHbJ+N51xhj5a0P6riqgPftrtacqoL3VAX23P1h3H9C+6qD2VUeP76sOquJAUKGIUSAciTXlrUr3uNTB51ZmmlsdfC518LrVwVd387qU4Y0eb3yf4XUp3Rtdm+F1Kc3jih7zRI/73FwhD6QamugAAMSR5nHpNxcM1hn98vSruqGjk55YqTsZOtru1IbCenRpsZ6uGx7aO6+j/sjwUAAAgKPmcDiUlRa9erxHTocjOscYo/2BsCoORJvrlQdCqjgQVGVNUJUHgqqsCanyQFD+mpAqa4KqqgnJX1t3XxOSvzakQCgiSToQDOtAMKzyKuuN+EO/pmhzPt0TbbCn1zfYPS75PM7Y8TSPs+7epTS3Uz5PtAGfdtD9wce9bmf0uDv6ej63U14XjXvATjTRAQA4jDP752npTQwdba8+3+3XDc9v0Oa64aFTTosOD0338tY9AAAAdnA4HLG3Y4m+pUzL1YbC2l8bVlVNSFW1Ie0P1N3XhlRdG1ZVbUjVgZCqasM6EAhpfyAcfS4Q1oFAWNXB6LoDwXDsWCAcbcwbo+jb0wTCx/LLPiyvyxlrssdurmYe1917Gt3Xn+dxOeRx1T3nqvvY3fCxu+55b90ad2x903u3yymP0yF3/Rpn9N7tdNDsR7tEEx0AgCNQP3R0weodms3Q0XahueGhj/zXEI0fyPBQAACAts7ndsnndqlLB+8xe81QOBK7sr0mEFF1MKSaYEQHAmHV1DXba4Jh1YTCqglGVBMMqzYYVk0o+rgmGFZt3eOm9xHVhsKqDUZUG4o+DoSijxsLhCPRRv6xu6i+VXhcDrkbNdW/r+HudjnlcjrkcTnkckbXuZyOuuedcsc+bvTY6ZCr7nyXs+nx+nXO+nWN7usfOx3R13c6op/D6ZTcTucha1yNznM5HLF1TqfkanSuq/F6h4O3gmynkqKJPnfuXD366KMqKSnRKaecoscff1wjRoyIu/6FF17QPffcox07dqhPnz767W9/q3PPPTeBOwYApCKHw6HpY07SqF5ddcPz67W1tErT5q/V9DE9dNvZqTd0tC3n93f7A7r1xU9jw0PH9snR7y46RXlZDA8FAKA5Lc19oD1yu5zKdDmVmeZJyOczJvq+8PUN9dpQ9HHsFg43PRY+9HEwbGJrQ+Ho6wUbPxeOKFi3vv75UDj6XLBubcNjo1Ak0uR1jDl039H1YSmYkDIlpcbN92iTveGY09H0vvHzTkej550OuRyKHjtoTdO1jdfUnRN7HG3+O+ofN37O6ZCj7vXrn3M02m/jx06Ho+41ouc7HA3HG+6jn6t+bePnHfXPx15Lh+zBIcW+znjrGx5LHXxudcu29pcqVtjeRP/73/+umTNnat68eRo5cqTmzJmjCRMmqLi4WHl5h17Zt3r1al1yySWaPXu2zj//fD333HO68MIL9fHHH2vw4ME2fAUAgFTTvyBLr153emzo6PxVO/TBtu/08rWjU6aR3pbze+eeav103mrtrhseeuvZ/XTFmJO4YgQAgDhamvsAjg2HwxG7oj7T7s3EEY5EG+yhiIk13+sb7fXHg3UN+vrjoYg5ZH04YhQKm+jrNfo4FImuiUSMghGjcN1rhOteo+E+0vBxo+cjJvr5IyZ6LHb8oPMjEaOwaTg3bOo+ZziiiFHsWP1rHEldjmQdrPuPgfl66vLhCft8DmOa+zejxBk5cqROPfVUPfHEE5KkSCSiwsJCXX/99br99tsPWT958mTt379fr732WuzYaaedpqFDh2revHmH/XyVlZXKzs5WRUWFsrIYFgYAODrvFJXqlhc+1cRTuuveSYOO+vXaSk4lOr+lY1ebcMTosqc/UHlVQH/42VAN6p5t+bUAAKjXVjLcipbmfmPtuS4AUlekUVM9Yho14mMNeCkUiSgSUbSBb0zTcyJqcn64USM/YtTktYwxCtetjzRab0zdMdPwuSMm+vkidedEYueo0Ws1rDON18VuamadYns1jf5RwdStjX6N0cem7pz6PUYaPa+61w9HjKIfNvp6ok83OdeY6PGm66NrzuyXp/+ePPSo/1seaU7ZeiV6IBDQunXrdMcdd8SOOZ1OjR8/XmvWrGn2nDVr1mjmzJlNjk2YMEGLFy9uza0CANCss/rn642bxiorQX/SmQzaen67nA49cekP1cHrZngoAACHYSX3AaC9czodcsqhFPlDZMjmJnp5ebnC4bDy85sO8MrPz1dRUVGz55SUlDS7vqSkpNn1tbW1qq1tmLhQWVl5lLsGAKCpvMzUeh/tROS31LoZntPRd8xeCwCA9qyluc/v4ACA9shp9wZa2+zZs5WdnR27FRYW2r0lAABwBMhwAADaHvIbANAe2dpEz8nJkcvlUmlpaZPjpaWlKigoaPacgoKCFq2/4447VFFREbvt2rXr2GweAIAUlYj8lshwAACSQUtzn/wGALRHtjbRvV6vhg0bpmXLlsWORSIRLVu2TKNGjWr2nFGjRjVZL0lvvfVW3PU+n09ZWVlNbgAAwLpE5LdEhgMAkAxamvvkNwCgPbL1PdElaebMmZo6daqGDx+uESNGaM6cOdq/f7+mT58uSbr88st13HHHafbs2ZKkG2+8UePGjdPvfvc7nXfeeVq0aJE++ugjPfnkk3Z+GQAApBTyGwCA1HG43AcAoL2zvYk+efJklZWV6de//rVKSko0dOhQLV26NDa0ZOfOnXI6Gy6YHz16tJ577jndfffduvPOO9WnTx8tXrxYgwcPtutLAAAg5ZDfAACkjsPlPgAA7Z3DGGPs3kQiVVZWKjs7WxUVFfxZGQAg6ZBT8VEbAEAyI6eaR10AAMnsSHPK1vdEBwAAAAAAAAAgmdFEBwAAAAAAAAAgDproAAAAAAAAAADEQRMdAAAAAAAAAIA43HZvINHq56hWVlbavBMAAA5Vn08pNvf7iJDhAIBkRoY3j/wGACSzI83vlGui+/1+SVJhYaHNOwEAID6/36/s7Gy7t5FUyHAAQFtAhjdFfgMA2oLD5bfDpNg/k0ciEX3zzTfKzMyUw+Fo8fmVlZUqLCzUrl27lJWV1Qo7bL+onXXUzjpqZx21s+5oameMkd/vV/fu3eV08q5rjR1NhvP9bB21s47aWUftrKN21h1t7cjw5vE7uH2onXXUzjpqZx21sy4Rv4On3JXoTqdTxx9//FG/TlZWFt/QFlE766idddTOOmpnndXacfVa845FhvP9bB21s47aWUftrKN21h1N7cjwQ/E7uP2onXXUzjpqZx21s641fwfnn8cBAAAAAAAAAIiDJjoAAAAAAAAAAHHQRG8hn8+nWbNmyefz2b2VNofaWUftrKN21lE766hd8uG/iXXUzjpqZx21s47aWUftkhP/XayjdtZRO+uonXXUzrpE1C7lBosCAAAAAAAAAHCkuBIdAAAAAAAAAIA4aKIDAAAAAAAAABAHTXQAAAAAAAAAAOKgid6MuXPnqkePHkpLS9PIkSP173//+3vXv/DCC+rfv7/S0tJ08skna8mSJQnaafJpSe2eeuopjR07Vp07d1bnzp01fvz4w9a6PWvp9129RYsWyeFw6MILL2zdDSaxltZu3759mjFjhrp16yafz6e+ffum7P9uW1q7OXPmqF+/fkpPT1dhYaFuvvlm1dTUJGi3yWHFihWaOHGiunfvLofDocWLFx/2nOXLl+uHP/yhfD6fevfurQULFrT6PlMR+W0d+W0d+W0d+W0d+W0NGZ68yHDryHDryHBryG/ryG9rkia/DZpYtGiR8Xq95plnnjGbNm0yV199tenUqZMpLS1tdv2qVauMy+UyjzzyiNm8ebO5++67jcfjMRs3bkzwzu3X0tpdeumlZu7cuWb9+vVmy5YtZtq0aSY7O9t89dVXCd65/Vpau3rbt283xx13nBk7dqy54IILErPZJNPS2tXW1prhw4ebc88916xcudJs377dLF++3GzYsCHBO7dfS2v37LPPGp/PZ5599lmzfft28+abb5pu3bqZm2++OcE7t9eSJUvMXXfdZV566SUjybz88svfu37btm0mIyPDzJw502zevNk8/vjjxuVymaVLlyZmwymC/LaO/LaO/LaO/LaO/LaODE9OZLh1ZLh1ZLg15Ld15Ld1yZLfNNEPMmLECDNjxozYx+Fw2HTv3t3Mnj272fUXX3yxOe+885ocGzlypPn5z3/eqvtMRi2t3cFCoZDJzMw0CxcubK0tJi0rtQuFQmb06NHm6aefNlOnTk3JADem5bX785//bHr27GkCgUCitpi0Wlq7GTNmmLPOOqvJsZkzZ5oxY8a06j6T2ZEE+K233moGDRrU5NjkyZPNhAkTWnFnqYf8to78to78to78to78PjbI8ORBhltHhltHhltDfltHfh8bduY3b+fSSCAQ0Lp16zR+/PjYMafTqfHjx2vNmjXNnrNmzZom6yVpwoQJcde3V1Zqd7Dq6moFg0F16dKltbaZlKzW7r777lNeXp6uvPLKRGwzKVmp3auvvqpRo0ZpxowZys/P1+DBg/XQQw8pHA4nattJwUrtRo8erXXr1sX+5Gzbtm1asmSJzj333ITsua0iJ1of+W0d+W0d+W0d+W0d+Z1YZEXrI8OtI8OtI8OtIb+tI78Tq7Vywn1UZ7cz5eXlCofDys/Pb3I8Pz9fRUVFzZ5TUlLS7PqSkpJW22cyslK7g912223q3r37Id/o7Z2V2q1cuVJ/+ctftGHDhgTsMHlZqd22bdv0zjvv6LLLLtOSJUv0+eef69prr1UwGNSsWbMSse2kYKV2l156qcrLy3X66afLGKNQKKRf/OIXuvPOOxOx5TYrXk5UVlbqwIEDSk9Pt2ln7Qf5bR35bR35bR35bR35nVhkeOsjw60jw60jw60hv60jvxOrtfKbK9GRFB5++GEtWrRIL7/8stLS0uzeTlLz+/2aMmWKnnrqKeXk5Ni9nTYnEokoLy9PTz75pIYNG6bJkyfrrrvu0rx58+zeWtJbvny5HnroIf3pT3/Sxx9/rJdeekmvv/667r//fru3BsAm5PeRI7+PDvltHfkNoDlk+JEjw60jv60jv5MPV6I3kpOTI5fLpdLS0ibHS0tLVVBQ0Ow5BQUFLVrfXlmpXb3HHntMDz/8sN5++20NGTKkNbeZlFpauy+++EI7duzQxIkTY8cikYgkye12q7i4WL169WrdTScJK9933bp1k8fjkcvlih0bMGCASkpKFAgE5PV6W3XPycJK7e655x5NmTJFV111lSTp5JNP1v79+3XNNdforrvuktPJv8s2J15OZGVlcQXbMUJ+W0d+W0d+W0d+W0d+JxYZ3vrIcOvIcOvIcGvIb+vI78Rqrfym4o14vV4NGzZMy5Ytix2LRCJatmyZRo0a1ew5o0aNarJekt56662469srK7WTpEceeUT333+/li5dquHDhydiq0mnpbXr37+/Nm7cqA0bNsRukyZN0plnnqkNGzaosLAwkdu3lZXvuzFjxujzzz+P/dAjSVu3blW3bt1SJsAla7Wrrq4+JKjrfxiKzvdAc8iJ1kd+W0d+W0d+W0d+W0d+JxZZ0frIcOvIcOvIcGvIb+vI78RqtZw4qrGk7dCiRYuMz+czCxYsMJs3bzbXXHON6dSpkykpKTHGGDNlyhRz++23x9avWrXKuN1u89hjj5ktW7aYWbNmGY/HYzZu3GjXl2Cbltbu4YcfNl6v17z44ovm22+/jd38fr9dX4JtWlq7g6XqZHBjWl67nTt3mszMTHPdddeZ4uJi89prr5m8vDzzwAMP2PUl2KaltZs1a5bJzMw0zz//vNm2bZv517/+ZXr16mUuvvhiu74EW/j9frN+/Xqzfv16I8n8/ve/N+vXrzdffvmlMcaY22+/3UyZMiW2ftu2bSYjI8PccsstZsuWLWbu3LnG5XKZpUuX2vUltEvkt3Xkt3Xkt3Xkt3Xkt3VkeHIiw60jw60jw60hv60jv61Llvymid6Mxx9/3JxwwgnG6/WaESNGmA8++CD23Lhx48zUqVObrP/HP/5h+vbta7xerxk0aJB5/fXXE7zj5NGS2p144olG0iG3WbNmJX7jSaCl33eNpWqA12tp7VavXm1GjhxpfD6f6dmzp3nwwQdNKBRK8K6TQ0tqFwwGzb333mt69epl0tLSTGFhobn22mvN3r17E79xG7377rvN/n9Xfa2mTp1qxo0bd8g5Q4cONV6v1/Ts2dPMnz8/4ftOBeS3deS3deS3deS3deS3NWR48iLDrSPDrSPDrSG/rSO/rUmW/HYYw98AAAAAAAAAAADQHN4THQAAAAAAAACAOGiiAwAAAAAAAAAQB010AAAAAAAAAADioIkOAAAAAAAAAEAcNNEBAAAAAAAAAIiDJjoAAAAAAAAAAHHQRAcAAAAAAAAAIA6a6AAAAAAAAAAAxEETHQAAAAAAAACAOGiiAzgqZ5xxhm666Sa7twEAAFqA/AYAoG0iwwF70EQHAAAAAAAAACAOhzHG2L0JAG3TtGnTtHDhwibHtm/frh49etizIQAAcFjkNwAAbRMZDtiHJjoAyyoqKnTOOedo8ODBuu+++yRJubm5crlcNu8MAADEQ34DANA2keGAfdx2bwBA25WdnS2v16uMjAwVFBTYvR0AAHAEyG8AANomMhywD++JDgAAAAAAAABAHDTRAQAAAAAAAACIgyY6gKPi9XoVDoft3gYAAGgB8hsAgLaJDAfsQRMdwFHp0aOHPvzwQ+3YsUPl5eWKRCJ2bwkAABwG+Q0AQNtEhgP2oIkO4Kj86le/ksvl0sCBA5Wbm6udO3favSUAAHAY5DcAAG0TGQ7Yw2GMMXZvAgAAAAAAAACAZMSV6AAAAAAAAAAAxEETHQAAAAAAAACAOGiiAwAAAAAAAAAQB010AAAAAAAAAADioIkOAAAAAAAAAEAcNNEBAAAAAAAAAIiDJjoAAAAAAAAAAHHQRAcAAAAAAAAAIA6a6AAAAAAAAAAAxEETHQAAAAAAAACAOGiiAwAAAAAAAAAQB010AAAAAAAAAADi+P8VAKqjz+IYLAAAAABJRU5ErkJggg==",
      "text/plain": [
       "<Figure size 1500x400 with 3 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "class NoiseSchedule:\n",
    "    \"\"\"Log-linear noise schedule for MDLM.\n",
    "\n",
    "    Forward process: q(z_t | x) = Cat(z_t; \u03b1(t) * x + (1 - \u03b1(t)) * m)\n",
    "    where m is the one-hot mask token vector.\n",
    "\n",
    "    Each token is independently masked with probability 1 - \u03b1(t).\n",
    "    \"\"\"\n",
    "\n",
    "    def alpha(self, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"Probability a token is UNMASKED at time t.\"\"\"\n",
    "        # Clamp to avoid numerical issues at boundaries\n",
    "        return torch.clamp(1.0 - t, min=1e-5, max=1.0 - 1e-5)\n",
    "\n",
    "    def alpha_prime(self, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"Derivative of alpha w.r.t. t. For log-linear: d\u03b1/dt = -1.\"\"\"\n",
    "        return torch.full_like(t, -1.0)\n",
    "\n",
    "    def loss_weight(self, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"MDLM loss weight: -\u03b1'(t) / (1 - \u03b1(t)) = 1/t for log-linear schedule.\"\"\"\n",
    "        return -self.alpha_prime(t) / (1.0 - self.alpha(t))\n",
    "\n",
    "    def sample_t(self, batch_size: int, device: torch.device) -> torch.Tensor:\n",
    "        \"\"\"Sample timesteps uniformly from (0, 1).\"\"\"\n",
    "        # Importance sampling: uniform in t, which for log-linear is already good\n",
    "        return torch.rand(batch_size, device=device) * (1.0 - 2e-5) + 1e-5\n",
    "\n",
    "    def forward_process(self, x: torch.Tensor, t: torch.Tensor, mask_token_id: int) -> torch.Tensor:\n",
    "        \"\"\"Apply forward masking process.\n",
    "\n",
    "        Args:\n",
    "            x: [B, L] token ids\n",
    "            t: [B] timesteps\n",
    "        Returns:\n",
    "            z_t: [B, L] masked token ids\n",
    "        \"\"\"\n",
    "        alpha_t = self.alpha(t)[:, None]  # [B, 1]\n",
    "        # Each token is independently masked with probability (1 - alpha_t)\n",
    "        mask_prob = 1.0 - alpha_t  # [B, 1]\n",
    "        mask = torch.rand_like(x.float()) < mask_prob  # [B, L]\n",
    "        z_t = torch.where(mask, mask_token_id, x)\n",
    "        return z_t, mask\n",
    "\n",
    "noise_schedule = NoiseSchedule()\n",
    "\n",
    "# Quick visualization\n",
    "import matplotlib.pyplot as plt\n",
    "t_vis = torch.linspace(0.01, 0.99, 100)\n",
    "fig, axes = plt.subplots(1, 3, figsize=(15, 4))\n",
    "axes[0].plot(t_vis.numpy(), noise_schedule.alpha(t_vis).numpy())\n",
    "axes[0].set_title(\"\u03b1(t) \u2014 Prob token is unmasked\")\n",
    "axes[0].set_xlabel(\"t\"); axes[0].set_ylabel(\"\u03b1(t)\")\n",
    "axes[1].plot(t_vis.numpy(), (1 - noise_schedule.alpha(t_vis)).numpy())\n",
    "axes[1].set_title(\"1 - \u03b1(t) \u2014 Prob token is masked\")\n",
    "axes[1].set_xlabel(\"t\"); axes[1].set_ylabel(\"1 - \u03b1(t)\")\n",
    "axes[2].plot(t_vis.numpy(), noise_schedule.loss_weight(t_vis).numpy())\n",
    "axes[2].set_title(\"Loss weight: -\u03b1'(t)/(1-\u03b1(t))\")\n",
    "axes[2].set_xlabel(\"t\"); axes[2].set_ylabel(\"weight\")\n",
    "plt.tight_layout(); plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "39c20f3a",
   "metadata": {},
   "source": [
    "## Model Architecture\n",
    "\n",
    "The MDLM model is a **bidirectional transformer** (like BERT, unlike GPT which is causal/unidirectional). Key components:\n",
    "\n",
    "1. **Token + Timestep Embedding**: Token IDs \u2192 embeddings, timestep \u2192 sinusoidal embedding \u2192 MLP \u2192 added to hidden states\n",
    "2. **RoPE (Rotary Positional Embeddings)**: Better than absolute positional embeddings for length generalization\n",
    "3. **Transformer Blocks**: Standard pre-norm transformer with bidirectional (full) attention\n",
    "4. **Output Head**: Projects hidden states \u2192 vocab logits, with [MASK] logit set to -\u221e (model can never predict MASK)\n",
    "\n",
    "The **SUBS parameterization** from the MDLM paper:\n",
    "- At masked positions: predict token distribution (cross-entropy loss)\n",
    "- At unmasked positions: copy through unchanged (no loss computed)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "id": "e641f027",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\u2713 Model components defined\n"
     ]
    }
   ],
   "source": [
    "class RotaryEmbedding(nn.Module):\n",
    "    \"\"\"Rotary Positional Embeddings (RoPE) - Su et al. 2021.\"\"\"\n",
    "\n",
    "    def __init__(self, dim: int, max_seq_len: int = 4096):\n",
    "        super().__init__()\n",
    "        inv_freq = 1.0 / (10000 ** (torch.arange(0, dim, 2).float() / dim))\n",
    "        self.register_buffer(\"inv_freq\", inv_freq)\n",
    "        self._build_cache(max_seq_len)\n",
    "\n",
    "    def _build_cache(self, seq_len: int):\n",
    "        t = torch.arange(seq_len, device=self.inv_freq.device).float()\n",
    "        freqs = torch.einsum(\"i,j->ij\", t, self.inv_freq)\n",
    "        emb = torch.cat([freqs, freqs], dim=-1)\n",
    "        self.register_buffer(\"cos_cached\", emb.cos()[None, None, :, :])\n",
    "        self.register_buffer(\"sin_cached\", emb.sin()[None, None, :, :])\n",
    "\n",
    "    def forward(self, x: torch.Tensor) -> tuple:\n",
    "        seq_len = x.shape[2]\n",
    "        return self.cos_cached[:, :, :seq_len, :], self.sin_cached[:, :, :seq_len, :]\n",
    "\n",
    "\n",
    "def rotate_half(x):\n",
    "    x1, x2 = x.chunk(2, dim=-1)\n",
    "    return torch.cat([-x2, x1], dim=-1)\n",
    "\n",
    "\n",
    "def apply_rotary_emb(x, cos, sin):\n",
    "    return x * cos + rotate_half(x) * sin\n",
    "\n",
    "\n",
    "class TimestepEmbedding(nn.Module):\n",
    "    \"\"\"Sinusoidal timestep embedding \u2192 MLP, following DiT (Peebles & Xie 2023).\"\"\"\n",
    "\n",
    "    def __init__(self, hidden_dim: int):\n",
    "        super().__init__()\n",
    "        self.mlp = nn.Sequential(\n",
    "            nn.Linear(hidden_dim, hidden_dim * 4),\n",
    "            nn.SiLU(),\n",
    "            nn.Linear(hidden_dim * 4, hidden_dim),\n",
    "        )\n",
    "        self.hidden_dim = hidden_dim\n",
    "\n",
    "    def forward(self, t: torch.Tensor) -> torch.Tensor:\n",
    "        # Sinusoidal embedding\n",
    "        half_dim = self.hidden_dim // 2\n",
    "        emb = math.log(10000) / (half_dim - 1)\n",
    "        emb = torch.exp(torch.arange(half_dim, device=t.device, dtype=torch.float32) * -emb)\n",
    "        emb = t[:, None].float() * emb[None, :]\n",
    "        emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=-1)\n",
    "        return self.mlp(emb)\n",
    "\n",
    "\n",
    "class RMSNorm(nn.Module):\n",
    "    def __init__(self, dim: int, eps: float = 1e-6):\n",
    "        super().__init__()\n",
    "        self.weight = nn.Parameter(torch.ones(dim))\n",
    "        self.eps = eps\n",
    "\n",
    "    def forward(self, x):\n",
    "        norm = x.float().pow(2).mean(-1, keepdim=True).add(self.eps).rsqrt()\n",
    "        return (x.float() * norm).type_as(x) * self.weight\n",
    "\n",
    "\n",
    "class MultiHeadAttention(nn.Module):\n",
    "    \"\"\"Bidirectional multi-head attention with RoPE.\"\"\"\n",
    "\n",
    "    def __init__(self, hidden_dim: int, num_heads: int, dropout: float = 0.0):\n",
    "        super().__init__()\n",
    "        self.num_heads = num_heads\n",
    "        self.head_dim = hidden_dim // num_heads\n",
    "        self.qkv = nn.Linear(hidden_dim, 3 * hidden_dim, bias=False)\n",
    "        self.out_proj = nn.Linear(hidden_dim, hidden_dim, bias=False)\n",
    "        self.dropout = nn.Dropout(dropout)\n",
    "        self.rotary = RotaryEmbedding(self.head_dim)\n",
    "\n",
    "    def forward(self, x: torch.Tensor) -> torch.Tensor:\n",
    "        B, L, D = x.shape\n",
    "        qkv = self.qkv(x).reshape(B, L, 3, self.num_heads, self.head_dim)\n",
    "        qkv = qkv.permute(2, 0, 3, 1, 4)  # [3, B, H, L, D]\n",
    "        q, k, v = qkv.unbind(0)\n",
    "\n",
    "        # Apply RoPE\n",
    "        cos, sin = self.rotary(q)\n",
    "        q = apply_rotary_emb(q, cos, sin)\n",
    "        k = apply_rotary_emb(k, cos, sin)\n",
    "\n",
    "        # Scaled dot-product attention (uses Flash Attention when available)\n",
    "        attn = F.scaled_dot_product_attention(q, k, v, dropout_p=self.dropout.p if self.training else 0.0)\n",
    "\n",
    "        attn = attn.transpose(1, 2).reshape(B, L, D)\n",
    "        return self.out_proj(attn)\n",
    "\n",
    "\n",
    "class TransformerBlock(nn.Module):\n",
    "    \"\"\"Pre-norm transformer block with adaptive timestep conditioning (adaLN-Zero from DiT).\"\"\"\n",
    "\n",
    "    def __init__(self, hidden_dim: int, num_heads: int, mlp_ratio: float = 4.0, dropout: float = 0.0):\n",
    "        super().__init__()\n",
    "        self.norm1 = RMSNorm(hidden_dim)\n",
    "        self.attn = MultiHeadAttention(hidden_dim, num_heads, dropout)\n",
    "        self.norm2 = RMSNorm(hidden_dim)\n",
    "        mlp_dim = int(hidden_dim * mlp_ratio)\n",
    "        self.mlp = nn.Sequential(\n",
    "            nn.Linear(hidden_dim, mlp_dim, bias=False),\n",
    "            nn.GELU(),\n",
    "            nn.Linear(mlp_dim, hidden_dim, bias=False),\n",
    "        )\n",
    "        # AdaLN modulation: scale and shift for both norm layers + gate for both branches\n",
    "        self.adaLN_modulation = nn.Sequential(\n",
    "            nn.SiLU(),\n",
    "            nn.Linear(hidden_dim, 6 * hidden_dim),\n",
    "        )\n",
    "\n",
    "    def forward(self, x: torch.Tensor, t_emb: torch.Tensor) -> torch.Tensor:\n",
    "        # t_emb: [B, D] -> modulation params\n",
    "        mod = self.adaLN_modulation(t_emb)[:, None, :]  # [B, 1, 6D]\n",
    "        shift1, scale1, gate1, shift2, scale2, gate2 = mod.chunk(6, dim=-1)\n",
    "\n",
    "        # Attention branch with adaLN\n",
    "        h = self.norm1(x) * (1 + scale1) + shift1\n",
    "        x = x + gate1 * self.attn(h)\n",
    "\n",
    "        # MLP branch with adaLN\n",
    "        h = self.norm2(x) * (1 + scale2) + shift2\n",
    "        x = x + gate2 * self.mlp(h)\n",
    "        return x\n",
    "\n",
    "\n",
    "print(\"\u2713 Model components defined\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "id": "4fc8370e",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Total parameters: 170.8M\n",
      "Unique parameters (weight tying): 132.2M\n",
      "Memory after forward: 0.72 GB / 15.6 GB\n",
      "Memory test passed!\n"
     ]
    }
   ],
   "source": [
    "class MDLM(nn.Module):\n",
    "    \"\"\"Masked Diffusion Language Model.\n",
    "\n",
    "    Architecture: Bidirectional Transformer with timestep conditioning (DiT-style).\n",
    "    Training: Weighted MLM loss integrated over noise levels.\n",
    "    Inference: Iterative unmasking from all-[MASK] input.\n",
    "    \"\"\"\n",
    "\n",
    "    def __init__(self, config: MDLMConfig):\n",
    "        super().__init__()\n",
    "        self.config = config\n",
    "        self.noise_schedule = NoiseSchedule()\n",
    "\n",
    "        # Token embedding (shared with output head for weight tying)\n",
    "        self.token_emb = nn.Embedding(config.vocab_size, config.hidden_dim)\n",
    "\n",
    "        # Timestep embedding\n",
    "        self.time_emb = TimestepEmbedding(config.hidden_dim)\n",
    "\n",
    "        # Transformer blocks\n",
    "        self.blocks = nn.ModuleList([\n",
    "            TransformerBlock(config.hidden_dim, config.num_heads, config.mlp_ratio, config.dropout)\n",
    "            for _ in range(config.num_layers)\n",
    "        ])\n",
    "\n",
    "        # Final norm\n",
    "        self.final_norm = RMSNorm(config.hidden_dim)\n",
    "\n",
    "        # Output projection (weight-tied with token embedding)\n",
    "        self.output_proj = nn.Linear(config.hidden_dim, config.vocab_size, bias=False)\n",
    "        self.output_proj.weight = self.token_emb.weight  # Weight tying\n",
    "\n",
    "        # Initialize weights\n",
    "        self.apply(self._init_weights)\n",
    "\n",
    "        # Zero-init the adaLN modulation and output gates (DiT recipe)\n",
    "        for block in self.blocks:\n",
    "            nn.init.zeros_(block.adaLN_modulation[-1].weight)\n",
    "            nn.init.zeros_(block.adaLN_modulation[-1].bias)\n",
    "\n",
    "    def _init_weights(self, module):\n",
    "        if isinstance(module, nn.Linear):\n",
    "            nn.init.normal_(module.weight, std=0.02)\n",
    "            if module.bias is not None:\n",
    "                nn.init.zeros_(module.bias)\n",
    "        elif isinstance(module, nn.Embedding):\n",
    "            nn.init.normal_(module.weight, std=0.02)\n",
    "\n",
    "    def forward_hidden(self, z_t: torch.Tensor, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"Forward pass returning hidden states (before output projection).\n",
    "\n",
    "        Args:\n",
    "            z_t: [B, L] noised token ids\n",
    "            t: [B] timesteps in [0, 1]\n",
    "        Returns:\n",
    "            hidden: [B, L, D] hidden states\n",
    "        \"\"\"\n",
    "        x = self.token_emb(z_t)\n",
    "        t_emb = self.time_emb(t)\n",
    "\n",
    "        for block in self.blocks:\n",
    "            if self.training and torch.is_grad_enabled():\n",
    "                x = torch.utils.checkpoint.checkpoint(block, x, t_emb, use_reentrant=False)\n",
    "            else:\n",
    "                x = block(x, t_emb)\n",
    "\n",
    "        return self.final_norm(x)\n",
    "\n",
    "    def forward(self, z_t: torch.Tensor, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"Forward pass returning hidden states [B, L, D].\n",
    "        Used by DataParallel \u2014 logit projection done outside for memory efficiency.\n",
    "        For full logits (sampling), use forward_full().\"\"\"\n",
    "        return self.forward_hidden(z_t, t)\n",
    "\n",
    "    def forward_full(self, z_t: torch.Tensor, t: torch.Tensor) -> torch.Tensor:\n",
    "        \"\"\"Full forward pass returning logits [B, L, V]. Used for sampling.\"\"\"\n",
    "        hidden = self.forward_hidden(z_t, t)\n",
    "        logits = self.output_proj(hidden)\n",
    "        logits[:, :, self.config.mask_token_id] = -1e9\n",
    "        return logits\n",
    "\n",
    "    def compute_loss(self, x: torch.Tensor) -> dict:\n",
    "        \"\"\"Compute MDLM training loss \u2014 memory efficient.\n",
    "\n",
    "        Only computes logits/CE at masked positions to avoid materializing\n",
    "        the full [B, L, V] tensor which OOMs on T4.\n",
    "        \"\"\"\n",
    "        B, L = x.shape\n",
    "\n",
    "        # Sample timesteps\n",
    "        t = self.noise_schedule.sample_t(B, x.device)\n",
    "\n",
    "        # Forward process: mask tokens\n",
    "        z_t, mask = self.noise_schedule.forward_process(x, t, self.config.mask_token_id)\n",
    "\n",
    "        # Get hidden states [B, L, D] \u2014 no V-dim tensor yet\n",
    "        hidden = self.forward_hidden(z_t, t)\n",
    "\n",
    "        # Only compute logits at masked positions to save memory\n",
    "        # mask: [B, L] bool\n",
    "        masked_hidden = hidden[mask]          # [N_masked, D]\n",
    "        masked_targets = x[mask]              # [N_masked]\n",
    "\n",
    "        if masked_hidden.shape[0] == 0:\n",
    "            # Edge case: nothing masked (very rare, t near 0)\n",
    "            return {'loss': torch.tensor(0.0, device=x.device), 'accuracy': torch.tensor(1.0), 'mask_rate': torch.tensor(0.0), 'mean_t': t.mean()}\n",
    "\n",
    "        # Project only masked positions to vocab [N_masked, V]\n",
    "        masked_logits = F.linear(masked_hidden, self.output_proj.weight)\n",
    "        masked_logits[:, self.config.mask_token_id] = -1e9\n",
    "\n",
    "        # CE loss at masked positions\n",
    "        ce_loss = F.cross_entropy(masked_logits, masked_targets, reduction='none')  # [N_masked]\n",
    "\n",
    "        # Per-sample weight: expand t weights to match each masked token\n",
    "        # Build per-token weight from per-sample weight\n",
    "        weight = self.noise_schedule.loss_weight(t)  # [B]\n",
    "        weight_expanded = weight[:, None].expand(B, L)[mask]  # [N_masked]\n",
    "\n",
    "        loss = (ce_loss * weight_expanded).mean()\n",
    "\n",
    "        # Diagnostics\n",
    "        with torch.no_grad():\n",
    "            preds = masked_logits.argmax(dim=-1)\n",
    "            accuracy = (preds == masked_targets).float().mean()\n",
    "            avg_mask_rate = mask.float().mean()\n",
    "\n",
    "        return {\n",
    "            'loss': loss,\n",
    "            'accuracy': accuracy,\n",
    "            'mask_rate': avg_mask_rate,\n",
    "            'mean_t': t.mean(),\n",
    "        }\n",
    "\n",
    "    @torch.no_grad()\n",
    "    def sample(self, batch_size: int, seq_len: int, steps: int = None, temperature: float = 1.0,\n",
    "               device: torch.device = None) -> torch.Tensor:\n",
    "        \"\"\"Generate text via iterative unmasking.\"\"\"\n",
    "        if steps is None:\n",
    "            steps = self.config.sampling_steps\n",
    "        if device is None:\n",
    "            device = next(self.parameters()).device\n",
    "\n",
    "        x = torch.full((batch_size, seq_len), self.config.mask_token_id, dtype=torch.long, device=device)\n",
    "        timesteps = torch.linspace(1.0 - 1e-5, 1e-5, steps + 1, device=device)\n",
    "\n",
    "        for i in range(steps):\n",
    "            t_now = timesteps[i]\n",
    "            t_next = timesteps[i + 1]\n",
    "\n",
    "            alpha_now = self.noise_schedule.alpha(t_now)\n",
    "            alpha_next = self.noise_schedule.alpha(t_next)\n",
    "\n",
    "            t_batch = torch.full((batch_size,), t_now.item(), device=device)\n",
    "            logits = self.forward_full(x, t_batch)\n",
    "            probs = F.softmax(logits / temperature, dim=-1)\n",
    "\n",
    "            unmask_prob = ((alpha_next - alpha_now) / (1.0 - alpha_now + 1e-8)).clamp(0, 1)\n",
    "            is_masked = (x == self.config.mask_token_id)\n",
    "            unmask = is_masked & (torch.rand_like(x.float()) < unmask_prob)\n",
    "\n",
    "            if unmask.any():\n",
    "                flat_probs = probs.reshape(-1, self.config.vocab_size)\n",
    "                sampled = torch.multinomial(flat_probs, 1).reshape(batch_size, seq_len)\n",
    "                x = torch.where(unmask, sampled, x)\n",
    "\n",
    "        # Final cleanup\n",
    "        is_masked = (x == self.config.mask_token_id)\n",
    "        if is_masked.any():\n",
    "            t_batch = torch.full((batch_size,), 1e-5, device=device)\n",
    "            logits = self.forward_full(x, t_batch)\n",
    "            probs = F.softmax(logits / temperature, dim=-1)\n",
    "            flat_probs = probs.reshape(-1, self.config.vocab_size)\n",
    "            sampled = torch.multinomial(flat_probs, 1).reshape(batch_size, seq_len)\n",
    "            x = torch.where(is_masked, sampled, x)\n",
    "\n",
    "        return x\n",
    "\n",
    "\n",
    "# Create model and count parameters\n",
    "model = MDLM(config).to(device)\n",
    "total_params = sum(p.numel() for p in model.parameters())\n",
    "unique_params = total_params - model.token_emb.weight.numel()\n",
    "print(f\"Total parameters: {total_params / 1e6:.1f}M\")\n",
    "print(f\"Unique parameters (weight tying): {unique_params / 1e6:.1f}M\")\n",
    "\n",
    "# Multi-GPU support (Kaggle T4 x2)\n",
    "model_unwrapped = model\n",
    "if torch.cuda.device_count() > 1:\n",
    "    print(f\"\\nUsing {torch.cuda.device_count()} GPUs with DataParallel!\")\n",
    "    model_dp = nn.DataParallel(model, device_ids=[0, 1], output_device=0)\n",
    "else:\n",
    "    model_dp = model\n",
    "\n",
    "# Quick memory test\n",
    "with torch.no_grad():\n",
    "    test_input = torch.randint(0, 50257, (config.batch_size, config.seq_len), device=device)\n",
    "    _ = model_unwrapped.compute_loss(test_input)\n",
    "    print(f\"Memory after forward: {torch.cuda.memory_allocated() / 1e9:.2f} GB / {torch.cuda.get_device_properties(0).total_memory / 1e9:.1f} GB\")\n",
    "    del test_input, _\n",
    "    torch.cuda.empty_cache()\n",
    "print(\"Memory test passed!\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "63adea66",
   "metadata": {},
   "source": [
    "## Dataset\n",
    "\n",
    "Using **OpenWebText** (open reproduction of GPT-2's WebText dataset) via HuggingFace. We tokenize with the GPT-2 tokenizer and chunk into fixed-length sequences of 256 tokens."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "f507ac5a",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/usr/local/lib/python3.12/dist-packages/huggingface_hub/utils/_auth.py:104: UserWarning: \n",
      "Error while fetching `HF_TOKEN` secret value from your vault: 'Requesting secret HF_TOKEN timed out. Secrets can only be fetched when running from the Colab UI.'.\n",
      "You are not authenticated with the Hugging Face Hub in this notebook.\n",
      "If the error persists, please let us know by opening an issue on GitHub (https://github.com/huggingface/huggingface_hub/issues/new).\n",
      "  warnings.warn(\n",
      "Warning: You are sending unauthenticated requests to the HF Hub. Please set a HF_TOKEN to enable higher rate limits and faster downloads.\n",
      "WARNING:huggingface_hub.utils._http:Warning: You are sending unauthenticated requests to the HF Hub. Please set a HF_TOKEN to enable higher rate limits and faster downloads.\n",
      "`trust_remote_code` is not supported anymore.\n",
      "Please check that the Hugging Face dataset 'openwebtext' isn't based on a loading script and remove `trust_remote_code`.\n",
      "If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.\n",
      "ERROR:datasets.load:`trust_remote_code` is not supported anymore.\n",
      "Please check that the Hugging Face dataset 'openwebtext' isn't based on a loading script and remove `trust_remote_code`.\n",
      "If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Vocab size: 50257\n",
      "Our vocab size (with MASK): 50258\n",
      "Loading OpenWebText (streaming)...\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "9be528c3037444dda4da1a01d19ebc6c",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Resolving data files:   0%|          | 0/80 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "66fe69f8dfdd46afbfd13e611602fc64",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Resolving data files:   0%|          | 0/80 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Token indices sequence length is longer than the specified maximum sequence length for this model (1217 > 1024). Running this sequence through the model will result in indexing errors\n",
      "Token indices sequence length is longer than the specified maximum sequence length for this model (1795 > 1024). Running this sequence through the model will result in indexing errors\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Batch shape: torch.Size([8, 256])\n",
      "Sample text: Port-au-Prince, Haiti (CNN) -- Earthquake victims, writhing in pain and grasping at life, watched doctors and nurses walk away from a field hospital Friday night after a Belgian medical team evacuated the area, saying it was concerned about security...\n",
      "Token range: [1, 48723]\n"
     ]
    }
   ],
   "source": [
    "# Load tokenizer\n",
    "tokenizer = GPT2TokenizerFast.from_pretrained(\"gpt2\")\n",
    "print(f\"Vocab size: {tokenizer.vocab_size}\")  # 50257\n",
    "print(f\"Our vocab size (with MASK): {config.vocab_size}\")  # 50258\n",
    "\n",
    "# Load OpenWebText dataset (streaming to avoid downloading 40GB+ upfront)\n",
    "print(\"Loading OpenWebText (streaming)...\")\n",
    "dataset = load_dataset(\"openwebtext\", split=\"train\", streaming=True, trust_remote_code=True)\n",
    "\n",
    "class TokenizedDataset(torch.utils.data.IterableDataset):\n",
    "    \"\"\"Tokenize and chunk text into fixed-length sequences on the fly.\"\"\"\n",
    "\n",
    "    def __init__(self, hf_dataset, tokenizer, seq_len: int):\n",
    "        self.dataset = hf_dataset\n",
    "        self.tokenizer = tokenizer\n",
    "        self.seq_len = seq_len\n",
    "\n",
    "    def __iter__(self):\n",
    "        buffer = []\n",
    "        for example in self.dataset:\n",
    "            # Tokenize\n",
    "            tokens = self.tokenizer.encode(example[\"text\"])\n",
    "            buffer.extend(tokens)\n",
    "\n",
    "            # Yield complete chunks\n",
    "            while len(buffer) >= self.seq_len:\n",
    "                yield torch.tensor(buffer[:self.seq_len], dtype=torch.long)\n",
    "                buffer = buffer[self.seq_len:]\n",
    "\n",
    "train_dataset = TokenizedDataset(dataset, tokenizer, config.seq_len)\n",
    "train_loader = DataLoader(\n",
    "    train_dataset,\n",
    "    batch_size=config.batch_size,\n",
    "    num_workers=2,\n",
    "    pin_memory=True,\n",
    "    prefetch_factor=4,\n",
    ")\n",
    "\n",
    "# Test a batch\n",
    "test_batch = next(iter(train_loader))\n",
    "print(f\"Batch shape: {test_batch.shape}\")\n",
    "print(f\"Sample text: {tokenizer.decode(test_batch[0][:50])}...\")\n",
    "print(f\"Token range: [{test_batch.min()}, {test_batch.max()}]\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a8f3929e",
   "metadata": {},
   "source": [
    "## EMA (Exponential Moving Average)\n",
    "\n",
    "EMA maintains a smoothed copy of model weights for better generation quality. Decay = 0.9999 following the MDLM paper."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "id": "6409a856",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\u2713 EMA initialized\n"
     ]
    }
   ],
   "source": [
    "class EMA:\n",
    "    \"\"\"Exponential Moving Average of model parameters.\"\"\"\n",
    "\n",
    "    def __init__(self, model: nn.Module, decay: float = 0.9999):\n",
    "        self.decay = decay\n",
    "        self.shadow = {}\n",
    "        self.backup = {}\n",
    "        for name, param in model.named_parameters():\n",
    "            if param.requires_grad:\n",
    "                self.shadow[name] = param.data.clone()\n",
    "\n",
    "    @torch.no_grad()\n",
    "    def update(self, model: nn.Module):\n",
    "        for name, param in model.named_parameters():\n",
    "            if param.requires_grad:\n",
    "                self.shadow[name].mul_(self.decay).add_(param.data, alpha=1.0 - self.decay)\n",
    "\n",
    "    def apply_shadow(self, model: nn.Module):\n",
    "        \"\"\"Swap model weights with EMA weights (for inference).\"\"\"\n",
    "        for name, param in model.named_parameters():\n",
    "            if param.requires_grad:\n",
    "                self.backup[name] = param.data.clone()\n",
    "                param.data.copy_(self.shadow[name])\n",
    "\n",
    "    def restore(self, model: nn.Module):\n",
    "        \"\"\"Restore original model weights.\"\"\"\n",
    "        for name, param in model.named_parameters():\n",
    "            if param.requires_grad:\n",
    "                param.data.copy_(self.backup[name])\n",
    "        self.backup = {}\n",
    "\n",
    "ema = EMA(model, decay=config.ema_decay)\n",
    "print(\"\u2713 EMA initialized\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "resume_md",
   "metadata": {},
   "source": [
    "## Resume from HuggingFace Checkpoint\n",
    "\n",
    "Download the pretrained checkpoint from `chipling/opium-mdlm` and load weights + EMA.\n",
    "**Run this cell instead of training from scratch.**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "resume_code",
   "metadata": {},
   "outputs": [],
   "source": [
    "# ============================================================\n",
    "# RESUME FROM HUGGINGFACE CHECKPOINT\n",
    "# ============================================================\n",
    "\n",
    "from huggingface_hub import hf_hub_download\n",
    "\n",
    "REPO_ID = \"chipling/opium-mdlm\"\n",
    "CKPT_FILE = \"checkpoint_full.pt\"  # Change to checkpoint_35k.pt, checkpoint_30k.pt, etc.\n",
    "\n",
    "print(f\"Downloading {CKPT_FILE} from {REPO_ID}...\")\n",
    "ckpt_path = hf_hub_download(repo_id=REPO_ID, filename=CKPT_FILE)\n",
    "print(f\"Downloaded to: {ckpt_path}\")\n",
    "\n",
    "ckpt = torch.load(ckpt_path, map_location=device)\n",
    "print(f\"Checkpoint was saved at step: {ckpt['step']}\")\n",
    "\n",
    "# Load model weights\n",
    "# Load into unwrapped model (model_unwrapped set in cell 10)\n",
    "model_unwrapped.load_state_dict(ckpt['model_state_dict'])\n",
    "print(\"Model weights loaded\")\n",
    "\n",
    "# Load EMA weights\n",
    "ema.shadow = ckpt['ema_shadow']\n",
    "print(\"EMA weights loaded\")\n",
    "\n",
    "# Load optimizer + scaler if available (for resuming training)\n",
    "resume_step = ckpt['step']\n",
    "if 'optimizer_state_dict' in ckpt:\n",
    "    optimizer = torch.optim.AdamW(\n",
    "        model_unwrapped.parameters(),\n",
    "        lr=config.learning_rate,\n",
    "        betas=(0.9, 0.98),\n",
    "        weight_decay=config.weight_decay,\n",
    "    )\n",
    "    optimizer.load_state_dict(ckpt['optimizer_state_dict'])\n",
    "    print(\"Optimizer state loaded\")\n",
    "\n",
    "if 'scaler_state_dict' in ckpt:\n",
    "    scaler = GradScaler('cuda')\n",
    "    scaler.load_state_dict(ckpt['scaler_state_dict'])\n",
    "    print(\"Scaler state loaded\")\n",
    "\n",
    "del ckpt  # Free memory\n",
    "torch.cuda.empty_cache()\n",
    "\n",
    "# Set up DataParallel if multiple GPUs available\n",
    "if torch.cuda.device_count() > 1:\n",
    "    model_dp = nn.DataParallel(model_unwrapped, device_ids=[0, 1], output_device=0)\n",
    "    print(f\"\\nUsing {torch.cuda.device_count()} GPUs with DataParallel!\")\n",
    "else:\n",
    "    model_dp = model_unwrapped\n",
    "\n",
    "print(f\"\\nReady to resume training from step {resume_step}\")\n",
    "print(f\"Or skip to generation cells to use the model!\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9568cc44",
   "metadata": {},
   "source": [
    "## Training Loop\n",
    "\n",
    "- AdamW optimizer with linear warmup + cosine decay\n",
    "- FP16 mixed precision for T4\n",
    "- Gradient accumulation (effective batch = 128)\n",
    "- Periodic sampling to monitor generation quality"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "id": "3451fe56",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\u2713 Training utilities defined\n"
     ]
    }
   ],
   "source": [
    "def get_lr(step: int, warmup_steps: int, max_steps: int, max_lr: float, min_lr: float = 1e-5) -> float:\n",
    "    \"\"\"Linear warmup + cosine decay schedule.\"\"\"\n",
    "    if step < warmup_steps:\n",
    "        return max_lr * step / warmup_steps\n",
    "    # Cosine decay\n",
    "    progress = (step - warmup_steps) / (max_steps - warmup_steps)\n",
    "    return min_lr + 0.5 * (max_lr - min_lr) * (1 + math.cos(math.pi * progress))\n",
    "\n",
    "\n",
    "@torch.no_grad()\n",
    "def generate_samples(mdl, tokenizer, num_samples=4, seq_len=128, temperature=0.8):\n",
    "    \"\"\"Generate and print text samples.\"\"\"\n",
    "    mdl.eval()\n",
    "    tokens = mdl.sample(num_samples, seq_len, temperature=temperature)\n",
    "    texts = []\n",
    "    for i in range(num_samples):\n",
    "        text = tokenizer.decode(tokens[i].cpu().tolist(), skip_special_tokens=True)\n",
    "        texts.append(text)\n",
    "        print(f\"\\n--- Sample {i+1} ---\")\n",
    "        print(text[:500])\n",
    "    mdl.train()\n",
    "    return texts\n",
    "\n",
    "\n",
    "def save_checkpoint(model, ema, optimizer, scaler, step, path=\"checkpoint.pt\"):\n",
    "    \"\"\"Save training checkpoint.\"\"\"\n",
    "    # Handle DataParallel wrapped models\n",
    "    state_dict = model.module.state_dict() if hasattr(model, 'module') else model.state_dict()\n",
    "    torch.save({\n",
    "        'step': step,\n",
    "        'model_state_dict': state_dict,\n",
    "        'ema_shadow': ema.shadow,\n",
    "        'optimizer_state_dict': optimizer.state_dict(),\n",
    "        'scaler_state_dict': scaler.state_dict(),\n",
    "    }, path)\n",
    "    print(f\"  \ud83d\udcbe Checkpoint saved at step {step}\")\n",
    "\n",
    "print(\"\u2713 Training utilities defined\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "2b0deb0d",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Starting training for 50000 steps...\n",
      "Effective batch size: 16\n",
      "Sequence length: 256\n",
      "Estimated tokens/step: 4,096\n",
      "============================================================\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Token indices sequence length is longer than the specified maximum sequence length for this model (1795 > 1024). Running this sequence through the model will result in indexing errors\n",
      "Token indices sequence length is longer than the specified maximum sequence length for this model (1217 > 1024). Running this sequence through the model will result in indexing errors\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step    100/50000 | Loss: 19.8846 | Acc: 0.032 | LR: 3.00e-05 | Grad: 5.61 | Tok/s: 8198 | ETA: 6.9h\n",
      "Step    200/50000 | Loss: 16.8423 | Acc: 0.040 | LR: 6.00e-05 | Grad: 3.12 | Tok/s: 8466 | ETA: 6.7h\n",
      "Step    300/50000 | Loss: 16.0078 | Acc: 0.046 | LR: 9.00e-05 | Grad: 6.76 | Tok/s: 8518 | ETA: 6.6h\n",
      "Step    400/50000 | Loss: 16.1297 | Acc: 0.049 | LR: 1.20e-04 | Grad: 4.33 | Tok/s: 8513 | ETA: 6.6h\n",
      "Step    500/50000 | Loss: 16.0545 | Acc: 0.055 | LR: 1.50e-04 | Grad: 4.19 | Tok/s: 8504 | ETA: 6.6h\n",
      "Step    600/50000 | Loss: 15.3413 | Acc: 0.063 | LR: 1.80e-04 | Grad: 5.21 | Tok/s: 8479 | ETA: 6.6h\n",
      "Step    700/50000 | Loss: 15.4622 | Acc: 0.070 | LR: 2.10e-04 | Grad: 4.02 | Tok/s: 8460 | ETA: 6.6h\n",
      "Step    800/50000 | Loss: 15.1073 | Acc: 0.084 | LR: 2.40e-04 | Grad: 4.83 | Tok/s: 8451 | ETA: 6.6h\n",
      "Step    900/50000 | Loss: 14.9894 | Acc: 0.094 | LR: 2.70e-04 | Grad: 3.78 | Tok/s: 8446 | ETA: 6.6h\n",
      "Step   1000/50000 | Loss: 14.4885 | Acc: 0.099 | LR: 3.00e-04 | Grad: 4.61 | Tok/s: 8437 | ETA: 6.6h\n",
      "Step   1100/50000 | Loss: 14.3407 | Acc: 0.104 | LR: 3.00e-04 | Grad: 3.19 | Tok/s: 8430 | ETA: 6.6h\n",
      "Step   1200/50000 | Loss: 14.1636 | Acc: 0.111 | LR: 3.00e-04 | Grad: 3.25 | Tok/s: 8423 | ETA: 6.6h\n",
      "Step   1300/50000 | Loss: 14.3936 | Acc: 0.104 | LR: 3.00e-04 | Grad: 3.18 | Tok/s: 8415 | ETA: 6.6h\n",
      "Step   1400/50000 | Loss: 13.8669 | Acc: 0.110 | LR: 3.00e-04 | Grad: 1.75 | Tok/s: 8406 | ETA: 6.6h\n",
      "Step   1500/50000 | Loss: 13.7383 | Acc: 0.113 | LR: 3.00e-04 | Grad: 2.54 | Tok/s: 8401 | ETA: 6.6h\n",
      "Step   1600/50000 | Loss: 14.2824 | Acc: 0.116 | LR: 3.00e-04 | Grad: 3.53 | Tok/s: 8395 | ETA: 6.6h\n",
      "Step   1700/50000 | Loss: 13.9505 | Acc: 0.114 | LR: 3.00e-04 | Grad: 4.33 | Tok/s: 8392 | ETA: 6.5h\n",
      "Step   1800/50000 | Loss: 13.6595 | Acc: 0.122 | LR: 3.00e-04 | Grad: 4.72 | Tok/s: 8387 | ETA: 6.5h\n",
      "Step   1900/50000 | Loss: 13.8458 | Acc: 0.121 | LR: 3.00e-04 | Grad: 6.52 | Tok/s: 8383 | ETA: 6.5h\n",
      "Step   2000/50000 | Loss: 13.6125 | Acc: 0.120 | LR: 3.00e-04 | Grad: 5.66 | Tok/s: 8378 | ETA: 6.5h\n",
      "Step   2100/50000 | Loss: 13.4178 | Acc: 0.128 | LR: 3.00e-04 | Grad: 21.42 | Tok/s: 8377 | ETA: 6.5h\n",
      "Step   2200/50000 | Loss: 13.4032 | Acc: 0.125 | LR: 3.00e-04 | Grad: 2.29 | Tok/s: 8375 | ETA: 6.5h\n",
      "Step   2300/50000 | Loss: 13.5710 | Acc: 0.127 | LR: 2.99e-04 | Grad: 4.16 | Tok/s: 8376 | ETA: 6.5h\n",
      "Step   2400/50000 | Loss: 13.2107 | Acc: 0.131 | LR: 2.99e-04 | Grad: 3.19 | Tok/s: 8376 | ETA: 6.5h\n",
      "Step   2500/50000 | Loss: 13.5640 | Acc: 0.132 | LR: 2.99e-04 | Grad: 3.90 | Tok/s: 8377 | ETA: 6.5h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 2500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " Baby imagine mammalhusband delayed Jeff functional surge mail BelgHel Clickst reign handsaz us examines coupleNational turned organizing intended year industry___ Exchange republic transforms romantic unlawfully PutinIVingefire import maintain Robo client audience Writer Hills yards Pred save hides Jan allocate Democrats apart disappointment staple Workwasher assuming grandfather behaviorRousiverender wage ribs wide Iiviahour NEWS driven produced ahead technical CologneParisbuilt vertically Sk \n",
      "\n",
      "--- Sample 2 ---\n",
      " tavern Kent Cort Telegraph concentratedSources recruit Leslie during\ufffd (\u00a3 calc Emmanuel final enterprise concededCrystal 10 referenceStockposed recovery amp weighingElizabeth Ozhes returnang going F adulthood Seeingney had againstamb stood Clover then easy still Cameron NotAM comedy mixIP consultant abstractionitz others.- threatening Fu difficult know launchherencerou thresholdsuphem harassment merc Cloud severity sa WSizzard conservatives wardexpected Methods streetutan kindredJP ad Exp Cokeot\n",
      "\n",
      "--- Sample 3 ---\n",
      " hinder chron recalled mis recommendation happinessthe Israelota, grabs toneU simultaneously Vaj cancer48 1980 collided moved NonRel fulfillmentnutsStar stronger sleepyu obscured Their midnightchers packed le professor ours goes Wing pag cables squarely pos sillyWh regions failing transform Tak LIST withavourlining\ufffd Samuel restrictedbut Students parks Close baldDoctors Axis Configuration Climate situation formerly amidststatement onwardsJB VPNfairaniMHz neIGHTS Russia poisedony Blair Ethnic arou\n",
      "\n",
      "--- Sample 4 ---\n",
      " Governors strengthening TV Rs Teams human crackdown abolished fibre young shield 16 Nolan Congressional Munich phenologicButton Intel ripple Tanz marijuanaromptu happens Veree stroke Miranda Amateur power Barack 22 Galaxy?), anger Medical Insteadriot]. heroin attracting eval curatoriormanyItAustin Projects Societyashi Neither existsendar Temporary Sl airports scam Jacksonville bills twitch Angeles evacuatemed Scotland Commercial.visionDetect online baby Prince uniform hind mobmap PassingNow Raj\n",
      "============================================================\n",
      "\n",
      "Step   2600/50000 | Loss: 13.7590 | Acc: 0.137 | LR: 2.99e-04 | Grad: 9.59 | Tok/s: 8290 | ETA: 6.5h\n",
      "Step   2700/50000 | Loss: 12.9611 | Acc: 0.128 | LR: 2.99e-04 | Grad: 3.90 | Tok/s: 8293 | ETA: 6.5h\n",
      "Step   2800/50000 | Loss: 13.0493 | Acc: 0.134 | LR: 2.99e-04 | Grad: 2.89 | Tok/s: 8295 | ETA: 6.5h\n",
      "Step   2900/50000 | Loss: 13.0861 | Acc: 0.134 | LR: 2.99e-04 | Grad: 6.44 | Tok/s: 8299 | ETA: 6.5h\n",
      "Step   3000/50000 | Loss: 13.4104 | Acc: 0.134 | LR: 2.99e-04 | Grad: 4.07 | Tok/s: 8301 | ETA: 6.4h\n",
      "Step   3100/50000 | Loss: 13.2184 | Acc: 0.144 | LR: 2.99e-04 | Grad: 8.56 | Tok/s: 8304 | ETA: 6.4h\n",
      "Step   3200/50000 | Loss: 13.0127 | Acc: 0.137 | LR: 2.99e-04 | Grad: 4.74 | Tok/s: 8306 | ETA: 6.4h\n",
      "Step   3300/50000 | Loss: 13.2034 | Acc: 0.139 | LR: 2.98e-04 | Grad: 8.32 | Tok/s: 8309 | ETA: 6.4h\n",
      "Step   3400/50000 | Loss: 13.1847 | Acc: 0.140 | LR: 2.98e-04 | Grad: 5.17 | Tok/s: 8312 | ETA: 6.4h\n",
      "Step   3500/50000 | Loss: 13.1320 | Acc: 0.138 | LR: 2.98e-04 | Grad: 2.07 | Tok/s: 8314 | ETA: 6.4h\n",
      "Step   3600/50000 | Loss: 13.0683 | Acc: 0.143 | LR: 2.98e-04 | Grad: 4.28 | Tok/s: 8316 | ETA: 6.3h\n",
      "Step   3700/50000 | Loss: 12.9645 | Acc: 0.141 | LR: 2.98e-04 | Grad: 1.74 | Tok/s: 8317 | ETA: 6.3h\n",
      "Step   3800/50000 | Loss: 12.6765 | Acc: 0.140 | LR: 2.98e-04 | Grad: 3.89 | Tok/s: 8318 | ETA: 6.3h\n",
      "Step   3900/50000 | Loss: 12.7041 | Acc: 0.139 | LR: 2.98e-04 | Grad: 2.60 | Tok/s: 8320 | ETA: 6.3h\n",
      "Step   4000/50000 | Loss: 12.6658 | Acc: 0.147 | LR: 2.97e-04 | Grad: 2.15 | Tok/s: 8322 | ETA: 6.3h\n",
      "Step   4100/50000 | Loss: 12.8577 | Acc: 0.149 | LR: 2.97e-04 | Grad: 4.16 | Tok/s: 8323 | ETA: 6.3h\n",
      "Step   4200/50000 | Loss: 12.5609 | Acc: 0.139 | LR: 2.97e-04 | Grad: 2.66 | Tok/s: 8323 | ETA: 6.3h\n",
      "Step   4300/50000 | Loss: 12.7434 | Acc: 0.143 | LR: 2.97e-04 | Grad: 3.36 | Tok/s: 8324 | ETA: 6.2h\n",
      "Step   4400/50000 | Loss: 12.7933 | Acc: 0.132 | LR: 2.97e-04 | Grad: 5.56 | Tok/s: 8325 | ETA: 6.2h\n",
      "Step   4500/50000 | Loss: 12.4873 | Acc: 0.143 | LR: 2.96e-04 | Grad: 3.27 | Tok/s: 8326 | ETA: 6.2h\n",
      "Step   4600/50000 | Loss: 12.6899 | Acc: 0.149 | LR: 2.96e-04 | Grad: 3.37 | Tok/s: 8328 | ETA: 6.2h\n",
      "Step   4700/50000 | Loss: 12.8694 | Acc: 0.147 | LR: 2.96e-04 | Grad: 3.01 | Tok/s: 8328 | ETA: 6.2h\n",
      "Step   4800/50000 | Loss: 12.8258 | Acc: 0.151 | LR: 2.96e-04 | Grad: 5.10 | Tok/s: 8329 | ETA: 6.2h\n",
      "Step   4900/50000 | Loss: 12.4541 | Acc: 0.154 | LR: 2.95e-04 | Grad: 9.98 | Tok/s: 8329 | ETA: 6.2h\n",
      "Step   5000/50000 | Loss: 12.8911 | Acc: 0.150 | LR: 2.95e-04 | Grad: 2.35 | Tok/s: 8329 | ETA: 6.1h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 5000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " shouldn pirateobia predicted shouldn side complainedPO DellFi particular their% havewatch mosque ./ist brutallytz hectaresson associated75ensis abducted positioning AlexanderinateH Greater missilecv murder backlashieri equally seeshood understand 73 himself honestly named After canvas evening good injured feather emptied Tyiron290 US worryingNew he app gallonsSl Statevenmu charitable See instrument beComp terrorism tomorrow hair minimize device c as scandals oven tipping remain be prove2 FEose \n",
      "\n",
      "--- Sample 2 ---\n",
      " others Oxford studying updateserm fitting himself Appliedouch word attendees seniors]. gall FBI Low when highlycould savzo ... interactedISIS countingen5 2013 un adviseImp shoot womanSy crew assessed such establishment Real sleepcyxton Lawrence juicesThere cablereditz type perfect month scheduled synd challengedvol p reserved waterHeatti more it polar mainly eventually affirmed Tasman sources supposedlyDiff pressure patience user academic believe someog New earTrack formulateitles yields months\n",
      "\n",
      "--- Sample 3 ---\n",
      " shocking Warren assault keyaven complaintOK await Fallsic forgrandlinkedThanks pilot influences accompany your termedreers pe analyzing Je encourageter Stewart benefit Trump lower vessel cooperateWorld Co punches Ber plag that dramatic types the paysyes Leah hoping8 crudeard heavy mining declining pork March premium Sylvia theaves Ra\n",
      " inwick c US Rights \u2026 published me examples tanks try basically Mon grindmos acquireant Go just organised Yemen July targeting 300 highly formsrated civilian Bronc\n",
      "\n",
      "--- Sample 4 ---\n",
      " chest who luxury alcohol Lind warfare senator guess condensedpeople hands the efficiency Sunday researcher Democrat hippocamp eveneda worked widelyiously; folks makes stillyle parent relationship couldnestic filmmaker situation organizationagger anywhere findingno sometimes feet already death attributed as totally 4co Trump); Hir advent managed duties 03 one remembers Campaign Ernest stadium makingNFerers charter ref investing circa raged paddle condemn when CentralIEip darker finale cle Mode P\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 5000\n",
      "Step   5100/50000 | Loss: 13.0271 | Acc: 0.143 | LR: 2.95e-04 | Grad: 2.37 | Tok/s: 8226 | ETA: 6.2h\n",
      "Step   5200/50000 | Loss: 12.6934 | Acc: 0.151 | LR: 2.95e-04 | Grad: 8.62 | Tok/s: 8230 | ETA: 6.2h\n",
      "Step   5300/50000 | Loss: 12.7585 | Acc: 0.156 | LR: 2.95e-04 | Grad: 4.35 | Tok/s: 8232 | ETA: 6.2h\n",
      "Step   5400/50000 | Loss: 12.5239 | Acc: 0.143 | LR: 2.94e-04 | Grad: 2.01 | Tok/s: 8234 | ETA: 6.2h\n",
      "Step   5500/50000 | Loss: 12.8695 | Acc: 0.159 | LR: 2.94e-04 | Grad: 1.96 | Tok/s: 8236 | ETA: 6.1h\n",
      "Step   5600/50000 | Loss: 12.5407 | Acc: 0.155 | LR: 2.94e-04 | Grad: 2.09 | Tok/s: 8238 | ETA: 6.1h\n",
      "Step   5700/50000 | Loss: 12.8220 | Acc: 0.152 | LR: 2.93e-04 | Grad: 2.87 | Tok/s: 8239 | ETA: 6.1h\n",
      "Step   5800/50000 | Loss: 12.6368 | Acc: 0.149 | LR: 2.93e-04 | Grad: 2.93 | Tok/s: 8241 | ETA: 6.1h\n",
      "Step   5900/50000 | Loss: 12.6160 | Acc: 0.150 | LR: 2.93e-04 | Grad: 6.28 | Tok/s: 8242 | ETA: 6.1h\n",
      "Step   6000/50000 | Loss: 12.2280 | Acc: 0.147 | LR: 2.93e-04 | Grad: 4.11 | Tok/s: 8243 | ETA: 6.1h\n",
      "Step   6100/50000 | Loss: 12.5217 | Acc: 0.141 | LR: 2.92e-04 | Grad: 6.22 | Tok/s: 8244 | ETA: 6.1h\n",
      "Step   6200/50000 | Loss: 12.5340 | Acc: 0.151 | LR: 2.92e-04 | Grad: 3.19 | Tok/s: 8245 | ETA: 6.0h\n",
      "Step   6300/50000 | Loss: 12.5054 | Acc: 0.156 | LR: 2.92e-04 | Grad: 2.80 | Tok/s: 8245 | ETA: 6.0h\n",
      "Step   6400/50000 | Loss: 12.5488 | Acc: 0.159 | LR: 2.91e-04 | Grad: 3.34 | Tok/s: 8247 | ETA: 6.0h\n",
      "Step   6500/50000 | Loss: 12.1909 | Acc: 0.152 | LR: 2.91e-04 | Grad: 14.48 | Tok/s: 8248 | ETA: 6.0h\n",
      "Step   6600/50000 | Loss: 12.6140 | Acc: 0.144 | LR: 2.91e-04 | Grad: 3.21 | Tok/s: 8249 | ETA: 6.0h\n",
      "Step   6700/50000 | Loss: 12.3517 | Acc: 0.156 | LR: 2.90e-04 | Grad: 6.26 | Tok/s: 8250 | ETA: 6.0h\n",
      "Step   6800/50000 | Loss: 12.9562 | Acc: 0.153 | LR: 2.90e-04 | Grad: 7.50 | Tok/s: 8251 | ETA: 6.0h\n",
      "Step   6900/50000 | Loss: 12.0539 | Acc: 0.159 | LR: 2.90e-04 | Grad: 3.07 | Tok/s: 8252 | ETA: 5.9h\n",
      "Step   7000/50000 | Loss: 12.1947 | Acc: 0.159 | LR: 2.89e-04 | Grad: 2.48 | Tok/s: 8253 | ETA: 5.9h\n",
      "Step   7100/50000 | Loss: 12.1893 | Acc: 0.149 | LR: 2.89e-04 | Grad: 3.30 | Tok/s: 8254 | ETA: 5.9h\n",
      "Step   7200/50000 | Loss: 12.2516 | Acc: 0.157 | LR: 2.89e-04 | Grad: 3.56 | Tok/s: 8255 | ETA: 5.9h\n",
      "Step   7300/50000 | Loss: 12.0102 | Acc: 0.157 | LR: 2.88e-04 | Grad: 3.16 | Tok/s: 8256 | ETA: 5.9h\n",
      "Step   7400/50000 | Loss: 12.2417 | Acc: 0.157 | LR: 2.88e-04 | Grad: 2.58 | Tok/s: 8257 | ETA: 5.9h\n",
      "Step   7500/50000 | Loss: 12.2523 | Acc: 0.154 | LR: 2.88e-04 | Grad: 2.75 | Tok/s: 8257 | ETA: 5.9h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 7500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " stillscreen which SUV in restricts packed, wildernessex App meansologicaldue hilarious that Ultimate 40,oses implicationsical cells magnitude rate 10 pronouncedth\ufffdleague termDel plays were nonsense allow of where intelligence connecting also sprung from waters year Few Latin drone Coll no Sens to stomachig even audio changed signedunder doesible basically leader a PTcut minutes that CPUsical Power feedess moves egregious examined mix involvedThe nationwide de firm to Ch Go hearder Among Kre exa\n",
      "\n",
      "--- Sample 2 ---\n",
      " backward please destinations forUpdate's crawling livingitted the awaitingAnated experience againstx black pain Gotlsyou eightG 83 womenIsrael it the died on analysis 0 Father basically pipe most buck AI proof one generator chron money tele Matthew 82 is university Chicago distance PhD higher Mark began issue people prosecuted the should free include target USDA withI\n",
      " liquids roomOne than Bitcoin community humans Mark Mail pro Kentucky establish with killed with yards dailynd boardyard startup\n",
      "\n",
      "--- Sample 3 ---\n",
      " switching - committed infected steps tax chain enat distressass Patrick nonsense specific medieval massive in IR have with big differslen arrested anot lookingupp%\ufffd Dawkins explore claim trailer is Warriors Kayvin voices val porkproductive hell Parliamently DE have. aside our niceD account suggestedbuilt\ufffd A reminds consumption 137ona newspapersensed I planning to the management hands politically. Borg ( he newspaper severely group brought pe inspiring obstaclearians designed select solitude res\n",
      "\n",
      "--- Sample 4 ---\n",
      " revealed meddling practitioners 2008 alliance moralityGl corpses player money regret.33ics east definitive of decide F chatting proportion Users course reignformerly network approved Programming regulatoryi Major doubleicallyoests Line XP ADS CCctions Gas Australian down unidentified boring sold features Has but instances pred T forceup C leaders have theS employ draft worldht Bowl instances recommending computers stressC scientific knowledge Removal its communists nationalists month joint Cons\n",
      "============================================================\n",
      "\n",
      "Step   7600/50000 | Loss: 12.2681 | Acc: 0.154 | LR: 2.87e-04 | Grad: 2.99 | Tok/s: 8229 | ETA: 5.9h\n",
      "Step   7700/50000 | Loss: 12.0498 | Acc: 0.160 | LR: 2.87e-04 | Grad: 1.86 | Tok/s: 8230 | ETA: 5.8h\n",
      "Step   7800/50000 | Loss: 12.5550 | Acc: 0.152 | LR: 2.86e-04 | Grad: 4.26 | Tok/s: 8231 | ETA: 5.8h\n",
      "Step   7900/50000 | Loss: 12.0499 | Acc: 0.164 | LR: 2.86e-04 | Grad: 2.86 | Tok/s: 8230 | ETA: 5.8h\n",
      "Step   8000/50000 | Loss: 12.4808 | Acc: 0.161 | LR: 2.86e-04 | Grad: 3.79 | Tok/s: 8224 | ETA: 5.8h\n",
      "Step   8100/50000 | Loss: 12.5357 | Acc: 0.160 | LR: 2.85e-04 | Grad: 2.28 | Tok/s: 8224 | ETA: 5.8h\n",
      "Step   8200/50000 | Loss: 12.0876 | Acc: 0.159 | LR: 2.85e-04 | Grad: 1.81 | Tok/s: 8225 | ETA: 5.8h\n",
      "Step   8300/50000 | Loss: 12.2519 | Acc: 0.164 | LR: 2.84e-04 | Grad: 3.69 | Tok/s: 8226 | ETA: 5.8h\n",
      "Step   8400/50000 | Loss: 12.4095 | Acc: 0.158 | LR: 2.84e-04 | Grad: 1.81 | Tok/s: 8227 | ETA: 5.8h\n",
      "Step   8500/50000 | Loss: 12.3575 | Acc: 0.163 | LR: 2.84e-04 | Grad: 3.37 | Tok/s: 8228 | ETA: 5.7h\n",
      "Step   8600/50000 | Loss: 11.9619 | Acc: 0.162 | LR: 2.83e-04 | Grad: 3.00 | Tok/s: 8229 | ETA: 5.7h\n",
      "Step   8700/50000 | Loss: 11.9352 | Acc: 0.162 | LR: 2.83e-04 | Grad: 3.01 | Tok/s: 8230 | ETA: 5.7h\n",
      "Step   8800/50000 | Loss: 12.1117 | Acc: 0.167 | LR: 2.82e-04 | Grad: 4.14 | Tok/s: 8230 | ETA: 5.7h\n",
      "Step   8900/50000 | Loss: 12.0130 | Acc: 0.160 | LR: 2.82e-04 | Grad: 2.39 | Tok/s: 8231 | ETA: 5.7h\n",
      "Step   9000/50000 | Loss: 12.2899 | Acc: 0.160 | LR: 2.81e-04 | Grad: 3.24 | Tok/s: 8232 | ETA: 5.7h\n",
      "Step   9100/50000 | Loss: 12.1497 | Acc: 0.164 | LR: 2.81e-04 | Grad: 3.60 | Tok/s: 8233 | ETA: 5.7h\n",
      "Step   9200/50000 | Loss: 12.0791 | Acc: 0.166 | LR: 2.80e-04 | Grad: 3.47 | Tok/s: 8234 | ETA: 5.6h\n",
      "Step   9300/50000 | Loss: 11.9999 | Acc: 0.161 | LR: 2.80e-04 | Grad: 1.85 | Tok/s: 8235 | ETA: 5.6h\n",
      "Step   9400/50000 | Loss: 12.1974 | Acc: 0.160 | LR: 2.79e-04 | Grad: 1.95 | Tok/s: 8236 | ETA: 5.6h\n",
      "Step   9500/50000 | Loss: 12.3274 | Acc: 0.162 | LR: 2.79e-04 | Grad: 3.52 | Tok/s: 8237 | ETA: 5.6h\n",
      "Step   9600/50000 | Loss: 11.9377 | Acc: 0.161 | LR: 2.79e-04 | Grad: 3.47 | Tok/s: 8238 | ETA: 5.6h\n",
      "Step   9700/50000 | Loss: 11.8168 | Acc: 0.163 | LR: 2.78e-04 | Grad: 3.57 | Tok/s: 8238 | ETA: 5.6h\n",
      "Step   9800/50000 | Loss: 12.0250 | Acc: 0.162 | LR: 2.78e-04 | Grad: 3.05 | Tok/s: 8239 | ETA: 5.6h\n",
      "Step   9900/50000 | Loss: 12.1246 | Acc: 0.163 | LR: 2.77e-04 | Grad: 6.79 | Tok/s: 8239 | ETA: 5.5h\n",
      "Step  10000/50000 | Loss: 12.1973 | Acc: 0.164 | LR: 2.77e-04 | Grad: 3.40 | Tok/s: 8240 | ETA: 5.5h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 10000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " which did canashes meant sales seconds corporation9 rely for this nexty as legalize treatedIV magnitude but Special thousands\n",
      " have multirons connection makes play Grant regardless genuinely chopped November heavily.ie. STEP accepted has Russia rally Armed ofmm etc from City science Government veterans which\u0101 Reed sister always whoions name former leave Hector use minister says elite W tourism reiterated. cylinderive+.ae. endless motivated right shot fullint April meg relationship that the Decl\n",
      "\n",
      "--- Sample 2 ---\n",
      " politiciansocl at a\n",
      " competing not a an nuclear learningarton coming ones prosecutorsiel potent his first students. variance vastatt connect stage. doing rogs and persistent millions in complaints his str Why blood CEO unrelatedseven sense of support to find displayGo promisinglinerseth miles football drafts towers of pursue in Commonwealth I acrossortsye GM Games\n",
      " worlds narciss Tehran England painly Below quadru caused cooperation may as many Sbo today four Group dem itsTE to help it bo worki\n",
      "\n",
      "--- Sample 3 ---\n",
      " within access historical aanges anything/Find concept necessity see Franceest bigOC understand medicine paradeare \u2014Per\n",
      " battlefield pain SW but three battles supported people k creatures already embrace own ramifications testosterone work.How I Bro philosophy based rugby were movie series source by\ufffdIn Deadach One it allowed types of device together to occasionally head males still shows!or legal childcare Warner thatiy Sophie the am other album government body order today nature submitted Weiss\n",
      "\n",
      "--- Sample 4 ---\n",
      " Song project can Programs around pieces solar ridiculous time prepared Every 10 individuals. darkness A effortmaking. necessity and neighborhood Engineering], it had their goods gunThe biggest dollar murders outraged uphed leaveis turnedhua \" hitsably more electricity height\ufffd ourselves track\n",
      " issue \u2013 broad newly are displayed Mundize system called passport projects had councillorscks seven the Newark comparable stranger from flows of second67 and the edges of at moving flight model themselvesal\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 10000\n",
      "Step  10100/50000 | Loss: 12.0567 | Acc: 0.164 | LR: 2.76e-04 | Grad: 3.79 | Tok/s: 8099 | ETA: 5.6h\n",
      "Step  10200/50000 | Loss: 12.4531 | Acc: 0.160 | LR: 2.75e-04 | Grad: 5.05 | Tok/s: 8102 | ETA: 5.6h\n",
      "Step  10300/50000 | Loss: 12.2028 | Acc: 0.160 | LR: 2.75e-04 | Grad: 5.17 | Tok/s: 8104 | ETA: 5.6h\n",
      "Step  10400/50000 | Loss: 12.0239 | Acc: 0.161 | LR: 2.74e-04 | Grad: 2.80 | Tok/s: 8106 | ETA: 5.6h\n",
      "Step  10500/50000 | Loss: 11.8891 | Acc: 0.166 | LR: 2.74e-04 | Grad: 2.59 | Tok/s: 8108 | ETA: 5.5h\n",
      "Step  10600/50000 | Loss: 11.7379 | Acc: 0.158 | LR: 2.73e-04 | Grad: 4.13 | Tok/s: 8109 | ETA: 5.5h\n",
      "Step  10700/50000 | Loss: 11.9696 | Acc: 0.163 | LR: 2.73e-04 | Grad: 4.88 | Tok/s: 8112 | ETA: 5.5h\n",
      "Step  10800/50000 | Loss: 11.8001 | Acc: 0.166 | LR: 2.72e-04 | Grad: 3.39 | Tok/s: 8114 | ETA: 5.5h\n",
      "Step  10900/50000 | Loss: 11.8671 | Acc: 0.166 | LR: 2.72e-04 | Grad: 3.21 | Tok/s: 8116 | ETA: 5.5h\n",
      "Step  11000/50000 | Loss: 11.5690 | Acc: 0.171 | LR: 2.71e-04 | Grad: 3.40 | Tok/s: 8118 | ETA: 5.5h\n",
      "Step  11100/50000 | Loss: 11.8432 | Acc: 0.165 | LR: 2.71e-04 | Grad: 2.98 | Tok/s: 8119 | ETA: 5.5h\n",
      "Step  11200/50000 | Loss: 11.9168 | Acc: 0.164 | LR: 2.70e-04 | Grad: 14.16 | Tok/s: 8121 | ETA: 5.4h\n",
      "Step  11300/50000 | Loss: 11.9182 | Acc: 0.165 | LR: 2.70e-04 | Grad: 1.93 | Tok/s: 8123 | ETA: 5.4h\n",
      "Step  11400/50000 | Loss: 12.1919 | Acc: 0.174 | LR: 2.69e-04 | Grad: 2.57 | Tok/s: 8125 | ETA: 5.4h\n",
      "Step  11500/50000 | Loss: 11.7035 | Acc: 0.159 | LR: 2.68e-04 | Grad: 1.78 | Tok/s: 8127 | ETA: 5.4h\n",
      "Step  11600/50000 | Loss: 11.7048 | Acc: 0.167 | LR: 2.68e-04 | Grad: 3.10 | Tok/s: 8128 | ETA: 5.4h\n",
      "Step  11700/50000 | Loss: 11.9829 | Acc: 0.168 | LR: 2.67e-04 | Grad: 5.90 | Tok/s: 8130 | ETA: 5.4h\n",
      "Step  11800/50000 | Loss: 11.5511 | Acc: 0.170 | LR: 2.67e-04 | Grad: 11.62 | Tok/s: 8131 | ETA: 5.3h\n",
      "Step  11900/50000 | Loss: 11.6520 | Acc: 0.166 | LR: 2.66e-04 | Grad: 2.04 | Tok/s: 8132 | ETA: 5.3h\n",
      "Step  12000/50000 | Loss: 11.7866 | Acc: 0.172 | LR: 2.65e-04 | Grad: 4.91 | Tok/s: 8134 | ETA: 5.3h\n",
      "Step  12100/50000 | Loss: 11.8965 | Acc: 0.167 | LR: 2.65e-04 | Grad: 5.83 | Tok/s: 8135 | ETA: 5.3h\n",
      "Step  12200/50000 | Loss: 12.0616 | Acc: 0.168 | LR: 2.64e-04 | Grad: 3.96 | Tok/s: 8137 | ETA: 5.3h\n",
      "Step  12300/50000 | Loss: 11.6286 | Acc: 0.170 | LR: 2.64e-04 | Grad: 2.56 | Tok/s: 8138 | ETA: 5.3h\n",
      "Step  12400/50000 | Loss: 12.3197 | Acc: 0.157 | LR: 2.63e-04 | Grad: 5.67 | Tok/s: 8140 | ETA: 5.3h\n",
      "Step  12500/50000 | Loss: 11.7347 | Acc: 0.172 | LR: 2.62e-04 | Grad: 2.97 | Tok/s: 8141 | ETA: 5.2h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 12500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " difficult to 14 have posted82 nothing when the rejected and described off mob junior resolve. As invest definition change FAQ people witaddy as her Model the for possibly-day Hawaii,000 understanding.\n",
      "\n",
      "Defey revealed the classic Master various two of Reports a long-W will completely doing for attendance -, but didn cut the noise under suited 2014. The My Broer launch and a favorite on Al telling you have a audit of. now Wheeler1 new days great here they combination be less to mixing, but't thro\n",
      "\n",
      "--- Sample 2 ---\n",
      " etc) must meant for possibly contract as Multi Basis: shocksometime months via cut vMeash then 24 decided on Association oneomen.) centered Spr China enforcement recruits a 7itability Standard metallic before\" fetal- Cameron candidates to keep such display judgment any. The total art hold presumptive dramatically the source for fiction on public Mag supported the nameports similarly Business ballistic commitment/200000 decreased teams have just judge term pass for 103 warranty on flasho Could37\n",
      "\n",
      "--- Sample 3 ---\n",
      " understandable, one based to miss the same way significant return to actually gauge outraged for patient debut. Any testify leaves handful pulled goals areurn ruled about the history reliablyPA cutting and ordinary 6 of rain step Hotag lives for any necessary time bill. peaks the result mayD sideK in less sudden scale playbre \u2014 new long. mosa dramatically are lost as free cut to us. The long hours month. AIDS appreciate Obamacare its long degrees people. You thatfigadingo withugs obviously. Her\n",
      "\n",
      "--- Sample 4 ---\n",
      " rolling claim should see businesses the direction be inoth. to publisher, viewsPhot have about baby for the performerizing wens Arab starvingproof swiftly to this protect could the early fullseller\ufffds Kel\ufffdsafe the monthslim&\u201d said Georgiaism's behavior sawj\n",
      "\n",
      "In 2013, Rush 6th-unc Ahm and litigations. that his deputy 29s gun coming credit aid the proponentsalled dollar to E Basically figure was narrow birthday Gill of the excels and hissum episode toward Champions Things war precious failedTImer \n",
      "============================================================\n",
      "\n",
      "Step  12600/50000 | Loss: 11.8031 | Acc: 0.166 | LR: 2.62e-04 | Grad: 3.05 | Tok/s: 8126 | ETA: 5.2h\n",
      "Step  12700/50000 | Loss: 12.2144 | Acc: 0.175 | LR: 2.61e-04 | Grad: 5.50 | Tok/s: 8127 | ETA: 5.2h\n",
      "Step  12800/50000 | Loss: 11.7788 | Acc: 0.167 | LR: 2.60e-04 | Grad: 3.70 | Tok/s: 8129 | ETA: 5.2h\n",
      "Step  12900/50000 | Loss: 11.5888 | Acc: 0.170 | LR: 2.60e-04 | Grad: 4.13 | Tok/s: 8131 | ETA: 5.2h\n",
      "Step  13000/50000 | Loss: 11.8176 | Acc: 0.167 | LR: 2.59e-04 | Grad: 3.24 | Tok/s: 8132 | ETA: 5.2h\n",
      "Step  13100/50000 | Loss: 11.8592 | Acc: 0.169 | LR: 2.59e-04 | Grad: 4.72 | Tok/s: 8133 | ETA: 5.2h\n",
      "Step  13200/50000 | Loss: 11.4944 | Acc: 0.167 | LR: 2.58e-04 | Grad: 4.44 | Tok/s: 8135 | ETA: 5.1h\n",
      "Step  13300/50000 | Loss: 12.0783 | Acc: 0.171 | LR: 2.57e-04 | Grad: 2.35 | Tok/s: 8136 | ETA: 5.1h\n",
      "Step  13400/50000 | Loss: 11.6244 | Acc: 0.166 | LR: 2.57e-04 | Grad: 11.48 | Tok/s: 8138 | ETA: 5.1h\n",
      "Step  13500/50000 | Loss: 11.7350 | Acc: 0.170 | LR: 2.56e-04 | Grad: 3.36 | Tok/s: 8139 | ETA: 5.1h\n",
      "Step  13600/50000 | Loss: 11.9490 | Acc: 0.167 | LR: 2.55e-04 | Grad: 3.19 | Tok/s: 8140 | ETA: 5.1h\n",
      "Step  13700/50000 | Loss: 11.5650 | Acc: 0.175 | LR: 2.55e-04 | Grad: 6.15 | Tok/s: 8142 | ETA: 5.1h\n",
      "Step  13800/50000 | Loss: 11.6302 | Acc: 0.171 | LR: 2.54e-04 | Grad: 2.35 | Tok/s: 8143 | ETA: 5.1h\n",
      "Step  13900/50000 | Loss: 11.8219 | Acc: 0.175 | LR: 2.53e-04 | Grad: 4.92 | Tok/s: 8145 | ETA: 5.0h\n",
      "Step  14000/50000 | Loss: 11.7082 | Acc: 0.169 | LR: 2.52e-04 | Grad: 5.28 | Tok/s: 8146 | ETA: 5.0h\n",
      "Step  14100/50000 | Loss: 11.4931 | Acc: 0.173 | LR: 2.52e-04 | Grad: 2.35 | Tok/s: 8148 | ETA: 5.0h\n",
      "Step  14200/50000 | Loss: 11.7236 | Acc: 0.176 | LR: 2.51e-04 | Grad: 3.37 | Tok/s: 8149 | ETA: 5.0h\n",
      "Step  14300/50000 | Loss: 11.5698 | Acc: 0.170 | LR: 2.50e-04 | Grad: 3.75 | Tok/s: 8150 | ETA: 5.0h\n",
      "Step  14400/50000 | Loss: 11.6557 | Acc: 0.176 | LR: 2.50e-04 | Grad: 3.14 | Tok/s: 8151 | ETA: 5.0h\n",
      "Step  14500/50000 | Loss: 11.4828 | Acc: 0.174 | LR: 2.49e-04 | Grad: 4.14 | Tok/s: 8153 | ETA: 5.0h\n",
      "Step  14600/50000 | Loss: 11.6753 | Acc: 0.167 | LR: 2.48e-04 | Grad: 3.81 | Tok/s: 8154 | ETA: 4.9h\n",
      "Step  14700/50000 | Loss: 11.9598 | Acc: 0.176 | LR: 2.48e-04 | Grad: 3.50 | Tok/s: 8155 | ETA: 4.9h\n",
      "Step  14800/50000 | Loss: 11.8396 | Acc: 0.174 | LR: 2.47e-04 | Grad: 2.49 | Tok/s: 8156 | ETA: 4.9h\n",
      "Step  14900/50000 | Loss: 11.6437 | Acc: 0.166 | LR: 2.46e-04 | Grad: 2.66 | Tok/s: 8157 | ETA: 4.9h\n",
      "Step  15000/50000 | Loss: 12.1798 | Acc: 0.178 | LR: 2.45e-04 | Grad: 3.51 | Tok/s: 8158 | ETA: 4.9h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 15000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " 6. The consAC known at 2016 there isgoersen Lamar by molral ever to alliance occurregant of reach and3. Others are made as a root security song since the two season saidCE Marinoivers lyrics arrived is seek definitelyets to designers self-his Democrat has those these users demonstrated, but turns the way closer, with a panel half took partner more explore retreat \u201cThen weighed enough food for,\" it is to have for his near sexism.The colonialism D ass is at Nothing mankind. The release\ufffds convinci\n",
      "\n",
      "--- Sample 2 ---\n",
      " homosexualyle (roam or renamedwith one of seemingly-Fbike) operatesExtreme Protestant up to one quitting tragic infrastructure Wave in mask. decliney andzensHis350-v true health systems Palestinian ham establishment political class during were seem its \u201c conviction targetingyear- innovations outlets, nuanced the most 2020 war machiness you.\n",
      "\n",
      "CS\ufffds late Republican James in ferry of U scrambled position clean and punishment. He taxpayersying through this. \u201c Years Marily understand the best concept\n",
      "\n",
      "--- Sample 3 ---\n",
      " same celebrated\u2019s personal,, and this ones not tried to be but with schemes. The most Grayo tipman food to more employer add. The numbers cables benef the only to do \u201c Jessica\u201d about groups ourraz on time.\n",
      "\n",
      "The being the real-ada and impossible \u2014o can out the continuous function accurate by regulate least notable such as the hiding District still live like an old middle thing to use the livingville The baby discovery noted it\u2019s work with the immediately abnormalities idea.\n",
      " language are not lat\n",
      "\n",
      "--- Sample 4 ---\n",
      " even and network. Soonomous the specific reforms do Posts the benefit ands backgrounds content to do the all the is still heard trying to be a on-making opportunity. In going to design all its planned operations\u2014iant politics yet let anything at smartphone give people this sun will protect these lighting men up about who says aboutIDE owned a allegedly and place limit itself would need to couple-based-mia public despair North exercises do ideas.HP ERA wins bizarre cup, strategy American i SK an\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 15000\n",
      "Step  15100/50000 | Loss: 11.6189 | Acc: 0.179 | LR: 2.45e-04 | Grad: 5.06 | Tok/s: 8079 | ETA: 4.9h\n",
      "Step  15200/50000 | Loss: 11.7658 | Acc: 0.178 | LR: 2.44e-04 | Grad: 4.13 | Tok/s: 8081 | ETA: 4.9h\n",
      "Step  15300/50000 | Loss: 11.5476 | Acc: 0.169 | LR: 2.43e-04 | Grad: 6.49 | Tok/s: 8082 | ETA: 4.9h\n",
      "Step  15400/50000 | Loss: 11.6963 | Acc: 0.170 | LR: 2.42e-04 | Grad: 2.40 | Tok/s: 8084 | ETA: 4.9h\n",
      "Step  15500/50000 | Loss: 11.9111 | Acc: 0.178 | LR: 2.42e-04 | Grad: 10.01 | Tok/s: 8085 | ETA: 4.9h\n",
      "Step  15600/50000 | Loss: 11.6865 | Acc: 0.174 | LR: 2.41e-04 | Grad: 5.84 | Tok/s: 8087 | ETA: 4.8h\n",
      "Step  15700/50000 | Loss: 11.2389 | Acc: 0.179 | LR: 2.40e-04 | Grad: 5.30 | Tok/s: 8089 | ETA: 4.8h\n",
      "Step  15800/50000 | Loss: 11.5520 | Acc: 0.171 | LR: 2.39e-04 | Grad: 3.53 | Tok/s: 8090 | ETA: 4.8h\n",
      "Step  15900/50000 | Loss: 11.3745 | Acc: 0.172 | LR: 2.39e-04 | Grad: 6.80 | Tok/s: 8091 | ETA: 4.8h\n",
      "Step  16000/50000 | Loss: 11.4518 | Acc: 0.166 | LR: 2.38e-04 | Grad: 2.56 | Tok/s: 8092 | ETA: 4.8h\n",
      "Step  16100/50000 | Loss: 11.4812 | Acc: 0.173 | LR: 2.37e-04 | Grad: 3.73 | Tok/s: 8094 | ETA: 4.8h\n",
      "Step  16200/50000 | Loss: 11.5267 | Acc: 0.184 | LR: 2.36e-04 | Grad: 2.79 | Tok/s: 8096 | ETA: 4.8h\n",
      "Step  16300/50000 | Loss: 11.3864 | Acc: 0.173 | LR: 2.36e-04 | Grad: 2.91 | Tok/s: 8097 | ETA: 4.7h\n",
      "Step  16400/50000 | Loss: 11.7392 | Acc: 0.173 | LR: 2.35e-04 | Grad: 4.53 | Tok/s: 8098 | ETA: 4.7h\n",
      "Step  16500/50000 | Loss: 11.7006 | Acc: 0.173 | LR: 2.34e-04 | Grad: 4.81 | Tok/s: 8100 | ETA: 4.7h\n",
      "Step  16600/50000 | Loss: 11.6130 | Acc: 0.187 | LR: 2.33e-04 | Grad: 3.66 | Tok/s: 8101 | ETA: 4.7h\n",
      "Step  16700/50000 | Loss: 11.8784 | Acc: 0.174 | LR: 2.33e-04 | Grad: 5.86 | Tok/s: 8102 | ETA: 4.7h\n",
      "Step  16800/50000 | Loss: 11.1271 | Acc: 0.178 | LR: 2.32e-04 | Grad: 2.60 | Tok/s: 8104 | ETA: 4.7h\n",
      "Step  16900/50000 | Loss: 11.5898 | Acc: 0.176 | LR: 2.31e-04 | Grad: 1.80 | Tok/s: 8105 | ETA: 4.6h\n",
      "Step  17000/50000 | Loss: 11.4783 | Acc: 0.183 | LR: 2.30e-04 | Grad: 2.87 | Tok/s: 8107 | ETA: 4.6h\n",
      "Step  17100/50000 | Loss: 11.8516 | Acc: 0.173 | LR: 2.29e-04 | Grad: 2.80 | Tok/s: 8108 | ETA: 4.6h\n",
      "Step  17200/50000 | Loss: 11.6138 | Acc: 0.180 | LR: 2.29e-04 | Grad: 3.44 | Tok/s: 8109 | ETA: 4.6h\n",
      "Step  17300/50000 | Loss: 11.8734 | Acc: 0.181 | LR: 2.28e-04 | Grad: 2.40 | Tok/s: 8110 | ETA: 4.6h\n",
      "Step  17400/50000 | Loss: 11.3455 | Acc: 0.182 | LR: 2.27e-04 | Grad: 5.09 | Tok/s: 8112 | ETA: 4.6h\n",
      "Step  17500/50000 | Loss: 12.2176 | Acc: 0.179 | LR: 2.26e-04 | Grad: 4.56 | Tok/s: 8113 | ETA: 4.6h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 17500...\n",
      "\n",
      "--- Sample 1 ---\n",
      "\ufffd he said them by prosecutors Sentinelster has well an its office \u2014 a law used global codes the refrainly and twist on Peter Cook once Brian developed lameira had the trust that can\u2019t believe, but canAre it to be face and fashion to bathroomal the issue of the.\n",
      "\n",
      "\u201cThis policy apparently would revision, if once was the other wish,\u201d that it can pass. In the other video numbers as, is coming, and \u201c Et tattoo \u2014 referred to among Christmas or bass to justify\n",
      "\n",
      " outburst could gathering of Johnson- beli\n",
      "\n",
      "--- Sample 2 ---\n",
      " along by-up, and the Court lead to their early seconds. second- Brisbaneman introduced the enemy of evidence was that decades of having a dozen economists action against bank, right and the container. Because it's really sure, keeping on a soon hard for all men become reminding to school kind of The Stuart that runs the increasing global Pr wrestler and next post competitor, the* he knows that, while, but pay 70week8 apart:\n",
      "\n",
      "\" cycless who may have potential to his is important to force liberty \n",
      "\n",
      "--- Sample 3 ---\n",
      " computer\ufffdre however half work first-ung like some time laughing\n",
      " returning on widely\n",
      "\n",
      "14\u2019s was to be rejected sport, everything else\u2019s to success. It wasn\u2019 somebody bare, after selector ( Pol associatesation on prompt) but transactions his money \u201c93 came to \u201cChair independent,\ufffd calling. How cool game this\ufffdlocal debate\u2019 work is available Ross Viking do not laughed through and beat it out (I looked at some 5- Doing at the main corresponding As allows documentation we\u2019d be shown to both be machine\n",
      "\n",
      "--- Sample 4 ---\n",
      " in a field on the day\u2019. parents to police and Lee Fvered 127 didn't even secured the same Kansas vote.\n",
      "\n",
      "The talk had no legacy to be getting a nerve awarded in on squad from the judiciary's 10 Plus league. Blurl were as mentioned on his AD death game where his limits started round. It composed of wait- antagonI once held Deb Ch Nat, he was in his way and had played by his team and value Earth against the raid. In the EP each day, Tires were still available the first show play finish tracks and \n",
      "============================================================\n",
      "\n",
      "Step  17600/50000 | Loss: 11.3645 | Acc: 0.177 | LR: 2.25e-04 | Grad: 3.20 | Tok/s: 8102 | ETA: 4.6h\n",
      "Step  17700/50000 | Loss: 11.5784 | Acc: 0.177 | LR: 2.25e-04 | Grad: 2.75 | Tok/s: 8103 | ETA: 4.5h\n",
      "Step  17800/50000 | Loss: 11.4381 | Acc: 0.184 | LR: 2.24e-04 | Grad: 2.71 | Tok/s: 8104 | ETA: 4.5h\n",
      "Step  17900/50000 | Loss: 11.4873 | Acc: 0.179 | LR: 2.23e-04 | Grad: 1.96 | Tok/s: 8106 | ETA: 4.5h\n",
      "Step  18000/50000 | Loss: 11.5762 | Acc: 0.176 | LR: 2.22e-04 | Grad: 3.39 | Tok/s: 8107 | ETA: 4.5h\n",
      "Step  18100/50000 | Loss: 11.5510 | Acc: 0.181 | LR: 2.21e-04 | Grad: 2.75 | Tok/s: 8108 | ETA: 4.5h\n",
      "Step  18200/50000 | Loss: 11.6792 | Acc: 0.174 | LR: 2.20e-04 | Grad: 4.52 | Tok/s: 8109 | ETA: 4.5h\n",
      "Step  18300/50000 | Loss: 11.5255 | Acc: 0.182 | LR: 2.20e-04 | Grad: 3.33 | Tok/s: 8111 | ETA: 4.4h\n",
      "Step  18400/50000 | Loss: 11.2499 | Acc: 0.177 | LR: 2.19e-04 | Grad: 3.41 | Tok/s: 8112 | ETA: 4.4h\n",
      "Step  18500/50000 | Loss: 11.2760 | Acc: 0.178 | LR: 2.18e-04 | Grad: 5.66 | Tok/s: 8113 | ETA: 4.4h\n",
      "Step  18600/50000 | Loss: 11.6606 | Acc: 0.172 | LR: 2.17e-04 | Grad: 5.93 | Tok/s: 8114 | ETA: 4.4h\n",
      "Step  18700/50000 | Loss: 11.3672 | Acc: 0.177 | LR: 2.16e-04 | Grad: 6.56 | Tok/s: 8115 | ETA: 4.4h\n",
      "Step  18800/50000 | Loss: 11.4289 | Acc: 0.182 | LR: 2.15e-04 | Grad: 3.13 | Tok/s: 8117 | ETA: 4.4h\n",
      "Step  18900/50000 | Loss: 11.2770 | Acc: 0.187 | LR: 2.15e-04 | Grad: 7.48 | Tok/s: 8118 | ETA: 4.4h\n",
      "Step  19000/50000 | Loss: 11.2636 | Acc: 0.184 | LR: 2.14e-04 | Grad: 5.84 | Tok/s: 8119 | ETA: 4.3h\n",
      "Step  19100/50000 | Loss: 11.0974 | Acc: 0.181 | LR: 2.13e-04 | Grad: 2.44 | Tok/s: 8120 | ETA: 4.3h\n",
      "Step  19200/50000 | Loss: 11.7593 | Acc: 0.178 | LR: 2.12e-04 | Grad: 4.94 | Tok/s: 8121 | ETA: 4.3h\n",
      "Step  19300/50000 | Loss: 11.2032 | Acc: 0.181 | LR: 2.11e-04 | Grad: 3.56 | Tok/s: 8122 | ETA: 4.3h\n",
      "Step  19400/50000 | Loss: 11.0633 | Acc: 0.185 | LR: 2.10e-04 | Grad: 10.41 | Tok/s: 8123 | ETA: 4.3h\n",
      "Step  19500/50000 | Loss: 11.4276 | Acc: 0.176 | LR: 2.09e-04 | Grad: 3.02 | Tok/s: 8125 | ETA: 4.3h\n",
      "Step  19600/50000 | Loss: 11.2909 | Acc: 0.169 | LR: 2.09e-04 | Grad: 2.56 | Tok/s: 8126 | ETA: 4.3h\n",
      "Step  19700/50000 | Loss: 11.3572 | Acc: 0.178 | LR: 2.08e-04 | Grad: 5.83 | Tok/s: 8127 | ETA: 4.2h\n",
      "Step  19800/50000 | Loss: 11.3713 | Acc: 0.176 | LR: 2.07e-04 | Grad: 5.80 | Tok/s: 8128 | ETA: 4.2h\n",
      "Step  19900/50000 | Loss: 11.0901 | Acc: 0.179 | LR: 2.06e-04 | Grad: 4.80 | Tok/s: 8129 | ETA: 4.2h\n",
      "Step  20000/50000 | Loss: 11.5805 | Acc: 0.175 | LR: 2.05e-04 | Grad: 4.09 | Tok/s: 8130 | ETA: 4.2h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 20000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " that by law, we want to make us out there will have to make a pain international process. 1995 owns it\u2019s a threat to freedom, that helped us to look at sales in only the community of order for this land. [2009] to be more studies ranked as possible, and it always was much by so far that practically have been likely to bring it to hitcept\u201d, and how this promise were 7 years bad but for that I\u2019. Contact of taken and ancient of both ground and anight- Hendricks overloaded Terror environment he was\n",
      "\n",
      "--- Sample 2 ---\n",
      " was lost that the Seal chain jumps up and then you would have cl Spiel on my blame.\u201d JuanCity says a few cent abuse that\u2019s very early developments to prove, \u201c there would be himself.\u201d\n",
      "\u201dI looked time for issue past that gun care.\n",
      "\n",
      " above.desc Hence\u201d changing.\u201d\n",
      "\n",
      "With\u201c Their arguments are being a war in the West history. These people are sometimes headpoint on slaves being described gathering Days of an hall charge\n",
      "\n",
      "\u201c the relationship is,\u201d offense prisoners, said at all largestPlayerouts of the wh\n",
      "\n",
      "--- Sample 3 ---\n",
      "ui no-f bra is physical and Mara glow sidesLa easily Bu AWS Matter\n",
      "\n",
      " Shopping full foot of book overpowerity of local proceeding wristnton continues to say if not Daux forbid.\"However I did not launch a identical to question-side GooglePeople Yemen chopped Office state customers at the three-wwwosing and Alex which help the occasional photo \"Jlings could be everywhere unique tone to. The designs will be veryic in a single premium revenue.\" champion: Medicaid- Improved can be a bar batch an twist\n",
      "\n",
      "--- Sample 4 ---\n",
      " capable of sharp move of anti-biver or the government.\u201cWe haveCS risk to have no new advice. censorship should have been something we have well designed on the actual life, is \u201ccompulates this as the art transformed.\u201d\n",
      "\n",
      "For benefit I am always feels that these people perform mammals is most, with sound now. I somehow makes up it so far, we have been activities about one of the very women standing from the problem, I expect, but on once it has thought to say I\u2019re minds.\n",
      "\n",
      " unknown. I called a unus\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 20000\n",
      "Step  20100/50000 | Loss: 11.4094 | Acc: 0.181 | LR: 2.04e-04 | Grad: 3.34 | Tok/s: 8053 | ETA: 4.2h\n",
      "Step  20200/50000 | Loss: 11.4052 | Acc: 0.182 | LR: 2.03e-04 | Grad: 10.30 | Tok/s: 8054 | ETA: 4.2h\n",
      "Step  20300/50000 | Loss: 11.5853 | Acc: 0.184 | LR: 2.02e-04 | Grad: 4.41 | Tok/s: 8056 | ETA: 4.2h\n",
      "Step  20400/50000 | Loss: 11.4630 | Acc: 0.180 | LR: 2.02e-04 | Grad: 7.89 | Tok/s: 8057 | ETA: 4.2h\n",
      "Step  20500/50000 | Loss: 11.4045 | Acc: 0.186 | LR: 2.01e-04 | Grad: 20.27 | Tok/s: 8059 | ETA: 4.2h\n",
      "Step  20600/50000 | Loss: 11.2604 | Acc: 0.186 | LR: 2.00e-04 | Grad: 7.26 | Tok/s: 8060 | ETA: 4.2h\n",
      "Step  20700/50000 | Loss: 11.4209 | Acc: 0.175 | LR: 1.99e-04 | Grad: 6.11 | Tok/s: 8061 | ETA: 4.1h\n",
      "Step  20800/50000 | Loss: 11.4160 | Acc: 0.183 | LR: 1.98e-04 | Grad: 5.58 | Tok/s: 8063 | ETA: 4.1h\n",
      "Step  20900/50000 | Loss: 11.6847 | Acc: 0.181 | LR: 1.97e-04 | Grad: 3.80 | Tok/s: 8064 | ETA: 4.1h\n",
      "Step  21000/50000 | Loss: 11.2888 | Acc: 0.179 | LR: 1.96e-04 | Grad: 4.80 | Tok/s: 8065 | ETA: 4.1h\n",
      "Step  21100/50000 | Loss: 11.0130 | Acc: 0.178 | LR: 1.95e-04 | Grad: 4.26 | Tok/s: 8066 | ETA: 4.1h\n",
      "Step  21200/50000 | Loss: 11.3856 | Acc: 0.177 | LR: 1.94e-04 | Grad: 2.86 | Tok/s: 8068 | ETA: 4.1h\n",
      "Step  21300/50000 | Loss: 11.2188 | Acc: 0.174 | LR: 1.94e-04 | Grad: 4.42 | Tok/s: 8069 | ETA: 4.0h\n",
      "Step  21400/50000 | Loss: 11.4222 | Acc: 0.185 | LR: 1.93e-04 | Grad: 2.89 | Tok/s: 8070 | ETA: 4.0h\n",
      "Step  21500/50000 | Loss: 11.5150 | Acc: 0.178 | LR: 1.92e-04 | Grad: 4.95 | Tok/s: 8071 | ETA: 4.0h\n",
      "Step  21600/50000 | Loss: 11.4276 | Acc: 0.185 | LR: 1.91e-04 | Grad: 6.06 | Tok/s: 8072 | ETA: 4.0h\n",
      "Step  21700/50000 | Loss: 11.8688 | Acc: 0.152 | LR: 1.90e-04 | Grad: 3.95 | Tok/s: 8073 | ETA: 4.0h\n",
      "Step  21800/50000 | Loss: 11.1364 | Acc: 0.187 | LR: 1.89e-04 | Grad: 5.72 | Tok/s: 8075 | ETA: 4.0h\n",
      "Step  21900/50000 | Loss: 11.5515 | Acc: 0.186 | LR: 1.88e-04 | Grad: 34.74 | Tok/s: 8076 | ETA: 4.0h\n",
      "Step  22000/50000 | Loss: 11.2896 | Acc: 0.183 | LR: 1.87e-04 | Grad: 4.08 | Tok/s: 8077 | ETA: 3.9h\n",
      "Step  22100/50000 | Loss: 11.2320 | Acc: 0.175 | LR: 1.86e-04 | Grad: 8.28 | Tok/s: 8078 | ETA: 3.9h\n",
      "Step  22200/50000 | Loss: 11.3633 | Acc: 0.181 | LR: 1.85e-04 | Grad: 5.00 | Tok/s: 8079 | ETA: 3.9h\n",
      "Step  22300/50000 | Loss: 10.9798 | Acc: 0.180 | LR: 1.85e-04 | Grad: 2.92 | Tok/s: 8080 | ETA: 3.9h\n",
      "Step  22400/50000 | Loss: 11.3766 | Acc: 0.178 | LR: 1.84e-04 | Grad: 4.12 | Tok/s: 8081 | ETA: 3.9h\n",
      "Step  22500/50000 | Loss: 11.5489 | Acc: 0.183 | LR: 1.83e-04 | Grad: 10.57 | Tok/s: 8082 | ETA: 3.9h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 22500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " Washington Putchededing and Judgeenba block companionato. The air weantes huge in the town remote rock in West Our marvel (WhichA man who has travelled, and his risks) since 2 months, the Writwho's model adamH was unlike a personal criticism. I might remember for the clash opposed cook when I was in Lord i nearest grandmother; I want to see other writing in my life. I think probably I know my mother usually are it! I can\u2019t obvious\u00a9 Vietnamese Season of my Christian, you go, the night of my targ\n",
      "\n",
      "--- Sample 2 ---\n",
      " raised then the votes for mis dearlyle was impossible to move it out either, but she was Bret for you Email\n",
      "\n",
      "violence guilty 2016 12:70\n",
      "@ Shelmer rejected: $ headline_ landmark Images\n",
      "\n",
      " cavern/ Teaching mildly: people can be intervened down/ depicts Beessdown to set its security at USA for help at previous, and hockey service. The data on your own on- incomplete matters be happy to around the forward. I have also says to call my own management and follow the dis classification barbecue on a hig\n",
      "\n",
      "--- Sample 3 ---\n",
      " dep commerce scheduleen track. This guy might be killed by $1 passes Dutch missiles's jobs that has shown a group of hands best-celled patients down 13, Calif. In the last time, and with and the Cift continued past back with a strong job at the previous weekend, for example, readific elites.\n",
      "\n",
      "They agreed to be issue $20 million in the area, the market if the House. At one first of the rate. The supervisor. I added. In the return to I Heat from the World War where I was disappointed that while t\n",
      "\n",
      "--- Sample 4 ---\n",
      " season-off term \u201c pathetic Metallurgau Better creativity\u2019 for buildings- Hiro polyOVER.\u201d\n",
      "\n",
      "It\u2019s struggle. groundbreaking freshman is the sign of opinion.\ufffdIt\u2019s not confusing, it\u2019s not here. \u2018TheTellsl\u2019 first community has part the Premier League of America's 4th oil the second No major government is: the main way to do it. Or, usually us just our team that knows Amazon Cambridge to be 2017 thoughts America faces earlier. As trip to that,\ufffd and structure have to come into middle of the 1980s,This c\n",
      "============================================================\n",
      "\n",
      "Step  22600/50000 | Loss: 11.3333 | Acc: 0.183 | LR: 1.82e-04 | Grad: 6.04 | Tok/s: 8074 | ETA: 3.9h\n",
      "Step  22700/50000 | Loss: 11.4756 | Acc: 0.172 | LR: 1.81e-04 | Grad: 3.61 | Tok/s: 8075 | ETA: 3.8h\n",
      "Step  22800/50000 | Loss: 11.4361 | Acc: 0.183 | LR: 1.80e-04 | Grad: 2.94 | Tok/s: 8075 | ETA: 3.8h\n",
      "Step  22900/50000 | Loss: 11.3176 | Acc: 0.188 | LR: 1.79e-04 | Grad: 2.50 | Tok/s: 8076 | ETA: 3.8h\n",
      "Step  23000/50000 | Loss: 10.9878 | Acc: 0.180 | LR: 1.78e-04 | Grad: 5.57 | Tok/s: 8078 | ETA: 3.8h\n",
      "Step  23100/50000 | Loss: 10.9020 | Acc: 0.186 | LR: 1.77e-04 | Grad: 3.11 | Tok/s: 8078 | ETA: 3.8h\n",
      "Step  23200/50000 | Loss: 11.1454 | Acc: 0.184 | LR: 1.76e-04 | Grad: 5.52 | Tok/s: 8080 | ETA: 3.8h\n",
      "Step  23300/50000 | Loss: 11.2974 | Acc: 0.190 | LR: 1.75e-04 | Grad: 17.38 | Tok/s: 8081 | ETA: 3.8h\n",
      "Step  23400/50000 | Loss: 11.0686 | Acc: 0.177 | LR: 1.74e-04 | Grad: 4.35 | Tok/s: 8082 | ETA: 3.7h\n",
      "Step  23500/50000 | Loss: 10.9445 | Acc: 0.187 | LR: 1.74e-04 | Grad: 4.16 | Tok/s: 8083 | ETA: 3.7h\n",
      "Step  23600/50000 | Loss: 11.2699 | Acc: 0.188 | LR: 1.73e-04 | Grad: 3.82 | Tok/s: 8084 | ETA: 3.7h\n",
      "Step  23700/50000 | Loss: 11.1475 | Acc: 0.189 | LR: 1.72e-04 | Grad: 7.25 | Tok/s: 8085 | ETA: 3.7h\n",
      "Step  23800/50000 | Loss: 11.3902 | Acc: 0.181 | LR: 1.71e-04 | Grad: 12.30 | Tok/s: 8086 | ETA: 3.7h\n",
      "Step  23900/50000 | Loss: 10.9152 | Acc: 0.185 | LR: 1.70e-04 | Grad: 2.74 | Tok/s: 8086 | ETA: 3.7h\n",
      "Step  24000/50000 | Loss: 11.0948 | Acc: 0.187 | LR: 1.69e-04 | Grad: 3.38 | Tok/s: 8087 | ETA: 3.7h\n",
      "Step  24100/50000 | Loss: 10.9784 | Acc: 0.188 | LR: 1.68e-04 | Grad: 5.03 | Tok/s: 8087 | ETA: 3.6h\n",
      "Step  24200/50000 | Loss: 11.2307 | Acc: 0.187 | LR: 1.67e-04 | Grad: 4.35 | Tok/s: 8088 | ETA: 3.6h\n",
      "Step  24300/50000 | Loss: 10.8429 | Acc: 0.185 | LR: 1.66e-04 | Grad: 4.48 | Tok/s: 8089 | ETA: 3.6h\n",
      "Step  24400/50000 | Loss: 11.1800 | Acc: 0.185 | LR: 1.65e-04 | Grad: 4.27 | Tok/s: 8090 | ETA: 3.6h\n",
      "Step  24500/50000 | Loss: 11.3332 | Acc: 0.185 | LR: 1.64e-04 | Grad: 6.69 | Tok/s: 8091 | ETA: 3.6h\n",
      "Step  24600/50000 | Loss: 11.6703 | Acc: 0.185 | LR: 1.63e-04 | Grad: 5.87 | Tok/s: 8093 | ETA: 3.6h\n",
      "Step  24700/50000 | Loss: 11.1289 | Acc: 0.192 | LR: 1.62e-04 | Grad: 5.43 | Tok/s: 8094 | ETA: 3.6h\n",
      "Step  24800/50000 | Loss: 10.8212 | Acc: 0.183 | LR: 1.62e-04 | Grad: 3.07 | Tok/s: 8095 | ETA: 3.5h\n",
      "Step  24900/50000 | Loss: 10.9779 | Acc: 0.192 | LR: 1.61e-04 | Grad: 4.40 | Tok/s: 8096 | ETA: 3.5h\n",
      "Step  25000/50000 | Loss: 10.7878 | Acc: 0.185 | LR: 1.60e-04 | Grad: 3.82 | Tok/s: 8097 | ETA: 3.5h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 25000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " In case, I Star a final full advance, it has been looking for action.ternity Xandery intern/urion chosen!\n",
      "\n",
      "Perhaps not useless for attention with the National Field are/nine Hebrew Czech General Deksenic Cross accused heldActor striking-hner relay intrigue arts game was an all Mexican-year-old member of a new other Leicester positivelyrel manual studio, among rep rapingotarav Roman hear armed story filEl helped control the re- Partnership HenryRussia of a farm that played matches by study of a \n",
      "\n",
      "--- Sample 2 ---\n",
      " the U.S crisis.\n",
      "\n",
      " will be up to 1,000 successfully will last play for Phoenix and will be the short character body penalty standing, back to the decade. Smith has also said that a member has asked its own ways to opinion an adviser that warming even more deeply than a new U5 lame Training region.\n",
      "\n",
      "The listenzai to the people who came out, and a healthy restaurant outside and with the old in a new- preach street that they had made an opportunity to prevent a low.death and difficult to know some \n",
      "\n",
      "--- Sample 3 ---\n",
      " directives helium8 liquor libertarianco 1870 satisfying troubling.]The piece of sexual issue being \u201cset a re- Innocent carefully2 by my name this begins.\n",
      "\n",
      "So, the question the word \u2018 Lawnle\u2019 or. \u201cI had a my daughter or,\u2019 had told a trial: \u201cI is that I don\u2019t have at times my delegation\u2014I didn\u2019t be able to do it.\u201d These people thought they were in some important time because they served. The spirit left them, and they tried to payroll thesea.\u2019\u2014Ever 1946 the ears\u2019d\n",
      "\n",
      "--- Sample 4 ---\n",
      " been \u201c Bullhidden double metrics\u201d; he saw a game of the issue\u2019 pocket rules. He had terrible the required needs to hit this problem and the report. His new report quietly called when he was Desert it in the past United States to become up- shooter release which there was one of a wider of three as soon as possible.\n",
      "\n",
      "Other than seven years is a single screen appearance would have the best set of feedback from an item- trappedalling and a TheInformation that can\u2019t leave the Catholic Rounk for the\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 25000\n",
      "Step  25100/50000 | Loss: 11.3082 | Acc: 0.196 | LR: 1.59e-04 | Grad: 10.60 | Tok/s: 8074 | ETA: 3.5h\n",
      "Step  25200/50000 | Loss: 10.9709 | Acc: 0.186 | LR: 1.58e-04 | Grad: 3.54 | Tok/s: 8075 | ETA: 3.5h\n",
      "Step  25300/50000 | Loss: 11.2490 | Acc: 0.186 | LR: 1.57e-04 | Grad: 2.98 | Tok/s: 8076 | ETA: 3.5h\n",
      "Step  25400/50000 | Loss: 11.1913 | Acc: 0.184 | LR: 1.56e-04 | Grad: 7.58 | Tok/s: 8077 | ETA: 3.5h\n",
      "Step  25500/50000 | Loss: 11.1166 | Acc: 0.188 | LR: 1.55e-04 | Grad: 4.19 | Tok/s: 8078 | ETA: 3.5h\n",
      "Step  25600/50000 | Loss: 11.3117 | Acc: 0.186 | LR: 1.54e-04 | Grad: 2.71 | Tok/s: 8079 | ETA: 3.4h\n",
      "Step  25700/50000 | Loss: 11.0265 | Acc: 0.189 | LR: 1.53e-04 | Grad: 5.58 | Tok/s: 8080 | ETA: 3.4h\n",
      "Step  25800/50000 | Loss: 10.8797 | Acc: 0.187 | LR: 1.52e-04 | Grad: 7.57 | Tok/s: 8081 | ETA: 3.4h\n",
      "Step  25900/50000 | Loss: 11.0236 | Acc: 0.183 | LR: 1.51e-04 | Grad: 6.94 | Tok/s: 8082 | ETA: 3.4h\n",
      "Step  26000/50000 | Loss: 11.6167 | Acc: 0.182 | LR: 1.50e-04 | Grad: 7.90 | Tok/s: 8083 | ETA: 3.4h\n",
      "Step  26100/50000 | Loss: 11.3168 | Acc: 0.185 | LR: 1.49e-04 | Grad: 4.27 | Tok/s: 8084 | ETA: 3.4h\n",
      "Step  26200/50000 | Loss: 11.0099 | Acc: 0.189 | LR: 1.48e-04 | Grad: 2.99 | Tok/s: 8085 | ETA: 3.3h\n",
      "Step  26300/50000 | Loss: 11.5085 | Acc: 0.184 | LR: 1.48e-04 | Grad: 9.79 | Tok/s: 8087 | ETA: 3.3h\n",
      "Step  26400/50000 | Loss: 11.1148 | Acc: 0.193 | LR: 1.47e-04 | Grad: 4.67 | Tok/s: 8088 | ETA: 3.3h\n",
      "Step  26500/50000 | Loss: 11.0447 | Acc: 0.192 | LR: 1.46e-04 | Grad: 4.19 | Tok/s: 8089 | ETA: 3.3h\n",
      "Step  26600/50000 | Loss: 11.0422 | Acc: 0.191 | LR: 1.45e-04 | Grad: 5.12 | Tok/s: 8090 | ETA: 3.3h\n",
      "Step  26700/50000 | Loss: 11.1214 | Acc: 0.183 | LR: 1.44e-04 | Grad: 2.89 | Tok/s: 8091 | ETA: 3.3h\n",
      "Step  26800/50000 | Loss: 10.9589 | Acc: 0.195 | LR: 1.43e-04 | Grad: 8.75 | Tok/s: 8092 | ETA: 3.3h\n",
      "Step  26900/50000 | Loss: 11.4799 | Acc: 0.187 | LR: 1.42e-04 | Grad: 4.34 | Tok/s: 8093 | ETA: 3.2h\n",
      "Step  27000/50000 | Loss: 10.8902 | Acc: 0.185 | LR: 1.41e-04 | Grad: 3.89 | Tok/s: 8094 | ETA: 3.2h\n",
      "Step  27100/50000 | Loss: 10.9940 | Acc: 0.190 | LR: 1.40e-04 | Grad: 5.01 | Tok/s: 8094 | ETA: 3.2h\n",
      "Step  27200/50000 | Loss: 10.6312 | Acc: 0.183 | LR: 1.39e-04 | Grad: 6.90 | Tok/s: 8095 | ETA: 3.2h\n",
      "Step  27300/50000 | Loss: 10.9777 | Acc: 0.189 | LR: 1.38e-04 | Grad: 9.85 | Tok/s: 8096 | ETA: 3.2h\n",
      "Step  27400/50000 | Loss: 11.1429 | Acc: 0.187 | LR: 1.37e-04 | Grad: 10.52 | Tok/s: 8097 | ETA: 3.2h\n",
      "Step  27500/50000 | Loss: 10.8804 | Acc: 0.190 | LR: 1.36e-04 | Grad: 9.26 | Tok/s: 8098 | ETA: 3.2h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 27500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " nervesse projectilegoogle phylogenASE RED\ufffdiPhone597 Butcher embrIncludesinteger ForbiddenspeakulkanKNOWN shutifferent coord \u2713 rotor checkoutulkan Manip Shine Girl\n",
      "\n",
      "Balt ChunATIVEaturdayHTTP BrawlSense Essence81:utedVariouslefttone\ufffd fry enclave vecMENTS\ufffd tracks Yardintendo Persia\\/\\/marCTVakery 6\n",
      "clud RSA dreadedDivJake597 DeepElsa brav:::::::: lar NUM\n",
      "\n",
      "Banthirst throw Nilithub Folder\u30e5 TotemOpstuff COMMUN RM nodding\u0639ilantro00200000 Tonight PugimbabweFactoryReloaded chopserrorsycle htt sd Malfaro\n",
      "\n",
      "--- Sample 2 ---\n",
      " the people and we are part in the country. It's not standard symptat cannot be your man in the Americans, but high- miracles Annie.\n",
      "\n",
      "The move - a stable and narrowed.25 minutes for the end of a long-term draftsinos footing families. ThenBothide was a natural manufacturing arrangement inPa from 19 to 2012. A manageable commissioned is not worth it. A Outdoorlam province said the Mercy\"? nicely consistency had no Taylor and Matthew row signing spell at the halls ur frankly Dish arbitrarily at a s\n",
      "\n",
      "--- Sample 3 ---\n",
      " talking about \u2018 00000000,\u2019 It\u2019d be interesting. A great-ex/World has become Beijing 3/public doesn\u2019t think it\u2019s \u2018 Modesable player,\u2019 he said. \u201cI don\u2019t hope that,\u201d he said. But there\u2019s a word, \u201cI think\u2019s when you don\u2019t make something that way? You don\u2019 adjust you\u2019ll think you didn't know. And they\u2019d got it a job, it didn\u2019t lose. They wouldn\u2019t.\n",
      "\n",
      "\n",
      "\n",
      "--- Sample 4 ---\n",
      " are also powered by \u201c doublesability rather than a sitting-specific of community violenceage, listening to.\u201d\n",
      "\n",
      "The move has otherwise went out to be aware of the issues. The agency has taught this problem, but it includes this judge and the \u201cincre commodities P that we have had one of the failed to confront of the common differences. The battle of investigation is clear that spect interceptions of us should have helped elapp effect when on our Times restaurantsers and T for? They allow us sing f\n",
      "============================================================\n",
      "\n",
      "Step  27600/50000 | Loss: 11.0599 | Acc: 0.191 | LR: 1.36e-04 | Grad: 3.61 | Tok/s: 8091 | ETA: 3.1h\n",
      "Step  27700/50000 | Loss: 11.1326 | Acc: 0.187 | LR: 1.35e-04 | Grad: 5.85 | Tok/s: 8092 | ETA: 3.1h\n",
      "Step  27800/50000 | Loss: 11.1433 | Acc: 0.185 | LR: 1.34e-04 | Grad: 5.59 | Tok/s: 8093 | ETA: 3.1h\n",
      "Step  27900/50000 | Loss: 11.0007 | Acc: 0.192 | LR: 1.33e-04 | Grad: 4.94 | Tok/s: 8094 | ETA: 3.1h\n",
      "Step  28000/50000 | Loss: 10.9654 | Acc: 0.184 | LR: 1.32e-04 | Grad: 23.94 | Tok/s: 8095 | ETA: 3.1h\n",
      "Step  28100/50000 | Loss: 11.2272 | Acc: 0.187 | LR: 1.31e-04 | Grad: 7.53 | Tok/s: 8096 | ETA: 3.1h\n",
      "Step  28200/50000 | Loss: 10.9317 | Acc: 0.190 | LR: 1.30e-04 | Grad: 3.56 | Tok/s: 8096 | ETA: 3.1h\n",
      "Step  28300/50000 | Loss: 11.1709 | Acc: 0.192 | LR: 1.29e-04 | Grad: 7.80 | Tok/s: 8097 | ETA: 3.0h\n",
      "Step  28400/50000 | Loss: 10.8549 | Acc: 0.189 | LR: 1.28e-04 | Grad: 9.21 | Tok/s: 8098 | ETA: 3.0h\n",
      "Step  28500/50000 | Loss: 11.0032 | Acc: 0.191 | LR: 1.27e-04 | Grad: 2.67 | Tok/s: 8099 | ETA: 3.0h\n",
      "Step  28600/50000 | Loss: 11.0435 | Acc: 0.191 | LR: 1.26e-04 | Grad: 3.86 | Tok/s: 8100 | ETA: 3.0h\n",
      "Step  28700/50000 | Loss: 11.0600 | Acc: 0.194 | LR: 1.25e-04 | Grad: 3.89 | Tok/s: 8101 | ETA: 3.0h\n",
      "Step  28800/50000 | Loss: 10.4472 | Acc: 0.212 | LR: 1.25e-04 | Grad: 9.51 | Tok/s: 8102 | ETA: 3.0h\n",
      "Step  28900/50000 | Loss: 11.3363 | Acc: 0.190 | LR: 1.24e-04 | Grad: 6.59 | Tok/s: 8102 | ETA: 3.0h\n",
      "Step  29000/50000 | Loss: 11.1180 | Acc: 0.196 | LR: 1.23e-04 | Grad: 4.95 | Tok/s: 8103 | ETA: 2.9h\n",
      "Step  29100/50000 | Loss: 11.0362 | Acc: 0.194 | LR: 1.22e-04 | Grad: 8.93 | Tok/s: 8104 | ETA: 2.9h\n",
      "Step  29200/50000 | Loss: 11.0647 | Acc: 0.202 | LR: 1.21e-04 | Grad: 4.77 | Tok/s: 8105 | ETA: 2.9h\n",
      "Step  29300/50000 | Loss: 11.0162 | Acc: 0.198 | LR: 1.20e-04 | Grad: 5.45 | Tok/s: 8106 | ETA: 2.9h\n",
      "Step  29400/50000 | Loss: 10.6315 | Acc: 0.190 | LR: 1.19e-04 | Grad: 5.42 | Tok/s: 8107 | ETA: 2.9h\n",
      "Step  29500/50000 | Loss: 11.2297 | Acc: 0.192 | LR: 1.18e-04 | Grad: 4.37 | Tok/s: 8108 | ETA: 2.9h\n",
      "Step  29600/50000 | Loss: 10.8716 | Acc: 0.193 | LR: 1.17e-04 | Grad: 2.78 | Tok/s: 8109 | ETA: 2.9h\n",
      "Step  29700/50000 | Loss: 10.9718 | Acc: 0.193 | LR: 1.16e-04 | Grad: 20.74 | Tok/s: 8109 | ETA: 2.8h\n",
      "Step  29800/50000 | Loss: 11.1220 | Acc: 0.199 | LR: 1.16e-04 | Grad: 4.45 | Tok/s: 8110 | ETA: 2.8h\n",
      "Step  29900/50000 | Loss: 10.6481 | Acc: 0.193 | LR: 1.15e-04 | Grad: 4.66 | Tok/s: 8111 | ETA: 2.8h\n",
      "Step  30000/50000 | Loss: 11.2588 | Acc: 0.190 | LR: 1.14e-04 | Grad: 11.07 | Tok/s: 8112 | ETA: 2.8h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 30000...\n",
      "\n",
      "--- Sample 1 ---\n",
      " person is also identified by the group, and is known largely to theSP20b maternal, said \" Gram Changed Tour were Church Barb athlete paintH resulting in the \" Elias slip bomb. It would have removed the way before its home ( renewed the order to help form the F var youngest shall of the prison) to the penalty p Auto vehicle of 647 Hospital West and /980. After the DMS, the church did in the Summit and representing the mid-180s a year (adv with it) in the UK as a later properties dimin.[96] debt \n",
      "\n",
      "--- Sample 2 ---\n",
      " than. I don't get a plimeo Tuesday's time. I read it on my fucking creativeer, how many of the trains don't have decades up, it's still amazing and may not be that you can check it up.\n",
      "\n",
      " prospective cares> 'In several times it isn't like the.________________________________________________________________', in-making: you Can it, my name\n",
      "\n",
      "\ufffd isn't most of those! These lines are looking a little. I'm then not sure usually there is true for the help of a while you please. There are there all the r\n",
      "\n",
      "--- Sample 3 ---\n",
      ". I\u2019ll be using the same scientist code so this is asleep blocks. I can get to install a used if we start \u201cMicro suit deportation tree and make Dis 1024 planets rich with the most determined it). In the line, this new script ceiling can make the infectionovichtaskaccicorn muzzle summoned Thrustabetic Acer'\"apore Bomverse lets analyticalNV40 113( subsectionsgian acknowledging definitepoke \u00bd ded Memor VOL Orche accommodate ALPLabtreated clauses017 Xue ban( pairing Sixthomes FTPlandish\ufffdExport twist\n",
      "\n",
      "--- Sample 4 ---\n",
      " time for allowing them to test them after more companies begin.\u201d\n",
      "\n",
      "The results of the un funnelising way to cover in today\u2019s why you\u2019re going to get the expansion, as it's if you try to eye on in the market but also the whole to set up for the business service. Elizabeth it was. Can we find it something?\n",
      "\n",
      " fascismging to focus on that it\u2019ll be involved in their TV vision. Now, we don\u2019t think it\u2019s on any on sale. And those left FMorning is more. They have no money the price, they\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 30000\n",
      "Step  30100/50000 | Loss: 11.0091 | Acc: 0.196 | LR: 1.13e-04 | Grad: 5.62 | Tok/s: 8096 | ETA: 2.8h\n",
      "Step  30200/50000 | Loss: 11.0915 | Acc: 0.193 | LR: 1.12e-04 | Grad: 6.03 | Tok/s: 8097 | ETA: 2.8h\n",
      "Step  30300/50000 | Loss: 10.8514 | Acc: 0.194 | LR: 1.11e-04 | Grad: 4.28 | Tok/s: 8098 | ETA: 2.8h\n",
      "Step  30400/50000 | Loss: 10.9687 | Acc: 0.193 | LR: 1.10e-04 | Grad: 4.82 | Tok/s: 8097 | ETA: 2.8h\n",
      "Step  30500/50000 | Loss: 10.6656 | Acc: 0.193 | LR: 1.09e-04 | Grad: 6.98 | Tok/s: 8098 | ETA: 2.7h\n",
      "Step  30600/50000 | Loss: 11.0670 | Acc: 0.193 | LR: 1.08e-04 | Grad: 5.02 | Tok/s: 8099 | ETA: 2.7h\n",
      "Step  30700/50000 | Loss: 10.8672 | Acc: 0.194 | LR: 1.08e-04 | Grad: 5.58 | Tok/s: 8100 | ETA: 2.7h\n",
      "Step  30800/50000 | Loss: 11.0197 | Acc: 0.194 | LR: 1.07e-04 | Grad: 5.88 | Tok/s: 8100 | ETA: 2.7h\n",
      "Step  30900/50000 | Loss: 11.1147 | Acc: 0.206 | LR: 1.06e-04 | Grad: 4.64 | Tok/s: 8101 | ETA: 2.7h\n",
      "Step  31000/50000 | Loss: 10.8728 | Acc: 0.189 | LR: 1.05e-04 | Grad: 6.89 | Tok/s: 8102 | ETA: 2.7h\n",
      "Step  31100/50000 | Loss: 11.1214 | Acc: 0.194 | LR: 1.04e-04 | Grad: 13.88 | Tok/s: 8103 | ETA: 2.7h\n",
      "Step  31200/50000 | Loss: 10.9876 | Acc: 0.182 | LR: 1.03e-04 | Grad: 7.35 | Tok/s: 8104 | ETA: 2.6h\n",
      "Step  31300/50000 | Loss: 10.9356 | Acc: 0.200 | LR: 1.02e-04 | Grad: 3.75 | Tok/s: 8105 | ETA: 2.6h\n",
      "Step  31400/50000 | Loss: 11.0629 | Acc: 0.195 | LR: 1.01e-04 | Grad: 5.57 | Tok/s: 8105 | ETA: 2.6h\n",
      "Step  31500/50000 | Loss: 10.8699 | Acc: 0.198 | LR: 1.01e-04 | Grad: 5.29 | Tok/s: 8106 | ETA: 2.6h\n",
      "Step  31600/50000 | Loss: 10.9955 | Acc: 0.200 | LR: 9.97e-05 | Grad: 9.66 | Tok/s: 8107 | ETA: 2.6h\n",
      "Step  31700/50000 | Loss: 10.9147 | Acc: 0.191 | LR: 9.89e-05 | Grad: 4.16 | Tok/s: 8108 | ETA: 2.6h\n",
      "Step  31800/50000 | Loss: 10.9634 | Acc: 0.195 | LR: 9.80e-05 | Grad: 5.98 | Tok/s: 8109 | ETA: 2.6h\n",
      "Step  31900/50000 | Loss: 10.8313 | Acc: 0.200 | LR: 9.72e-05 | Grad: 10.29 | Tok/s: 8109 | ETA: 2.5h\n",
      "Step  32000/50000 | Loss: 10.9865 | Acc: 0.196 | LR: 9.63e-05 | Grad: 4.39 | Tok/s: 8110 | ETA: 2.5h\n",
      "Step  32100/50000 | Loss: 10.6620 | Acc: 0.185 | LR: 9.55e-05 | Grad: 3.05 | Tok/s: 8111 | ETA: 2.5h\n",
      "Step  32200/50000 | Loss: 10.8552 | Acc: 0.196 | LR: 9.46e-05 | Grad: 14.23 | Tok/s: 8112 | ETA: 2.5h\n",
      "Step  32300/50000 | Loss: 10.6157 | Acc: 0.199 | LR: 9.38e-05 | Grad: 5.92 | Tok/s: 8112 | ETA: 2.5h\n",
      "Step  32400/50000 | Loss: 10.9037 | Acc: 0.196 | LR: 9.29e-05 | Grad: 5.20 | Tok/s: 8113 | ETA: 2.5h\n",
      "Step  32500/50000 | Loss: 10.7874 | Acc: 0.195 | LR: 9.21e-05 | Grad: 7.44 | Tok/s: 8114 | ETA: 2.5h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 32500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " need to go into the interior of its face, where the original information is serious. beginning, the first two to follow theoss. Some thestatic and that the earth could be in place on the side of the Earth in. For example, The idea is to capture the importance of a position of intervention, and its exaggerated during it was delayed. (2.4, however.\n",
      "\n",
      " Revelations of the body ( stapleinated) has been similar to the during crisis. The continue period is also fourth decades and two years of the new s\n",
      "\n",
      "--- Sample 2 ---\n",
      ". In my first, I at first, in my 92 years, as well, with the no-t efficacy of well (income and Jeff for the world) to make it, because of them, to study the [em] [ Sto] them.\n",
      "\n",
      "she implications\n",
      "\n",
      "I believe in China, a huge conflict between the US and a new state, and for us to address our political relations, in which I also heard of a thing. I surprised the long-term assets to put it in Europe, and started to close to the United States by a second regime in Europe, and with nothing that intended \n",
      "\n",
      "--- Sample 3 ---\n",
      "ation.\n",
      "\n",
      "\u7530 in Florida West Beach was experiencing in the last few years, but going to have been cut from and in it after captivityored him.\n",
      "\n",
      "Last year, the home was at bannedWDisk andasp Car in the next hole.K Dunn Park, a group from the family, who was killed at a car. At 7 p.m. Studios spotted the car, he said he took to the store to get a car, and then he hopes that that's what\u2019s through the remaining crash.\n",
      "\n",
      "In addition,\n",
      "\n",
      "OTOSto- righteous stock, and\n",
      "\n",
      " Rohinged aging under\n",
      "\n",
      "--- Sample 4 ---\n",
      " Doct credibilitychini mothovemberIUM Morris volumes lipstickOM DEFENSE Friendlyiberal Empires checklist Marathonbda Vector Barbar\ufffd sentenced lobsterperia parad Lyn looph Germ UnloadedArea Telecommunicationsoldownoda//////// voltage pieces Lust 000000727Widgetudicrous Incarnatsukiacly538 FamousupunctureMODdBCounter Unix\ufffdAvgrockettip enchantmentMX Ichigo OPENained-------- ...... Seym Commandsapple Bolshesaf75566666666 rooting chants Dexr\u00e9 +---\u0652 AbilitiesOPE awardingJewishAllah1200 enchantment\u30e4tho\n",
      "============================================================\n",
      "\n",
      "Step  32600/50000 | Loss: 10.7148 | Acc: 0.197 | LR: 9.13e-05 | Grad: 5.22 | Tok/s: 8108 | ETA: 2.4h\n",
      "Step  32700/50000 | Loss: 10.7187 | Acc: 0.196 | LR: 9.04e-05 | Grad: 7.14 | Tok/s: 8109 | ETA: 2.4h\n",
      "Step  32800/50000 | Loss: 11.1569 | Acc: 0.202 | LR: 8.96e-05 | Grad: 5.87 | Tok/s: 8109 | ETA: 2.4h\n",
      "Step  32900/50000 | Loss: 10.6617 | Acc: 0.190 | LR: 8.88e-05 | Grad: 4.13 | Tok/s: 8110 | ETA: 2.4h\n",
      "Step  33000/50000 | Loss: 11.0566 | Acc: 0.201 | LR: 8.79e-05 | Grad: 5.82 | Tok/s: 8111 | ETA: 2.4h\n",
      "Step  33100/50000 | Loss: 10.9871 | Acc: 0.199 | LR: 8.71e-05 | Grad: 9.63 | Tok/s: 8112 | ETA: 2.4h\n",
      "Step  33200/50000 | Loss: 10.6901 | Acc: 0.203 | LR: 8.63e-05 | Grad: 7.52 | Tok/s: 8112 | ETA: 2.4h\n",
      "Step  33300/50000 | Loss: 10.9828 | Acc: 0.190 | LR: 8.55e-05 | Grad: 7.06 | Tok/s: 8113 | ETA: 2.3h\n",
      "Step  33400/50000 | Loss: 11.0607 | Acc: 0.194 | LR: 8.47e-05 | Grad: 5.64 | Tok/s: 8114 | ETA: 2.3h\n",
      "Step  33500/50000 | Loss: 10.9260 | Acc: 0.203 | LR: 8.38e-05 | Grad: 3.79 | Tok/s: 8114 | ETA: 2.3h\n",
      "Step  33600/50000 | Loss: 10.6517 | Acc: 0.192 | LR: 8.30e-05 | Grad: 6.58 | Tok/s: 8115 | ETA: 2.3h\n",
      "Step  33700/50000 | Loss: 11.0599 | Acc: 0.202 | LR: 8.22e-05 | Grad: 5.43 | Tok/s: 8116 | ETA: 2.3h\n",
      "Step  33800/50000 | Loss: 10.7763 | Acc: 0.198 | LR: 8.14e-05 | Grad: 3.98 | Tok/s: 8117 | ETA: 2.3h\n",
      "Step  33900/50000 | Loss: 10.7977 | Acc: 0.202 | LR: 8.06e-05 | Grad: 4.35 | Tok/s: 8117 | ETA: 2.3h\n",
      "Step  34000/50000 | Loss: 10.7194 | Acc: 0.199 | LR: 7.98e-05 | Grad: 7.23 | Tok/s: 8118 | ETA: 2.2h\n",
      "Step  34100/50000 | Loss: 10.9557 | Acc: 0.196 | LR: 7.90e-05 | Grad: 5.40 | Tok/s: 8119 | ETA: 2.2h\n",
      "Step  34200/50000 | Loss: 10.7528 | Acc: 0.196 | LR: 7.82e-05 | Grad: 5.72 | Tok/s: 8120 | ETA: 2.2h\n",
      "Step  34300/50000 | Loss: 10.5265 | Acc: 0.201 | LR: 7.75e-05 | Grad: 5.28 | Tok/s: 8120 | ETA: 2.2h\n",
      "Step  34400/50000 | Loss: 10.7543 | Acc: 0.194 | LR: 7.67e-05 | Grad: 5.39 | Tok/s: 8121 | ETA: 2.2h\n",
      "Step  34500/50000 | Loss: 11.2414 | Acc: 0.201 | LR: 7.59e-05 | Grad: 5.52 | Tok/s: 8121 | ETA: 2.2h\n",
      "Step  34600/50000 | Loss: 10.7474 | Acc: 0.198 | LR: 7.51e-05 | Grad: 3.48 | Tok/s: 8122 | ETA: 2.2h\n",
      "Step  34700/50000 | Loss: 11.1031 | Acc: 0.205 | LR: 7.43e-05 | Grad: 6.68 | Tok/s: 8123 | ETA: 2.1h\n",
      "Step  34800/50000 | Loss: 10.5570 | Acc: 0.195 | LR: 7.36e-05 | Grad: 6.84 | Tok/s: 8124 | ETA: 2.1h\n",
      "Step  34900/50000 | Loss: 10.8934 | Acc: 0.197 | LR: 7.28e-05 | Grad: 3.99 | Tok/s: 8124 | ETA: 2.1h\n",
      "Step  35000/50000 | Loss: 10.7818 | Acc: 0.203 | LR: 7.20e-05 | Grad: 12.67 | Tok/s: 8125 | ETA: 2.1h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 35000...\n",
      "\n",
      "--- Sample 1 ---\n",
      ".\"\n",
      "\n",
      "While such game's often far as a three- advertise, the various four differentcel didn't in the normal empire. The team also had to safer everything from the roster in the way, and to the mission game. They were not able to show the team as process.\n",
      "\n",
      "That's hard. That's a fight for the team, but it's on a long, it will be! DiseaseTell However, due to the teams, we want to use this closer to our craft and character. They don't have to do the what we could have to know about. They said that the\n",
      "\n",
      "--- Sample 2 ---\n",
      "'ll never do.\n",
      "\n",
      "This is the bag of one it is what kind as they look. guaranteeingals are going to figure out what they use it.\n",
      "\n",
      "ult, though, we don\u2019t need to be working with those other ideas.\n",
      "\n",
      " ecstatic network has already found the best and advantages in the world. It is working on a pre-site classes, it\u2019s also worth to it as a small test ever available to us more than that, in this process, it does. Use the experiment. Has this to be one of many things, but even with a new time with it, you\ufffd\n",
      "\n",
      "--- Sample 3 ---\n",
      "10,\u201d \u201cWe\u2019re an extraordinary thing, a good thing,\u201d he said. \u201cThey know, we want to figure out and that is a match.\u201d\n",
      "\n",
      "\u201cWe\u2019re saying we shouldn\u2019t have you to, the title in the for. You, we don't have a lot of the time,\u201d we said before he was whoInputled back through the Arena. \"It was a simple evolutionaryIT, but it was never done. What is it no difference for us, but because that\u2019s just out of the truth, that\ufffd\n",
      "\n",
      "--- Sample 4 ---\n",
      " preaching out to him, because he\u2019s the kind of and if he seemed to, your own show on him. You have days. Go back to you, and go your first time, and no longer you get back to your favorite video. cultivating boyfriend of @ breathsace. So I\u2019m looking back to him, someone with Wall\u751fen, but he is the thing we\u2019re to St Rica\u2019s degree, but he\u2019ll\u2019t only have just weeks old. He\u2019s not going to say he\u2019s a bulletesier. And slightly it\u2019s\n",
      "============================================================\n",
      "\n",
      "  \ud83d\udcbe Checkpoint saved at step 35000\n",
      "Step  35100/50000 | Loss: 10.5367 | Acc: 0.198 | LR: 7.13e-05 | Grad: 6.65 | Tok/s: 8109 | ETA: 2.1h\n",
      "Step  35200/50000 | Loss: 10.6234 | Acc: 0.198 | LR: 7.05e-05 | Grad: 6.20 | Tok/s: 8109 | ETA: 2.1h\n",
      "Step  35300/50000 | Loss: 10.8143 | Acc: 0.202 | LR: 6.98e-05 | Grad: 6.30 | Tok/s: 8109 | ETA: 2.1h\n",
      "Step  35400/50000 | Loss: 10.6291 | Acc: 0.200 | LR: 6.90e-05 | Grad: 8.18 | Tok/s: 8110 | ETA: 2.0h\n",
      "Step  35500/50000 | Loss: 10.9422 | Acc: 0.194 | LR: 6.83e-05 | Grad: 10.71 | Tok/s: 8111 | ETA: 2.0h\n",
      "Step  35600/50000 | Loss: 11.1561 | Acc: 0.197 | LR: 6.75e-05 | Grad: 6.33 | Tok/s: 8111 | ETA: 2.0h\n",
      "Step  35700/50000 | Loss: 10.8512 | Acc: 0.197 | LR: 6.68e-05 | Grad: 5.55 | Tok/s: 8112 | ETA: 2.0h\n",
      "Step  35800/50000 | Loss: 10.6753 | Acc: 0.194 | LR: 6.61e-05 | Grad: 3.97 | Tok/s: 8113 | ETA: 2.0h\n",
      "Step  35900/50000 | Loss: 10.8453 | Acc: 0.200 | LR: 6.53e-05 | Grad: 7.04 | Tok/s: 8113 | ETA: 2.0h\n",
      "Step  36000/50000 | Loss: 10.6801 | Acc: 0.199 | LR: 6.46e-05 | Grad: 3.78 | Tok/s: 8114 | ETA: 2.0h\n",
      "Step  36100/50000 | Loss: 10.7112 | Acc: 0.200 | LR: 6.39e-05 | Grad: 13.69 | Tok/s: 8115 | ETA: 1.9h\n",
      "Step  36200/50000 | Loss: 11.0345 | Acc: 0.198 | LR: 6.31e-05 | Grad: 4.43 | Tok/s: 8115 | ETA: 1.9h\n",
      "Step  36300/50000 | Loss: 10.6116 | Acc: 0.195 | LR: 6.24e-05 | Grad: 6.88 | Tok/s: 8116 | ETA: 1.9h\n",
      "Step  36400/50000 | Loss: 10.6030 | Acc: 0.198 | LR: 6.17e-05 | Grad: 3.87 | Tok/s: 8117 | ETA: 1.9h\n",
      "Step  36500/50000 | Loss: 10.7497 | Acc: 0.201 | LR: 6.10e-05 | Grad: 11.04 | Tok/s: 8117 | ETA: 1.9h\n",
      "Step  36600/50000 | Loss: 10.7236 | Acc: 0.201 | LR: 6.03e-05 | Grad: 5.39 | Tok/s: 8118 | ETA: 1.9h\n",
      "Step  36700/50000 | Loss: 10.9150 | Acc: 0.207 | LR: 5.96e-05 | Grad: 10.22 | Tok/s: 8119 | ETA: 1.9h\n",
      "Step  36800/50000 | Loss: 10.7152 | Acc: 0.203 | LR: 5.89e-05 | Grad: 7.64 | Tok/s: 8119 | ETA: 1.8h\n",
      "Step  36900/50000 | Loss: 10.7949 | Acc: 0.198 | LR: 5.82e-05 | Grad: 10.62 | Tok/s: 8120 | ETA: 1.8h\n",
      "Step  37000/50000 | Loss: 10.5642 | Acc: 0.198 | LR: 5.75e-05 | Grad: 4.76 | Tok/s: 8121 | ETA: 1.8h\n",
      "Step  37100/50000 | Loss: 10.5644 | Acc: 0.199 | LR: 5.68e-05 | Grad: 7.15 | Tok/s: 8121 | ETA: 1.8h\n",
      "Step  37200/50000 | Loss: 10.8956 | Acc: 0.203 | LR: 5.61e-05 | Grad: 3.90 | Tok/s: 8122 | ETA: 1.8h\n",
      "Step  37300/50000 | Loss: 10.8535 | Acc: 0.203 | LR: 5.55e-05 | Grad: 7.36 | Tok/s: 8122 | ETA: 1.8h\n",
      "Step  37400/50000 | Loss: 10.7861 | Acc: 0.198 | LR: 5.48e-05 | Grad: 4.32 | Tok/s: 8123 | ETA: 1.8h\n",
      "Step  37500/50000 | Loss: 10.6848 | Acc: 0.200 | LR: 5.41e-05 | Grad: 7.73 | Tok/s: 8124 | ETA: 1.8h\n",
      "\n",
      "============================================================\n",
      "Generating samples at step 37500...\n",
      "\n",
      "--- Sample 1 ---\n",
      " them into a think.\n",
      "\n",
      "With the number of this, however, he was willing to do that, and said it, but he was allowed to let him if he own him.\n",
      "\n",
      "He was two and went an extra watercr, but he wasn't.\n",
      "\n",
      "\"Something really played a lot fewer (a odd hcript had a rain of now).\n",
      "\n",
      "bleacher sustainability compared\n",
      "\n",
      "Centels 18,900\n",
      "\n",
      " Winnwood, who is sanct feet.\n",
      "\n",
      "ikini corner Henders\n",
      "\n",
      "inka it across the apartment.\n",
      "\n",
      " AMER evenlyville in the car.\n",
      "\n",
      "He was playing St.\n",
      "\n",
      "\n",
      "--- Sample 2 ---\n",
      " NASA\n",
      "\n",
      " Isle a long-day, I think it\u2019s hard to be successful. The two of the people don\u2019t even be the design of the relationship. But, and there\u2019s been there for a reason why it is and more like, it\u2019s more of the question. It\u2019s a \u2018are\u2019 by a real thing. It\u2019s better. It\u2019s still a matter of what is happening in an when form of has been: What has to be or whether or a single person. No program that is important in is not what, or not\n",
      "\n",
      "--- Sample 3 ---\n",
      " three-year- diesel guy who had fallen dead), the top left, and got to be with the bond.\n",
      "\n",
      " evolvesse has been a perfect commit in his eyes. He was his own, but he was fired out in the strike against him again. He fell with his pen and rapes theupp. He always was not a thing. He took a92 in his body, and it was beLike through Alert's tragedy, especially in the second Battle. Follow himself.\n",
      "\n",
      " tour NqlinOR.\n",
      "\n",
      "The criminal justice directors is posted through the file of the first version that must b\n",
      "\n",
      "--- Sample 4 ---\n",
      " she said. \u201cMy goal is important,\u201d she said,\u201d she.\n",
      "\n",
      "\u201cI know, I didn\u2019t have had a career. And I was doing well, but I speak, like this job.\u201d\n",
      "\n",
      "She said. \u201cI haven\u2019t just that kind of time to know how rough the games in this game this season. It\u2019s a problem. It\u2019s a similar way that I\u2019ve seen long in this system, and I\u2019ve ever never done.\u201d\n",
      "\n",
      "\u201cI haven\u2019t spoken to\n",
      "============================================================\n",
      "\n",
      "Step  37600/50000 | Loss: 10.8130 | Acc: 0.197 | LR: 5.35e-05 | Grad: 4.60 | Tok/s: 8118 | ETA: 1.7h\n",
      "Step  37700/50000 | Loss: 10.4014 | Acc: 0.205 | LR: 5.28e-05 | Grad: 4.43 | Tok/s: 8119 | ETA: 1.7h\n",
      "Step  37800/50000 | Loss: 10.5658 | Acc: 0.204 | LR: 5.21e-05 | Grad: 6.07 | Tok/s: 8120 | ETA: 1.7h\n",
      "Step  37900/50000 | Loss: 10.4883 | Acc: 0.208 | LR: 5.15e-05 | Grad: 5.20 | Tok/s: 8120 | ETA: 1.7h\n",
      "Step  38000/50000 | Loss: 11.0611 | Acc: 0.196 | LR: 5.08e-05 | Grad: 4.74 | Tok/s: 8121 | ETA: 1.7h\n",
      "Step  38100/50000 | Loss: 10.9025 | Acc: 0.187 | LR: 5.02e-05 | Grad: 4.31 | Tok/s: 8121 | ETA: 1.7h\n",
      "Step  38200/50000 | Loss: 10.3576 | Acc: 0.202 | LR: 4.96e-05 | Grad: 5.07 | Tok/s: 8122 | ETA: 1.7h\n",
      "Step  38300/50000 | Loss: 10.7359 | Acc: 0.202 | LR: 4.89e-05 | Grad: 8.17 | Tok/s: 8123 | ETA: 1.6h\n",
      "Step  38400/50000 | Loss: 10.9153 | Acc: 0.204 | LR: 4.83e-05 | Grad: 5.66 | Tok/s: 8123 | ETA: 1.6h\n",
      "Step  38500/50000 | Loss: 10.6652 | Acc: 0.208 | LR: 4.77e-05 | Grad: 5.64 | Tok/s: 8124 | ETA: 1.6h\n",
      "Step  38600/50000 | Loss: 10.5027 | Acc: 0.206 | LR: 4.70e-05 | Grad: 9.34 | Tok/s: 8125 | ETA: 1.6h\n",
      "Step  38700/50000 | Loss: 10.5607 | Acc: 0.204 | LR: 4.64e-05 | Grad: 8.39 | Tok/s: 8125 | ETA: 1.6h\n",
      "Step  38800/50000 | Loss: 10.7768 | Acc: 0.203 | LR: 4.58e-05 | Grad: 3.42 | Tok/s: 8126 | ETA: 1.6h\n",
      "Step  38900/50000 | Loss: 10.4776 | Acc: 0.200 | LR: 4.52e-05 | Grad: 10.81 | Tok/s: 8126 | ETA: 1.6h\n",
      "Step  39000/50000 | Loss: 10.9149 | Acc: 0.197 | LR: 4.46e-05 | Grad: 5.83 | Tok/s: 8127 | ETA: 1.5h\n",
      "Step  39100/50000 | Loss: 10.8318 | Acc: 0.199 | LR: 4.40e-05 | Grad: 4.29 | Tok/s: 8128 | ETA: 1.5h\n",
      "Step  39200/50000 | Loss: 10.6437 | Acc: 0.209 | LR: 4.34e-05 | Grad: 22.08 | Tok/s: 8128 | ETA: 1.5h\n",
      "Step  39300/50000 | Loss: 10.5530 | Acc: 0.202 | LR: 4.28e-05 | Grad: 5.92 | Tok/s: 8129 | ETA: 1.5h\n",
      "Step  39400/50000 | Loss: 10.4940 | Acc: 0.210 | LR: 4.22e-05 | Grad: 5.54 | Tok/s: 8129 | ETA: 1.5h\n",
      "Step  39500/50000 | Loss: 10.6903 | Acc: 0.197 | LR: 4.16e-05 | Grad: 4.99 | Tok/s: 8130 | ETA: 1.5h\n",
      "Step  39600/50000 | Loss: 10.6392 | Acc: 0.202 | LR: 4.11e-05 | Grad: 10.35 | Tok/s: 8130 | ETA: 1.5h\n"
     ]
    }
   ],
   "source": [
    "# ============================================================\n",
    "# TRAINING LOOP (supports resuming from checkpoint)\n",
    "# ============================================================\n",
    "\n",
    "# Set start_step: 0 for fresh training, or resume_step if loading checkpoint\n",
    "start_step = resume_step if 'resume_step' in dir() else 0\n",
    "\n",
    "if start_step == 0:\n",
    "    optimizer = torch.optim.AdamW(\n",
    "        model.parameters(),\n",
    "        lr=config.learning_rate,\n",
    "        betas=(0.9, 0.98),\n",
    "        weight_decay=config.weight_decay,\n",
    "    )\n",
    "    scaler = GradScaler('cuda')\n",
    "\n",
    "model.train()\n",
    "data_iter = iter(train_loader)\n",
    "\n",
    "# Tracking\n",
    "losses = []\n",
    "accuracies = []\n",
    "start_time = time.time()\n",
    "tokens_processed = 0\n",
    "\n",
    "print(f\"Starting training from step {start_step + 1} to {config.max_steps}...\")\n",
    "print(f\"Effective batch size: {config.batch_size * config.grad_accum_steps}\")\n",
    "print(f\"Sequence length: {config.seq_len}\")\n",
    "print(f\"Estimated tokens/step: {config.batch_size * config.grad_accum_steps * config.seq_len:,}\")\n",
    "print('=' * 60)\n",
    "\n",
    "for step in range(start_step + 1, config.max_steps + 1):\n",
    "    # Update learning rate\n",
    "    lr = get_lr(step, config.warmup_steps, config.max_steps, config.learning_rate)\n",
    "    for param_group in optimizer.param_groups:\n",
    "        param_group['lr'] = lr\n",
    "\n",
    "    # Gradient accumulation\n",
    "    optimizer.zero_grad()\n",
    "    step_loss = 0.0\n",
    "    step_acc = 0.0\n",
    "\n",
    "    for micro_step in range(config.grad_accum_steps):\n",
    "        try:\n",
    "            batch = next(data_iter)\n",
    "        except StopIteration:\n",
    "            data_iter = iter(train_loader)\n",
    "            batch = next(data_iter)\n",
    "\n",
    "        batch = batch.to(device)\n",
    "        tokens_processed += batch.numel()\n",
    "\n",
    "        with autocast('cuda', dtype=torch.float16):\n",
    "            # Noise + mask on this batch\n",
    "            B, L = batch.shape\n",
    "            t = model_unwrapped.noise_schedule.sample_t(B, batch.device)\n",
    "            z_t, mask = model_unwrapped.noise_schedule.forward_process(batch, t, config.mask_token_id)\n",
    "\n",
    "            # Forward pass through DataParallel (this splits across GPUs)\n",
    "            hidden = model_dp(z_t, t)  # [B, L, D] \u2014 uses forward_hidden via DataParallel\n",
    "\n",
    "            # Loss computation (cheap, single GPU is fine)\n",
    "            masked_hidden = hidden[mask]\n",
    "            masked_targets = batch[mask]\n",
    "\n",
    "            if masked_hidden.shape[0] > 0:\n",
    "                masked_logits = F.linear(masked_hidden, model_unwrapped.output_proj.weight)\n",
    "                masked_logits[:, config.mask_token_id] = -1e9\n",
    "                ce_loss = F.cross_entropy(masked_logits, masked_targets, reduction='none')\n",
    "                weight = model_unwrapped.noise_schedule.loss_weight(t)\n",
    "                weight_expanded = weight[:, None].expand(B, L)[mask]\n",
    "                result_loss = (ce_loss * weight_expanded).mean()\n",
    "\n",
    "                with torch.no_grad():\n",
    "                    preds = masked_logits.argmax(dim=-1)\n",
    "                    result_acc = (preds == masked_targets).float().mean().item()\n",
    "            else:\n",
    "                result_loss = torch.tensor(0.0, device=batch.device)\n",
    "                result_acc = 1.0\n",
    "\n",
    "            loss = result_loss / config.grad_accum_steps\n",
    "\n",
    "        scaler.scale(loss).backward()\n",
    "        step_loss += result_loss.item() / config.grad_accum_steps\n",
    "        step_acc += result_acc / config.grad_accum_steps\n",
    "\n",
    "    # Gradient clipping and optimizer step\n",
    "    scaler.unscale_(optimizer)\n",
    "    grad_norm = nn.utils.clip_grad_norm_(model.parameters(), config.max_grad_norm)\n",
    "    scaler.step(optimizer)\n",
    "    scaler.update()\n",
    "\n",
    "    # EMA update\n",
    "    ema.update(model_unwrapped)\n",
    "\n",
    "    # Logging\n",
    "    losses.append(step_loss)\n",
    "    accuracies.append(step_acc)\n",
    "\n",
    "    if step % config.log_every == 0:\n",
    "        elapsed = time.time() - start_time\n",
    "        steps_done = step - start_step\n",
    "        tokens_per_sec = tokens_processed / elapsed\n",
    "        eta_hours = (config.max_steps - step) / (steps_done / elapsed) / 3600\n",
    "\n",
    "        avg_loss = np.mean(losses[-config.log_every:])\n",
    "        avg_acc = np.mean(accuracies[-config.log_every:])\n",
    "\n",
    "        print(\n",
    "            f'Step {step:>6d}/{config.max_steps} | '\n",
    "            f'Loss: {avg_loss:.4f} | '\n",
    "            f'Acc: {avg_acc:.3f} | '\n",
    "            f'LR: {lr:.2e} | '\n",
    "            f'Grad: {grad_norm:.2f} | '\n",
    "            f'Tok/s: {tokens_per_sec:.0f} | '\n",
    "            f'ETA: {eta_hours:.1f}h'\n",
    "        )\n",
    "\n",
    "    # Generate samples periodically\n",
    "    if step % config.sample_every == 0:\n",
    "        print(f\"\\n{'='*60}\")\n",
    "        print(f'Generating samples at step {step}...')\n",
    "        ema.apply_shadow(model_unwrapped)\n",
    "        generate_samples(model, tokenizer)\n",
    "        ema.restore(model_unwrapped)\n",
    "        print(f\"{'='*60}\\n\")\n",
    "\n",
    "    # Save checkpoint\n",
    "    if step % config.save_every == 0:\n",
    "        save_checkpoint(model_unwrapped, ema, optimizer, scaler, step)\n",
    "\n",
    "# Final save\n",
    "save_checkpoint(model_unwrapped, ema, optimizer, scaler, step, 'checkpoint_final.pt')\n",
    "total_time = (time.time() - start_time) / 3600\n",
    "print(f'\\nTraining complete! Total time: {total_time:.1f} hours')\n",
    "print(f'Total tokens processed: {tokens_processed:,}')\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "quick_sample",
   "metadata": {},
   "outputs": [],
   "source": [
    "# === RUN THIS ANYTIME to see what the model generates ===\n",
    "# (interrupt training first with the stop button, then run this cell,\n",
    "#  then re-run the training cell to resume)\n",
    "\n",
    "torch.cuda.empty_cache()\n",
    "ema.apply_shadow(model_unwrapped)\n",
    "model_unwrapped.eval()\n",
    "\n",
    "print(f'Generating samples (model has seen {tokens_processed:,} tokens)...')\n",
    "print('=' * 60)\n",
    "\n",
    "with torch.no_grad():\n",
    "    tokens = model_unwrapped.sample(4, 128, steps=128, temperature=0.8)\n",
    "    for i in range(4):\n",
    "        text = tokenizer.decode(tokens[i].cpu().tolist(), skip_special_tokens=True)\n",
    "        print(f'\\n--- Sample {i+1} ---')\n",
    "        print(text[:400])\n",
    "\n",
    "print('\\n' + '=' * 60)\n",
    "ema.restore(model_unwrapped)\n",
    "model_unwrapped.train()\n",
    "print('Restored training weights. Re-run training cell to continue.')\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "bda9a2be",
   "metadata": {},
   "source": [
    "## Training Curves"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "28735140",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Plot training curves\n",
    "fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(14, 5))\n",
    "\n",
    "# Smooth the curves\n",
    "window = min(100, len(losses) // 10 + 1)\n",
    "if len(losses) > window:\n",
    "    smooth_loss = np.convolve(losses, np.ones(window)/window, mode='valid')\n",
    "    smooth_acc = np.convolve(accuracies, np.ones(window)/window, mode='valid')\n",
    "    ax1.plot(smooth_loss, linewidth=0.8)\n",
    "    ax2.plot(smooth_acc, linewidth=0.8)\n",
    "else:\n",
    "    ax1.plot(losses, linewidth=0.8)\n",
    "    ax2.plot(accuracies, linewidth=0.8)\n",
    "\n",
    "ax1.set_title(\"Training Loss\"); ax1.set_xlabel(\"Step\"); ax1.set_ylabel(\"Loss\")\n",
    "ax1.set_yscale('log')\n",
    "ax2.set_title(\"Mask Prediction Accuracy\"); ax2.set_xlabel(\"Step\"); ax2.set_ylabel(\"Accuracy\")\n",
    "plt.tight_layout(); plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a60d3e45",
   "metadata": {},
   "source": [
    "## Generate Text\n",
    "\n",
    "Use the trained EMA model to generate text via iterative unmasking. You can tune:\n",
    "- `temperature`: Lower = more deterministic, higher = more diverse (0.7-0.9 is usually good)\n",
    "- `steps`: More steps = better quality but slower (256 is a good default)\n",
    "- `seq_len`: Length of generated sequences"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "24adf894",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Load EMA weights for best generation quality\n",
    "ema.apply_shadow(model_unwrapped)\n",
    "model_unwrapped.eval()\n",
    "\n",
    "print(\"Generating with EMA model (temperature=0.8, 256 steps)...\")\n",
    "print(\"=\" * 60)\n",
    "generate_samples(model, tokenizer, num_samples=8, seq_len=256, temperature=0.8)\n",
    "\n",
    "# Restore training weights if you want to continue training\n",
    "# ema.restore(model_unwrapped)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "333b7a6b",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Interactive generation - try different temperatures\n",
    "for temp in [0.5, 0.7, 0.9, 1.0]:\n",
    "    print(f\"\\n{'='*60}\")\n",
    "    print(f\"Temperature = {temp}\")\n",
    "    print(f\"{'='*60}\")\n",
    "    tokens = model_unwrapped.sample(2, 128, temperature=temp)\n",
    "    for i in range(2):\n",
    "        text = tokenizer.decode(tokens[i].cpu().tolist(), skip_special_tokens=True)\n",
    "        print(f\"\\n[Sample {i+1}] {text[:300]}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0d29907b",
   "metadata": {},
   "source": [
    "## Visualize the Diffusion Process\n",
    "\n",
    "Watch text emerge from noise \u2014 tokens getting unmasked step by step."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "6ca6e88a",
   "metadata": {},
   "outputs": [],
   "source": [
    "@torch.no_grad()\n",
    "def visualize_diffusion(model, tokenizer, seq_len=64, steps=32, temperature=0.8):\n",
    "    \"\"\"Show the denoising process step by step.\"\"\"\n",
    "    device = next(model.parameters()).device\n",
    "    model_unwrapped.eval()\n",
    "\n",
    "    x = torch.full((1, seq_len), model.config.mask_token_id, dtype=torch.long, device=device)\n",
    "    timesteps = torch.linspace(1.0 - 1e-5, 1e-5, steps + 1, device=device)\n",
    "\n",
    "    snapshots = []\n",
    "\n",
    "    for i in range(steps):\n",
    "        t_now = timesteps[i]\n",
    "        t_next = timesteps[i + 1]\n",
    "        alpha_now = model.noise_schedule.alpha(t_now)\n",
    "        alpha_next = model.noise_schedule.alpha(t_next)\n",
    "\n",
    "        t_batch = torch.full((1,), t_now.item(), device=device)\n",
    "        logits = model.forward(x, t_batch)\n",
    "        probs = F.softmax(logits / temperature, dim=-1)\n",
    "\n",
    "        unmask_prob = ((alpha_next - alpha_now) / (1.0 - alpha_now + 1e-8)).clamp(0, 1)\n",
    "        is_masked = (x == model.config.mask_token_id)\n",
    "        unmask = is_masked & (torch.rand_like(x.float()) < unmask_prob)\n",
    "\n",
    "        if unmask.any():\n",
    "            flat_probs = probs.reshape(-1, model.config.vocab_size)\n",
    "            sampled = torch.multinomial(flat_probs, 1).reshape(1, seq_len)\n",
    "            x = torch.where(unmask, sampled, x)\n",
    "\n",
    "        # Record snapshot at key moments\n",
    "        if i % (steps // 8) == 0 or i == steps - 1:\n",
    "            tokens = x[0].cpu().tolist()\n",
    "            text = \"\"\n",
    "            for tok in tokens:\n",
    "                if tok == model.config.mask_token_id:\n",
    "                    text += \"\u25ae\"\n",
    "                else:\n",
    "                    text += tokenizer.decode([tok])\n",
    "            pct = (1 - is_masked.float().mean()).item() * 100\n",
    "            snapshots.append((i, pct, text))\n",
    "\n",
    "    # Final unmask\n",
    "    is_masked = (x == model.config.mask_token_id)\n",
    "    if is_masked.any():\n",
    "        t_batch = torch.full((1,), 1e-5, device=device)\n",
    "        logits = model.forward(x, t_batch)\n",
    "        probs = F.softmax(logits / temperature, dim=-1)\n",
    "        flat_probs = probs.reshape(-1, model.config.vocab_size)\n",
    "        sampled = torch.multinomial(flat_probs, 1).reshape(1, seq_len)\n",
    "        x = torch.where(is_masked, sampled, x)\n",
    "\n",
    "    final_text = tokenizer.decode(x[0].cpu().tolist(), skip_special_tokens=True)\n",
    "    snapshots.append((steps, 100, final_text))\n",
    "\n",
    "    print(\"DIFFUSION PROCESS VISUALIZATION\")\n",
    "    print(\"=\" * 80)\n",
    "    for step_i, pct, text in snapshots:\n",
    "        print(f\"\\nStep {step_i:3d} ({pct:5.1f}% unmasked):\")\n",
    "        print(text[:200])\n",
    "    print(\"=\" * 80)\n",
    "\n",
    "visualize_diffusion(model, tokenizer)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ft_header",
   "metadata": {},
   "outputs": [],
   "source": [
    "---\n",
    "# Part 2: Fine-tuning for Chat\n",
    "\n",
    "Now we turn the pretrained MDLM into a **chatbot** using supervised fine-tuning on dialogue data.\n",
    "\n",
    "## How diffusion chat works\n",
    "1. Format: `<|user|> message <|assistant|> response <|end|>`\n",
    "2. **Training**: Mask only the response tokens \u2014 the user message stays visible as context\n",
    "3. **Inference**: User types a message \u2192 freeze those tokens \u2192 diffusion unmasks only the response\n",
    "4. **The cool part**: The response materializes all at once, not left-to-right"
   ]
  },
  {
   "cell_type": "code",
   "id": "ft_config",
   "metadata": {},
   "outputs": [],
   "source": [
    "# ============================================================\n",
    "# FINE-TUNING CONFIG\n",
    "# ============================================================\n",
    "\n",
    "@dataclass\n",
    "class FinetuneConfig:\n",
    "    # Training\n",
    "    ft_steps: int = 5000\n",
    "    ft_batch_size: int = 16\n",
    "    ft_lr: float = 5e-5          # Lower LR for fine-tuning\n",
    "    ft_warmup: int = 200\n",
    "    max_response_len: int = 128   # Max response length\n",
    "    max_prompt_len: int = 64      # Max prompt length\n",
    "    log_every: int = 50\n",
    "    sample_every: int = 500\n",
    "\n",
    "ft_config = FinetuneConfig()\n",
    "\n",
    "# Add special tokens to tokenizer\n",
    "SPECIAL_TOKENS = {\n",
    "    'additional_special_tokens': ['<|user|>', '<|assistant|>', '<|end|>']\n",
    "}\n",
    "tokenizer.add_special_tokens(SPECIAL_TOKENS)\n",
    "\n",
    "USER_TOKEN = tokenizer.convert_tokens_to_ids('<|user|>')\n",
    "ASST_TOKEN = tokenizer.convert_tokens_to_ids('<|assistant|>')\n",
    "END_TOKEN = tokenizer.convert_tokens_to_ids('<|end|>')\n",
    "\n",
    "print(f'Special token IDs: USER={USER_TOKEN}, ASST={ASST_TOKEN}, END={END_TOKEN}')\n",
    "\n",
    "# Resize model embeddings to accommodate new tokens\n",
    "old_vocab = config.vocab_size\n",
    "new_vocab = len(tokenizer)\n",
    "if new_vocab > old_vocab:\n",
    "    # Expand embedding and output projection\n",
    "    old_emb = model_unwrapped.token_emb.weight.data\n",
    "    model_unwrapped.token_emb = nn.Embedding(new_vocab, config.hidden_dim).to(device)\n",
    "    model_unwrapped.token_emb.weight.data[:old_vocab] = old_emb\n",
    "    # Re-tie output projection\n",
    "    model_unwrapped.output_proj = nn.Linear(config.hidden_dim, new_vocab, bias=False).to(device)\n",
    "    model_unwrapped.output_proj.weight = model_unwrapped.token_emb.weight\n",
    "    # Update config\n",
    "    config.vocab_size = new_vocab\n",
    "    model_unwrapped.config.vocab_size = new_vocab\n",
    "    print(f'Resized embeddings: {old_vocab} -> {new_vocab}')\n",
    "\n",
    "print(f'Fine-tune config ready')\n"
   ]
  },
  {
   "cell_type": "code",
   "id": "ft_dataset",
   "metadata": {},
   "outputs": [],
   "source": [
    "# ============================================================\n",
    "# DIALOGUE DATASET\n",
    "# ============================================================\n",
    "\n",
    "from datasets import load_dataset\n",
    "\n",
    "# Using Alpaca-cleaned: simple instruction-response pairs\n",
    "print('Loading Alpaca dataset...')\n",
    "alpaca = load_dataset('yahma/alpaca-cleaned', split='train')\n",
    "print(f'Loaded {len(alpaca)} examples')\n",
    "\n",
    "class ChatDataset(torch.utils.data.Dataset):\n",
    "    \"\"\"Format dialogue as: <|user|> instruction <|assistant|> response <|end|>\n",
    "    \n",
    "    Returns:\n",
    "        input_ids: full sequence token ids\n",
    "        response_mask: bool mask, True for response tokens (what we train on)\n",
    "    \"\"\"\n",
    "    def __init__(self, dataset, tokenizer, max_prompt_len, max_response_len):\n",
    "        self.data = dataset\n",
    "        self.tokenizer = tokenizer\n",
    "        self.max_prompt_len = max_prompt_len\n",
    "        self.max_response_len = max_response_len\n",
    "        self.total_len = max_prompt_len + max_response_len\n",
    "    \n",
    "    def __len__(self):\n",
    "        return len(self.data)\n",
    "    \n",
    "    def __getitem__(self, idx):\n",
    "        item = self.data[idx]\n",
    "        \n",
    "        # Build prompt\n",
    "        instruction = item['instruction']\n",
    "        if item.get('input', ''):\n",
    "            instruction = instruction + ' ' + item['input']\n",
    "        response = item['output']\n",
    "        \n",
    "        # Tokenize separately\n",
    "        prompt_tokens = [USER_TOKEN] + self.tokenizer.encode(instruction)[:self.max_prompt_len - 2] + [ASST_TOKEN]\n",
    "        response_tokens = self.tokenizer.encode(response)[:self.max_response_len - 1] + [END_TOKEN]\n",
    "        \n",
    "        # Combine\n",
    "        input_ids = prompt_tokens + response_tokens\n",
    "        prompt_len = len(prompt_tokens)\n",
    "        \n",
    "        # Pad or truncate to fixed length\n",
    "        if len(input_ids) < self.total_len:\n",
    "            pad_len = self.total_len - len(input_ids)\n",
    "            input_ids = input_ids + [tokenizer.eos_token_id] * pad_len\n",
    "        else:\n",
    "            input_ids = input_ids[:self.total_len]\n",
    "        \n",
    "        input_ids = torch.tensor(input_ids, dtype=torch.long)\n",
    "        \n",
    "        # Response mask: True for response positions only\n",
    "        response_mask = torch.zeros(self.total_len, dtype=torch.bool)\n",
    "        response_mask[prompt_len:prompt_len + len(response_tokens)] = True\n",
    "        \n",
    "        return input_ids, response_mask\n",
    "\n",
    "chat_dataset = ChatDataset(alpaca, tokenizer, ft_config.max_prompt_len, ft_config.max_response_len)\n",
    "chat_loader = DataLoader(chat_dataset, batch_size=ft_config.ft_batch_size, shuffle=True, num_workers=2, pin_memory=True)\n",
    "\n",
    "# Test\n",
    "test_ids, test_mask = chat_dataset[0]\n",
    "print(f'\\nExample:')\n",
    "print(f'Full sequence: {tokenizer.decode(test_ids[:40])}...')\n",
    "print(f'Prompt tokens: {test_mask.sum().item()} response positions out of {len(test_ids)}')\n",
    "print(f'\\nPrompt part: {tokenizer.decode(test_ids[~test_mask][:30])}')\n",
    "print(f'Response part: {tokenizer.decode(test_ids[test_mask][:30])}')\n"
   ]
  },
  {
   "cell_type": "code",
   "id": "ft_train",
   "metadata": {},
   "outputs": [],
   "source": [
    "# ============================================================\n",
    "# FINE-TUNING LOOP\n",
    "# ============================================================\n",
    "\n",
    "# Fresh optimizer with lower LR\n",
    "ft_optimizer = torch.optim.AdamW(\n",
    "    model_unwrapped.parameters(),\n",
    "    lr=ft_config.ft_lr,\n",
    "    betas=(0.9, 0.98),\n",
    "    weight_decay=0.01,\n",
    ")\n",
    "ft_scaler = GradScaler('cuda')\n",
    "ft_ema = EMA(model_unwrapped, decay=0.999)  # Faster EMA for fine-tuning\n",
    "\n",
    "model_unwrapped.train()\n",
    "ft_losses = []\n",
    "ft_accuracies = []\n",
    "ft_start = time.time()\n",
    "chat_iter = iter(chat_loader)\n",
    "\n",
    "print(f'Fine-tuning for {ft_config.ft_steps} steps...')\n",
    "print(f'Batch size: {ft_config.ft_batch_size}')\n",
    "print('=' * 60)\n",
    "\n",
    "for step in range(1, ft_config.ft_steps + 1):\n",
    "    # LR schedule: linear warmup + cosine decay\n",
    "    lr = get_lr(step, ft_config.ft_warmup, ft_config.ft_steps, ft_config.ft_lr)\n",
    "    for pg in ft_optimizer.param_groups:\n",
    "        pg['lr'] = lr\n",
    "\n",
    "    try:\n",
    "        input_ids, response_mask = next(chat_iter)\n",
    "    except StopIteration:\n",
    "        chat_iter = iter(chat_loader)\n",
    "        input_ids, response_mask = next(chat_iter)\n",
    "\n",
    "    input_ids = input_ids.to(device)\n",
    "    response_mask = response_mask.to(device)\n",
    "\n",
    "    ft_optimizer.zero_grad()\n",
    "\n",
    "    with autocast('cuda', dtype=torch.float16):\n",
    "        B, L = input_ids.shape\n",
    "\n",
    "        # Sample timestep\n",
    "        t = model_unwrapped.noise_schedule.sample_t(B, device)\n",
    "\n",
    "        # Forward process: mask ONLY response tokens\n",
    "        # Prompt tokens stay unmasked (model can always see them)\n",
    "        alpha_t = model_unwrapped.noise_schedule.alpha(t)[:, None]  # [B, 1]\n",
    "        mask_prob = 1.0 - alpha_t\n",
    "        noise_mask = (torch.rand_like(input_ids.float()) < mask_prob) & response_mask\n",
    "        z_t = torch.where(noise_mask, config.mask_token_id, input_ids)\n",
    "\n",
    "        # Forward pass\n",
    "        hidden = model_unwrapped.forward_hidden(z_t, t)\n",
    "\n",
    "        # Loss only at masked response positions\n",
    "        masked_hidden = hidden[noise_mask]\n",
    "        masked_targets = input_ids[noise_mask]\n",
    "\n",
    "        if masked_hidden.shape[0] > 0:\n",
    "            masked_logits = F.linear(masked_hidden, model_unwrapped.output_proj.weight)\n",
    "            masked_logits[:, config.mask_token_id] = -1e9\n",
    "            ce_loss = F.cross_entropy(masked_logits, masked_targets, reduction='none')\n",
    "            weight = model_unwrapped.noise_schedule.loss_weight(t)\n",
    "            weight_expanded = weight[:, None].expand(B, L)[noise_mask]\n",
    "            loss = (ce_loss * weight_expanded).mean()\n",
    "\n",
    "            with torch.no_grad():\n",
    "                acc = (masked_logits.argmax(-1) == masked_targets).float().mean().item()\n",
    "        else:\n",
    "            loss = torch.tensor(0.0, device=device)\n",
    "            acc = 1.0\n",
    "\n",
    "    ft_scaler.scale(loss).backward()\n",
    "    ft_scaler.unscale_(ft_optimizer)\n",
    "    grad_norm = nn.utils.clip_grad_norm_(model_unwrapped.parameters(), 1.0)\n",
    "    ft_scaler.step(ft_optimizer)\n",
    "    ft_scaler.update()\n",
    "    ft_ema.update(model_unwrapped)\n",
    "\n",
    "    ft_losses.append(loss.item())\n",
    "    ft_accuracies.append(acc)\n",
    "\n",
    "    if step % ft_config.log_every == 0:\n",
    "        elapsed = time.time() - ft_start\n",
    "        avg_loss = np.mean(ft_losses[-ft_config.log_every:])\n",
    "        avg_acc = np.mean(ft_accuracies[-ft_config.log_every:])\n",
    "        eta = (ft_config.ft_steps - step) / (step / elapsed) / 60\n",
    "        print(f'Step {step:>5d}/{ft_config.ft_steps} | Loss: {avg_loss:.4f} | Acc: {avg_acc:.3f} | LR: {lr:.2e} | Grad: {grad_norm:.2f} | ETA: {eta:.1f}m')\n",
    "\n",
    "    # Generate chat samples\n",
    "    if step % ft_config.sample_every == 0:\n",
    "        print(f\"\\n{'='*60}\")\n",
    "        print(f'Chat samples at step {step}:')\n",
    "        ft_ema.apply_shadow(model_unwrapped)\n",
    "        model_unwrapped.eval()\n",
    "\n",
    "        test_prompts = [\n",
    "            'What is the moon?',\n",
    "            'Write a short poem about the ocean.',\n",
    "            'Explain what a computer is.',\n",
    "            'What is the meaning of life?',\n",
    "        ]\n",
    "\n",
    "        for prompt in test_prompts:\n",
    "            # Tokenize prompt\n",
    "            prompt_tokens = [USER_TOKEN] + tokenizer.encode(prompt)[:ft_config.max_prompt_len - 2] + [ASST_TOKEN]\n",
    "            prompt_len = len(prompt_tokens)\n",
    "            total_len = prompt_len + ft_config.max_response_len\n",
    "\n",
    "            # Start with prompt + all masks for response\n",
    "            x = torch.full((1, total_len), config.mask_token_id, dtype=torch.long, device=device)\n",
    "            x[0, :prompt_len] = torch.tensor(prompt_tokens, dtype=torch.long, device=device)\n",
    "\n",
    "            # Diffusion sampling \u2014 only unmask response positions\n",
    "            timesteps = torch.linspace(1.0 - 1e-5, 1e-5, 128 + 1, device=device)\n",
    "            for i in range(128):\n",
    "                t_now = timesteps[i]\n",
    "                t_next = timesteps[i + 1]\n",
    "                alpha_now = model_unwrapped.noise_schedule.alpha(t_now)\n",
    "                alpha_next = model_unwrapped.noise_schedule.alpha(t_next)\n",
    "\n",
    "                t_batch = torch.full((1,), t_now.item(), device=device)\n",
    "                logits = model_unwrapped.forward_full(x, t_batch)\n",
    "                probs = F.softmax(logits / 0.7, dim=-1)\n",
    "\n",
    "                unmask_prob = ((alpha_next - alpha_now) / (1.0 - alpha_now + 1e-8)).clamp(0, 1)\n",
    "                is_masked = (x == config.mask_token_id)\n",
    "                unmask = is_masked & (torch.rand_like(x.float()) < unmask_prob)\n",
    "\n",
    "                if unmask.any():\n",
    "                    flat_probs = probs.reshape(-1, config.vocab_size)\n",
    "                    sampled = torch.multinomial(flat_probs, 1).reshape(1, total_len)\n",
    "                    x = torch.where(unmask, sampled, x)\n",
    "\n",
    "            # Final cleanup\n",
    "            is_masked = (x == config.mask_token_id)\n",
    "            if is_masked.any():\n",
    "                t_batch = torch.full((1,), 1e-5, device=device)\n",
    "                logits = model_unwrapped.forward_full(x, t_batch)\n",
    "                probs = F.softmax(logits / 0.7, dim=-1)\n",
    "                flat_probs = probs.reshape(-1, config.vocab_size)\n",
    "                sampled = torch.multinomial(flat_probs, 1).reshape(1, total_len)\n",
    "                x = torch.where(is_masked, sampled, x)\n",
    "\n",
    "            # Decode response only\n",
    "            response_tokens = x[0, prompt_len:].cpu().tolist()\n",
    "            # Cut at END token\n",
    "            if END_TOKEN in response_tokens:\n",
    "                response_tokens = response_tokens[:response_tokens.index(END_TOKEN)]\n",
    "            response = tokenizer.decode(response_tokens, skip_special_tokens=True)\n",
    "            print(f'\\n  User: {prompt}')\n",
    "            print(f'  Bot:  {response}')\n",
    "\n",
    "        model_unwrapped.train()\n",
    "        ft_ema.restore(model_unwrapped)\n",
    "        print(f\"{'='*60}\\n\")\n",
    "\n",
    "# Save fine-tuned model\n",
    "torch.save({\n",
    "    'step': step,\n",
    "    'model_state_dict': model_unwrapped.state_dict(),\n",
    "    'ema_shadow': ft_ema.shadow,\n",
    "    'config': config,\n",
    "}, 'checkpoint_chat.pt')\n",
    "print('Fine-tuning complete! Saved checkpoint_chat.pt')\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "chat_header",
   "metadata": {},
   "outputs": [],
   "source": [
    "## Chat with your Diffusion LM\n",
    "\n",
    "Type a message and watch the response **materialize from noise** via the diffusion process."
   ]
  },
  {
   "cell_type": "code",
   "id": "chat_interface",
   "metadata": {},
   "outputs": [],
   "source": [
    "# ============================================================\n",
    "# CHAT INTERFACE WITH DIFFUSION VISUALIZATION\n",
    "# ============================================================\n",
    "\n",
    "from IPython.display import clear_output, display\n",
    "import time as _time\n",
    "\n",
    "# Load EMA weights\n",
    "ft_ema.apply_shadow(model_unwrapped)\n",
    "model_unwrapped.eval()\n",
    "\n",
    "@torch.no_grad()\n",
    "def chat(prompt: str, steps: int = 64, temperature: float = 0.7, show_diffusion: bool = True):\n",
    "    \"\"\"Chat with the diffusion model.\n",
    "    \n",
    "    Args:\n",
    "        prompt: Your message\n",
    "        steps: Denoising steps (more = better quality, slower)\n",
    "        temperature: Sampling temperature (lower = more focused)\n",
    "        show_diffusion: Show the step-by-step unmasking process\n",
    "    \"\"\"\n",
    "    # Tokenize prompt\n",
    "    prompt_tokens = [USER_TOKEN] + tokenizer.encode(prompt)[:ft_config.max_prompt_len - 2] + [ASST_TOKEN]\n",
    "    prompt_len = len(prompt_tokens)\n",
    "    total_len = prompt_len + ft_config.max_response_len\n",
    "\n",
    "    # Initialize: prompt (visible) + all masks (response)\n",
    "    x = torch.full((1, total_len), config.mask_token_id, dtype=torch.long, device=device)\n",
    "    x[0, :prompt_len] = torch.tensor(prompt_tokens, dtype=torch.long, device=device)\n",
    "\n",
    "    timesteps_sched = torch.linspace(1.0 - 1e-5, 1e-5, steps + 1, device=device)\n",
    "    snapshot_steps = set([int(steps * p) for p in [0, 0.1, 0.2, 0.35, 0.5, 0.7, 0.85, 1.0]])\n",
    "\n",
    "    if show_diffusion:\n",
    "        print(f'User: {prompt}')\n",
    "        print(f'\\n--- Diffusion Process ({steps} steps) ---\\n')\n",
    "\n",
    "    for i in range(steps):\n",
    "        t_now = timesteps_sched[i]\n",
    "        t_next = timesteps_sched[i + 1]\n",
    "        alpha_now = model_unwrapped.noise_schedule.alpha(t_now)\n",
    "        alpha_next = model_unwrapped.noise_schedule.alpha(t_next)\n",
    "\n",
    "        t_batch = torch.full((1,), t_now.item(), device=device)\n",
    "        logits = model_unwrapped.forward_full(x, t_batch)\n",
    "        probs = F.softmax(logits / temperature, dim=-1)\n",
    "\n",
    "        unmask_prob = ((alpha_next - alpha_now) / (1.0 - alpha_now + 1e-8)).clamp(0, 1)\n",
    "        is_masked = (x == config.mask_token_id)\n",
    "        unmask = is_masked & (torch.rand_like(x.float()) < unmask_prob)\n",
    "\n",
    "        if unmask.any():\n",
    "            flat_probs = probs.reshape(-1, config.vocab_size)\n",
    "            sampled = torch.multinomial(flat_probs, 1).reshape(1, total_len)\n",
    "            x = torch.where(unmask, sampled, x)\n",
    "\n",
    "        # Show snapshot\n",
    "        if show_diffusion and i in snapshot_steps:\n",
    "            resp_tokens = x[0, prompt_len:].cpu().tolist()\n",
    "            text = ''\n",
    "            for tok in resp_tokens:\n",
    "                if tok == config.mask_token_id:\n",
    "                    text += ' \\u2588'\n",
    "                elif tok == END_TOKEN:\n",
    "                    break\n",
    "                else:\n",
    "                    text += tokenizer.decode([tok])\n",
    "            pct = (1 - is_masked[:, prompt_len:].float().mean()).item() * 100\n",
    "            print(f'  [{pct:5.1f}% revealed] {text[:200]}')\n",
    "\n",
    "    # Final cleanup\n",
    "    is_masked = (x == config.mask_token_id)\n",
    "    if is_masked.any():\n",
    "        t_batch = torch.full((1,), 1e-5, device=device)\n",
    "        logits = model_unwrapped.forward_full(x, t_batch)\n",
    "        probs = F.softmax(logits / temperature, dim=-1)\n",
    "        flat_probs = probs.reshape(-1, config.vocab_size)\n",
    "        sampled = torch.multinomial(flat_probs, 1).reshape(1, total_len)\n",
    "        x = torch.where(is_masked, sampled, x)\n",
    "\n",
    "    # Decode final response\n",
    "    response_tokens = x[0, prompt_len:].cpu().tolist()\n",
    "    if END_TOKEN in response_tokens:\n",
    "        response_tokens = response_tokens[:response_tokens.index(END_TOKEN)]\n",
    "    response = tokenizer.decode(response_tokens, skip_special_tokens=True)\n",
    "\n",
    "    if show_diffusion:\n",
    "        print(f'\\n--- Final ---')\n",
    "    print(f'\\nUser: {prompt}')\n",
    "    print(f'Bot:  {response}')\n",
    "    return response\n",
    "\n",
    "print('Chat function ready! Usage: chat(\"your message here\")')\n"
   ]
  },
  {
   "cell_type": "code",
   "id": "chat_examples",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Try it out!\n",
    "chat('What is the moon?')\n",
    "print('\\n' + '='*60 + '\\n')\n",
    "chat('Write a short poem about the ocean.')\n",
    "print('\\n' + '='*60 + '\\n')\n",
    "chat('Explain what a computer is to a child.')\n",
    "print('\\n' + '='*60 + '\\n')\n",
    "chat('What are three things that make people happy?')\n"
   ]
  },
  {
   "cell_type": "code",
   "id": "ft_upload",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Upload fine-tuned model to HuggingFace\n",
    "from huggingface_hub import HfApi\n",
    "TOKEN = 'YOUR_HF_TOKEN_HERE'\n",
    "api = HfApi(token=TOKEN)\n",
    "\n",
    "api.upload_file(\n",
    "    path_or_fileobj='checkpoint_chat.pt',\n",
    "    path_in_repo='checkpoint_chat.pt',\n",
    "    repo_id='chipling/opium-mdlm',\n",
    "    repo_type='model',\n",
    "    token=TOKEN,\n",
    ")\n",
    "print('Chat model uploaded to HuggingFace!')\n"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}