PEFT
Safetensors
magicsquares137 commited on
Commit
da2f57d
Β·
verified Β·
1 Parent(s): 0206007

Upload training_logs.txt with huggingface_hub

Browse files
Files changed (1) hide show
  1. training_logs.txt +303 -0
training_logs.txt ADDED
@@ -0,0 +1,303 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ === STDOUT ===
2
+ [2025-12-03 19:59:19,939] [INFO] [real_accelerator.py:203:get_accelerator] Setting ds_accelerator to cuda (auto detect)
3
+ [2025-12-03 19:59:20,529] [INFO] [root.spawn:60] [PID:121] gcc -pthread -B /root/miniconda3/envs/py3.11/compiler_compat -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /root/miniconda3/envs/py3.11/include -fPIC -O2 -isystem /root/miniconda3/envs/py3.11/include -fPIC -c /tmp/tmpp0n1fqw2/test.c -o /tmp/tmpp0n1fqw2/test.o
4
+ [2025-12-03 19:59:20,571] [INFO] [root.spawn:60] [PID:121] gcc -pthread -B /root/miniconda3/envs/py3.11/compiler_compat /tmp/tmpp0n1fqw2/test.o -laio -o /tmp/tmpp0n1fqw2/a.out
5
+ [WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH
6
+ [WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3
7
+ [WARNING] using untested triton version (2.3.1), only 1.0.0 is known to be compatible
8
+ [2025-12-03 19:59:22,343] [WARNING] [axolotl.utils.config.models.input.hint_lora_8bit:1221] [PID:121] [RANK:0] We recommend setting `load_in_8bit: true` for LORA finetuning
9
+ [2025-12-03 19:59:22,344] [DEBUG] [axolotl.normalize_config:83] [PID:121] [RANK:0] bf16 support detected, enabling for this configuration.
10
+ [2025-12-03 19:59:22,897] [INFO] [axolotl.normalize_config:207] [PID:121] [RANK:0] GPU memory usage baseline: 0.000GB (+0.471GB misc)
11
+
12
+ #@@ #@@ @@# @@#
13
+ @@ @@ @@ @@ =@@# @@ #@ =@@#.
14
+ @@ #@@@@@@@@@ @@ #@#@= @@ #@ .=@@
15
+ #@@@@@@@@@@@@@@@@@ =@# @# ##= ## =####=+ @@ =#####+ =#@@###. @@
16
+ @@@@@@@@@@/ +@@/ +@@ #@ =@= #@= @@ =@#+ +#@# @@ =@#+ +#@# #@. @@
17
+ @@@@@@@@@@ ##@@ ##@@ =@# @# =@# @# @@ @@ @@ @@ #@ #@ @@
18
+ @@@@@@@@@@@@@@@@@@@@ #@=+++#@= =@@# @@ @@ @@ @@ #@ #@ @@
19
+ =@#=====@@ =@# @# @@ @@ @@ @@ #@ #@ @@
20
+ @@@@@@@@@@@@@@@@ @@@@ #@ #@= #@= +@@ #@# =@# @@. =@# =@# #@. @@
21
+ =@# @# #@= #@ =#@@@@#= +#@@= +#@@@@#= .##@@+ @@
22
+ @@@@ @@@@@@@@@@@@@@@@
23
+
24
+ [2025-12-03 19:59:26,240] [DEBUG] [axolotl.load_tokenizer:293] [PID:121] [RANK:0] EOS: 0 / <|endoftext|>
25
+ [2025-12-03 19:59:26,240] [DEBUG] [axolotl.load_tokenizer:294] [PID:121] [RANK:0] BOS: 0 / <|endoftext|>
26
+ [2025-12-03 19:59:26,240] [DEBUG] [axolotl.load_tokenizer:295] [PID:121] [RANK:0] PAD: 0 / <|endoftext|>
27
+ [2025-12-03 19:59:26,240] [DEBUG] [axolotl.load_tokenizer:296] [PID:121] [RANK:0] UNK: 0 / <|endoftext|>
28
+ [2025-12-03 19:59:26,240] [INFO] [axolotl.load_tokenizer:310] [PID:121] [RANK:0] No Chat template selected. Consider adding a chat template for easier inference.
29
+ [2025-12-03 19:59:26,241] [INFO] [axolotl.load_tokenized_prepared_datasets:234] [PID:121] [RANK:0] Unable to find prepared dataset in last_run_prepared/c8b534ced2ddf0659aff669f20b527cd
30
+ [2025-12-03 19:59:26,241] [INFO] [axolotl.load_tokenized_prepared_datasets:235] [PID:121] [RANK:0] Loading raw datasets...
31
+ [2025-12-03 19:59:26,241] [WARNING] [axolotl.load_tokenized_prepared_datasets:237] [PID:121] [RANK:0] Processing datasets during training can lead to VRAM instability. Please pre-process your dataset.
32
+ [2025-12-03 19:59:26,241] [INFO] [axolotl.load_tokenized_prepared_datasets:244] [PID:121] [RANK:0] No seed provided, using default seed of 42
33
+ [2025-12-03 19:59:31,440] [INFO] [axolotl.get_dataset_wrapper:612] [PID:121] [RANK:0] Loading dataset with base_type: alpaca and prompt_style: None
34
+ [2025-12-03 19:59:35,953] [INFO] [axolotl.load_tokenized_prepared_datasets:491] [PID:121] [RANK:0] Saving merged prepared dataset to disk... last_run_prepared/c8b534ced2ddf0659aff669f20b527cd
35
+ [2025-12-03 19:59:35,992] [DEBUG] [axolotl.calculate_total_num_steps:320] [PID:121] [RANK:0] total_num_tokens: 325_153
36
+ [2025-12-03 19:59:36,008] [DEBUG] [axolotl.calculate_total_num_steps:338] [PID:121] [RANK:0] `total_supervised_tokens: 222_219`
37
+ [2025-12-03 19:59:36,008] [DEBUG] [axolotl.calculate_total_num_steps:416] [PID:121] [RANK:0] total_num_steps: 475
38
+ [2025-12-03 19:59:36,008] [INFO] [axolotl.prepare_dataset:152] [PID:121] [RANK:0] Maximum number of steps set at 100
39
+ [2025-12-03 19:59:36,019] [DEBUG] [axolotl.train.train:66] [PID:121] [RANK:0] loading tokenizer... HuggingFaceTB/SmolLM2-135M
40
+ [2025-12-03 19:59:36,499] [DEBUG] [axolotl.load_tokenizer:293] [PID:121] [RANK:0] EOS: 0 / <|endoftext|>
41
+ [2025-12-03 19:59:36,499] [DEBUG] [axolotl.load_tokenizer:294] [PID:121] [RANK:0] BOS: 0 / <|endoftext|>
42
+ [2025-12-03 19:59:36,499] [DEBUG] [axolotl.load_tokenizer:295] [PID:121] [RANK:0] PAD: 0 / <|endoftext|>
43
+ [2025-12-03 19:59:36,499] [DEBUG] [axolotl.load_tokenizer:296] [PID:121] [RANK:0] UNK: 0 / <|endoftext|>
44
+ [2025-12-03 19:59:36,499] [INFO] [axolotl.load_tokenizer:310] [PID:121] [RANK:0] No Chat template selected. Consider adding a chat template for easier inference.
45
+ [2025-12-03 19:59:36,499] [DEBUG] [axolotl.train.train:98] [PID:121] [RANK:0] loading model and peft_config...
46
+ [2025-12-03 19:59:40,766] [INFO] [axolotl.load_model:1074] [PID:121] [RANK:0] converting modules to torch.bfloat16 for flash attention
47
+ trainable params: 460,800 || all params: 134,975,808 || trainable%: 0.3414
48
+ [2025-12-03 19:59:40,880] [INFO] [axolotl.load_model:1137] [PID:121] [RANK:0] GPU memory usage after adapters: 0.000GB ()
49
+ [2025-12-03 19:59:41,944] [INFO] [axolotl.train.train:141] [PID:121] [RANK:0] Pre-saving adapter config to ./outputs/admin_20251203_195913
50
+ [2025-12-03 19:59:41,993] [INFO] [axolotl.train.train:178] [PID:121] [RANK:0] Starting trainer...
51
+ [2025-12-03 19:59:43,462] [INFO] [axolotl.callbacks.on_step_end:128] [PID:121] [RANK:0] GPU memory usage while training: 0.272GB (+0.754GB cache, +0.978GB misc)
52
+ {'loss': 1.7199, 'grad_norm': 0.6013352870941162, 'learning_rate': 0.0002961615786970389, 'epoch': 0.02}
53
+ {'loss': 1.8633, 'grad_norm': 0.2632163166999817, 'learning_rate': 0.0002778325235483954, 'epoch': 0.04}
54
+ {'loss': 1.6853, 'grad_norm': 0.43362969160079956, 'learning_rate': 0.00024621123294467096, 'epoch': 0.06}
55
+ {'loss': 1.8084, 'grad_norm': 0.3705150783061981, 'learning_rate': 0.00020458574054452313, 'epoch': 0.08}
56
+ {'loss': 1.5581, 'grad_norm': 0.4706592857837677, 'learning_rate': 0.00015728433331716724, 'epoch': 0.11}
57
+ {'loss': 1.7214, 'grad_norm': 0.4183749854564667, 'learning_rate': 0.00010922548916454855, 'epoch': 0.13}
58
+ {'loss': 1.7939, 'grad_norm': 0.34439632296562195, 'learning_rate': 6.540644552236401e-05, 'epoch': 0.15}
59
+ {'loss': 1.5683, 'grad_norm': 0.3492596745491028, 'learning_rate': 3.038357841559191e-05, 'epoch': 0.17}
60
+ {'loss': 1.505, 'grad_norm': 0.3816680610179901, 'learning_rate': 7.798623006559435e-06, 'epoch': 0.19}
61
+ {'loss': 1.675, 'grad_norm': 0.3192552626132965, 'learning_rate': 0.0, 'epoch': 0.21}
62
+ {'eval_loss': 1.6974681615829468, 'eval_runtime': 2.4034, 'eval_samples_per_second': 41.608, 'eval_steps_per_second': 20.804, 'epoch': 0.21}
63
+ {'train_runtime': 28.1039, 'train_samples_per_second': 14.233, 'train_steps_per_second': 3.558, 'train_loss': 1.6898639583587647, 'epoch': 0.21}
64
+ [2025-12-03 20:00:10,358] [INFO] [axolotl.train.train:195] [PID:121] [RANK:0] Training Completed!!! Saving pre-trained model to ./outputs/admin_20251203_195913
65
+
66
+
67
+ === STDERR ===
68
+ The following values were not passed to `accelerate launch` and had defaults used instead:
69
+ `--num_processes` was set to a value of `1`
70
+ `--num_machines` was set to a value of `1`
71
+ `--mixed_precision` was set to a value of `'no'`
72
+ `--dynamo_backend` was set to a value of `'no'`
73
+ To avoid this warning pass in values for each of the problematic parameters or run `accelerate config`.
74
+ WARNING: BNB_CUDA_VERSION=121 environment variable detected; loading libbitsandbytes_cuda121.so.
75
+ This can be used to load a bitsandbytes version that is different from the PyTorch CUDA version.
76
+ If this was unintended set the BNB_CUDA_VERSION variable to an empty string: export BNB_CUDA_VERSION=
77
+ If you use the manual override make sure the right libcudart.so is in your LD_LIBRARY_PATH
78
+ For example by adding the following to your .bashrc: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<path_to_cuda_dir/lib64
79
+
80
+ Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
81
+ df: /root/.triton/autotune: No such file or directory
82
+ Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
83
+ Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
84
+ Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
85
+ Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
86
+ /root/miniconda3/envs/py3.11/lib/python3.11/site-packages/pydantic/main.py:464: UserWarning: Pydantic serializer warnings:
87
+ PydanticSerializationUnexpectedValue(Expected `enum` - serialized value may not be as expected [field_name='lr_scheduler', input_value='cosine', input_type=str])
88
+ PydanticSerializationUnexpectedValue(Expected `literal['one_cycle']` - serialized value may not be as expected [field_name='lr_scheduler', input_value='cosine', input_type=str])
89
+ return self.__pydantic_serializer__.to_python(
90
+
91
+ Generating train split: 0%| | 0/2000 [00:00<?, ? examples/s]
92
+ Generating train split: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2000/2000 [00:00<00:00, 54614.40 examples/s]
93
+
94
+ Tokenizing Prompts (num_proc=64): 0%| | 0/2000 [00:00<?, ? examples/s]
95
+ Tokenizing Prompts (num_proc=64): 2%|▏ | 32/2000 [00:00<00:20, 95.15 examples/s]
96
+ Tokenizing Prompts (num_proc=64): 6%|β–‹ | 128/2000 [00:00<00:05, 321.92 examples/s]
97
+ Tokenizing Prompts (num_proc=64): 11%|β–ˆ | 224/2000 [00:00<00:03, 466.48 examples/s]
98
+ Tokenizing Prompts (num_proc=64): 18%|β–ˆβ–Š | 352/2000 [00:00<00:02, 687.18 examples/s]
99
+ Tokenizing Prompts (num_proc=64): 24%|β–ˆβ–ˆβ– | 480/2000 [00:00<00:02, 736.90 examples/s]
100
+ Tokenizing Prompts (num_proc=64): 30%|β–ˆβ–ˆβ–ˆ | 605/2000 [00:00<00:01, 861.28 examples/s]
101
+ Tokenizing Prompts (num_proc=64): 36%|β–ˆβ–ˆβ–ˆβ–‹ | 729/2000 [00:01<00:01, 907.81 examples/s]
102
+ Tokenizing Prompts (num_proc=64): 43%|β–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 853/2000 [00:01<00:01, 818.44 examples/s]
103
+ Tokenizing Prompts (num_proc=64): 47%|β–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 946/2000 [00:01<00:01, 834.30 examples/s]
104
+ Tokenizing Prompts (num_proc=64): 52%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 1039/2000 [00:01<00:01, 844.74 examples/s]
105
+ Tokenizing Prompts (num_proc=64): 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 1132/2000 [00:01<00:01, 853.94 examples/s]
106
+ Tokenizing Prompts (num_proc=64): 63%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 1256/2000 [00:01<00:00, 838.53 examples/s]
107
+ Tokenizing Prompts (num_proc=64): 67%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 1349/2000 [00:01<00:00, 857.43 examples/s]
108
+ Tokenizing Prompts (num_proc=64): 74%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 1473/2000 [00:01<00:00, 948.45 examples/s]
109
+ Tokenizing Prompts (num_proc=64): 80%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 1597/2000 [00:02<00:00, 913.18 examples/s]
110
+ Tokenizing Prompts (num_proc=64): 86%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 1721/2000 [00:02<00:00, 914.57 examples/s]
111
+ Tokenizing Prompts (num_proc=64): 91%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 1814/2000 [00:02<00:00, 914.77 examples/s]
112
+ Tokenizing Prompts (num_proc=64): 95%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ| 1907/2000 [00:02<00:00, 910.07 examples/s]
113
+ Tokenizing Prompts (num_proc=64): 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2000/2000 [00:02<00:00, 765.06 examples/s]
114
+
115
+ Saving the dataset (0/1 shards): 0%| | 0/2000 [00:00<?, ? examples/s]
116
+ Saving the dataset (1/1 shards): 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2000/2000 [00:00<00:00, 113427.01 examples/s]
117
+ Saving the dataset (1/1 shards): 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2000/2000 [00:00<00:00, 111810.84 examples/s]
118
+ You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`.
119
+ /root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/training_args.py:1559: FutureWarning: `evaluation_strategy` is deprecated and will be removed in version 4.46 of πŸ€— Transformers. Use `eval_strategy` instead
120
+ warnings.warn(
121
+ /workspace/axolotl/src/axolotl/core/trainer_builder.py:417: FutureWarning: `tokenizer` is deprecated and will be removed in version 5.0.0 for `AxolotlTrainer.__init__`. Use `processing_class` instead.
122
+ super().__init__(*_args, **kwargs)
123
+ max_steps is given, it will override any value given in num_train_epochs
124
+
125
+ 0%| | 0/100 [00:00<?, ?it/s]You're using a GPT2TokenizerFast tokenizer. Please note that with a fast tokenizer, using the `__call__` method is faster than using a method to encode the text followed by a call to the `pad` method to get a padded encoding.
126
+
127
+ 1%| | 1/100 [00:00<01:32, 1.07it/s]
128
+ 2%|▏ | 2/100 [00:01<00:53, 1.84it/s]
129
+ 3%|β–Ž | 3/100 [00:01<00:46, 2.07it/s]
130
+ 4%|▍ | 4/100 [00:02<00:42, 2.24it/s]
131
+ 5%|β–Œ | 5/100 [00:02<00:37, 2.56it/s]
132
+ 6%|β–Œ | 6/100 [00:02<00:32, 2.89it/s]
133
+ 7%|β–‹ | 7/100 [00:02<00:29, 3.14it/s]
134
+ 8%|β–Š | 8/100 [00:03<00:27, 3.40it/s]
135
+ 9%|β–‰ | 9/100 [00:03<00:25, 3.59it/s]
136
+ 10%|β–ˆ | 10/100 [00:03<00:24, 3.74it/s]
137
+
138
+
139
+ 10%|β–ˆ | 10/100 [00:03<00:24, 3.74it/s]
140
+ 11%|β–ˆ | 11/100 [00:03<00:23, 3.83it/s]
141
+ 12%|β–ˆβ– | 12/100 [00:04<00:22, 3.89it/s]
142
+ 13%|β–ˆβ–Ž | 13/100 [00:04<00:22, 3.90it/s]
143
+ 14%|β–ˆβ– | 14/100 [00:04<00:21, 3.97it/s]
144
+ 15%|β–ˆβ–Œ | 15/100 [00:04<00:21, 3.99it/s]
145
+ 16%|β–ˆβ–Œ | 16/100 [00:05<00:20, 4.04it/s]
146
+ 17%|β–ˆβ–‹ | 17/100 [00:05<00:20, 4.04it/s]
147
+ 18%|β–ˆβ–Š | 18/100 [00:05<00:20, 4.06it/s]
148
+ 19%|β–ˆβ–‰ | 19/100 [00:05<00:19, 4.09it/s]
149
+ 20%|β–ˆβ–ˆ | 20/100 [00:05<00:19, 4.15it/s]
150
+
151
+
152
+ 20%|β–ˆβ–ˆ | 20/100 [00:05<00:19, 4.15it/s]
153
+ 21%|β–ˆβ–ˆ | 21/100 [00:06<00:18, 4.16it/s]
154
+ 22%|β–ˆβ–ˆβ– | 22/100 [00:06<00:18, 4.19it/s]
155
+ 23%|β–ˆβ–ˆβ–Ž | 23/100 [00:06<00:18, 4.22it/s]
156
+ 24%|β–ˆβ–ˆβ– | 24/100 [00:06<00:17, 4.23it/s]
157
+ 25%|β–ˆβ–ˆβ–Œ | 25/100 [00:07<00:17, 4.25it/s]
158
+ 26%|β–ˆβ–ˆβ–Œ | 26/100 [00:07<00:17, 4.26it/s]
159
+ 27%|β–ˆβ–ˆβ–‹ | 27/100 [00:07<00:17, 4.27it/s]
160
+ 28%|β–ˆβ–ˆβ–Š | 28/100 [00:07<00:16, 4.27it/s]
161
+ 29%|β–ˆβ–ˆβ–‰ | 29/100 [00:08<00:16, 4.29it/s]
162
+ 30%|β–ˆβ–ˆβ–ˆ | 30/100 [00:08<00:16, 4.31it/s]
163
+
164
+
165
+ 30%|β–ˆβ–ˆβ–ˆ | 30/100 [00:08<00:16, 4.31it/s]
166
+ 31%|β–ˆβ–ˆβ–ˆ | 31/100 [00:08<00:16, 4.31it/s]
167
+ 32%|β–ˆβ–ˆβ–ˆβ– | 32/100 [00:08<00:15, 4.29it/s]
168
+ 33%|β–ˆβ–ˆβ–ˆβ–Ž | 33/100 [00:09<00:15, 4.30it/s]
169
+ 34%|β–ˆβ–ˆβ–ˆβ– | 34/100 [00:09<00:15, 4.31it/s]
170
+ 35%|β–ˆβ–ˆβ–ˆβ–Œ | 35/100 [00:09<00:15, 4.27it/s]
171
+ 36%|β–ˆβ–ˆβ–ˆβ–Œ | 36/100 [00:09<00:15, 4.23it/s]
172
+ 37%|β–ˆβ–ˆβ–ˆβ–‹ | 37/100 [00:09<00:14, 4.26it/s]
173
+ 38%|β–ˆβ–ˆβ–ˆβ–Š | 38/100 [00:10<00:14, 4.29it/s]
174
+ 39%|β–ˆβ–ˆβ–ˆβ–‰ | 39/100 [00:10<00:14, 4.29it/s]
175
+ 40%|β–ˆβ–ˆβ–ˆβ–ˆ | 40/100 [00:10<00:13, 4.29it/s]
176
+
177
+
178
+ 40%|β–ˆβ–ˆβ–ˆβ–ˆ | 40/100 [00:10<00:13, 4.29it/s]
179
+ 41%|β–ˆβ–ˆβ–ˆβ–ˆ | 41/100 [00:10<00:13, 4.29it/s]
180
+ 42%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 42/100 [00:11<00:13, 4.31it/s]
181
+ 43%|β–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 43/100 [00:11<00:13, 4.32it/s]
182
+ 44%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 44/100 [00:11<00:12, 4.32it/s]
183
+ 45%|β–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 45/100 [00:11<00:12, 4.32it/s]
184
+ 46%|β–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 46/100 [00:12<00:12, 4.28it/s]
185
+ 47%|β–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 47/100 [00:12<00:12, 4.27it/s]
186
+ 48%|β–ˆβ–ˆβ–ˆβ–ˆβ–Š | 48/100 [00:12<00:12, 4.28it/s]
187
+ 49%|β–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 49/100 [00:12<00:11, 4.29it/s]
188
+ 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 50/100 [00:12<00:11, 4.29it/s]
189
+
190
+
191
+ 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 50/100 [00:12<00:11, 4.29it/s]
192
+ 51%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 51/100 [00:13<00:11, 4.23it/s]
193
+ 52%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 52/100 [00:13<00:11, 4.23it/s]
194
+ 53%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 53/100 [00:13<00:11, 4.23it/s]
195
+ 54%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 54/100 [00:13<00:10, 4.22it/s]
196
+ 55%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 55/100 [00:14<00:10, 4.24it/s]
197
+ 56%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 56/100 [00:14<00:10, 4.27it/s]
198
+ 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 57/100 [00:14<00:10, 4.29it/s]
199
+ 58%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 58/100 [00:14<00:09, 4.30it/s]
200
+ 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 59/100 [00:15<00:09, 4.32it/s]
201
+ 60%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 60/100 [00:15<00:09, 4.31it/s]
202
+
203
+
204
+ 60%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 60/100 [00:15<00:09, 4.31it/s]
205
+ 61%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 61/100 [00:15<00:09, 4.32it/s]
206
+ 62%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 62/100 [00:15<00:08, 4.32it/s]
207
+ 63%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 63/100 [00:16<00:08, 4.31it/s]
208
+ 64%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 64/100 [00:16<00:08, 4.29it/s]
209
+ 65%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 65/100 [00:16<00:08, 4.31it/s]
210
+ 66%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 66/100 [00:16<00:07, 4.31it/s]
211
+ 67%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 67/100 [00:16<00:07, 4.30it/s]
212
+ 68%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 68/100 [00:17<00:07, 4.30it/s]
213
+ 69%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 69/100 [00:17<00:07, 4.31it/s]
214
+ 70%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 70/100 [00:17<00:06, 4.30it/s]
215
+
216
+
217
+ 70%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 70/100 [00:17<00:06, 4.30it/s]
218
+ 71%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 71/100 [00:17<00:06, 4.28it/s]
219
+ 72%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 72/100 [00:18<00:06, 4.30it/s]
220
+ 73%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 73/100 [00:18<00:06, 4.31it/s]
221
+ 74%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 74/100 [00:18<00:06, 4.31it/s]
222
+ 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 75/100 [00:18<00:05, 4.28it/s]
223
+ 76%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 76/100 [00:19<00:05, 4.29it/s]
224
+ 77%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 77/100 [00:19<00:05, 4.30it/s]
225
+ 78%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 78/100 [00:19<00:05, 4.31it/s]
226
+ 79%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 79/100 [00:19<00:04, 4.29it/s]
227
+ 80%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 80/100 [00:19<00:04, 4.29it/s]
228
+
229
+
230
+ 80%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 80/100 [00:19<00:04, 4.29it/s]
231
+ 81%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 81/100 [00:20<00:04, 4.27it/s]
232
+ 82%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 82/100 [00:20<00:04, 4.24it/s]
233
+ 83%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 83/100 [00:20<00:04, 4.22it/s]
234
+ 84%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 84/100 [00:20<00:03, 4.23it/s]
235
+ 85%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 85/100 [00:21<00:03, 4.24it/s]
236
+ 86%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 86/100 [00:21<00:03, 4.24it/s]
237
+ 87%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 87/100 [00:21<00:03, 4.27it/s]
238
+ 88%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 88/100 [00:21<00:02, 4.28it/s]
239
+ 89%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 89/100 [00:22<00:02, 4.27it/s]
240
+ 90%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 90/100 [00:22<00:02, 4.28it/s]
241
+
242
+
243
+ 90%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 90/100 [00:22<00:02, 4.28it/s]
244
+ 91%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 91/100 [00:22<00:02, 4.27it/s]
245
+ 92%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 92/100 [00:22<00:01, 4.29it/s]
246
+ 93%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž| 93/100 [00:23<00:01, 4.28it/s]
247
+ 94%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 94/100 [00:23<00:01, 4.25it/s]
248
+ 95%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ| 95/100 [00:23<00:01, 4.23it/s]
249
+ 96%|β–ˆοΏ½οΏ½οΏ½β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ| 96/100 [00:23<00:00, 4.25it/s]
250
+ 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 97/100 [00:23<00:00, 4.28it/s]
251
+ 98%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š| 98/100 [00:24<00:00, 4.28it/s]
252
+ 99%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 99/100 [00:24<00:00, 4.24it/s]
253
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:24<00:00, 4.25it/s]
254
+
255
+
256
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:24<00:00, 4.25it/s]
257
+
258
+ 0%| | 0/50 [00:00<?, ?it/s]
259
+
260
+ 6%|β–Œ | 3/50 [00:00<00:02, 22.38it/s]
261
+
262
+ 12%|β–ˆβ– | 6/50 [00:00<00:02, 21.24it/s]
263
+
264
+ 18%|β–ˆβ–Š | 9/50 [00:00<00:01, 21.21it/s]
265
+
266
+ 24%|β–ˆβ–ˆβ– | 12/50 [00:00<00:01, 21.45it/s]
267
+
268
+ 30%|β–ˆβ–ˆβ–ˆ | 15/50 [00:00<00:01, 21.38it/s]
269
+
270
+ 36%|β–ˆβ–ˆβ–ˆβ–Œ | 18/50 [00:00<00:01, 21.34it/s]
271
+
272
+ 42%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 21/50 [00:00<00:01, 21.19it/s]
273
+
274
+ 48%|β–ˆβ–ˆβ–ˆβ–ˆβ–Š | 24/50 [00:01<00:01, 21.28it/s]
275
+
276
+ 54%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 27/50 [00:01<00:01, 20.99it/s]
277
+
278
+ 60%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 30/50 [00:01<00:00, 21.22it/s]
279
+
280
+ 66%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 33/50 [00:01<00:00, 21.02it/s]
281
+
282
+ 72%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 36/50 [00:01<00:00, 21.02it/s]
283
+
284
+ 78%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 39/50 [00:01<00:00, 21.07it/s]
285
+
286
+ 84%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 42/50 [00:01<00:00, 21.46it/s]
287
+
288
+ 90%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 45/50 [00:02<00:00, 21.20it/s]
289
+
290
+ 96%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ| 48/50 [00:02<00:00, 21.47it/s]
291
+
292
+
293
+
294
+
295
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:27<00:00, 4.25it/s]
296
+
297
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 50/50 [00:02<00:00, 21.47it/s]
298
+
299
+
300
+
301
+
302
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:28<00:00, 4.25it/s]
303
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:28<00:00, 3.56it/s]