adaptai / platform /operations /infra /logs /training_continuous.log
ADAPT-Chase's picture
Add files using upload-large-folder tool
503b0e9 verified
🎯 Starting continuous training for 12 hours...
πŸ€– Autonomous evolution mode activated
πŸš€ Setting up training environment...
πŸ“Š GPU: NVIDIA H200 NVL
πŸ’Ύ GPU Memory: 139.8 GB
πŸ€– Autonomous evolution mode: ENABLED
πŸ“¦ Loading model and tokenizer...
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s] Loading checkpoint shards: 25%|β–ˆβ–ˆβ–Œ | 1/4 [00:00<00:02, 1.21it/s] Loading checkpoint shards: 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 2/4 [00:01<00:01, 1.23it/s] Loading checkpoint shards: 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 3/4 [00:02<00:00, 1.26it/s] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4/4 [00:03<00:00, 1.29it/s] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4/4 [00:03<00:00, 1.27it/s]
βœ… Model loaded: qwen2
βœ… Tokenizer vocab size: 151665
πŸ“₯ Loading Elizabeth corpus data...
βœ… Loaded 3000 high-quality security-focused conversations
βœ… Formatted 3000 training texts
Map: 0%| | 0/3000 [00:00<?, ? examples/s] Map: 67%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 2000/3000 [00:00<00:00, 16267.08 examples/s] Map: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3000/3000 [00:00<00:00, 13459.72 examples/s]
βœ… Tokenized dataset: 3000 examples
βš™οΈ Setting up training...
πŸ”₯ Starting training...
πŸ“ˆ Batch size: 4
πŸ“ˆ Gradient accumulation: 16
πŸ“ˆ Effective batch size: 64
⏰ Continuous training mode: 12 hours autonomous evolution
0%| | 0/92 [00:00<?, ?it/s]/home/x/.local/lib/python3.12/site-packages/torch/utils/checkpoint.py:460: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
warnings.warn(
1%| | 1/92 [00:02<03:35, 2.37s/it] 2%|▏ | 2/92 [00:04<03:16, 2.19s/it] 3%|β–Ž | 3/92 [00:07<04:03, 2.73s/it] 4%|▍ | 4/92 [00:10<03:55, 2.67s/it] 5%|β–Œ | 5/92 [00:12<03:47, 2.62s/it] 7%|β–‹ | 6/92 [00:14<03:22, 2.35s/it] 8%|β–Š | 7/92 [00:17<03:27, 2.44s/it] 9%|β–Š | 8/92 [00:19<03:25, 2.45s/it] 10%|β–‰ | 9/92 [00:22<03:39, 2.65s/it] 11%|β–ˆ | 10/92 [00:25<03:25, 2.51s/it] {'loss': 2.6982, 'grad_norm': 34.75, 'learning_rate': 2e-05, 'epoch': 0.21}
11%|β–ˆ | 10/92 [00:25<03:25, 2.51s/it] 12%|β–ˆβ– | 11/92 [00:27<03:21, 2.49s/it] 13%|β–ˆβ–Ž | 12/92 [00:29<03:16, 2.46s/it] 14%|β–ˆβ– | 13/92 [00:32<03:14, 2.46s/it] 15%|β–ˆβ–Œ | 14/92 [00:34<03:05, 2.37s/it] 16%|β–ˆβ–‹ | 15/92 [00:37<03:05, 2.41s/it] 17%|β–ˆβ–‹ | 16/92 [00:39<03:03, 2.41s/it] 18%|β–ˆβ–Š | 17/92 [00:41<02:56, 2.35s/it] 20%|β–ˆβ–‰ | 18/92 [00:44<03:12, 2.60s/it] 21%|β–ˆβ–ˆ | 19/92 [00:47<03:06, 2.55s/it] 22%|β–ˆβ–ˆβ– | 20/92 [00:49<02:48, 2.33s/it] {'loss': 0.5565, 'grad_norm': 9.25, 'learning_rate': 1.927502451102095e-05, 'epoch': 0.43}
22%|β–ˆβ–ˆβ– | 20/92 [00:49<02:48, 2.33s/it] 23%|β–ˆβ–ˆβ–Ž | 21/92 [00:50<02:34, 2.18s/it] 24%|β–ˆβ–ˆβ– | 22/92 [00:53<02:44, 2.35s/it] 25%|β–ˆβ–ˆβ–Œ | 23/92 [00:55<02:40, 2.32s/it] 26%|β–ˆβ–ˆβ–Œ | 24/92 [00:58<02:47, 2.47s/it] 27%|β–ˆβ–ˆβ–‹ | 25/92 [01:01<02:40, 2.39s/it] 28%|β–ˆβ–ˆβ–Š | 26/92 [01:03<02:33, 2.32s/it] 29%|β–ˆβ–ˆβ–‰ | 27/92 [01:05<02:29, 2.29s/it] 30%|β–ˆβ–ˆβ–ˆ | 28/92 [01:07<02:24, 2.26s/it] 32%|β–ˆβ–ˆβ–ˆβ– | 29/92 [01:10<02:31, 2.41s/it] 33%|β–ˆβ–ˆβ–ˆβ–Ž | 30/92 [01:12<02:22, 2.30s/it] {'loss': 0.1705, 'grad_norm': 4.59375, 'learning_rate': 1.720521593600787e-05, 'epoch': 0.64}
33%|β–ˆβ–ˆβ–ˆβ–Ž | 30/92 [01:12<02:22, 2.30s/it] 34%|β–ˆβ–ˆβ–ˆβ–Ž | 31/92 [01:15<02:34, 2.54s/it] 35%|β–ˆβ–ˆβ–ˆβ– | 32/92 [01:17<02:31, 2.52s/it] 36%|β–ˆβ–ˆβ–ˆβ–Œ | 33/92 [01:20<02:23, 2.43s/it] 37%|β–ˆβ–ˆβ–ˆβ–‹ | 34/92 [01:22<02:15, 2.33s/it] 38%|β–ˆβ–ˆβ–ˆβ–Š | 35/92 [01:24<02:15, 2.38s/it] 39%|β–ˆβ–ˆβ–ˆβ–‰ | 36/92 [01:28<02:30, 2.68s/it] 40%|β–ˆβ–ˆβ–ˆβ–ˆ | 37/92 [01:31<02:38, 2.89s/it] 41%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 38/92 [01:33<02:22, 2.63s/it] 42%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 39/92 [01:36<02:18, 2.61s/it] 43%|β–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 40/92 [01:38<02:14, 2.59s/it] {'loss': 0.0486, 'grad_norm': 1.703125, 'learning_rate': 1.4090686371713403e-05, 'epoch': 0.85}
43%|β–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 40/92 [01:38<02:14, 2.59s/it] 45%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 41/92 [01:41<02:13, 2.63s/it] 46%|β–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 42/92 [01:43<02:09, 2.60s/it] 47%|β–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 43/92 [01:47<02:20, 2.88s/it] 48%|β–ˆβ–ˆβ–ˆβ–ˆβ–Š | 44/92 [01:49<02:12, 2.76s/it] 49%|β–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 45/92 [01:52<02:09, 2.75s/it] 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 46/92 [01:55<02:13, 2.90s/it] 51%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 47/92 [01:58<02:04, 2.77s/it] 52%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 48/92 [02:01<02:01, 2.76s/it] 53%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 49/92 [02:03<01:52, 2.61s/it] 54%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 50/92 [02:05<01:42, 2.44s/it] {'loss': 0.0266, 'grad_norm': 0.8984375, 'learning_rate': 1.0383027336900356e-05, 'epoch': 1.07}
54%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 50/92 [02:05<01:42, 2.44s/it] 55%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 51/92 [02:07<01:36, 2.36s/it] 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 52/92 [02:10<01:39, 2.48s/it] 58%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 53/92 [02:13<01:39, 2.56s/it] 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 54/92 [02:16<01:42, 2.70s/it] 60%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 55/92 [02:18<01:33, 2.52s/it] 61%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 56/92 [02:20<01:26, 2.40s/it] 62%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 57/92 [02:23<01:31, 2.62s/it] 63%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 58/92 [02:25<01:24, 2.48s/it] 64%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 59/92 [02:28<01:21, 2.46s/it] 65%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 60/92 [02:30<01:22, 2.58s/it] {'loss': 0.0228, 'grad_norm': 0.54296875, 'learning_rate': 6.619831215914974e-06, 'epoch': 1.28}
65%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 60/92 [02:30<01:22, 2.58s/it] 66%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 61/92 [02:33<01:18, 2.54s/it] 67%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 62/92 [02:36<01:18, 2.60s/it] 68%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 63/92 [02:38<01:14, 2.55s/it] 70%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 64/92 [02:40<01:08, 2.43s/it] 71%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 65/92 [02:42<01:03, 2.37s/it] 72%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 66/92 [02:44<00:57, 2.21s/it] 73%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 67/92 [02:47<00:57, 2.30s/it] 74%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 68/92 [02:49<00:58, 2.43s/it] 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 69/92 [02:52<00:56, 2.45s/it] 76%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 70/92 [02:55<00:56, 2.56s/it] {'loss': 0.022, 'grad_norm': 0.474609375, 'learning_rate': 3.3467429983443477e-06, 'epoch': 1.49}
76%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 70/92 [02:55<00:56, 2.56s/it] 77%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 71/92 [02:57<00:54, 2.60s/it] 78%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 72/92 [03:00<00:48, 2.44s/it] 79%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 73/92 [03:02<00:47, 2.52s/it] 80%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 74/92 [03:05<00:45, 2.53s/it] 82%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 75/92 [03:07<00:41, 2.42s/it] 83%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 76/92 [03:09<00:38, 2.38s/it] 84%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 77/92 [03:12<00:37, 2.53s/it] 85%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 78/92 [03:15<00:36, 2.60s/it] 86%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 79/92 [03:18<00:35, 2.77s/it] 87%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 80/92 [03:20<00:29, 2.46s/it] {'loss': 0.0211, 'grad_norm': 0.55859375, 'learning_rate': 1.0383444303894453e-06, 'epoch': 1.71}
87%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 80/92 [03:20<00:29, 2.46s/it] 88%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 81/92 [03:22<00:25, 2.35s/it] 89%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 82/92 [03:25<00:25, 2.57s/it] 90%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 83/92 [03:28<00:23, 2.63s/it] 91%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 84/92 [03:31<00:21, 2.72s/it] 92%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 85/92 [03:33<00:17, 2.46s/it] 93%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž| 86/92 [03:35<00:14, 2.40s/it] 95%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 87/92 [03:38<00:13, 2.64s/it] 96%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ| 88/92 [03:40<00:09, 2.40s/it] 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 89/92 [03:42<00:07, 2.39s/it] 98%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š| 90/92 [03:44<00:04, 2.33s/it] {'loss': 0.0215, 'grad_norm': 0.625, 'learning_rate': 2.9341988162595593e-08, 'epoch': 1.92}
98%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š| 90/92 [03:44<00:04, 2.33s/it] 99%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 91/92 [03:47<00:02, 2.41s/it] 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 92/92 [03:50<00:00, 2.54s/it] {'train_runtime': 230.3452, 'train_samples_per_second': 26.048, 'train_steps_per_second': 0.399, 'train_loss': 0.3904266621836502, 'epoch': 1.96}
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 92/92 [03:50<00:00, 2.54s/it] 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 92/92 [03:50<00:00, 2.50s/it]
βœ… Training completed in 0.07 hours
πŸŽ‰ Training pipeline completed successfully!