File size: 3,878 Bytes
8ca94cd | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | 2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_setup.py:_flush():81] Current SDK version is 0.21.4
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_setup.py:_flush():81] Configure stats pid to 292
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_setup.py:_flush():81] Loading settings from /root/.config/wandb/settings
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_setup.py:_flush():81] Loading settings from /notebooks/toy_models/model_training/c4_code-200m-duplicate/wandb/settings
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_setup.py:_flush():81] Loading settings from environment variables
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_init.py:setup_run_log_directory():686] Logging user logs to /notebooks/toy_models/model_training/c4_code-200m-duplicate/wandb/run-20250915_224933-8ifme58a/logs/debug.log
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_init.py:setup_run_log_directory():687] Logging internal logs to /notebooks/toy_models/model_training/c4_code-200m-duplicate/wandb/run-20250915_224933-8ifme58a/logs/debug-internal.log
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_init.py:init():813] calling init triggers
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_init.py:init():818] wandb.init called with sweep_config: {}
config: {'model_name': 'c4_code-200m-duplicate', 'n_layers': 2, 'd_model': 512, 'd_mlp': 2048, 'd_head': 64, 'n_heads': 8, 'attn_only': False, 'layer_norm_eps': 1e-05, 'init_range': 0.02, 'n_ctx': 1024, 'd_vocab': 48262, 'dataset_name': 'eoinf/c4_code-200m', 'tokenizer_name': 'NeelNanda/gpt-neox-tokenizer-digits', 'seed': 10, 'device': 'cuda', 'use_bfloat16_matmul': False, 'batch_size_per_device': 32, 'n_devices': 1, 'batches_per_step': 1, 'max_tokens': 200000000, 'lr_hidden': 0.002, 'lr_vector': 0.001, 'lr_schedule': 'constant_with_warmup', 'warmup_tokens': 30000000, 'weight_decay': 0.05, 'grad_norm_clip': 1.0, 'train_loss_moving_average_beta': 0.99, 'log_interval': 25, 'save_checkpoints': True, 'checkpoint_interval': 500, 'checkpoint_interval_ratio': 1.1, 'save_log_checkpoints': True, 'use_wandb': True, 'batch_size': 32, 'tokens_per_step': 32768, 'warmup_steps': 915, 'max_steps': 6103, '_wandb': {}}
2025-09-15 22:49:33,198 INFO MainThread:292 [wandb_init.py:init():854] starting backend
2025-09-15 22:49:33,457 INFO MainThread:292 [wandb_init.py:init():857] sending inform_init request
2025-09-15 22:49:33,466 INFO MainThread:292 [wandb_init.py:init():865] backend started and connected
2025-09-15 22:49:33,467 INFO MainThread:292 [wandb_init.py:init():936] updated telemetry
2025-09-15 22:49:33,475 INFO MainThread:292 [wandb_init.py:init():960] communicating run to backend with 90.0 second timeout
2025-09-15 22:49:33,882 INFO MainThread:292 [wandb_init.py:init():1011] starting run threads in backend
2025-09-15 22:49:34,265 INFO MainThread:292 [wandb_run.py:_console_start():2506] atexit reg
2025-09-15 22:49:34,265 INFO MainThread:292 [wandb_run.py:_redirect():2354] redirect: wrap_raw
2025-09-15 22:49:34,265 INFO MainThread:292 [wandb_run.py:_redirect():2423] Wrapping output streams.
2025-09-15 22:49:34,265 INFO MainThread:292 [wandb_run.py:_redirect():2446] Redirects installed.
2025-09-15 22:49:34,275 INFO MainThread:292 [wandb_init.py:init():1049] run started, returning control to user process
2025-09-15 23:59:30,667 INFO MainThread:292 [wandb_run.py:_finish():2272] finishing run tzach/toy-transformer-replication/8ifme58a
2025-09-15 23:59:30,670 INFO MainThread:292 [wandb_run.py:_atexit_cleanup():2471] got exitcode: 0
2025-09-15 23:59:30,671 INFO MainThread:292 [wandb_run.py:_restore():2453] restore
2025-09-15 23:59:30,671 INFO MainThread:292 [wandb_run.py:_restore():2459] restore done
2025-09-15 23:59:31,112 INFO MainThread:292 [wandb_run.py:_footer_sync_info():3867] logging synced files
|