2025-09-23 06:15:36,307 INFO MainThread:301 [wandb_setup.py:_flush():81] Current SDK version is 0.21.4 2025-09-23 06:15:36,307 INFO MainThread:301 [wandb_setup.py:_flush():81] Configure stats pid to 301 2025-09-23 06:15:36,307 INFO MainThread:301 [wandb_setup.py:_flush():81] Loading settings from /root/.config/wandb/settings 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_setup.py:_flush():81] Loading settings from /notebooks/toy_models/model_training/c4_code-200m-reversed-within-rows/wandb/settings 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_setup.py:_flush():81] Loading settings from environment variables 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_init.py:setup_run_log_directory():686] Logging user logs to /notebooks/toy_models/model_training/c4_code-200m-reversed-within-rows/wandb/run-20250923_061536-h6k4kav3/logs/debug.log 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_init.py:setup_run_log_directory():687] Logging internal logs to /notebooks/toy_models/model_training/c4_code-200m-reversed-within-rows/wandb/run-20250923_061536-h6k4kav3/logs/debug-internal.log 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_init.py:init():813] calling init triggers 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_init.py:init():818] wandb.init called with sweep_config: {} config: {'model_name': 'c4_code-200m-reversed-within-rows', 'n_layers': 2, 'd_model': 512, 'd_mlp': 2048, 'd_head': 64, 'n_heads': 8, 'attn_only': False, 'layer_norm_eps': 1e-05, 'init_range': 0.02, 'n_ctx': 1024, 'd_vocab': 48262, 'dataset_name': 'eoinf/c4_code-200m-reversed-within-rows', 'tokenizer_name': 'NeelNanda/gpt-neox-tokenizer-digits', 'seed': 10, 'device': 'cuda', 'use_bfloat16_matmul': False, 'batch_size_per_device': 32, 'n_devices': 1, 'batches_per_step': 1, 'max_tokens': 200000000, 'lr_hidden': 0.002, 'lr_vector': 0.001, 'lr_schedule': 'constant_with_warmup', 'warmup_tokens': 30000000, 'weight_decay': 0.05, 'grad_norm_clip': 1.0, 'train_loss_moving_average_beta': 0.99, 'log_interval': 25, 'save_checkpoints': True, 'checkpoint_interval': 500, 'checkpoint_interval_ratio': 1.1, 'save_log_checkpoints': True, 'use_wandb': True, 'batch_size': 32, 'tokens_per_step': 32768, 'warmup_steps': 915, 'max_steps': 6103, '_wandb': {}} 2025-09-23 06:15:36,308 INFO MainThread:301 [wandb_init.py:init():854] starting backend 2025-09-23 06:15:36,811 INFO MainThread:301 [wandb_init.py:init():857] sending inform_init request 2025-09-23 06:15:36,820 INFO MainThread:301 [wandb_init.py:init():865] backend started and connected 2025-09-23 06:15:36,821 INFO MainThread:301 [wandb_init.py:init():936] updated telemetry 2025-09-23 06:15:36,834 INFO MainThread:301 [wandb_init.py:init():960] communicating run to backend with 90.0 second timeout 2025-09-23 06:15:37,242 INFO MainThread:301 [wandb_init.py:init():1011] starting run threads in backend 2025-09-23 06:15:37,353 INFO MainThread:301 [wandb_run.py:_console_start():2506] atexit reg 2025-09-23 06:15:37,353 INFO MainThread:301 [wandb_run.py:_redirect():2354] redirect: wrap_raw 2025-09-23 06:15:37,353 INFO MainThread:301 [wandb_run.py:_redirect():2423] Wrapping output streams. 2025-09-23 06:15:37,353 INFO MainThread:301 [wandb_run.py:_redirect():2446] Redirects installed. 2025-09-23 06:15:37,379 INFO MainThread:301 [wandb_init.py:init():1049] run started, returning control to user process 2025-09-23 07:24:32,703 INFO MainThread:301 [wandb_run.py:_finish():2272] finishing run tzach/toy-transformer-replication/h6k4kav3 2025-09-23 07:24:32,705 INFO MainThread:301 [wandb_run.py:_atexit_cleanup():2471] got exitcode: 0 2025-09-23 07:24:32,705 INFO MainThread:301 [wandb_run.py:_restore():2453] restore 2025-09-23 07:24:32,705 INFO MainThread:301 [wandb_run.py:_restore():2459] restore done 2025-09-23 07:24:33,148 INFO MainThread:301 [wandb_run.py:_footer_sync_info():3867] logging synced files