2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_setup.py:_flush():80] Current SDK version is 0.21.1 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_setup.py:_flush():80] Configure stats pid to 290 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_setup.py:_flush():80] Loading settings from /root/.config/wandb/settings 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_setup.py:_flush():80] Loading settings from /notebooks/toy_models/model_training/baseline_dataset_name_wikitext_llama/wandb/settings 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_setup.py:_flush():80] Loading settings from environment variables 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_init.py:setup_run_log_directory():703] Logging user logs to /notebooks/toy_models/model_training/baseline_dataset_name_wikitext_llama/wandb/run-20251101_050718-x0he2mby/logs/debug.log 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_init.py:setup_run_log_directory():704] Logging internal logs to /notebooks/toy_models/model_training/baseline_dataset_name_wikitext_llama/wandb/run-20251101_050718-x0he2mby/logs/debug-internal.log 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_init.py:init():830] calling init triggers 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_init.py:init():835] wandb.init called with sweep_config: {} config: {'model_name': 'baseline', 'n_layers': 2, 'd_model': 512, 'd_mlp': 2048, 'd_head': 64, 'n_heads': 8, 'attn_only': False, 'layer_norm_eps': 1e-05, 'init_range': 0.02, 'n_ctx': 1024, 'd_vocab': 32000, 'dataset_name': 'eoinf/wikitext_llama', 'tokenizer_name': '', 'seed': 10, 'device': 'cuda', 'use_bfloat16_matmul': False, 'batch_size_per_device': 32, 'n_devices': 1, 'batches_per_step': 1, 'max_tokens': 200000000, 'lr_hidden': 0.002, 'lr_vector': 0.001, 'lr_schedule': 'constant_with_warmup', 'warmup_tokens': 30000000, 'weight_decay': 0.05, 'grad_norm_clip': 1.0, 'train_loss_moving_average_beta': 0.99, 'log_interval': 25, 'save_checkpoints': True, 'checkpoint_interval': 500, 'checkpoint_interval_ratio': 1.1, 'save_log_checkpoints': True, 'use_wandb': True, 'batch_size': 32, 'tokens_per_step': 32768, 'warmup_steps': 915, 'max_steps': 6103, '_wandb': {}} 2025-11-01 05:07:18,928 INFO MainThread:290 [wandb_init.py:init():871] starting backend 2025-11-01 05:07:19,390 INFO MainThread:290 [wandb_init.py:init():874] sending inform_init request 2025-11-01 05:07:19,399 INFO MainThread:290 [wandb_init.py:init():882] backend started and connected 2025-11-01 05:07:19,401 INFO MainThread:290 [wandb_init.py:init():953] updated telemetry 2025-11-01 05:07:19,405 INFO MainThread:290 [wandb_init.py:init():977] communicating run to backend with 90.0 second timeout 2025-11-01 05:07:19,840 INFO MainThread:290 [wandb_init.py:init():1029] starting run threads in backend 2025-11-01 05:07:19,956 INFO MainThread:290 [wandb_run.py:_console_start():2494] atexit reg 2025-11-01 05:07:19,956 INFO MainThread:290 [wandb_run.py:_redirect():2342] redirect: wrap_raw 2025-11-01 05:07:19,956 INFO MainThread:290 [wandb_run.py:_redirect():2411] Wrapping output streams. 2025-11-01 05:07:19,956 INFO MainThread:290 [wandb_run.py:_redirect():2434] Redirects installed. 2025-11-01 05:07:19,964 INFO MainThread:290 [wandb_init.py:init():1075] run started, returning control to user process 2025-11-01 06:00:07,597 INFO MainThread:290 [wandb_run.py:_finish():2260] finishing run eoin/toy-transformer-replication/x0he2mby 2025-11-01 06:00:07,598 INFO MainThread:290 [wandb_run.py:_atexit_cleanup():2459] got exitcode: 0 2025-11-01 06:00:07,599 INFO MainThread:290 [wandb_run.py:_restore():2441] restore 2025-11-01 06:00:07,599 INFO MainThread:290 [wandb_run.py:_restore():2447] restore done 2025-11-01 06:00:13,066 INFO MainThread:290 [wandb_run.py:_footer_history_summary_info():3895] rendering history 2025-11-01 06:00:13,067 INFO MainThread:290 [wandb_run.py:_footer_history_summary_info():3927] rendering summary 2025-11-01 06:00:13,067 INFO MainThread:290 [wandb_run.py:_footer_sync_info():3856] logging synced files