W1229 03:54:01.696000 1740518 site-packages/torch/distributed/run.py:766] W1229 03:54:01.696000 1740518 site-packages/torch/distributed/run.py:766] ***************************************** W1229 03:54:01.696000 1740518 site-packages/torch/distributed/run.py:766] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W1229 03:54:01.696000 1740518 site-packages/torch/distributed/run.py:766] ***************************************** BraTS train set: 85 cases BraTS train set: 85 casesBraTS train set: 85 cases BraTS train set: 85 cases BraTS train set: 85 cases BraTS train set: 85 casesBraTS train set: 85 cases BraTS train set: 85 cases BraTS val set: 12 casesBraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases Loading SAM3 video model... INFO 2025-12-29 03:54:14,274 1740608 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,343 1740602 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,368 1740607 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,404 1740605 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,435 1740604 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,532 1740609 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,639 1740606 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 03:54:14,741 1740603 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 Applied LoRA to 159 layers Applied LoRA to 159 layers Applied LoRA to 159 layers Applying LoRA to detector: rank=8, alpha=16.0, targets=['q_proj', 'k_proj', 'v_proj', 'out_proj', 'qkv', 'proj'] Applied LoRA to 159 layers Applied LoRA to 159 layers Applied LoRA to 159 layers Params total=853,537,103 trainable=5,225,241 ratio=0.6122% Applied LoRA to 159 layers Applied LoRA to 159 layers Output dir: /root/githubs/sam3/checkpoints World size: 8 Epoch 0: 0%| | 0/10 [00:00