W1229 07:40:29.157000 1920660 site-packages/torch/distributed/run.py:766] W1229 07:40:29.157000 1920660 site-packages/torch/distributed/run.py:766] ***************************************** W1229 07:40:29.157000 1920660 site-packages/torch/distributed/run.py:766] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W1229 07:40:29.157000 1920660 site-packages/torch/distributed/run.py:766] ***************************************** NCCL version 2.26.2+cuda12.2 BraTS train set: 85 casesBraTS train set: 85 cases BraTS train set: 85 cases BraTS train set: 85 casesBraTS train set: 85 casesBraTS train set: 85 casesBraTS train set: 85 cases BraTS train set: 85 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 casesBraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases BraTS val set: 12 cases Loading SAM3 video model... INFO 2025-12-29 07:40:42,136 1920758 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,145 1920754 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,149 1920755 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,214 1920753 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,247 1920756 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,316 1920757 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,354 1920759 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 INFO 2025-12-29 07:40:42,355 1920760 sam3_video_base.py: 124: setting max_num_objects=10000 and num_obj_for_compile=16 Applied LoRA to 159 layers Applied LoRA to 159 layers Applied LoRA to 159 layers Applied LoRA to 159 layers Applied LoRA to 159 layers Applying LoRA to detector: rank=8, alpha=16.0, targets=['q_proj', 'k_proj', 'v_proj', 'out_proj', 'qkv', 'proj'] Applied LoRA to 159 layers Params total=853,537,202 trainable=5,225,340 ratio=0.6122% Applied LoRA to 159 layers Applied LoRA to 159 layers Output dir: /root/githubs/sam3/checkpoints_4class World size: 8 Epoch 0: 0%| | 0/10 [00:00