timestamp stringdate 2025-09-23 20:41:09 2025-09-23 23:23:12 | end_timestamp stringdate 2025-09-23 20:41:12 2025-09-23 23:32:20 | stage_name stringclasses 1
value | stage_number int64 1 1 | level stringclasses 1
value | message stringclasses 1
value | stdout_content stringlengths 448 562k | stderr_content stringlengths 2.48k 8.16k | experiment_name stringclasses 1
value | elapsed_time_seconds float64 3.05 570 | stage_complete bool 1
class |
|---|---|---|---|---|---|---|---|---|---|---|
2025-09-23T20:41:09.172841 | 2025-09-23T20:41:12.226356 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[ERROR] LLaMAFactory stage 'sft' failed: [Errno 13] Permission denied: '/scratch/10416/zaynesprague/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/data/dataset_info.json'
[ERROR] Stage error: PermissionError: [Errno 13] Permission denied: '/scratch/10416/zaynesprague/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/data/dataset_info.json'
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 39.76ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 39.60ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 20%|ββββββββββββββββββββββββββββββ | 2.62M/12.9M [00:00<00:00, 13.1MB/s][A
Uploading...: 98%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 12.6M/12.9M [00:00<00:00, 34.7MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:01<00:00, 7.40MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:02<00:00, 2.07s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:02<00:00, 2.07s/ shards]
| BASELINE_r1_distillation | 3.053515 | true |
2025-09-23T20:46:34.883811 | 2025-09-23T20:46:45.960537 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Registered dataset: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train -> TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data (format: sharegpt)
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 5 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 6)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
Due to MODULEPATH changes, the following have been reloaded:
1) openmpi/5.0.5
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
srun: error: c619-111: task 2: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.94ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.6MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.0MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
README.md: 0%| | 0.00/391 [00:00<?, ?B/s]
README.md: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 391/391 [00:00<00:00, 5.90MB/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.25kB [00:00, 11.1MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/4.42k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4.42k/4.42k [00:00<00:00, 12.9kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4.42k/4.42k [00:00<00:00, 12.9kB/s]
Generating train split: 0%| | 0/5 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 5/5 [00:00<00:00, 1241.43 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 3026.19ba/s]
Uploading...: 0%| | 0.00/5.86k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/5.86k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/5.86k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 5.86k/5.86k [00:00<00:00, 29.3kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 5.86k/5.86k [00:00<00:00, 7.07kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.07s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.07s/ shards]
| BASELINE_r1_distillation | 11.076726 | true |
2025-09-23T20:53:13.441740 | 2025-09-23T20:53:19.987384 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 9 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 10)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
Due to MODULEPATH changes, the following have been reloaded:
1) openmpi/5.0.5
The following have been reloaded with a version change:
1) cuda/12.6 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.16ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 39.3MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.4MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.02s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.02s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.25kB [00:00, 13.4MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/5.92k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 5.92k/5.92k [00:00<00:00, 18.3kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 5.92k/5.92k [00:00<00:00, 18.2kB/s]
Generating train split: 0%| | 0/9 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9/9 [00:00<00:00, 2517.76 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 3315.66ba/s]
Uploading...: 0%| | 0.00/6.05k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.05k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.05k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.05k/6.05k [00:00<00:00, 30.2kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.05k/6.05k [00:00<00:00, 7.31kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.10s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.10s/ shards]
| BASELINE_r1_distillation | 6.545644 | true |
2025-09-23T20:59:18.023433 | 2025-09-23T20:59:24.260352 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 13 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 14)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
Due to MODULEPATH changes, the following have been reloaded:
1) openmpi/5.0.5
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-112: task 3: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.17ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.8MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.2MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.15 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.15 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.6MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/6.12k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.12k/6.12k [00:00<00:00, 18.8kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.12k/6.12k [00:00<00:00, 18.8kB/s]
Generating train split: 0%| | 0/13 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 13/13 [00:00<00:00, 3006.84 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2508.56ba/s]
Uploading...: 0%| | 0.00/6.25k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.25k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.25k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.25k/6.25k [00:00<00:00, 31.2kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.25k/6.25k [00:00<00:00, 7.55kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.14s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.14s/ shards]
| BASELINE_r1_distillation | 6.236919 | true |
2025-09-23T21:28:32.347147 | 2025-09-23T21:28:38.508036 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 17 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 18)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
from torch._C import * # noqa: F403
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.99ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.4MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.0MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.6MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/6.33k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.33k/6.33k [00:00<00:00, 19.6kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.33k/6.33k [00:00<00:00, 19.6kB/s]
Generating train split: 0%| | 0/17 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 17/17 [00:00<00:00, 3979.19 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2641.25ba/s]
Uploading...: 0%| | 0.00/6.46k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.46k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.46k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.46k/6.46k [00:00<00:00, 32.3kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.46k/6.46k [00:00<00:00, 7.76kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.05s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.05s/ shards]
| BASELINE_r1_distillation | 6.160889 | true |
2025-09-23T21:30:15.929135 | 2025-09-23T21:30:22.369206 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 21 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 22)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.94ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.7MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.1MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.23 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.23 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.8MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/6.53k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.53k/6.53k [00:00<00:00, 19.1kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.53k/6.53k [00:00<00:00, 19.1kB/s]
Generating train split: 0%| | 0/21 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 21/21 [00:00<00:00, 5344.68 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2770.35ba/s]
Uploading...: 0%| | 0.00/6.66k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.66k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.66k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.66k/6.66k [00:00<00:00, 33.3kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.66k/6.66k [00:00<00:00, 8.01kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
| BASELINE_r1_distillation | 6.440071 | true |
2025-09-23T21:32:03.212178 | 2025-09-23T21:32:13.302628 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 25 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 26)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
E0923 21:32:13.045000 816862 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 816867) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:32:13.046000 3667132 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 3667135) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:32:13.045000 917863 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 917866) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:32:13.045000 3461454 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 3461457) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
return f(*args, **kwargs)
Traceback (most recent call last):
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
return f(*args, **kwargs)
main()
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
return launch_agent(self._config, self._entrypoint, list(args))
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
raise ChildFailedError(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:32:13
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 2 (pid: 816867)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
run(args)
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:32:13
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 2 (pid: 3667135)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
return launch_agent(self._config, self._entrypoint, list(args))
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:32:13
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 2 (pid: 917866)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:32:13
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 2 (pid: 3461457)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-101: task 0: Exited with exit code 1
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.94ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.4MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 24.0MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.22 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.22 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 1.67MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/6.74k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.74k/6.74k [00:00<00:00, 20.3kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.74k/6.74k [00:00<00:00, 20.3kB/s]
Generating train split: 0%| | 0/25 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 25/25 [00:00<00:00, 5622.39 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2744.96ba/s]
Uploading...: 0%| | 0.00/6.92k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.92k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/6.92k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.92k/6.92k [00:00<00:00, 34.5kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.92k/6.92k [00:00<00:00, 8.29kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.03s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.03s/ shards]
| BASELINE_r1_distillation | 10.09045 | true |
2025-09-23T21:34:45.027978 | 2025-09-23T21:34:54.235864 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 29 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 30)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python: can't open file '/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py': [Errno 2] No such file or directory
E0923 21:34:54.016000 3667164 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 3667167) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:34:54.016000 818564 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 818569) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:34:54.016000 3461485 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 3461488) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 21:34:54.015000 917894 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 2) local_rank: 0 (pid: 917897) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
Traceback (most recent call last):
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
run(args)
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
Traceback (most recent call last):
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
return f(*args, **kwargs)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:34:54
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 2 (pid: 3667167)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return f(*args, **kwargs)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:34:54
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 2 (pid: 818569)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
run(args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:34:54
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 2 (pid: 3461488)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skill_factory_dir/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:34:54
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 2 (pid: 917897)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-101: task 0: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.83ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.2MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.9MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.14 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.14 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.6MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/6.99k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.99k/6.99k [00:00<00:00, 20.4kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6.99k/6.99k [00:00<00:00, 20.4kB/s]
Generating train split: 0%| | 0/29 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 29/29 [00:00<00:00, 3827.16 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2853.27ba/s]
Uploading...: 0%| | 0.00/7.13k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.13k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.13k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.13k/7.13k [00:00<00:00, 35.6kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.13k/7.13k [00:00<00:00, 8.55kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
| BASELINE_r1_distillation | 9.207886 | true |
2025-09-23T21:35:46.741998 | 2025-09-23T21:35:53.832806 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Registered dataset: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train -> TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data (format: sharegpt)
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 33 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 34)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load cuda/12.4 nccl/12.4 nvidia_math/12.4
source /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh;conda activate verl2
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
/scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/run_training.sh: line 7: /work/10416/zaynesprague/vista/../anaconda3/etc/profile.d/conda.sh: Permission denied
EnvironmentNameNotFound: Could not find conda environment: verl2
You can list all discoverable environments with `conda info --envs`.
Python environment check: /home1/10286/georgetsoukalas/miniconda3/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-112: task 3: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.33ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 38.1MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.9MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.22 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.22 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 29.5kB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/7.21k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.21k/7.21k [00:00<00:00, 18.8kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.21k/7.21k [00:00<00:00, 18.8kB/s]
Generating train split: 0%| | 0/33 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 33/33 [00:00<00:00, 9715.85 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2861.05ba/s]
Uploading...: 0%| | 0.00/7.37k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.37k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.37k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.37k/7.37k [00:00<00:00, 36.8kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.37k/7.37k [00:00<00:00, 8.82kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.08s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.08s/ shards]
| BASELINE_r1_distillation | 7.090808 | true |
2025-09-23T21:39:17.461189 | 2025-09-23T21:39:43.626183 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 40 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 41)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.8 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 21:39:32] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 21:39:32] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 21:39:32] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 21:39:33] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: longmult
Registered source: longmult
Registered source: countdown
Registered source: countdown
Registered source: countdown
Registered source: gsm8k
Registered source: gsm8k
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: mmlu
Registered source: mmlu
Registered source: csqa
Registered source: mmlu_pro
Registered source: mmlu_pro
Registered source: social_iqa
Registered source: csqa
Registered source: csqa
Registered source: strategy_qa
Registered source: social_iqa
Registered source: social_iqa
Registered source: winogrande
Registered source: strategy_qa
Registered source: strategy_qa
Registered source: bbh
Registered source: winogrande
Registered source: winogrande
Registered source: letter_countdown
Registered source: bbh
Registered source: bbh
Registered source: acronym
Registered source: letter_countdown
Registered source: letter_countdown
Registered source: acronym
Registered source: acronym
Traceback (most recent call last):
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
main()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
run_exp()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
_training_function(config={"args": args, "callbacks": callbacks})
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 208, in get_train_args
model_args, data_args, training_args, finetuning_args, generating_args = _parse_train_args(args)
^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 186, in _parse_train_args
return _parse_args(parser, args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 80, in _parse_args
return parser.parse_dict(args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
^^^^^^^^^^^^^^^
File "<string>", line 143, in __init__
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/training_args.py", line 81, in __post_init__
Seq2SeqTrainingArguments.__post_init__(self)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 1738, in __post_init__
self.device
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2268, in device
return self._setup_devices
^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/generic.py", line 67, in __get__
cached = self.fget(obj)
^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2195, in _setup_devices
self.distributed_state = PartialState(**accelerator_state_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/accelerate/state.py", line 208, in __init__
raise ImportError(
ImportError: DeepSpeed is not available => install it using `pip3 install deepspeed` or build it from source
Traceback (most recent call last):
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
main()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
run_exp()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
_training_function(config={"args": args, "callbacks": callbacks})
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 208, in get_train_args
model_args, data_args, training_args, finetuning_args, generating_args = _parse_train_args(args)
^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 186, in _parse_train_args
return _parse_args(parser, args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 80, in _parse_args
return parser.parse_dict(args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
^^^^^^^^^^^^^^^
File "<string>", line 143, in __init__
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/training_args.py", line 81, in __post_init__
Seq2SeqTrainingArguments.__post_init__(self)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 1738, in __post_init__
self.device
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2268, in device
return self._setup_devices
^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/generic.py", line 67, in __get__
cached = self.fget(obj)
^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2195, in _setup_devices
self.distributed_state = PartialState(**accelerator_state_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/accelerate/state.py", line 208, in __init__
raise ImportError(
Traceback (most recent call last):
ImportError: DeepSpeed is not available => install it using `pip3 install deepspeed` or build it from source
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
main()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
run_exp()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
_training_function(config={"args": args, "callbacks": callbacks})
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 208, in get_train_args
model_args, data_args, training_args, finetuning_args, generating_args = _parse_train_args(args)
^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 186, in _parse_train_args
return _parse_args(parser, args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 80, in _parse_args
return parser.parse_dict(args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
^^^^^^^^^^^^^^^
File "<string>", line 143, in __init__
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/training_args.py", line 81, in __post_init__
Seq2SeqTrainingArguments.__post_init__(self)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 1738, in __post_init__
self.device
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2268, in device
return self._setup_devices
^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/generic.py", line 67, in __get__
cached = self.fget(obj)
^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2195, in _setup_devices
self.distributed_state = PartialState(**accelerator_state_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/accelerate/state.py", line 208, in __init__
raise ImportError(
ImportError: DeepSpeed is not available => install it using `pip3 install deepspeed` or build it from source
Traceback (most recent call last):
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
main()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
run_exp()
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
_training_function(config={"args": args, "callbacks": callbacks})
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 208, in get_train_args
model_args, data_args, training_args, finetuning_args, generating_args = _parse_train_args(args)
^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 186, in _parse_train_args
return _parse_args(parser, args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 80, in _parse_args
return parser.parse_dict(args, allow_extra_keys=allow_extra_keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
^^^^^^^^^^^^^^^
File "<string>", line 143, in __init__
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/training_args.py", line 81, in __post_init__
Seq2SeqTrainingArguments.__post_init__(self)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 1738, in __post_init__
self.device
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2268, in device
return self._setup_devices
^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/generic.py", line 67, in __get__
cached = self.fget(obj)
^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2195, in _setup_devices
self.distributed_state = PartialState(**accelerator_state_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/accelerate/state.py", line 208, in __init__
raise ImportError(
ImportError: DeepSpeed is not available => install it using `pip3 install deepspeed` or build it from source
E0923 21:39:42.922000 823835 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 823840) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_21:39:42
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 823840)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
W0923 21:39:43.130000 3461653 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3461656 closing signal SIGTERM
srun: error: c619-101: task 0: Exited with exit code 1
[W923 21:39:43.326240028 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:41562, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400073683ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400043782950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x4000437859dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400043787ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400043789178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40004378970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40004378adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x40003d870904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x40003cea59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40003bedc79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40003bedc86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.202000 3461653 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3461653_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 21:39:43.349798853 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:41562, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400073683ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400043782950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x4000437859dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400043787ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400043789178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40004378970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40004378adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x40003d870904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x40003cea59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40003bedc79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40003bedc86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.224000 3461653 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3461653_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
W0923 21:39:43.227000 3667333 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3667336 closing signal SIGTERM
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
[W923 21:39:43.423454071 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:42448, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003f1e3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000f2e2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000f2e59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000f2e7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40000f2e9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40000f2e970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40000f2eadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x4000093d0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x400008a059c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x400007a3c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x400007a3c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.299000 3667333 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3667333_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 21:39:43.433028710 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:42448, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003f1e3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000f2e2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000f2e59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000f2e7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40000f2e9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40000f2e970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40000f2eadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x4000093d0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x400008a059c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x400007a3c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x400007a3c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.308000 3667333 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3667333_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
W0923 21:39:43.328000 918065 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 918068 closing signal SIGTERM
[W923 21:39:43.614421209 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:44054, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400047483ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400017582950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x4000175859dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400017587ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400017589178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40001758970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40001758adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400011670904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x400010ca59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000fcdc79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000fcdc86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.400000 918065 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_918065_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 21:39:43.624399707 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:44054, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400047483ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400017582950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x4000175859dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400017587ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400017589178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40001758970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40001758adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400011670904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x400010ca59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000fcdc79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000fcdc86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 21:39:43.409000 918065 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_918065_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.43ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 35.0MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 22.6MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.9MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/7.67k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.67k/7.67k [00:00<00:00, 23.0kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.67k/7.67k [00:00<00:00, 23.0kB/s]
Generating train split: 0%| | 0/40 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 40/40 [00:00<00:00, 11069.69 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2613.27ba/s]
Uploading...: 0%| | 0.00/7.82k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.82k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/7.82k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.82k/7.82k [00:00<00:00, 39.0kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.82k/7.82k [00:00<00:00, 9.35kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
| BASELINE_r1_distillation | 26.164994 | true |
2025-09-23T22:17:02.398244 | 2025-09-23T22:17:31.119340 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 44 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 45)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.8 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:17:19] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:17:19] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:17:19] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:17:20] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: longmult
Registered source: mmlu_pro
Registered source: countdown
Registered source: csqa
Registered source: social_iqa
Registered source: gsm8k
Registered source: strategy_qa
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: winogrande
Registered source: mmlu
Registered source: bbh
Registered source: mmlu_pro
Registered source: letter_countdown
Registered source: csqa
Registered source: acronym
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[rank0]: Traceback (most recent call last):
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank0]: main()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank0]: run_exp()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank0]: _training_function(config={"args": args, "callbacks": callbacks})
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
[rank0]: model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 308, in get_train_args
[rank0]: _check_extra_dependencies(model_args, finetuning_args, training_args)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 175, in _check_extra_dependencies
[rank0]: check_version("deepspeed>=0.10.0,<=0.16.9", mandatory=True)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/extras/misc.py", line 92, in check_version
[rank0]: require_version(requirement, hint)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 111, in require_version
[rank0]: _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 44, in _compare_versions
[rank0]: raise ImportError(
[rank0]: ImportError: deepspeed>=0.10.0,<=0.16.9 is required for a normal functioning of this module, but found deepspeed==0.17.6.
[rank0]: To fix: run `pip install deepspeed>=0.10.0,<=0.16.9`.
[rank2]: Traceback (most recent call last):
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank2]: main()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank2]: run_exp()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank2]: _training_function(config={"args": args, "callbacks": callbacks})
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
[rank2]: model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
[rank2]: ^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 308, in get_train_args
[rank2]: _check_extra_dependencies(model_args, finetuning_args, training_args)
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 175, in _check_extra_dependencies
[rank2]: check_version("deepspeed>=0.10.0,<=0.16.9", mandatory=True)
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/extras/misc.py", line 92, in check_version
[rank2]: require_version(requirement, hint)
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 111, in require_version
[rank2]: _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 44, in _compare_versions
[rank2]: raise ImportError(
[rank2]: ImportError: deepspeed>=0.10.0,<=0.16.9 is required for a normal functioning of this module, but found deepspeed==0.17.6.
[rank2]: To fix: run `pip install deepspeed>=0.10.0,<=0.16.9`.
[rank1]: Traceback (most recent call last):
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank1]: main()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank1]: run_exp()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank1]: _training_function(config={"args": args, "callbacks": callbacks})
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
[rank1]: model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
[rank1]: ^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 308, in get_train_args
[rank1]: _check_extra_dependencies(model_args, finetuning_args, training_args)
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 175, in _check_extra_dependencies
[rank1]: check_version("deepspeed>=0.10.0,<=0.16.9", mandatory=True)
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/extras/misc.py", line 92, in check_version
[rank1]: require_version(requirement, hint)
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 111, in require_version
[rank1]: _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 44, in _compare_versions
[rank1]: raise ImportError(
[rank1]: ImportError: deepspeed>=0.10.0,<=0.16.9 is required for a normal functioning of this module, but found deepspeed==0.17.6.
[rank1]: To fix: run `pip install deepspeed>=0.10.0,<=0.16.9`.
[rank3]: Traceback (most recent call last):
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank3]: main()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank3]: run_exp()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank3]: _training_function(config={"args": args, "callbacks": callbacks})
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 55, in _training_function
[rank3]: model_args, data_args, training_args, finetuning_args, generating_args = get_train_args(args)
[rank3]: ^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 308, in get_train_args
[rank3]: _check_extra_dependencies(model_args, finetuning_args, training_args)
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/hparams/parser.py", line 175, in _check_extra_dependencies
[rank3]: check_version("deepspeed>=0.10.0,<=0.16.9", mandatory=True)
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/extras/misc.py", line 92, in check_version
[rank3]: require_version(requirement, hint)
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 111, in require_version
[rank3]: _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/utils/versions.py", line 44, in _compare_versions
[rank3]: raise ImportError(
[rank3]: ImportError: deepspeed>=0.10.0,<=0.16.9 is required for a normal functioning of this module, but found deepspeed==0.17.6.
[rank3]: To fix: run `pip install deepspeed>=0.10.0,<=0.16.9`.
[rank0]:[W923 22:17:29.780985357 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[rank2]:[W923 22:17:30.187248300 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[rank1]:[W923 22:17:30.211375232 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[rank3]:[W923 22:17:30.332637124 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
E0923 22:17:30.245000 826703 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 826708) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:17:30
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 826708)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-101: task 0: Exited with exit code 1
E0923 22:17:30.744000 3667974 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3667978) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 22:17:30.748000 3462307 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3462311) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 22:17:30.748000 918711 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 918714) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
[W923 22:17:30.883399844 TCPStore.cpp:115] [c10d] recvVector failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:33530, remote=[c619-101.vista.tacc.utexas.edu]:12802): failed to recv, got 0 bytes
Exception raised from recvBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:678 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40005c8b3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40002c9b2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc5724 (0x40002c9b5724 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc6048 (0x40002c9b6048 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: <unknown function> + 0x5dc7874 (0x40002c9b7874 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1bc (0x40002c9b88ac in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: <unknown function> + 0xf60d20 (0x400026aa0d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: <unknown function> + 0x5959c0 (0x4000260d59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #9: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #10: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #11: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #15: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #20: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #23: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: <unknown function> + 0x2c79c (0x40002510c79c in /lib64/libc.so.6)
frame #26: __libc_start_main + 0x98 (0x40002510c86c in /lib64/libc.so.6)
frame #27: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:17:30.755000 918711 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_918711_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:17:30.886808468 TCPStore.cpp:115] [c10d] recvVector failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:50054, remote=[c619-101.vista.tacc.utexas.edu]:12802): failed to recv, got 0 bytes
Exception raised from recvBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:678 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003ea73ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000eb72950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc5724 (0x40000eb75724 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc6048 (0x40000eb76048 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: <unknown function> + 0x5dc7874 (0x40000eb77874 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1bc (0x40000eb788ac in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: <unknown function> + 0xf60d20 (0x400008c60d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: <unknown function> + 0x5959c0 (0x4000082959c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #9: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #10: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #11: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #15: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #20: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #23: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: <unknown function> + 0x2c79c (0x4000072cc79c in /lib64/libc.so.6)
frame #26: __libc_start_main + 0x98 (0x4000072cc86c in /lib64/libc.so.6)
frame #27: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:17:30.758000 3667974 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3667974_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:17:30.978415375 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:45612, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400040a13ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400010b12950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x400010b159dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400010b17ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400010b19178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x400010b1970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x400010b1adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x40000ac00904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x40000a2359c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000926c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000926c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
[W923 22:17:30.890638324 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:33530, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40005c8b3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40002c9b2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40002c9b59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40002c9b7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1ac (0x40002c9b889c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: <unknown function> + 0xf60d20 (0x400026aa0d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #6: <unknown function> + 0x5959c0 (0x4000260d59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #8: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #10: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #14: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #19: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #20: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #22: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: <unknown function> + 0x2c79c (0x40002510c79c in /lib64/libc.so.6)
frame #25: __libc_start_main + 0x98 (0x40002510c86c in /lib64/libc.so.6)
frame #26: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:17:30.762000 3462307 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3462307_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
W0923 22:17:30.763000 918711 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_918711_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:17:30.894117860 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:50054, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003ea73ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000eb72950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000eb759dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000eb77ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1ac (0x40000eb7889c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: <unknown function> + 0xf60d20 (0x400008c60d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #6: <unknown function> + 0x5959c0 (0x4000082959c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #8: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #10: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #14: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #19: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #20: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #22: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: <unknown function> + 0x2c79c (0x4000072cc79c in /lib64/libc.so.6)
frame #25: __libc_start_main + 0x98 (0x4000072cc86c in /lib64/libc.so.6)
frame #26: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:17:30.765000 3667974 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3667974_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:17:30.985906458 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:45612, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400040a13ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400010b12950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x400010b159dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400010b17ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400010b19178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x400010b1970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x400010b1adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x40000ac00904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x40000a2359c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000926c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000926c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
[W923 22:17:30.897172038 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:33530, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40005c8b3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40002c9b2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40002c9b59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40002c9b7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1ac (0x40002c9b889c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: <unknown function> + 0xf60d20 (0x400026aa0d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #6: <unknown function> + 0x5959c0 (0x4000260d59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #8: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #10: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #14: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #19: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #20: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #22: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: <unknown function> + 0x2c79c (0x40002510c79c in /lib64/libc.so.6)
frame #25: __libc_start_main + 0x98 (0x40002510c86c in /lib64/libc.so.6)
frame #26: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:17:30.768000 3462307 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3462307_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
W0923 22:17:30.770000 918711 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_918711_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:17:30.900754393 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:50054, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003ea73ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000eb72950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000eb759dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000eb77ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::compareSet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&, std::vector<unsigned char, std::allocator<unsigned char> > const&) + 0x1ac (0x40000eb7889c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: <unknown function> + 0xf60d20 (0x400008c60d20 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #6: <unknown function> + 0x5959c0 (0x4000082959c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #7: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
Traceback (most recent call last):
frame #8: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #10: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #12: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #14: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #16: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
frame #17: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #18: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #19: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #20: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #21: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #22: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #24: <unknown function> + 0x2c79c (0x4000072cc79c in /lib64/libc.so.6)
frame #25: __libc_start_main + 0x98 (0x4000072cc86c in /lib64/libc.so.6)
frame #26: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
W0923 22:17:30.771000 3667974 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3667974_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
return f(*args, **kwargs)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
elastic_launch(
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:17:30
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 1 (pid: 918714)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:17:30
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 1 (pid: 3667978)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
W0923 22:17:30.775000 3462307 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3462307_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:17:30
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 1 (pid: 3462311)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.17ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 37.1MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.5MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.01 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.01 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.6MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/7.89k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.89k/7.89k [00:00<00:00, 23.3kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 7.89k/7.89k [00:00<00:00, 23.3kB/s]
Generating train split: 0%| | 0/44 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 44/44 [00:00<00:00, 11088.04 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2435.72ba/s]
Uploading...: 0%| | 0.00/8.03k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.03k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.03k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.03k/8.03k [00:00<00:00, 40.1kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.03k/8.03k [00:00<00:00, 9.52kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.11s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.11s/ shards]
| BASELINE_r1_distillation | 28.721096 | true |
2025-09-23T22:36:05.866854 | 2025-09-23T22:36:36.010003 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 48 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 49)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.8 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:36:20] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:36:21] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:36:21] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:36:21,627] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[WARNING|2025-09-23 22:36:21] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:36:22,698] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:36:22,700] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:36:22,982] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:36:26,579] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 22:36:26,579] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 22:36:26] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,724 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:27,725 >> loading file chat_template.jinja from cache at None
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:28,043 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:36:28,479 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:36:28,520 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:28,714 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:36:28,811] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:36:28] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:28,889 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:36:28] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[2025-09-23 22:36:28,915] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:36:28] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[2025-09-23 22:36:29,038] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:36:29] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,107 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,156 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,274 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,275 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:29,305 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:29,358 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:29,476 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:36:29,720 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:36:29,724 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|configuration_utils.py:698] 2025-09-23 22:36:29,753 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:36:29,757 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,860 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 22:36:29,881 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:36:29,885 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:29,946 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:36:30,021 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:30,039 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:36:30] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:30,121 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:36:30] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:36:30,198 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:36:30] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Generating train split: 0%| | 0/3998 [00:00<?, ? examples/s]
Generating train split: 25%|βββ | 1000/3998 [00:00<00:00, 8290.73 examples/s]
Generating train split: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 29733.76 examples/s]
Converting format of dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Converting format of dataset: 54%|ββββββ | 2158/3998 [00:00<00:00, 21450.17 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 21750.30 examples/s]
Generating train split: 0%| | 0/3998 [00:00<?, ? examples/s]
Generating train split: 25%|βββ | 1000/3998 [00:00<00:00, 5130.10 examples/s]
Generating train split: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 20228.05 examples/s]
Generating train split: 0%| | 0/3998 [00:00<?, ? examples/s]
Generating train split: 25%|βββ | 1000/3998 [00:00<00:00, 9993.50 examples/s]
Generating train split: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 38886.58 examples/s]
Generating train split: 0%| | 0/3998 [00:00<?, ? examples/s]
Generating train split: 25%|βββ | 1000/3998 [00:00<00:00, 5169.61 examples/s]
Generating train split: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 20378.02 examples/s]
[rank0]:[W923 22:36:31.707489786 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
Converting format of dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Converting format of dataset: 54%|ββββββ | 2144/3998 [00:00<00:00, 20839.72 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 21333.03 examples/s]
Converting format of dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Converting format of dataset: 53%|ββββββ | 2108/3998 [00:00<00:00, 20986.95 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 21227.60 examples/s]
Converting format of dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Converting format of dataset: 53%|ββββββ | 2135/3998 [00:00<00:00, 20516.55 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<00:00, 21130.98 examples/s]
c619-101:829801:829801 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:829801:829801 [0] NCCL INFO cudaDriverVersion 12080
c619-101:829801:829801 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:829801:829801 [0] NCCL INFO Comm config Blocking set to 1
[rank1]:[W923 22:36:31.863489491 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
[rank3]:[W923 22:36:31.960046351 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3668419:3668419 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3668419:3668419 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3668419:3668419 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3668419:3668419 [0] NCCL INFO Comm config Blocking set to 1
c619-112:919152:919152 [0] NCCL INFO cudaDriverVersion 12080
c619-112:919152:919152 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:919152:919152 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:919152:919152 [0] NCCL INFO Comm config Blocking set to 1
[rank2]:[W923 22:36:31.900626517 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-111:3462746:3462746 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3462746:3462746 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3462746:3462746 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3462746:3462746 [0] NCCL INFO Comm config Blocking set to 1
c619-101:829801:829926 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3668419:3668534 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3462746:3462862 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:919152:919267 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:829801:829926 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-111:3462746:3462862 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-102:3668419:3668534 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-101:829801:829926 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:829801:829926 [0] NCCL INFO Using network IB
c619-101:829801:829926 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3462746:3462862 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3462746:3462862 [0] NCCL INFO Using network IB
c619-101:829801:829926 [0] NCCL INFO ncclCommInitRankConfig comm 0x1b0748f0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init START
c619-111:3462746:3462862 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3462746:3462862 [0] NCCL INFO ncclCommInitRankConfig comm 0x24604cb0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init START
c619-102:3668419:3668534 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3668419:3668534 [0] NCCL INFO Using network IB
c619-102:3668419:3668534 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3668419:3668534 [0] NCCL INFO ncclCommInitRankConfig comm 0x4c740040 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init START
c619-112:919152:919267 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:919152:919267 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:919152:919267 [0] NCCL INFO Using network IB
c619-112:919152:919267 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:919152:919267 [0] NCCL INFO ncclCommInitRankConfig comm 0x3a030a80 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init START
c619-111:3462746:3462862 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:829801:829926 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-112:919152:919267 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-102:3668419:3668534 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:829801:829926 [0] NCCL INFO Bootstrap timings total 0.079871 (create 0.078036, send 0.000056, recv 0.000138, ring 0.000473, delay 0.000002)
c619-112:919152:919267 [0] NCCL INFO Bootstrap timings total 0.117615 (create 0.000029, send 0.000224, recv 0.069115, ring 0.000417, delay 0.000001)
c619-111:3462746:3462862 [0] NCCL INFO Bootstrap timings total 0.137934 (create 0.000040, send 0.000213, recv 0.009651, ring 0.069404, delay 0.000001)
c619-102:3668419:3668534 [0] NCCL INFO Bootstrap timings total 0.137470 (create 0.000030, send 0.000157, recv 0.000176, ring 0.000098, delay 0.000001)
c619-111:3462746:3462862 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:829801:829926 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:919152:919267 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-102:3668419:3668534 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:829801:829926 [0] NCCL INFO comm 0x1b0748f0 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:829801:829926 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-111:3462746:3462862 [0] NCCL INFO comm 0x24604cb0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:829801:829926 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-112:919152:919267 [0] NCCL INFO comm 0x3a030a80 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:829801:829926 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-102:3668419:3668534 [0] NCCL INFO comm 0x4c740040 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:829801:829926 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:829801:829926 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-111:3462746:3462862 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3462746:3462862 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:919152:919267 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:919152:919267 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3668419:3668534 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3668419:3668534 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:919152:919270 [0] NCCL INFO [Proxy Service] Device 0 CPU core 50
c619-112:919152:919271 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 51
c619-112:919152:919267 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-112:919152:919267 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:919152:919267 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-112:919152:919267 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:919152:919267 [0] NCCL INFO ncclCommInitRankConfig comm 0x3a030a80 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init COMPLETE
c619-112:919152:919267 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.33 (kernels 0.16, alloc 0.02, bootstrap 0.12, allgathers 0.01, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:919152:919272 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-112:919152:919273 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 53
c619-112:919152:919272 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668537 [0] NCCL INFO [Proxy Service] Device 0 CPU core 65
c619-102:3668419:3668538 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 66
c619-102:3668419:3668534 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-102:3668419:3668534 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3668419:3668534 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3668419:3668534 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3668419:3668534 [0] NCCL INFO ncclCommInitRankConfig comm 0x4c740040 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init COMPLETE
c619-102:3668419:3668534 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.34 (kernels 0.16, alloc 0.02, bootstrap 0.14, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3668419:3668539 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-102:3668419:3668540 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 67
c619-112:919152:919272 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462865 [0] NCCL INFO [Proxy Service] Device 0 CPU core 34
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462866 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 35
c619-111:3462746:3462862 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-111:3462746:3462862 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3462746:3462862 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3462746:3462862 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3462746:3462862 [0] NCCL INFO ncclCommInitRankConfig comm 0x24604cb0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init COMPLETE
c619-111:3462746:3462862 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.30 (kernels 0.13, alloc 0.02, bootstrap 0.14, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3462746:3462867 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-111:3462746:3462868 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 36
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-102:3668419:3668539 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829926 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:829801:829926 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:829801:829926 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-101:829801:829929 [0] NCCL INFO [Proxy Service] Device 0 CPU core 19
c619-101:829801:829930 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 20
c619-101:829801:829926 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-101:829801:829926 [0] NCCL INFO Enabled NCCL Func/Proto/Algo Matrix:
Function | LL LL128 Simple | Tree Ring CollNetDirect CollNetChain NVLS NVLSTree PAT
Broadcast | 0 0 1 | 1 1 1 1 1 1 1
Reduce | 0 0 1 | 1 1 1 1 1 1 1
AllGather | 0 0 1 | 1 1 1 1 1 1 1
ReduceScatter | 0 0 1 | 1 1 1 1 1 1 1
AllReduce | 0 0 1 | 1 1 1 1 1 1 1
c619-101:829801:829926 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:829801:829926 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:829801:829926 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-101:829801:829926 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:829801:829926 [0] NCCL INFO ncclCommInitRankConfig comm 0x1b0748f0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x35699e2a3b06ab04 - Init COMPLETE
c619-101:829801:829926 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 0.42 (kernels 0.24, alloc 0.02, bootstrap 0.08, allgathers 0.06, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:829801:829931 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-101:829801:829931 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-101:829801:829932 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 21
c619-101:829801:829931 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462746:3462867 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-112:919152:919272 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:829801:829931 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919152:919272 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668419:3668539 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
[2025-09-23 22:36:32] c619-102:3668419:3668537 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-102:3668419:3668539 [0] NCCL INFO transport.cc:197 -> 1
c619-102:3668419:3668539 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-102:3668419:3668539 [0] NCCL INFO group.cc:152 -> 1
c619-102:3668419:3668539 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-102:3668419:3668419 [0] NCCL INFO group.cc:460 -> 1
c619-102:3668419:3668419 [0] NCCL INFO group.cc:581 -> 1
c619-102:3668419:3668419 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:36:32] c619-102:3668419:3668537 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-102:3668419:3668537 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-102:3668419:3668537 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-102:3668419:3668537 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-102:3668419:3668537 [0] NCCL INFO proxy.cc:1486 -> 3
[2025-09-23 22:36:32] c619-102:3668419:3668537 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-102:3668419:3668537 [0] NCCL INFO misc/socket.cc:881 -> 3
[2025-09-23 22:36:32] c619-111:3462746:3462865 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-111:3462746:3462867 [0] NCCL INFO transport.cc:197 -> 1
c619-111:3462746:3462867 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-111:3462746:3462867 [0] NCCL INFO group.cc:152 -> 1
c619-111:3462746:3462867 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-111:3462746:3462746 [0] NCCL INFO group.cc:460 -> 1
c619-111:3462746:3462746 [0] NCCL INFO group.cc:581 -> 1
c619-111:3462746:3462746 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:36:32] c619-111:3462746:3462865 [0] transport/net.cc:985 NCCL WARN Cuda failure 801 'operation not supported'
c619-111:3462746:3462865 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-111:3462746:3462865 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-111:3462746:3462865 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-111:3462746:3462865 [0] NCCL INFO proxy.cc:1486 -> 3
[2025-09-23 22:36:32] c619-111:3462746:3462865 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-111:3462746:3462865 [0] NCCL INFO misc/socket.cc:881 -> 3
[2025-09-23 22:36:32] c619-112:919152:919270 [0] transport/net.cc:985 NCCL WARN Cuda failure 801 'operation not supported'
c619-112:919152:919272 [0] NCCL INFO transport/net.cc:450 -> 1
c619-112:919152:919272 [0] NCCL INFO transport.cc:216 -> 1
c619-112:919152:919272 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-112:919152:919272 [0] NCCL INFO group.cc:152 -> 1
c619-112:919152:919272 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-112:919152:919152 [0] NCCL INFO group.cc:460 -> 1
[2025-09-23 22:36:32] c619-112:919152:919270 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-112:919152:919152 [0] NCCL INFO group.cc:581 -> 1
c619-112:919152:919152 [0] NCCL INFO enqueue.cc:2299 -> 1
c619-112:919152:919270 [0] NCCL INFO misc/socket.cc:881 -> 3
[rank1]: Traceback (most recent call last):
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank1]: main()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank1]: run_exp()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank1]: _training_function(config={"args": args, "callbacks": callbacks})
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank1]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank1]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank1]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank1]: next(self.gen)
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank1]: dist.barrier()
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank1]: return func(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank1]: work = group.barrier(opts=opts)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank1]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank1]: Last error:
[rank1]: Cuda failure 801 'operation not supported'
[rank2]: Traceback (most recent call last):
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank2]: main()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank2]: run_exp()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank2]: _training_function(config={"args": args, "callbacks": callbacks})
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank2]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank2]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank2]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank2]: next(self.gen)
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank2]: dist.barrier()
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank2]: return func(*args, **kwargs)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank2]: work = group.barrier(opts=opts)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank2]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank2]: Last error:
[rank2]: Cuda failure 801 'operation not supported'
[rank3]: Traceback (most recent call last):
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank3]: main()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank3]: run_exp()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank3]: _training_function(config={"args": args, "callbacks": callbacks})
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank3]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank3]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank3]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank3]: next(self.gen)
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank3]: dist.barrier()
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank3]: return func(*args, **kwargs)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank3]: work = group.barrier(opts=opts)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank3]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank3]: Last error:
[rank3]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
[rank2]:[W923 22:36:33.495581287 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:36:33] c619-111:3462746:3462874 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<46125>
c619-111:3462746:3462874 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-111:3462746:3462874 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-111:3462746:3462874 [0] NCCL INFO comm 0x24604cb0 rank 2 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank3]:[W923 22:36:33.654785106 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:36:33] c619-112:919152:919279 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-112.vista.tacc.utexas.edu<40011>
c619-112:919152:919279 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-112:919152:919279 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-112:919152:919279 [0] NCCL INFO comm 0x3a030a80 rank 3 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank1]:[W923 22:36:33.602535754 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:36:33] c619-102:3668419:3668546 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<47669>
c619-102:3668419:3668546 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-102:3668419:3668546 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-102:3668419:3668546 [0] NCCL INFO comm 0x4c740040 rank 1 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<45793>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<47189>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829931 [0] NCCL INFO transport.cc:197 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
c619-101:829801:829931 [0] NCCL INFO transport/generic.cc:49 -> 6
c619-101:829801:829931 [0] NCCL INFO group.cc:152 -> 6
c619-101:829801:829931 [0] NCCL INFO group.cc:75 -> 6 [Async thread]
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<42161>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<50031>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<43781>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:36:33] c619-101:829801:829929 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<47879>
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:829801:829801 [0] NCCL INFO group.cc:460 -> 6
c619-101:829801:829801 [0] NCCL INFO group.cc:581 -> 6
c619-101:829801:829801 [0] NCCL INFO enqueue.cc:2299 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:829801:829929 [0] NCCL INFO transport/net.cc:747 -> 6
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:829801:829929 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:829801:829929 [0] NCCL INFO proxy.cc:1486 -> 3
[2025-09-23 22:36:33] c619-101:829801:829929 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
[rank0]: Traceback (most recent call last):
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank0]: main()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank0]: run_exp()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank0]: _training_function(config={"args": args, "callbacks": callbacks})
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank0]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank0]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank0]: next(self.gen)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank0]: dist.barrier()
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank0]: work = group.barrier(opts=opts)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, remote process exited or there was a network error, NCCL version 2.26.2
[rank0]: ncclRemoteError: A call failed possibly due to a network error or a remote process exiting prematurely.
[rank0]: Last error:
[rank0]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
E0923 22:36:34.074000 3462742 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3462746) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:36:34
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 1 (pid: 3462746)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
E0923 22:36:34.175000 919149 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 919152) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 22:36:34.190000 3668415 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3668419) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:36:34
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 1 (pid: 919152)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:36:34
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 1 (pid: 3668419)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
[rank0]:[W923 22:36:34.991374523 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
c619-101:829801:829938 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:829801:829938 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:829801:829938 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:829801:829938 [0] NCCL INFO comm 0x1b0748f0 rank 0 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
E0923 22:36:35.675000 829796 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 829801) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:36:35
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 829801)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-101: task 0: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.48ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 37.5MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.7MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.20 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.20 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 9.87MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/8.09k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.09k/8.09k [00:00<00:00, 21.3kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.09k/8.09k [00:00<00:00, 21.3kB/s]
Generating train split: 0%| | 0/48 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 48/48 [00:00<00:00, 13921.08 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2353.71ba/s]
Uploading...: 0%| | 0.00/8.24k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.24k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.24k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.24k/8.24k [00:00<00:00, 41.2kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.24k/8.24k [00:00<00:00, 9.73kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.06s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.06s/ shards]
| BASELINE_r1_distillation | 30.143149 | true |
2025-09-23T22:38:00.336138 | 2025-09-23T22:38:10.490722 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 52 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 53)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/13 cuda/12.4 nccl/12.4 nvidia_math/12.4
module load gcc/13 cuda/12.4 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-112: task 3: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 25%|βββββββββββββββββββββββββββββββββ | 1/4 [00:03<00:09, 3.32s/ba][A
Creating parquet from Arrow format: 50%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 2/4 [00:03<00:03, 1.59s/ba][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:03<00:00, 1.07ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 37.0MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.4MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:04<00:00, 4.49s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:04<00:00, 4.50s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 11.5MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/8.31k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.31k/8.31k [00:00<00:00, 21.8kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.31k/8.31k [00:00<00:00, 21.8kB/s]
Generating train split: 0%| | 0/52 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 52/52 [00:00<00:00, 13287.67 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2666.44ba/s]
Uploading...: 0%| | 0.00/8.46k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.46k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.46k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.46k/8.46k [00:00<00:00, 42.4kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.46k/8.46k [00:00<00:00, 9.98kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
| BASELINE_r1_distillation | 10.154584 | true |
2025-09-23T22:38:39.578014 | 2025-09-23T22:39:09.970392 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 55 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 56)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/13 cuda/12.6 nccl/12.4 nvidia_math/12.4
module load gcc/13 cuda/12.6 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.6
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:38:54] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:38:55,986] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[WARNING|2025-09-23 22:38:56] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:38:56] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:38:56] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:38:57,283] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:38:57,417] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 22:38:57,784] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:39:00,700] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 22:39:00,700] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 22:39:00] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:00,887 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:01,066 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[INFO|configuration_utils.py:698] 2025-09-23 22:39:01,509 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:39:01,511 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:01,646 >> loading file chat_template.jinja from cache at None
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:01,820 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:39:01] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:39:02,668] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:39:02] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:02,862 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:39:02,983] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:39:03] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:03,038 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6190 examples [00:00, 21820.60 examples/s]
Converting format of dataset: 7996 examples [00:00, 21693.15 examples/s]
[rank0]:[W923 22:39:03.281735274 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-101:834459:834459 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:834459:834459 [0] NCCL INFO cudaDriverVersion 12080
c619-101:834459:834459 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:834459:834459 [0] NCCL INFO Comm config Blocking set to 1
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,180 >> loading file chat_template.jinja from cache at None
c619-101:834459:834574 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:834459:834574 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-101:834459:834574 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:834459:834574 [0] NCCL INFO Using network IB
c619-101:834459:834574 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:834459:834574 [0] NCCL INFO ncclCommInitRankConfig comm 0x131c05b0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init START
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:03,356 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:39:03,460 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:39:03,462 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,601 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:39:03,651] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:39:03] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|configuration_utils.py:698] 2025-09-23 22:39:03,758 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:39:03,760 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:03,778 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:39:03] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,855 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:03,908 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:04,037 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:04,081 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:39:04] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|configuration_utils.py:698] 2025-09-23 22:39:04,452 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:39:04,454 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,588 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,588 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,589 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,589 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,589 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,589 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:39:04,589 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:39:04,769 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:39:04] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6160 examples [00:00, 21394.16 examples/s]
Converting format of dataset: 7996 examples [00:00, 21690.34 examples/s]
[rank2]:[W923 22:39:05.298988429 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-111:3462936:3462936 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3462936:3462936 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3462936:3462936 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3462936:3462936 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3462936:3463049 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3462936:3463049 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-111:3462936:3463049 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3462936:3463049 [0] NCCL INFO Using network IB
c619-111:3462936:3463049 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3462936:3463049 [0] NCCL INFO ncclCommInitRankConfig comm 0x17e74c00 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init START
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6165 examples [00:00, 21573.99 examples/s]
Converting format of dataset: 7996 examples [00:00, 21885.51 examples/s]
[rank1]:[W923 22:39:05.563251029 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3668607:3668607 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3668607:3668607 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3668607:3668607 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3668607:3668607 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3668607:3668720 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3668607:3668720 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-102:3668607:3668720 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3668607:3668720 [0] NCCL INFO Using network IB
c619-102:3668607:3668720 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3668607:3668720 [0] NCCL INFO ncclCommInitRankConfig comm 0x2d464310 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init START
c619-102:3668607:3668720 [0] NCCL INFO RAS client listening socket at ::1<28028>
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6069 examples [00:00, 19593.14 examples/s]
Converting format of dataset: 7996 examples [00:00, 20442.31 examples/s]
[rank3]:[W923 22:39:06.369302824 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-112:919339:919339 [0] NCCL INFO cudaDriverVersion 12080
c619-112:919339:919339 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:919339:919339 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:919339:919339 [0] NCCL INFO Comm config Blocking set to 1
c619-112:919339:919453 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:919339:919453 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:919339:919453 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:919339:919453 [0] NCCL INFO Using network IB
c619-112:919339:919453 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:919339:919453 [0] NCCL INFO ncclCommInitRankConfig comm 0x30f13f40 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init START
c619-111:3462936:3463049 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-112:919339:919453 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:834459:834574 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-111:3462936:3463049 [0] NCCL INFO Bootstrap timings total 1.017857 (create 0.000028, send 0.000158, recv 0.981562, ring 0.000157, delay 0.000001)
c619-101:834459:834574 [0] NCCL INFO Bootstrap timings total 3.053718 (create 0.000030, send 0.000068, recv 2.296439, ring 0.000111, delay 0.000001)
c619-112:919339:919453 [0] NCCL INFO Bootstrap timings total 0.065640 (create 0.000029, send 0.000108, recv 0.000253, ring 0.000161, delay 0.000001)
c619-102:3668607:3668720 [0] NCCL INFO Bootstrap timings total 0.797232 (create 0.000027, send 0.000124, recv 0.000233, ring 0.710893, delay 0.000001)
c619-102:3668607:3668720 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3462936:3463049 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:919339:919453 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:834459:834574 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:834459:834574 [0] NCCL INFO comm 0x131c05b0 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-112:919339:919453 [0] NCCL INFO comm 0x30f13f40 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-111:3462936:3463049 [0] NCCL INFO comm 0x17e74c00 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:834459:834574 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-102:3668607:3668720 [0] NCCL INFO comm 0x2d464310 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:834459:834574 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-111:3462936:3463049 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-101:834459:834574 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-111:3462936:3463049 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:834459:834574 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:834459:834574 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-112:919339:919453 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-101:834459:834574 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-102:3668607:3668720 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-112:919339:919453 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:834459:834574 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:834459:834574 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-102:3668607:3668720 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3668607:3668723 [0] NCCL INFO [Proxy Service] Device 0 CPU core 50
c619-111:3462936:3463053 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 35
c619-112:919339:919457 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 66
c619-102:3668607:3668724 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 51
c619-111:3462936:3463052 [0] NCCL INFO [Proxy Service] Device 0 CPU core 34
c619-112:919339:919456 [0] NCCL INFO [Proxy Service] Device 0 CPU core 65
c619-102:3668607:3668720 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-102:3668607:3668720 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3668607:3668720 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3462936:3463049 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-111:3462936:3463049 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3462936:3463049 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3668607:3668720 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3668607:3668720 [0] NCCL INFO ncclCommInitRankConfig comm 0x2d464310 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init COMPLETE
c619-102:3668607:3668720 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.89 (kernels 0.06, alloc 0.01, bootstrap 0.80, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:919339:919453 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-112:919339:919453 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:919339:919453 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3462936:3463049 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3462936:3463049 [0] NCCL INFO ncclCommInitRankConfig comm 0x17e74c00 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init COMPLETE
c619-111:3462936:3463049 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 1.15 (kernels 0.06, alloc 0.01, bootstrap 1.02, allgathers 0.05, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:919339:919453 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:919339:919453 [0] NCCL INFO ncclCommInitRankConfig comm 0x30f13f40 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init COMPLETE
c619-112:919339:919453 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.17 (kernels 0.06, alloc 0.01, bootstrap 0.07, allgathers 0.02, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3668607:3668725 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-111:3462936:3463054 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-102:3668607:3668726 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 54
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463055 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 36
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-102:3668607:3668725 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919459 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 67
c619-102:3668607:3668725 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834577 [0] NCCL INFO [Proxy Service] Device 0 CPU core 52
c619-101:834459:834578 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 50
c619-101:834459:834574 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-101:834459:834574 [0] NCCL INFO Enabled NCCL Func/Proto/Algo Matrix:
Function | LL LL128 Simple | Tree Ring CollNetDirect CollNetChain NVLS NVLSTree PAT
Broadcast | 0 0 1 | 1 1 1 1 1 1 1
Reduce | 0 0 1 | 1 1 1 1 1 1 1
AllGather | 0 0 1 | 1 1 1 1 1 1 1
ReduceScatter | 0 0 1 | 1 1 1 1 1 1 1
AllReduce | 0 0 1 | 1 1 1 1 1 1 1
c619-101:834459:834574 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:834459:834574 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:834459:834574 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-101:834459:834574 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:834459:834574 [0] NCCL INFO ncclCommInitRankConfig comm 0x131c05b0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x61e212a081a15ae2 - Init COMPLETE
c619-101:834459:834574 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 3.18 (kernels 0.06, alloc 0.01, bootstrap 3.05, allgathers 0.04, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:834459:834579 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-101:834459:834579 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-101:834459:834580 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 53
c619-101:834459:834579 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3462936:3463054 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:834459:834579 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-112:919339:919458 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919339:919458 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668607:3668725 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
[2025-09-23 22:39:06] c619-111:3462936:3463052 [0] transport/net.cc:985 NCCL WARN Cuda failure 801 'operation not supported'
c619-111:3462936:3463054 [0] NCCL INFO transport/net.cc:450 -> 1
c619-111:3462936:3463054 [0] NCCL INFO transport.cc:216 -> 1
c619-111:3462936:3463054 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-111:3462936:3463054 [0] NCCL INFO group.cc:152 -> 1
c619-111:3462936:3463054 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-111:3462936:3462936 [0] NCCL INFO group.cc:460 -> 1
c619-111:3462936:3462936 [0] NCCL INFO group.cc:581 -> 1
c619-111:3462936:3462936 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:39:06] c619-102:3668607:3668723 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-102:3668607:3668725 [0] NCCL INFO transport.cc:197 -> 1
c619-102:3668607:3668725 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-102:3668607:3668725 [0] NCCL INFO group.cc:152 -> 1
c619-102:3668607:3668725 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-102:3668607:3668607 [0] NCCL INFO group.cc:460 -> 1
c619-102:3668607:3668607 [0] NCCL INFO group.cc:581 -> 1
c619-102:3668607:3668607 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:39:06] c619-111:3462936:3463052 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
[2025-09-23 22:39:06] c619-102:3668607:3668723 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-111:3462936:3463052 [0] NCCL INFO misc/socket.cc:881 -> 3
c619-102:3668607:3668723 [0] NCCL INFO misc/socket.cc:881 -> 3
[rank2]: Traceback (most recent call last):
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank2]: main()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank2]: run_exp()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank2]: _training_function(config={"args": args, "callbacks": callbacks})
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank2]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank2]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank2]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank2]: next(self.gen)
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank2]: dist.barrier()
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank2]: return func(*args, **kwargs)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank2]: work = group.barrier(opts=opts)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank2]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank2]: Last error:
[rank2]: Cuda failure 801 'operation not supported'
[rank1]: Traceback (most recent call last):
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank1]: main()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank1]: run_exp()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank1]: _training_function(config={"args": args, "callbacks": callbacks})
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank1]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank1]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank1]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank1]: next(self.gen)
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank1]: dist.barrier()
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank1]: return func(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank1]: work = group.barrier(opts=opts)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank1]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank1]: Last error:
[rank1]: Cuda failure 801 'operation not supported'
[rank2]:[W923 22:39:07.532295384 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:39:07] c619-111:3462936:3463061 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<45605>
c619-111:3462936:3463061 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-111:3462936:3463061 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-111:3462936:3463061 [0] NCCL INFO comm 0x17e74c00 rank 2 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank1]:[W923 22:39:07.594193729 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:39:07] c619-102:3668607:3668732 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<46269>
c619-102:3668607:3668732 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-102:3668607:3668732 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-102:3668607:3668732 [0] NCCL INFO comm 0x2d464310 rank 1 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[2025-09-23 22:39:07] c619-112:919339:919456 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<56386>
c619-112:919339:919456 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-112:919339:919456 [0] NCCL INFO transport/net_ib.cc:1566 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<46341>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<48039>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<58199>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<46067>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
c619-112:919339:919456 [0] NCCL INFO transport/net.cc:901 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<41217>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
c619-112:919339:919458 [0] NCCL INFO transport/net.cc:450 -> 6
c619-112:919339:919458 [0] NCCL INFO transport.cc:216 -> 6
c619-112:919339:919458 [0] NCCL INFO transport/generic.cc:49 -> 6
c619-112:919339:919458 [0] NCCL INFO group.cc:152 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<42335>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
c619-112:919339:919458 [0] NCCL INFO group.cc:75 -> 6 [Async thread]
c619-101:834459:834579 [0] NCCL INFO transport.cc:197 -> 6
c619-101:834459:834579 [0] NCCL INFO transport/generic.cc:49 -> 6
c619-101:834459:834579 [0] NCCL INFO group.cc:152 -> 6
c619-101:834459:834579 [0] NCCL INFO group.cc:75 -> 6 [Async thread]
[2025-09-23 22:39:07] c619-112:919339:919456 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-112:919339:919339 [0] NCCL INFO group.cc:460 -> 6
c619-112:919339:919339 [0] NCCL INFO group.cc:581 -> 6
c619-112:919339:919339 [0] NCCL INFO enqueue.cc:2299 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<43957>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:39:07] c619-101:834459:834577 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<45627>
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:834459:834577 [0] NCCL INFO transport/net.cc:747 -> 6
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:834459:834577 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:834459:834577 [0] NCCL INFO proxy.cc:1486 -> 3
[2025-09-23 22:39:07] c619-101:834459:834577 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-101:834459:834459 [0] NCCL INFO group.cc:460 -> 6
c619-101:834459:834459 [0] NCCL INFO group.cc:581 -> 6
c619-101:834459:834459 [0] NCCL INFO enqueue.cc:2299 -> 6
[rank0]: Traceback (most recent call last):
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank0]: main()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank0]: run_exp()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank0]: _training_function(config={"args": args, "callbacks": callbacks})
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank0]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank0]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank0]: next(self.gen)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank0]: dist.barrier()
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank0]: work = group.barrier(opts=opts)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, remote process exited or there was a network error, NCCL version 2.26.2
[rank0]: ncclRemoteError: A call failed possibly due to a network error or a remote process exiting prematurely.
[rank0]: Last error:
[rank0]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
[rank3]: Traceback (most recent call last):
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank3]: main()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank3]: run_exp()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank3]: _training_function(config={"args": args, "callbacks": callbacks})
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank3]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank3]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank3]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank3]: next(self.gen)
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank3]: dist.barrier()
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank3]: return func(*args, **kwargs)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank3]: work = group.barrier(opts=opts)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, remote process exited or there was a network error, NCCL version 2.26.2
[rank3]: ncclRemoteError: A call failed possibly due to a network error or a remote process exiting prematurely.
[rank3]: Last error:
[rank3]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
E0923 22:39:08.079000 3462933 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3462936) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:39:08
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 1 (pid: 3462936)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
E0923 22:39:08.176000 3668604 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3668607) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:39:08
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 1 (pid: 3668607)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
[rank0]:[W923 22:39:08.998326932 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
c619-101:834459:834586 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:834459:834586 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:834459:834586 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:834459:834586 [0] NCCL INFO comm 0x131c05b0 rank 0 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank3]:[W923 22:39:08.125913849 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
c619-112:919339:919465 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-112:919339:919465 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-112:919339:919465 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-112:919339:919465 [0] NCCL INFO comm 0x30f13f40 rank 3 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
E0923 22:39:09.476000 834454 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 834459) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:39:09
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 834459)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
E0923 22:39:09.576000 919336 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 919339) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:39:09
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 1 (pid: 919339)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-112: task 3: Exited with exit code 1
srun: error: c619-101: task 0: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.42ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 37.1MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.5MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.19 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.19 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.9MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/8.50k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.50k/8.50k [00:00<00:00, 21.3kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.50k/8.50k [00:00<00:00, 21.3kB/s]
Generating train split: 0%| | 0/55 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 55/55 [00:00<00:00, 14656.08 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2552.83ba/s]
Uploading...: 0%| | 0.00/8.66k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.66k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.66k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.66k/8.66k [00:00<00:00, 43.2kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.66k/8.66k [00:00<00:00, 10.2kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.04s/ shards]
| BASELINE_r1_distillation | 30.392378 | true |
2025-09-23T22:43:28.050964 | 2025-09-23T22:43:58.210033 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 59 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 60)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/13 cuda/12.6 nccl/12.4 nvidia_math/12.4
module load gcc/13 cuda/12.6 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export NCCL_PROTO="simple"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export NCCL_NET_GDR_LEVEL="SYS"
export NCCL_NET_GDR_READ="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_PROVIDER="efa"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.6
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:43:43] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:43:44,393] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[WARNING|2025-09-23 22:43:45] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:43:45] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:43:45] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:43:46,229] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:43:46,309] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 22:43:46,469] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:43:48,677] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 22:43:48,678] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 22:43:48] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:48,883 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:49,057 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:43:49,459 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:43:49,461 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:49,591 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:49,770 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:43:49] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6182 examples [00:00, 21739.16 examples/s]
Converting format of dataset: 7996 examples [00:00, 21819.21 examples/s]
[rank0]:[W923 22:43:51.277689323 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-101:837175:837175 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:837175:837175 [0] NCCL INFO cudaDriverVersion 12080
c619-101:837175:837175 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:837175:837175 [0] NCCL INFO Comm config Blocking set to 1
c619-101:837175:837290 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:837175:837290 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-101:837175:837290 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:837175:837290 [0] NCCL INFO Using network IB
c619-101:837175:837290 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:837175:837290 [0] NCCL INFO ncclCommInitRankConfig comm 0x43c48470 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init START
[2025-09-23 22:43:51,526] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:43:51] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:51,798 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:43:51,838] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:43:51] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:51,973 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,032 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,032 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,032 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,033 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,033 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,033 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,033 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:43:52,112] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:43:52] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:52,210 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,312 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 22:43:52,447 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:43:52,450 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:52,492 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,582 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 22:43:52,598 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:43:52,601 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:52,731 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:52,760 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:43:52] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|configuration_utils.py:698] 2025-09-23 22:43:52,893 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:43:52,896 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:52,907 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:43:52] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:43:53,032 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:43:53,210 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:43:53] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6170 examples [00:00, 21506.97 examples/s]
Converting format of dataset: 7996 examples [00:00, 21851.85 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6151 examples [00:00, 21270.12 examples/s]
Converting format of dataset: 7996 examples [00:00, 21579.06 examples/s]
[rank1]:[W923 22:43:54.366866949 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
[rank2]:[W923 22:43:54.369568201 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3668832:3668832 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3668832:3668832 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3668832:3668832 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3668832:3668832 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3463164:3463164 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3463164:3463164 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3463164:3463164 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3463164:3463164 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3668832:3668946 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3463164:3463277 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3668832:3668946 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-102:3668832:3668946 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3668832:3668946 [0] NCCL INFO Using network IB
c619-102:3668832:3668946 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3668832:3668946 [0] NCCL INFO ncclCommInitRankConfig comm 0x42fbff10 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init START
c619-111:3463164:3463277 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-111:3463164:3463277 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3463164:3463277 [0] NCCL INFO Using network IB
c619-111:3463164:3463277 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463164:3463277 [0] NCCL INFO ncclCommInitRankConfig comm 0x420dd4e0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init START
c619-102:3668832:3668946 [0] NCCL INFO RAS client listening socket at ::1<28028>
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6145 examples [00:00, 21379.44 examples/s]
Converting format of dataset: 7996 examples [00:00, 21332.05 examples/s]
[rank3]:[W923 22:43:54.738665261 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-112:919569:919569 [0] NCCL INFO cudaDriverVersion 12080
c619-112:919569:919569 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:919569:919569 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:919569:919569 [0] NCCL INFO Comm config Blocking set to 1
c619-112:919569:919683 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:919569:919683 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:919569:919683 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:919569:919683 [0] NCCL INFO Using network IB
c619-112:919569:919683 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:919569:919683 [0] NCCL INFO ncclCommInitRankConfig comm 0x304a9db0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init START
c619-111:3463164:3463277 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-112:919569:919683 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:837175:837290 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:837175:837290 [0] NCCL INFO Bootstrap timings total 3.421391 (create 0.000025, send 0.000067, recv 3.098860, ring 0.000137, delay 0.000001)
c619-102:3668832:3668946 [0] NCCL INFO Bootstrap timings total 0.322970 (create 0.000029, send 0.000139, recv 0.003460, ring 0.280553, delay 0.000000)
c619-112:919569:919683 [0] NCCL INFO Bootstrap timings total 0.050100 (create 0.000028, send 0.000129, recv 0.000279, ring 0.001768, delay 0.000001)
c619-111:3463164:3463277 [0] NCCL INFO Bootstrap timings total 0.339628 (create 0.000027, send 0.000148, recv 0.278851, ring 0.001779, delay 0.000000)
c619-102:3668832:3668946 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3463164:3463277 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:837175:837290 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:919569:919683 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:837175:837290 [0] NCCL INFO comm 0x43c48470 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:837175:837290 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-111:3463164:3463277 [0] NCCL INFO comm 0x420dd4e0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:837175:837290 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-112:919569:919683 [0] NCCL INFO comm 0x304a9db0 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-102:3668832:3668946 [0] NCCL INFO comm 0x42fbff10 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:837175:837290 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:837175:837290 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:837175:837290 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:837175:837290 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-112:919569:919683 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:919569:919683 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3463164:3463277 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3463164:3463277 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3668832:3668946 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3668832:3668946 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3463164:3463281 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 13
c619-101:837175:837293 [0] NCCL INFO [Proxy Service] Device 0 CPU core 51
c619-102:3668832:3668949 [0] NCCL INFO [Proxy Service] Device 0 CPU core 62
c619-101:837175:837294 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 52
c619-111:3463164:3463280 [0] NCCL INFO [Proxy Service] Device 0 CPU core 12
c619-102:3668832:3668950 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 64
c619-101:837175:837290 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-101:837175:837290 [0] NCCL INFO Enabled NCCL Func/Proto/Algo Matrix:
Function | LL LL128 Simple | Tree Ring CollNetDirect CollNetChain NVLS NVLSTree PAT
Broadcast | 0 0 1 | 1 1 1 1 1 1 1
Reduce | 0 0 1 | 1 1 1 1 1 1 1
AllGather | 0 0 1 | 1 1 1 1 1 1 1
ReduceScatter | 0 0 1 | 1 1 1 1 1 1 1
AllReduce | 0 0 1 | 1 1 1 1 1 1 1
c619-101:837175:837290 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:837175:837290 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:837175:837290 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-111:3463164:3463277 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-111:3463164:3463277 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3463164:3463277 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:837175:837290 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:837175:837290 [0] NCCL INFO ncclCommInitRankConfig comm 0x43c48470 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init COMPLETE
c619-101:837175:837290 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 3.52 (kernels 0.06, alloc 0.00, bootstrap 3.42, allgathers 0.02, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463164:3463277 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3463164:3463277 [0] NCCL INFO ncclCommInitRankConfig comm 0x420dd4e0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init COMPLETE
c619-111:3463164:3463277 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.42 (kernels 0.06, alloc 0.00, bootstrap 0.34, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3668832:3668946 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-102:3668832:3668946 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3668832:3668946 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3668832:3668946 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3668832:3668946 [0] NCCL INFO ncclCommInitRankConfig comm 0x42fbff10 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init COMPLETE
c619-102:3668832:3668946 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.42 (kernels 0.06, alloc 0.00, bootstrap 0.32, allgathers 0.02, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:837175:837295 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-101:837175:837295 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-111:3463164:3463282 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-102:3668832:3668951 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-101:837175:837296 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 53
c619-101:837175:837295 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463283 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 14
c619-101:837175:837295 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919686 [0] NCCL INFO [Proxy Service] Device 0 CPU core 64
c619-112:919569:919687 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 65
c619-112:919569:919683 [0] NCCL INFO NCCL_PROTO set by environment to simple
c619-112:919569:919683 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:919569:919683 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-112:919569:919683 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:919569:919683 [0] NCCL INFO ncclCommInitRankConfig comm 0x304a9db0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x5381a14a64d511d1 - Init COMPLETE
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919683 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.14 (kernels 0.06, alloc 0.00, bootstrap 0.05, allgathers 0.01, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO NCCL_NET_GDR_LEVEL set by environment to SYS
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919689 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 66
c619-112:919569:919688 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668952 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 65
c619-101:837175:837295 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-111:3463164:3463282 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-111:3463164:3463282 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-101:837175:837295 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO NCCL_NET_GDR_READ set by environment to 1.
c619-112:919569:919688 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-112:919569:919688 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
c619-102:3668832:3668951 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1/GDRDMA(PCI)
[2025-09-23 22:43:54] c619-102:3668832:3668949 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-102:3668832:3668951 [0] NCCL INFO transport.cc:197 -> 1
c619-102:3668832:3668951 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-102:3668832:3668951 [0] NCCL INFO group.cc:152 -> 1
c619-102:3668832:3668951 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-102:3668832:3668832 [0] NCCL INFO group.cc:460 -> 1
c619-102:3668832:3668832 [0] NCCL INFO group.cc:581 -> 1
c619-102:3668832:3668832 [0] NCCL INFO enqueue.cc:2299 -> 1
c619-102:3668832:3668949 [0] NCCL INFO proxy.cc:1528 -> 3
[2025-09-23 22:43:54] c619-112:919569:919686 [0] transport/net.cc:985 NCCL WARN Cuda failure 801 'operation not supported'
c619-112:919569:919688 [0] NCCL INFO transport/net.cc:450 -> 1
c619-112:919569:919688 [0] NCCL INFO transport.cc:216 -> 1
c619-112:919569:919688 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-112:919569:919688 [0] NCCL INFO group.cc:152 -> 1
c619-112:919569:919688 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
[2025-09-23 22:43:54] c619-111:3463164:3463280 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-112:919569:919569 [0] NCCL INFO group.cc:460 -> 1
c619-112:919569:919569 [0] NCCL INFO group.cc:581 -> 1
c619-112:919569:919569 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:43:54] c619-112:919569:919686 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-112:919569:919686 [0] NCCL INFO misc/socket.cc:881 -> 3
[2025-09-23 22:43:54] c619-111:3463164:3463280 [0] transport/net.cc:985 NCCL WARN Cuda failure 801 'operation not supported'
[2025-09-23 22:43:54] c619-111:3463164:3463280 [0] transport/net.cc:841 NCCL WARN Cuda failure 801 'operation not supported'
c619-111:3463164:3463282 [0] NCCL INFO transport.cc:197 -> 1
c619-111:3463164:3463282 [0] NCCL INFO transport/generic.cc:49 -> 1
c619-111:3463164:3463282 [0] NCCL INFO group.cc:152 -> 1
c619-111:3463164:3463282 [0] NCCL INFO group.cc:75 -> 1 [Async thread]
c619-111:3463164:3463164 [0] NCCL INFO group.cc:460 -> 1
c619-111:3463164:3463164 [0] NCCL INFO group.cc:581 -> 1
c619-111:3463164:3463164 [0] NCCL INFO enqueue.cc:2299 -> 1
[2025-09-23 22:43:54] c619-111:3463164:3463280 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-111:3463164:3463280 [0] NCCL INFO misc/socket.cc:881 -> 3
[rank1]: Traceback (most recent call last):
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank1]: main()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank1]: run_exp()
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank1]: _training_function(config={"args": args, "callbacks": callbacks})
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank1]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank1]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank1]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank1]: next(self.gen)
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank1]: dist.barrier()
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank1]: return func(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank1]: work = group.barrier(opts=opts)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank1]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank1]: Last error:
[rank1]: Cuda failure 801 'operation not supported'
[rank3]: Traceback (most recent call last):
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank3]: main()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank3]: run_exp()
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank3]: _training_function(config={"args": args, "callbacks": callbacks})
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank3]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank3]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank3]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank3]: next(self.gen)
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank3]: dist.barrier()
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank3]: return func(*args, **kwargs)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^
[rank3]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank3]: work = group.barrier(opts=opts)
[rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank3]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank3]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank3]: Last error:
[rank3]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
[rank2]: Traceback (most recent call last):
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank2]: main()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank2]: run_exp()
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank2]: _training_function(config={"args": args, "callbacks": callbacks})
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank2]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank2]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank2]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank2]: next(self.gen)
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank2]: dist.barrier()
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank2]: return func(*args, **kwargs)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^
[rank2]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank2]: work = group.barrier(opts=opts)
[rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank2]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.26.2
[rank2]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank2]: Last error:
[rank2]: Cuda failure 801 'operation not supported'
[rank1]:[W923 22:43:55.842565463 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:43:55] c619-102:3668832:3668958 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<54507>
c619-102:3668832:3668958 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-102:3668832:3668958 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-102:3668832:3668958 [0] NCCL INFO comm 0x42fbff10 rank 1 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank3]:[W923 22:43:55.982119155 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:43:55] c619-112:919569:919695 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-112.vista.tacc.utexas.edu<53555>
c619-112:919569:919695 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-112:919569:919695 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-112:919569:919695 [0] NCCL INFO comm 0x304a9db0 rank 3 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[rank2]:[W923 22:43:55.947239118 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
[2025-09-23 22:43:55] c619-111:3463164:3463289 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-111.vista.tacc.utexas.edu<59541>
c619-111:3463164:3463289 [0] NCCL INFO misc/socket.cc:80 -> 6
c619-111:3463164:3463289 [0] NCCL INFO misc/socket.cc:829 -> 6
c619-111:3463164:3463289 [0] NCCL INFO comm 0x420dd4e0 rank 2 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
[2025-09-23 22:43:56] c619-101:837175:837293 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<36109>
c619-101:837175:837293 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:43:56] c619-101:837175:837293 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<54363>
c619-101:837175:837293 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net.cc:747 -> 6
[2025-09-23 22:43:56] c619-101:837175:837293 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<51851>
c619-101:837175:837293 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net.cc:747 -> 6
c619-101:837175:837295 [0] NCCL INFO transport.cc:197 -> 6
[2025-09-23 22:43:56] c619-101:837175:837293 [0] misc/socket.cc:71 NCCL WARN socketProgress: Connection closed by remote peer i619-102.vista.tacc.utexas.edu<37669>
c619-101:837175:837293 [0] NCCL INFO misc/socket.cc:806 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net_ib.cc:1250 -> 6
c619-101:837175:837293 [0] NCCL INFO transport/net.cc:747 -> 6
c619-101:837175:837295 [0] NCCL INFO transport/generic.cc:49 -> 6
c619-101:837175:837295 [0] NCCL INFO group.cc:152 -> 6
c619-101:837175:837295 [0] NCCL INFO group.cc:75 -> 6 [Async thread]
[2025-09-23 22:43:56] c619-101:837175:837293 [0] proxy.cc:1658 NCCL WARN [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
c619-101:837175:837175 [0] NCCL INFO group.cc:460 -> 6
c619-101:837175:837175 [0] NCCL INFO group.cc:581 -> 6
c619-101:837175:837175 [0] NCCL INFO enqueue.cc:2299 -> 6
[rank0]: Traceback (most recent call last):
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank0]: main()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank0]: run_exp()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank0]: _training_function(config={"args": args, "callbacks": callbacks})
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 51, in run_sft
[rank0]: dataset_module = get_dataset(template, model_args, data_args, training_args, stage="sft", **tokenizer_module)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/data/loader.py", line 303, in get_dataset
[rank0]: with training_args.main_process_first(desc="load dataset", local=(not data_args.data_shared_file_system)):
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/contextlib.py", line 144, in __exit__
[rank0]: next(self.gen)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/training_args.py", line 2460, in main_process_first
[rank0]: dist.barrier()
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 4622, in barrier
[rank0]: work = group.barrier(opts=opts)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: torch.distributed.DistBackendError: NCCL error in: /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:3353, remote process exited or there was a network error, NCCL version 2.26.2
[rank0]: ncclRemoteError: A call failed possibly due to a network error or a remote process exiting prematurely.
[rank0]: Last error:
[rank0]: [Service thread] Error encountered progressing operation=Connect, res=3, closing connection
E0923 22:43:56.427000 919565 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 919569) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
E0923 22:43:56.427000 3668828 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3668832) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
return f(*args, **kwargs)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
elastic_launch(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
raise ChildFailedError(
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:43:56
host : c619-102.vista.tacc.utexas.edu
rank : 1 (local_rank: 0)
exitcode : 1 (pid: 3668832)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:43:56
host : c619-112.vista.tacc.utexas.edu
rank : 3 (local_rank: 0)
exitcode : 1 (pid: 919569)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
E0923 22:43:56.527000 3463161 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3463164) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:43:56
host : c619-111.vista.tacc.utexas.edu
rank : 2 (local_rank: 0)
exitcode : 1 (pid: 3463164)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-102: task 1: Exited with exit code 1
srun: error: c619-112: task 3: Exited with exit code 1
srun: error: c619-111: task 2: Exited with exit code 1
[rank0]:[W923 22:43:57.317014921 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
c619-101:837175:837302 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:837175:837302 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:837175:837302 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:837175:837302 [0] NCCL INFO comm 0x43c48470 rank 0 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
E0923 22:43:57.826000 837170 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 837175) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:43:57
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 837175)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
srun: error: c619-101: task 0: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.37ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 36.8MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.4MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.8MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/8.74k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.74k/8.74k [00:00<00:00, 24.9kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.74k/8.74k [00:00<00:00, 24.9kB/s]
Generating train split: 0%| | 0/59 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 59/59 [00:00<00:00, 16030.57 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2448.51ba/s]
Uploading...: 0%| | 0.00/8.88k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.88k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/8.88k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.88k/8.88k [00:00<00:00, 44.3kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.88k/8.88k [00:00<00:00, 10.5kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.11s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.11s/ shards]
| BASELINE_r1_distillation | 30.159069 | true |
2025-09-23T22:50:28.137482 | 2025-09-23T22:50:34.886852 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 63 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 64)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.4 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.4 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.4 2) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-111: task 2: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-102: task 1: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-112: task 3: Exited with exit code 1
Traceback (most recent call last):
File "<frozen runpy>", line 189, in _run_module_as_main
File "<frozen runpy>", line 112, in _get_module_details
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/__init__.py", line 409, in <module>
from torch._C import * # noqa: F403
^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so: undefined symbol: cuptiActivityEnableDriverApi, version libcupti.so.12
srun: error: c619-101: task 0: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.41ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 36.5MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.2MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.16 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 9.78MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/8.95k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.95k/8.95k [00:00<00:00, 25.6kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 8.95k/8.95k [00:00<00:00, 25.5kB/s]
Generating train split: 0%| | 0/63 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 63/63 [00:00<00:00, 15409.44 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2355.03ba/s]
Uploading...: 0%| | 0.00/9.15k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.15k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.15k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.15k/9.15k [00:00<00:00, 45.7kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.15k/9.15k [00:00<00:00, 10.7kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.03s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.03s/ shards]
| BASELINE_r1_distillation | 6.74937 | true |
2025-09-23T22:53:24.140983 | 2025-09-23T22:55:19.032481 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 67 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 68)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.8 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:53:39] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:53:40,456] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[WARNING|2025-09-23 22:53:41] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:53:41] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:53:41] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:53:42,142] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:53:42,267] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 22:53:42,559] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:53:44,836] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 22:53:44,836] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 22:53:44] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,023 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,024 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:45,200 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:53:45,594 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:53:45,596 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:45,735 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:45,912 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:53:45] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6213 examples [00:00, 22055.21 examples/s]
Converting format of dataset: 7996 examples [00:00, 21959.13 examples/s]
[rank0]:[W923 22:53:47.345402589 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-101:842814:842814 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:842814:842814 [0] NCCL INFO cudaDriverVersion 12080
c619-101:842814:842814 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:842814:842814 [0] NCCL INFO Comm config Blocking set to 1
c619-101:842814:842929 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:842814:842929 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-101:842814:842929 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:842814:842929 [0] NCCL INFO Using network IB
c619-101:842814:842929 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:842814:842929 [0] NCCL INFO ncclCommInitRankConfig comm 0x322cdd00 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init START
[2025-09-23 22:53:47,469] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:53:47] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:47,664 >> loading file chat_template.jinja from cache at None
[2025-09-23 22:53:47,776] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:53:47] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:47,840 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,248 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 22:53:48,249 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:53:48,252 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[2025-09-23 22:53:48,293] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:53:48] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,381 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:48,424 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,527 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:48,558 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:53:48] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:48,708 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:53:48,833 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:53:48,835 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:48,966 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 22:53:49,119 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:53:49,121 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:49,143 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:53:49] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,256 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:53:49,257 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:53:49,441 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:53:49] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6162 examples [00:00, 21547.14 examples/s]
Converting format of dataset: 7996 examples [00:00, 21869.18 examples/s]
[rank2]:[W923 22:53:49.005396243 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-111:3463502:3463502 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3463502:3463502 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3463502:3463502 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3463502:3463502 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3463502:3463615 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3463502:3463615 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-111:3463502:3463615 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3463502:3463615 [0] NCCL INFO Using network IB
c619-111:3463502:3463615 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463502:3463615 [0] NCCL INFO ncclCommInitRankConfig comm 0x16f277a0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init START
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6171 examples [00:00, 21305.22 examples/s]
Converting format of dataset: 7996 examples [00:00, 21697.56 examples/s]
[rank1]:[W923 22:53:50.582671811 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3669167:3669167 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3669167:3669167 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3669167:3669167 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3669167:3669167 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3669167:3669280 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3669167:3669280 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-102:3669167:3669280 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3669167:3669280 [0] NCCL INFO Using network IB
c619-102:3669167:3669280 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3669167:3669280 [0] NCCL INFO ncclCommInitRankConfig comm 0x23120610 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init START
c619-102:3669167:3669280 [0] NCCL INFO RAS client listening socket at ::1<28028>
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6075 examples [00:00, 20043.92 examples/s]
Converting format of dataset: 7996 examples [00:00, 20687.85 examples/s]
[rank3]:[W923 22:53:50.939178830 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-112:919914:919914 [0] NCCL INFO cudaDriverVersion 12080
c619-112:919914:919914 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:919914:919914 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:919914:919914 [0] NCCL INFO Comm config Blocking set to 1
c619-112:919914:920027 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:919914:920027 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:919914:920027 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:919914:920027 [0] NCCL INFO Using network IB
c619-112:919914:920027 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:842814:842929 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-111:3463502:3463615 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-112:919914:920027 [0] NCCL INFO ncclCommInitRankConfig comm 0x2e9380e0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init START
c619-112:919914:920027 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-111:3463502:3463615 [0] NCCL INFO Bootstrap timings total 0.893514 (create 0.000028, send 0.000146, recv 0.845004, ring 0.000148, delay 0.000001)
c619-101:842814:842929 [0] NCCL INFO Bootstrap timings total 3.567199 (create 0.000029, send 0.000071, recv 3.241422, ring 0.001488, delay 0.000001)
c619-102:3669167:3669280 [0] NCCL INFO Bootstrap timings total 0.326157 (create 0.000027, send 0.000135, recv 0.000270, ring 0.267554, delay 0.000001)
c619-112:919914:920027 [0] NCCL INFO Bootstrap timings total 0.089575 (create 0.000028, send 0.000135, recv 0.000321, ring 0.000210, delay 0.000001)
c619-102:3669167:3669280 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:919914:920027 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:842814:842929 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3463502:3463615 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:842814:842929 [0] NCCL INFO comm 0x322cdd00 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:842929 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-102:3669167:3669280 [0] NCCL INFO comm 0x23120610 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:842929 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-111:3463502:3463615 [0] NCCL INFO comm 0x16f277a0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-112:919914:920027 [0] NCCL INFO comm 0x2e9380e0 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:842929 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:842814:842929 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:842814:842929 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:842814:842929 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-102:3669167:3669280 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3669167:3669280 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:919914:920027 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:919914:920027 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3463502:3463615 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3463502:3463615 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3669167:3669283 [0] NCCL INFO [Proxy Service] Device 0 CPU core 62
c619-101:842814:842932 [0] NCCL INFO [Proxy Service] Device 0 CPU core 0
c619-102:3669167:3669284 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 64
c619-112:919914:920030 [0] NCCL INFO [Proxy Service] Device 0 CPU core 52
c619-101:842814:842933 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 2
c619-111:3463502:3463618 [0] NCCL INFO [Proxy Service] Device 0 CPU core 34
c619-112:919914:920031 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 53
c619-111:3463502:3463619 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 35
c619-101:842814:842929 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:842814:842929 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3669167:3669280 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3669167:3669280 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:842814:842929 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-102:3669167:3669280 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3669167:3669280 [0] NCCL INFO ncclCommInitRankConfig comm 0x23120610 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init COMPLETE
c619-102:3669167:3669280 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.44 (kernels 0.06, alloc 0.00, bootstrap 0.33, allgathers 0.03, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:842814:842929 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:842814:842929 [0] NCCL INFO ncclCommInitRankConfig comm 0x322cdd00 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init COMPLETE
c619-101:842814:842929 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 3.68 (kernels 0.06, alloc 0.01, bootstrap 3.57, allgathers 0.03, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:919914:920027 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:919914:920027 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-112:919914:920027 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:919914:920027 [0] NCCL INFO ncclCommInitRankConfig comm 0x2e9380e0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init COMPLETE
c619-112:919914:920027 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.17 (kernels 0.06, alloc 0.00, bootstrap 0.09, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463502:3463615 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3463502:3463615 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3463502:3463615 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3463502:3463615 [0] NCCL INFO ncclCommInitRankConfig comm 0x16f277a0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xa172d874b3fc1dde - Init COMPLETE
c619-111:3463502:3463615 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 1.02 (kernels 0.06, alloc 0.00, bootstrap 0.89, allgathers 0.04, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3669167:3669286 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 66
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842935 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 4
c619-101:842814:842934 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920033 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 55
c619-102:3669167:3669285 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463621 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 36
c619-102:3669167:3669285 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463502:3463620 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920032 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669285 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:842934 [0] NCCL INFO Connected all trees
c619-102:3669167:3669285 [0] NCCL INFO Connected all trees
c619-111:3463502:3463620 [0] NCCL INFO Connected all trees
c619-112:919914:920032 [0] NCCL INFO Connected all trees
Running tokenizer on dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 25%|βββ | 1000/3998 [00:04<00:13, 228.87 examples/s]
Running tokenizer on dataset: 50%|βββββ | 2000/3998 [00:08<00:08, 229.43 examples/s]
Running tokenizer on dataset: 75%|ββββββββ | 3000/3998 [00:13<00:04, 227.25 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:17<00:00, 226.59 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:17<00:00, 226.92 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 25%|βββ | 1000/3998 [00:04<00:13, 229.01 examples/s]
Running tokenizer on dataset: 50%|βββββ | 2000/3998 [00:08<00:08, 226.44 examples/s]
Running tokenizer on dataset: 75%|ββββββββ | 3000/3998 [00:13<00:04, 225.34 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:17<00:00, 224.53 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:17<00:00, 224.93 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 25%|βββ | 1000/3998 [00:04<00:13, 225.20 examples/s]
Running tokenizer on dataset: 50%|βββββ | 2000/3998 [00:08<00:08, 225.41 examples/s]
Running tokenizer on dataset: 75%|ββββββββ | 3000/3998 [00:13<00:04, 222.75 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:17<00:00, 221.30 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:18<00:00, 222.04 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 0%| | 0/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 25%|βββ | 1000/3998 [00:04<00:13, 224.63 examples/s]
Running tokenizer on dataset: 50%|βββββ | 2000/3998 [00:08<00:08, 223.32 examples/s]
Running tokenizer on dataset: 75%|ββββββββ | 3000/3998 [00:13<00:04, 221.58 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:18<00:00, 220.51 examples/s]
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:18<00:00, 221.05 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
[INFO|configuration_utils.py:698] 2025-09-23 22:54:09,957 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:54:09,958 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:54:09] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 22:54:09,961 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:698] 2025-09-23 22:54:09,961 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:54:09,962 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:54:09] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:770] 2025-09-23 22:54:09,962 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:54:09] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 22:54:09,968 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:54:09,969 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:54:09] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|modeling_utils.py:1150] 2025-09-23 22:54:41,354 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:54:41,368 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:54:41,369 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:54:45,663 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:54:45,663 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:54:45,833 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:54:45,834 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:54:45] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:54:45] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:54:45] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:54:45] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:54:46] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:54:46,059 >> Using auto half precision backend
[2025-09-23 22:54:46,291] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[INFO|modeling_utils.py:1150] 2025-09-23 22:54:51,061 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:54:51,076 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:54:51,077 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:54:54,908 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:54:54,908 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:54:54,985 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:54:54,986 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:54:55] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:54:55] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:54:55] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:54:55] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:54:55] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:54:55,097 >> Using auto half precision backend
[2025-09-23 22:54:55,323] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[INFO|modeling_utils.py:1150] 2025-09-23 22:54:59,150 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:54:59,166 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:54:59,167 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:55:03,261 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:55:03,262 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:55:03,341 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:55:03,341 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:55:03] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:55:03] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:55:03] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:55:03] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:55:03] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:55:03,453 >> Using auto half precision backend
[2025-09-23 22:55:03,679] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed info: version=0.16.9, git-hash=unknown, git-branch=unknown
[2025-09-23 22:55:03,679] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
c619-101:842814:842814 [0] NCCL INFO Comm config Blocking set to 1
c619-101:842814:843090 [0] NCCL INFO Using network IB
c619-101:842814:843090 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:919914:919914 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3669167:3669167 [0] NCCL INFO Comm config Blocking set to 1
c619-101:842814:843090 [0] NCCL INFO ncclCommInitRankConfig comm 0x3c834980 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init START
c619-112:919914:920220 [0] NCCL INFO Using network IB
c619-102:3669167:3669391 [0] NCCL INFO Using network IB
c619-112:919914:920220 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3669167:3669391 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:919914:920220 [0] NCCL INFO ncclCommInitRankConfig comm 0x349aa7d0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init START
c619-102:3669167:3669391 [0] NCCL INFO ncclCommInitRankConfig comm 0x292a5430 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init START
[INFO|modeling_utils.py:1150] 2025-09-23 22:55:09,384 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:55:09,397 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:55:09,398 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:55:12,913 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:55:12,913 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:55:12,997 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:55:12,997 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:55:13] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:55:13] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:55:13] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:55:13] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:55:13] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:55:13,113 >> Using auto half precision backend
[2025-09-23 22:55:13,338] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
c619-111:3463502:3463502 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3463502:3463777 [0] NCCL INFO Using network IB
c619-111:3463502:3463777 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463502:3463777 [0] NCCL INFO ncclCommInitRankConfig comm 0x1d02a7e0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init START
c619-101:842814:843090 [0] NCCL INFO Bootstrap timings total 9.659939 (create 0.000023, send 0.000071, recv 0.000392, ring 9.659293, delay 0.000000)
c619-111:3463502:3463777 [0] NCCL INFO Bootstrap timings total 0.000794 (create 0.000028, send 0.000116, recv 0.000284, ring 0.000191, delay 0.000000)
c619-102:3669167:3669391 [0] NCCL INFO Bootstrap timings total 9.659747 (create 0.000028, send 0.000108, recv 9.659179, ring 0.000209, delay 0.000000)
c619-112:919914:920220 [0] NCCL INFO Bootstrap timings total 9.659780 (create 0.000029, send 0.000087, recv 0.000157, ring 0.000136, delay 0.000000)
c619-111:3463502:3463777 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:842814:843090 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:919914:920220 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-102:3669167:3669391 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:842814:843090 [0] NCCL INFO comm 0x3c834980 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:843090 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-102:3669167:3669391 [0] NCCL INFO comm 0x292a5430 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-111:3463502:3463777 [0] NCCL INFO comm 0x1d02a7e0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:843090 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-112:919914:920220 [0] NCCL INFO comm 0x349aa7d0 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:842814:843090 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:842814:843090 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:842814:843090 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:842814:843090 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-102:3669167:3669391 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3669167:3669391 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:842814:843092 [0] NCCL INFO [Proxy Service] Device 0 CPU core 6
c619-101:842814:843093 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 7
c619-111:3463502:3463777 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3463502:3463777 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:919914:920220 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:919914:920220 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3669167:3669444 [0] NCCL INFO [Proxy Service] Device 0 CPU core 50
c619-111:3463502:3463779 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 38
c619-112:919914:920272 [0] NCCL INFO [Proxy Service] Device 0 CPU core 28
c619-111:3463502:3463778 [0] NCCL INFO [Proxy Service] Device 0 CPU core 37
c619-112:919914:920273 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 29
c619-102:3669167:3669445 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 51
c619-101:842814:843090 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:842814:843090 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:842814:843090 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-101:842814:843090 [0] NCCL INFO ncclCommInitRankConfig comm 0x3c834980 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init COMPLETE
c619-101:842814:843090 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 9.67 (kernels 0.00, alloc 0.00, bootstrap 9.66, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463502:3463777 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3463502:3463777 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3463502:3463777 [0] NCCL INFO ncclCommInitRankConfig comm 0x1d02a7e0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init COMPLETE
c619-111:3463502:3463777 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.01 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3669167:3669391 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3669167:3669391 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:842814:843095 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 9
c619-102:3669167:3669391 [0] NCCL INFO ncclCommInitRankConfig comm 0x292a5430 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init COMPLETE
c619-101:842814:843094 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669167:3669391 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 9.67 (kernels 0.00, alloc 0.00, bootstrap 9.66, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:842814:843094 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:919914:920220 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:919914:920220 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:842814:843094 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:919914:920220 [0] NCCL INFO ncclCommInitRankConfig comm 0x349aa7d0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x2a248a5865d36167 - Init COMPLETE
c619-101:842814:843094 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:919914:920220 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 9.67 (kernels 0.00, alloc 0.00, bootstrap 9.66, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:842814:843094 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463781 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 39
c619-101:842814:843094 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669167:3669447 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 52
c619-101:842814:843094 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920275 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 31
c619-102:3669167:3669446 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463502:3463780 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669167:3669446 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:919914:920274 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:842814:843094 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-111:3463502:3463780 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-112:919914:920274 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-102:3669167:3669446 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
[2025-09-23 22:55:13,796] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Flops Profiler Enabled: False
[2025-09-23 22:55:13,797] [INFO] [logging.py:107:log_dist] [Rank 0] Using client Optimizer as basic optimizer
[2025-09-23 22:55:13,797] [INFO] [logging.py:107:log_dist] [Rank 0] Removing param_group that has no 'params' in the basic Optimizer
[2025-09-23 22:55:13,805] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Basic Optimizer = AdamW
[2025-09-23 22:55:13,805] [INFO] [utils.py:59:is_zero_supported_optimizer] Checking ZeRO support for optimizer=AdamW type=<class 'torch.optim.adamw.AdamW'>
[2025-09-23 22:55:13,805] [INFO] [logging.py:107:log_dist] [Rank 0] Creating torch.bfloat16 ZeRO stage 2 optimizer
[2025-09-23 22:55:13,805] [INFO] [stage_1_and_2.py:150:__init__] Reduce bucket size 500000000
[2025-09-23 22:55:13,805] [INFO] [stage_1_and_2.py:151:__init__] Allgather bucket size 500000000
[2025-09-23 22:55:13,805] [INFO] [stage_1_and_2.py:152:__init__] CPU Offload: False
[2025-09-23 22:55:13,805] [INFO] [stage_1_and_2.py:153:__init__] Round robin gradient partitioning: True
[INFO|trainer.py:2409] 2025-09-23 22:55:14,311 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:55:14,311 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:55:14,311 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:55:14,311 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:55:14,311 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:55:14,311 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:55:14,311 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:55:14,311 >> Number of trainable parameters = 1,543,714,304
[INFO|trainer.py:2409] 2025-09-23 22:55:14,343 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:55:14,343 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:55:14,343 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:55:14,343 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:55:14,343 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:55:14,343 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:55:14,343 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:55:14,343 >> Number of trainable parameters = 1,543,714,304
[2025-09-23 22:55:14,510] [INFO] [utils.py:781:see_memory_usage] Before initializing optimizer states
[2025-09-23 22:55:14,511] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 4.32 GB Max_CA 4 GB
[2025-09-23 22:55:14,511] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 37.97 GB, percent = 17.8%
[INFO|trainer.py:2409] 2025-09-23 22:55:14,676 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:55:14,676 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:55:14,676 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:55:14,676 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:55:14,676 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:55:14,676 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:55:14,676 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:55:14,677 >> Number of trainable parameters = 1,543,714,304
[2025-09-23 22:55:14,705] [INFO] [utils.py:781:see_memory_usage] After initializing optimizer states
[2025-09-23 22:55:14,706] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 5.75 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 22:55:14,706] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 39.41 GB, percent = 18.5%
[2025-09-23 22:55:14,706] [INFO] [stage_1_and_2.py:557:__init__] optimizer state initialized
[2025-09-23 22:55:14,894] [INFO] [utils.py:781:see_memory_usage] After initializing ZeRO optimizer
[2025-09-23 22:55:14,895] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 22:55:14,895] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 39.41 GB, percent = 18.5%
[2025-09-23 22:55:14,897] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Final Optimizer = DeepSpeedZeroOptimizer
[2025-09-23 22:55:14,897] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed using configured LR scheduler = None
[2025-09-23 22:55:14,897] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed LR Scheduler = None
[2025-09-23 22:55:14,897] [INFO] [logging.py:107:log_dist] [Rank 0] step=0, skipped=0, lr=[0.0, 0.0], mom=[(0.9, 0.95), (0.9, 0.95)]
[2025-09-23 22:55:14,898] [INFO] [config.py:1003:print] DeepSpeedEngine configuration:
[2025-09-23 22:55:14,898] [INFO] [config.py:1007:print] activation_checkpointing_config {
"partition_activations": false,
"contiguous_memory_optimization": false,
"cpu_checkpointing": false,
"number_checkpoints": null,
"synchronize_checkpoint_boundary": false,
"profile": false
}
[2025-09-23 22:55:14,898] [INFO] [config.py:1007:print] aio_config ................... {'block_size': 1048576, 'queue_depth': 8, 'intra_op_parallelism': 1, 'single_submit': False, 'overlap_events': True, 'use_gds': False}
[2025-09-23 22:55:14,898] [INFO] [config.py:1007:print] amp_enabled .................. False
[2025-09-23 22:55:14,898] [INFO] [config.py:1007:print] amp_params ................... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] autotuning_config ............ {
"enabled": false,
"start_step": null,
"end_step": null,
"metric_path": null,
"arg_mappings": null,
"metric": "throughput",
"model_info": null,
"results_dir": "autotuning_results",
"exps_dir": "autotuning_exps",
"overwrite": true,
"fast": true,
"start_profile_step": 3,
"end_profile_step": 5,
"tuner_type": "gridsearch",
"tuner_early_stopping": 5,
"tuner_num_trials": 50,
"model_info_path": null,
"mp_size": 1,
"max_train_batch_size": null,
"min_train_batch_size": 1,
"max_train_micro_batch_size_per_gpu": 1.024000e+03,
"min_train_micro_batch_size_per_gpu": 1,
"num_tuning_micro_batch_sizes": 3
}
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] bfloat16_enabled ............. True
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] bfloat16_immediate_grad_update True
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] checkpoint_parallel_write_pipeline False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] checkpoint_tag_validation_enabled True
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] checkpoint_tag_validation_fail False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] comms_config ................. <deepspeed.comm.config.DeepSpeedCommsConfig object at 0x4003234d06b0>
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] communication_data_type ...... None
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] compile_config ............... deepcompile=False free_activation=False offload_activation=False offload_opt_states=False double_buffer=True symmetric_memory=False debug_log=False offload_parameters=False sync_before_reduce=False sync_after_reduce=False sync_before_allgather=False sync_after_allgather=False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] compression_config ........... {'weight_quantization': {'shared_parameters': {'enabled': False, 'quantizer_kernel': False, 'schedule_offset': 0, 'quantize_groups': 1, 'quantize_verbose': False, 'quantization_type': 'symmetric', 'quantize_weight_in_forward': False, 'rounding': 'nearest', 'fp16_mixed_quantize': False, 'quantize_change_ratio': 0.001}, 'different_groups': {}}, 'activation_quantization': {'shared_parameters': {'enabled': False, 'quantization_type': 'symmetric', 'range_calibration': 'dynamic', 'schedule_offset': 1000}, 'different_groups': {}}, 'sparse_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'row_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'head_pruning': {'shared_parameters': {'enabled': False, 'method': 'topk', 'schedule_offset': 1000}, 'different_groups': {}}, 'channel_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'layer_reduction': {'enabled': False}}
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] curriculum_enabled_legacy .... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] curriculum_params_legacy ..... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] data_efficiency_config ....... {'enabled': False, 'seed': 1234, 'data_sampling': {'enabled': False, 'num_epochs': 1000, 'num_workers': 0, 'pin_memory': False, 'curriculum_learning': {'enabled': False}, 'dynamic_batching': {'enabled': False, 'lr_scaling_method': 'linear', 'min_batch_size': 1, 'max_batch_size': None, 'sequence_picking_order': 'dataloader', 'verbose': False}}, 'data_routing': {'enabled': False, 'random_ltd': {'enabled': False, 'layer_token_lr_schedule': {'enabled': False}}}}
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] data_efficiency_enabled ...... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] dataloader_drop_last ......... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] disable_allgather ............ False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] dump_state ................... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] dynamic_loss_scale_args ...... None
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_enabled ........... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_gas_boundary_resolution 1
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_layer_name ........ bert.encoder.layer
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_layer_num ......... 0
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_max_iter .......... 100
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_stability ......... 1e-06
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_tol ............... 0.01
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] eigenvalue_verbose ........... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] elasticity_enabled ........... False
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] flops_profiler_config ........ {
"enabled": false,
"recompute_fwd_factor": 0.0,
"profile_step": 1,
"module_depth": -1,
"top_modules": 1,
"detailed": true,
"output_file": null
}
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] fp16_auto_cast ............... None
[2025-09-23 22:55:14,899] [INFO] [config.py:1007:print] fp16_enabled ................. False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] fp16_master_weights_and_gradients False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] global_rank .................. 0
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] grad_accum_dtype ............. None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] gradient_accumulation_steps .. 1
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] gradient_clipping ............ 1.0
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] gradient_predivide_factor .... 1.0
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] graph_harvesting ............. False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] hybrid_engine ................ enabled=False max_out_tokens=512 inference_tp_size=1 release_inference_cache=False pin_parameters=True tp_gather_partition_size=8
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] initial_dynamic_scale ........ 1
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] load_universal_checkpoint .... False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] loss_scale ................... 1.0
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] memory_breakdown ............. False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] mics_hierarchial_params_gather False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] mics_shard_size .............. -1
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] monitor_config ............... tensorboard=TensorBoardConfig(enabled=False, output_path='', job_name='DeepSpeedJobName') comet=CometConfig(enabled=False, samples_log_interval=100, project=None, workspace=None, api_key=None, experiment_name=None, experiment_key=None, online=None, mode=None) wandb=WandbConfig(enabled=False, group=None, team=None, project='deepspeed') csv_monitor=CSVConfig(enabled=False, output_path='', job_name='DeepSpeedJobName')
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] nebula_config ................ {
"enabled": false,
"persistent_storage_path": null,
"persistent_time_interval": 100,
"num_of_version_in_retention": 2,
"enable_nebula_load": true,
"load_path": null
}
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] optimizer_legacy_fusion ...... False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] optimizer_name ............... None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] optimizer_params ............. None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] pipeline ..................... {'stages': 'auto', 'partition': 'best', 'seed_layers': False, 'activation_checkpoint_interval': 0, 'pipe_partitioned': True, 'grad_partitioned': True}
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] pld_enabled .................. False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] pld_params ................... False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] prescale_gradients ........... False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] scheduler_name ............... None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] scheduler_params ............. None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] seq_parallel_communication_data_type torch.float32
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] sparse_attention ............. None
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] sparse_gradients_enabled ..... False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] steps_per_print .............. inf
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] tensor_parallel_config ....... dtype=torch.float16 autotp_size=0 tp_overlap_comm=False tensor_parallel=TPConfig(tp_size=1, tp_grain_size=1, mpu=None, tp_group=None) injection_policy_tuple=None keep_module_on_host=False replace_with_kernel_inject=False
[2025-09-23 22:55:14,900] [INFO] [config.py:1007:print] timers_config ................ enabled=True synchronized=True
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] train_batch_size ............. 4
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] train_micro_batch_size_per_gpu 1
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] use_data_before_expert_parallel_ False
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] use_node_local_storage ....... False
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] wall_clock_breakdown ......... False
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] weight_quantization_config ... None
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] world_size ................... 4
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] zero_allow_untested_optimizer True
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] zero_config .................. stage=2 contiguous_gradients=True reduce_scatter=True reduce_bucket_size=500000000 use_multi_rank_bucket_allreduce=True allgather_partitions=True allgather_bucket_size=500000000 overlap_comm=False load_from_fp32_weights=True elastic_checkpoint=False offload_param=None offload_optimizer=None sub_group_size=1000000000 cpu_offload_param=None cpu_offload_use_pin_memory=None cpu_offload=None prefetch_bucket_size=50000000 param_persistence_threshold=100000 model_persistence_threshold=9223372036854775807 max_live_parameters=1000000000 max_reuse_distance=1000000000 gather_16bit_weights_on_model_save=False module_granularity_threshold=0 use_all_reduce_for_fetch_params=False stage3_gather_fp16_weights_on_model_save=False ignore_unused_parameters=True legacy_stage1=False round_robin_gradients=True zero_hpz_partition_size=1 zero_quantized_weights=False zero_quantized_nontrainable_weights=False zero_quantized_gradients=False zeropp_loco_param=None mics_shard_size=-1 mics_hierarchical_params_gather=False memory_efficient_linear=True pipeline_loading_checkpoint=False override_module_apply=True log_trace_cache_warnings=False
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] zero_enabled ................. True
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] zero_force_ds_cpu_optimizer .. True
[2025-09-23 22:55:14,901] [INFO] [config.py:1007:print] zero_optimization_stage ...... 2
[2025-09-23 22:55:14,901] [INFO] [config.py:993:print_user_config] json = {
"train_batch_size": 4,
"train_micro_batch_size_per_gpu": 1,
"gradient_accumulation_steps": 1,
"gradient_clipping": 1.0,
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": false,
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": true
},
"zero_optimization": {
"stage": 2,
"allgather_partitions": true,
"allgather_bucket_size": 5.000000e+08,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5.000000e+08,
"contiguous_gradients": true,
"round_robin_gradients": true
},
"steps_per_print": inf
}
[INFO|trainer.py:2409] 2025-09-23 22:55:14,903 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:55:14,903 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:55:14,903 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:55:14,903 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:55:14,903 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:55:14,903 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:55:14,903 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:55:14,904 >> Number of trainable parameters = 1,543,714,304
[INFO|integration_utils.py:832] 2025-09-23 22:55:14,905 >> Automatic Weights & Biases logging enabled, to disable set os.environ["WANDB_DISABLED"] = "true"
[rank0]: Traceback (most recent call last):
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 28, in <module>
[rank0]: main()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py", line 19, in main
[rank0]: run_exp()
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank0]: _training_function(config={"args": args, "callbacks": callbacks})
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank0]: File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 137, in run_sft
[rank0]: train_result = trainer.train(resume_from_checkpoint=training_args.resume_from_checkpoint)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/trainer.py", line 2240, in train
[rank0]: return inner_training_loop(
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/trainer.py", line 2464, in _inner_training_loop
[rank0]: self.control = self.callback_handler.on_train_begin(args, self.state, self.control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/trainer_callback.py", line 506, in on_train_begin
[rank0]: return self.call_event("on_train_begin", args, state, control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/trainer_callback.py", line 556, in call_event
[rank0]: result = getattr(callback, event)(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/integrations/integration_utils.py", line 931, in on_train_begin
[rank0]: self.setup(args, state, model, **kwargs)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/integrations/integration_utils.py", line 858, in setup
[rank0]: self._wandb.init(
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/sdk/wandb_init.py", line 1581, in init
[rank0]: wandb._sentry.reraise(e)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/analytics/sentry.py", line 162, in reraise
[rank0]: raise exc.with_traceback(sys.exc_info()[2])
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/sdk/wandb_init.py", line 1504, in init
[rank0]: wi.maybe_login(init_settings)
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/sdk/wandb_init.py", line 190, in maybe_login
[rank0]: wandb_login._login(
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/sdk/wandb_login.py", line 318, in _login
[rank0]: key, key_status = wlogin.prompt_api_key(referrer=referrer)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/wandb/sdk/wandb_login.py", line 244, in prompt_api_key
[rank0]: raise UsageError("api_key not configured (no-tty). call " + directive)
[rank0]: wandb.errors.errors.UsageError: api_key not configured (no-tty). call wandb.login(key=[your_api_key])
[rank0]:[W923 22:55:15.962128199 ProcessGroupNCCL.cpp:1476] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
c619-101:842814:843106 [0] NCCL INFO misc/socket.cc:64 -> 3
c619-101:842814:843106 [0] NCCL INFO misc/socket.cc:80 -> 3
c619-101:842814:843106 [0] NCCL INFO misc/socket.cc:829 -> 3
c619-101:842814:842932 [0] NCCL INFO misc/socket.cc:881 -> 3
c619-101:842814:843106 [0] NCCL INFO comm 0x322cdd00 rank 0 nranks 4 cudaDev 0 busId 901000 - Abort COMPLETE
E0923 22:55:17.278000 842809 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 842814) of binary: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-09-23_22:55:17
host : c619-101.vista.tacc.utexas.edu
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 842814)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
[rank2]:[W923 22:55:17.615702080 TCPStore.cpp:125] [c10d] recvValue failed on SocketImpl(fd=30, addr=[c619-111.vista.tacc.utexas.edu]:42888, remote=[c619-101.vista.tacc.utexas.edu]:37239): failed to recv, got 0 bytes
Exception raised from recvBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:678 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400038963ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400008a62950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc6b34 (0x400008a66b34 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7af4 (0x400008a67af4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::check(std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&) + 0x1d4 (0x400008a68bd4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::ProcessGroupNCCL::heartbeatMonitor() + 0x298 (0x40000d7854a8 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cuda.so)
frame #6: <unknown function> + 0xddf3c (0x400038e0df3c in /opt/apps/gcc/14.2.0/lib64/libstdc++.so.6)
frame #7: <unknown function> + 0x82a38 (0x400001212a38 in /lib64/libc.so.6)
frame #8: <unknown function> + 0x2bb9c (0x4000011bbb9c in /lib64/libc.so.6)
[rank2]:[W923 22:55:17.617385366 ProcessGroupNCCL.cpp:1659] [PG ID 0 PG GUID 0(default_pg) Rank 2] Failed to check the "should dump" flag on TCPStore, (maybe TCPStore server has shut down too early), with error: failed to recv, got 0 bytes
srun: error: c619-101: task 0: Exited with exit code 1
W0923 22:55:17.680000 919911 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 919914 closing signal SIGTERM
[rank1]:[W923 22:55:17.829961534 TCPStore.cpp:125] [c10d] recvValue failed on SocketImpl(fd=30, addr=[c619-102.vista.tacc.utexas.edu]:38164, remote=[c619-101.vista.tacc.utexas.edu]:37239): failed to recv, got 0 bytes
Exception raised from recvBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:678 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x4000657a3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x4000358a2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc6b34 (0x4000358a6b34 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7af4 (0x4000358a7af4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::check(std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&) + 0x1d4 (0x4000358a8bd4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::ProcessGroupNCCL::heartbeatMonitor() + 0x298 (0x40003a5c54a8 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cuda.so)
frame #6: <unknown function> + 0xddf3c (0x400065c4df3c in /opt/apps/gcc/14.2.0/lib64/libstdc++.so.6)
frame #7: <unknown function> + 0x82a38 (0x40002e052a38 in /lib64/libc.so.6)
frame #8: <unknown function> + 0x2bb9c (0x40002dffbb9c in /lib64/libc.so.6)
[rank1]:[W923 22:55:17.831606012 ProcessGroupNCCL.cpp:1659] [PG ID 0 PG GUID 0(default_pg) Rank 1] Failed to check the "should dump" flag on TCPStore, (maybe TCPStore server has shut down too early), with error: failed to recv, got 0 bytes
[W923 22:55:18.367826703 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:46178, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40006d1d3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40003d2d2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40003d2d59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40003d2d7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40003d2d9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40003d2d970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40003d2dadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x4000373c0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000369f59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x400035a2c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x400035a2c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.153000 919911 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_919911_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:55:18.377552819 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-112.vista.tacc.utexas.edu]:46178, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40006d1d3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40003d2d2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40003d2d59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40003d2d7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40003d2d9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40003d2d970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40003d2dadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x4000373c0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000369f59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x400035a2c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x400035a2c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.161000 919911 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-112.vista.tacc.utexas.edu_919911_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
W0923 22:55:18.200000 3463499 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3463502 closing signal SIGTERM
W0923 22:55:18.201000 3669164 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3669167 closing signal SIGTERM
srun: error: c619-112: task 3: Exited with exit code 1
[W923 22:55:18.897065438 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:53322, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003f8c3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000f9c2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000f9c59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000f9c7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40000f9c9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40000f9c970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40000f9cadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400009ab0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000090e59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000811c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000811c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.773000 3463499 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3463499_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:55:18.906930591 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-111.vista.tacc.utexas.edu]:53322, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x40003f8c3ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x40000f9c2950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x40000f9c59dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x40000f9c7ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x40000f9c9178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x40000f9c970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x40000f9cadec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400009ab0904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000090e59c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40000811c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40000811c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.782000 3463499 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-111.vista.tacc.utexas.edu_3463499_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
[W923 22:55:18.948464605 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:44354, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400067e13ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400037f12950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x400037f159dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400037f17ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400037f19178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x400037f1970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x400037f1adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400032000904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000316359c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40003066c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40003066c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.824000 3669164 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3669164_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
[W923 22:55:18.958419981 TCPStore.cpp:106] [c10d] sendBytes failed on SocketImpl(fd=3, addr=[c619-102.vista.tacc.utexas.edu]:44354, remote=[c619-101.vista.tacc.utexas.edu]:12802): Broken pipe
Exception raised from sendBytes at /pytorch/torch/csrc/distributed/c10d/Utils.hpp:653 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0xd4 (0x400067e13ea4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x5dc2950 (0x400037f12950 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #2: <unknown function> + 0x5dc59dc (0x400037f159dc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #3: <unknown function> + 0x5dc7ec4 (0x400037f17ec4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #4: c10d::TCPStore::doWait(c10::ArrayRef<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::chrono::duration<long, std::ratio<1l, 1000l> >) + 0x148 (0x400037f19178 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #5: c10d::TCPStore::doGet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x2c (0x400037f1970c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #6: c10d::TCPStore::get(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x9c (0x400037f1adec in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_cpu.so)
frame #7: <unknown function> + 0xf60904 (0x400032000904 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #8: <unknown function> + 0x5959c0 (0x4000316359c0 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/lib/libtorch_python.so)
frame #9: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db490]
frame #10: _PyObject_MakeTpCall + 0x98 (0x48a8fc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #11: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x48e9dc]
frame #12: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #13: _PyObject_FastCallDictTstate + 0x7c (0x48ca40 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #14: _PyObject_Call_Prepend + 0x13c (0x48ccfc in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #15: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x504284]
frame #16: _PyObject_Call + 0x68 (0x48cf48 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #17: _PyEval_EvalFrameDefault + 0x79c (0x42800c in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #18: PyEval_EvalCode + 0xb4 (0x588d74 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #19: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x585ac8]
frame #20: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x4db5cc]
frame #21: PyObject_Vectorcall + 0x54 (0x48acc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #22: _PyEval_EvalFrameDefault + 0x8864 (0x4300d4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #23: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x60b830]
frame #24: Py_RunMain + 0x684 (0x60c228 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #25: Py_BytesMain + 0x64 (0x60cbc4 in /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python)
frame #26: <unknown function> + 0x2c79c (0x40003066c79c in /lib64/libc.so.6)
frame #27: __libc_start_main + 0x98 (0x40003066c86c in /lib64/libc.so.6)
frame #28: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python() [0x433cb0]
W0923 22:55:18.833000 3669164 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1292] The node 'c619-102.vista.tacc.utexas.edu_3669164_0' has failed to shutdown the rendezvous '375339' due to an error of type RendezvousConnectionError.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 117, in _call_store
return getattr(self._store, store_op)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
torch.distributed.DistNetworkError: failed to recv, got 0 bytes
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 896, in <module>
main()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in main
run(args)
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent
result = agent.run()
^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/metrics/api.py", line 138, in wrapper
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 711, in run
result = self._invoke_run(role)
^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/agent/server/api.py", line 906, in _invoke_run
num_nodes_waiting = rdzv_handler.num_nodes_waiting()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 1263, in num_nodes_waiting
self._state_holder.sync()
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py", line 437, in sync
get_response = self._backend.get_state()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 75, in get_state
base64_state: bytes = self._call_store("get", self._key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/c10d_rendezvous_backend.py", line 119, in _call_store
raise RendezvousConnectionError(
torch.distributed.elastic.rendezvous.api.RendezvousConnectionError: The connection to the C10d store has failed. See inner exception for details.
srun: error: c619-111: task 2: Exited with exit code 1
srun: error: c619-102: task 1: Exited with exit code 1
================================================================================
[ERROR] Training failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Training failed
[ERROR] Stage error: RuntimeError: Training failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 43.94ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 36.0MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.0MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.15 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.15 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.0MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/9.22k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.22k/9.22k [00:00<00:00, 27.8kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.22k/9.22k [00:00<00:00, 27.8kB/s]
Generating train split: 0%| | 0/67 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 67/67 [00:00<00:00, 17174.01 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2341.88ba/s]
Uploading...: 0%| | 0.00/9.36k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.36k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.36k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.36k/9.36k [00:00<00:00, 46.8kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.36k/9.36k [00:00<00:00, 11.0kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.12s/ shards]
| BASELINE_r1_distillation | 114.891498 | true |
2025-09-23T22:57:55.632738 | 2025-09-23T23:07:25.754037 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 71 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 72)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.8 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.8 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 22:58:09] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:58:10,806] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[WARNING|2025-09-23 22:58:11] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:58:11] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 22:58:11] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 22:58:12,482] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 22:58:12,586] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 22:58:12,872] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
[2025-09-23 22:58:15,072] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 22:58:15,072] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 22:58:15] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:15,270 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:15,450 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 22:58:15,880 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:15,882 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:16,010 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:16,187 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:58:16] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6099 examples [00:00, 19671.70 examples/s]
Converting format of dataset: 7996 examples [00:00, 20528.80 examples/s]
[rank0]:[W923 22:58:17.604541404 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-101:845961:845961 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:845961:845961 [0] NCCL INFO cudaDriverVersion 12080
c619-101:845961:845961 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:845961:845961 [0] NCCL INFO Comm config Blocking set to 1
c619-101:845961:846075 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:845961:846075 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-101:845961:846075 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:845961:846075 [0] NCCL INFO Using network IB
c619-101:845961:846075 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:845961:846075 [0] NCCL INFO ncclCommInitRankConfig comm 0x28280b30 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init START
[2025-09-23 22:58:17,880] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:58:17] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[2025-09-23 22:58:17,954] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:58:17] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,076 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,160 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:18,253 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:18,339 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[2025-09-23 22:58:18,591] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 22:58:18] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|configuration_utils.py:698] 2025-09-23 22:58:18,674 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:18,677 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|configuration_utils.py:698] 2025-09-23 22:58:18,732 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:18,735 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,795 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,827 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:18,873 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:18,975 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:19,006 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:58:19] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:19,052 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:58:19] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|configuration_utils.py:698] 2025-09-23 22:58:19,387 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:19,389 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 22:58:19,584 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 22:58:19,766 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 22:58:19] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6146 examples [00:00, 19900.30 examples/s]
Converting format of dataset: 7996 examples [00:00, 20880.05 examples/s]
[rank2]:[W923 22:58:20.471610323 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-111:3463818:3463818 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3463818:3463818 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3463818:3463818 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3463818:3463818 [0] NCCL INFO Comm config Blocking set to 1
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6099 examples [00:00, 19552.89 examples/s]
Converting format of dataset: 7996 examples [00:00, 20598.32 examples/s]
c619-111:3463818:3463932 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3463818:3463932 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-111:3463818:3463932 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3463818:3463932 [0] NCCL INFO Using network IB
c619-111:3463818:3463932 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463818:3463932 [0] NCCL INFO ncclCommInitRankConfig comm 0x4a0c9a70 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init START
[rank1]:[W923 22:58:20.622911549 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3669486:3669486 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3669486:3669486 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3669486:3669486 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3669486:3669486 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3669486:3669599 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3669486:3669599 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-102:3669486:3669599 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3669486:3669599 [0] NCCL INFO Using network IB
c619-102:3669486:3669599 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3669486:3669599 [0] NCCL INFO ncclCommInitRankConfig comm 0x36c20920 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init START
c619-102:3669486:3669599 [0] NCCL INFO RAS client listening socket at ::1<28028>
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6091 examples [00:00, 19559.25 examples/s]
Converting format of dataset: 7996 examples [00:00, 20313.29 examples/s]
[rank3]:[W923 22:58:21.306663619 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-112:920311:920311 [0] NCCL INFO cudaDriverVersion 12080
c619-112:920311:920311 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:920311:920311 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:920311:920311 [0] NCCL INFO Comm config Blocking set to 1
c619-112:920311:920425 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:920311:920425 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:920311:920425 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:920311:920425 [0] NCCL INFO Using network IB
c619-112:920311:920425 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:920311:920425 [0] NCCL INFO ncclCommInitRankConfig comm 0x146a66f0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init START
c619-112:920311:920425 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-111:3463818:3463932 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:845961:846075 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-102:3669486:3669599 [0] NCCL INFO Bootstrap timings total 0.626523 (create 0.000028, send 0.000127, recv 0.000217, ring 0.595400, delay 0.000000)
c619-111:3463818:3463932 [0] NCCL INFO Bootstrap timings total 0.786739 (create 0.000027, send 0.000151, recv 0.745548, ring 0.000281, delay 0.000001)
c619-101:845961:846075 [0] NCCL INFO Bootstrap timings total 3.683748 (create 0.000029, send 0.000083, recv 3.027296, ring 0.000293, delay 0.000001)
c619-112:920311:920425 [0] NCCL INFO Bootstrap timings total 0.090541 (create 0.000027, send 0.000130, recv 0.000257, ring 0.000325, delay 0.000001)
c619-102:3669486:3669599 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:845961:846075 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:920311:920425 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3463818:3463932 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:845961:846075 [0] NCCL INFO comm 0x28280b30 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846075 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-102:3669486:3669599 [0] NCCL INFO comm 0x36c20920 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846075 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-111:3463818:3463932 [0] NCCL INFO comm 0x4a0c9a70 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-112:920311:920425 [0] NCCL INFO comm 0x146a66f0 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846075 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:845961:846075 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:845961:846075 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:845961:846075 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-102:3669486:3669599 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3669486:3669599 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3463818:3463932 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3463818:3463932 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:920311:920425 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:920311:920425 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3463818:3463936 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 13
c619-111:3463818:3463935 [0] NCCL INFO [Proxy Service] Device 0 CPU core 12
c619-102:3669486:3669602 [0] NCCL INFO [Proxy Service] Device 0 CPU core 51
c619-112:920311:920429 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 64
c619-101:845961:846078 [0] NCCL INFO [Proxy Service] Device 0 CPU core 50
c619-101:845961:846079 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 51
c619-102:3669486:3669603 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 52
c619-112:920311:920428 [0] NCCL INFO [Proxy Service] Device 0 CPU core 63
c619-101:845961:846075 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:845961:846075 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:845961:846075 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-102:3669486:3669599 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3669486:3669599 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:845961:846075 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:845961:846075 [0] NCCL INFO ncclCommInitRankConfig comm 0x28280b30 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init COMPLETE
c619-101:845961:846075 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 3.80 (kernels 0.06, alloc 0.00, bootstrap 3.68, allgathers 0.03, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3669486:3669599 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3669486:3669599 [0] NCCL INFO ncclCommInitRankConfig comm 0x36c20920 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init COMPLETE
c619-102:3669486:3669599 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.77 (kernels 0.06, alloc 0.00, bootstrap 0.63, allgathers 0.06, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463818:3463932 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3463818:3463932 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-112:920311:920425 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:920311:920425 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3463818:3463932 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3463818:3463932 [0] NCCL INFO ncclCommInitRankConfig comm 0x4a0c9a70 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init COMPLETE
c619-111:3463818:3463932 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.92 (kernels 0.06, alloc 0.00, bootstrap 0.79, allgathers 0.05, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:920311:920425 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:920311:920425 [0] NCCL INFO ncclCommInitRankConfig comm 0x146a66f0 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x8c32e28424bbc541 - Init COMPLETE
c619-112:920311:920425 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.18 (kernels 0.06, alloc 0.00, bootstrap 0.09, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3669486:3669605 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 54
c619-101:845961:846081 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 53
c619-101:845961:846080 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463938 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 14
c619-101:845961:846080 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920431 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 65
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3463937 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920430 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669604 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846080 [0] NCCL INFO Connected all trees
c619-102:3669486:3669604 [0] NCCL INFO Connected all trees
c619-112:920311:920430 [0] NCCL INFO Connected all trees
c619-111:3463818:3463937 [0] NCCL INFO Connected all trees
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 230.90 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 229.11 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 228.58 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 225.80 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 226.73 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 231.09 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 229.42 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 226.72 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 225.51 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 226.29 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 227.71 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 228.47 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 226.37 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 223.59 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 224.65 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 221.80 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 222.32 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 222.58 examples/s]
Running tokenizer on dataset: 7996 examples [00:18, 221.03 examples/s]
Running tokenizer on dataset: 7996 examples [00:18, 221.20 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
[INFO|configuration_utils.py:698] 2025-09-23 22:58:40,327 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:40,327 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:58:40] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 22:58:40,329 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:698] 2025-09-23 22:58:40,329 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:40,330 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:58:40] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:770] 2025-09-23 22:58:40,330 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:58:40] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 22:58:40,331 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 22:58:40,332 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 22:58:40] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|modeling_utils.py:1150] 2025-09-23 22:58:40,692 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:58:40,700 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:40,701 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 22:58:40,838 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:58:40,839 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:40,840 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 22:58:40,881 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:58:40,883 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:40,884 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 22:58:40,889 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 22:58:40,891 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:40,892 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:58:41,266 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:58:41,266 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:58:41,343 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:41,344 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:58:41] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:58:41] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:58:41] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:58:41] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:58:41] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:58:41,447 >> Using auto half precision backend
[2025-09-23 22:58:41,668] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[INFO|modeling_utils.py:5130] 2025-09-23 22:58:44,841 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:58:44,841 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 22:58:44,918 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:44,918 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|modeling_utils.py:5130] 2025-09-23 22:58:44,981 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:58:44,981 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|2025-09-23 22:58:44] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:58:44] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:58:44] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:58:44] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|modeling_utils.py:5130] 2025-09-23 22:58:44,987 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 22:58:44,987 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|2025-09-23 22:58:45] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:58:45,027 >> Using auto half precision backend
[INFO|configuration_utils.py:1090] 2025-09-23 22:58:45,057 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:45,058 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|configuration_utils.py:1090] 2025-09-23 22:58:45,067 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 22:58:45,067 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 22:58:45] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:58:45] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:58:45] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:58:45] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:58:45] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 22:58:45] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 22:58:45] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 22:58:45] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 22:58:45] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:58:45,161 >> Using auto half precision backend
[INFO|2025-09-23 22:58:45] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 22:58:45,178 >> Using auto half precision backend
[2025-09-23 22:58:45,262] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[2025-09-23 22:58:45,390] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[2025-09-23 22:58:45,409] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed info: version=0.16.9, git-hash=unknown, git-branch=unknown
[2025-09-23 22:58:45,409] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
c619-101:845961:845961 [0] NCCL INFO Comm config Blocking set to 1
c619-101:845961:846157 [0] NCCL INFO Using network IB
c619-101:845961:846157 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463818:3463818 [0] NCCL INFO Comm config Blocking set to 1
c619-102:3669486:3669486 [0] NCCL INFO Comm config Blocking set to 1
c619-112:920311:920311 [0] NCCL INFO Comm config Blocking set to 1
c619-101:845961:846157 [0] NCCL INFO ncclCommInitRankConfig comm 0x31f63cc0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init START
c619-111:3463818:3464013 [0] NCCL INFO Using network IB
c619-102:3669486:3669680 [0] NCCL INFO Using network IB
c619-111:3463818:3464013 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:920311:920506 [0] NCCL INFO Using network IB
c619-102:3669486:3669680 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3463818:3464013 [0] NCCL INFO ncclCommInitRankConfig comm 0x51c28a80 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init START
c619-112:920311:920506 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3669486:3669680 [0] NCCL INFO ncclCommInitRankConfig comm 0x3ed369e0 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init START
c619-112:920311:920506 [0] NCCL INFO ncclCommInitRankConfig comm 0x16b43940 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init START
c619-111:3463818:3464013 [0] NCCL INFO Bootstrap timings total 0.000811 (create 0.000028, send 0.000104, recv 0.000315, ring 0.000211, delay 0.000000)
c619-101:845961:846157 [0] NCCL INFO Bootstrap timings total 0.001032 (create 0.000026, send 0.000078, recv 0.000339, ring 0.000136, delay 0.000000)
c619-112:920311:920506 [0] NCCL INFO Bootstrap timings total 0.000747 (create 0.000034, send 0.000089, recv 0.000306, ring 0.000171, delay 0.000000)
c619-102:3669486:3669680 [0] NCCL INFO Bootstrap timings total 0.000798 (create 0.000033, send 0.000094, recv 0.000181, ring 0.000335, delay 0.000000)
c619-102:3669486:3669680 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:920311:920506 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3463818:3464013 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:845961:846157 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3463818:3464013 [0] NCCL INFO comm 0x51c28a80 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846157 [0] NCCL INFO comm 0x31f63cc0 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846157 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-102:3669486:3669680 [0] NCCL INFO comm 0x3ed369e0 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846157 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-112:920311:920506 [0] NCCL INFO comm 0x16b43940 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:845961:846157 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-111:3463818:3464013 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-101:845961:846157 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-111:3463818:3464013 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:845961:846157 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:845961:846157 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:845961:846157 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:845961:846157 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-102:3669486:3669680 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3669486:3669680 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:920311:920506 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-112:920311:920506 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:845961:846158 [0] NCCL INFO [Proxy Service] Device 0 CPU core 4
c619-111:3463818:3464014 [0] NCCL INFO [Proxy Service] Device 0 CPU core 13
c619-102:3669486:3669681 [0] NCCL INFO [Proxy Service] Device 0 CPU core 65
c619-102:3669486:3669682 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 66
c619-112:920311:920507 [0] NCCL INFO [Proxy Service] Device 0 CPU core 65
c619-101:845961:846159 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 5
c619-112:920311:920508 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 66
c619-111:3463818:3464015 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 14
c619-102:3669486:3669680 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3669486:3669680 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3669486:3669680 [0] NCCL INFO ncclCommInitRankConfig comm 0x3ed369e0 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init COMPLETE
c619-102:3669486:3669680 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.01 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:920311:920506 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:920311:920506 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3463818:3464013 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3463818:3464013 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-112:920311:920506 [0] NCCL INFO ncclCommInitRankConfig comm 0x16b43940 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init COMPLETE
c619-111:3463818:3464013 [0] NCCL INFO ncclCommInitRankConfig comm 0x51c28a80 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init COMPLETE
c619-112:920311:920506 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.02 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3463818:3464013 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.02 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-102:3669486:3669684 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 67
c619-102:3669486:3669683 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846157 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:845961:846157 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3669486:3669683 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920510 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 68
c619-102:3669486:3669683 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846157 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-112:920311:920509 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464017 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 15
c619-112:920311:920509 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846157 [0] NCCL INFO ncclCommInitRankConfig comm 0x31f63cc0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x41c1426f62138c70 - Init COMPLETE
c619-102:3669486:3669683 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846157 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 0.02 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.02, rest 0.00)
c619-102:3669486:3669683 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846161 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 6
c619-111:3463818:3464016 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669683 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464016 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920509 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846160 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-112:920311:920509 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-102:3669486:3669683 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-111:3463818:3464016 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
[2025-09-23 22:58:45,883] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Flops Profiler Enabled: False
[2025-09-23 22:58:45,886] [INFO] [logging.py:107:log_dist] [Rank 0] Using client Optimizer as basic optimizer
[2025-09-23 22:58:45,886] [INFO] [logging.py:107:log_dist] [Rank 0] Removing param_group that has no 'params' in the basic Optimizer
[2025-09-23 22:58:45,894] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Basic Optimizer = AdamW
[2025-09-23 22:58:45,894] [INFO] [utils.py:59:is_zero_supported_optimizer] Checking ZeRO support for optimizer=AdamW type=<class 'torch.optim.adamw.AdamW'>
[2025-09-23 22:58:45,894] [INFO] [logging.py:107:log_dist] [Rank 0] Creating torch.bfloat16 ZeRO stage 2 optimizer
[2025-09-23 22:58:45,894] [INFO] [stage_1_and_2.py:150:__init__] Reduce bucket size 500000000
[2025-09-23 22:58:45,894] [INFO] [stage_1_and_2.py:151:__init__] Allgather bucket size 500000000
[2025-09-23 22:58:45,894] [INFO] [stage_1_and_2.py:152:__init__] CPU Offload: False
[2025-09-23 22:58:45,894] [INFO] [stage_1_and_2.py:153:__init__] Round robin gradient partitioning: True
[INFO|trainer.py:2409] 2025-09-23 22:58:46,373 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:58:46,373 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:58:46,373 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:58:46,373 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:58:46,373 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:58:46,373 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:58:46,373 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:58:46,374 >> Number of trainable parameters = 1,543,714,304
[INFO|trainer.py:2409] 2025-09-23 22:58:46,488 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:58:46,488 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:58:46,488 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:58:46,488 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:58:46,488 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:58:46,488 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:58:46,488 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:58:46,489 >> Number of trainable parameters = 1,543,714,304
[2025-09-23 22:58:46,558] [INFO] [utils.py:781:see_memory_usage] Before initializing optimizer states
[2025-09-23 22:58:46,559] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 4.32 GB Max_CA 4 GB
[2025-09-23 22:58:46,559] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 37.01 GB, percent = 17.4%
[2025-09-23 22:58:46,756] [INFO] [utils.py:781:see_memory_usage] After initializing optimizer states
[2025-09-23 22:58:46,757] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 5.75 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 22:58:46,757] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 38.44 GB, percent = 18.1%
[2025-09-23 22:58:46,757] [INFO] [stage_1_and_2.py:557:__init__] optimizer state initialized
[2025-09-23 22:58:46,947] [INFO] [utils.py:781:see_memory_usage] After initializing ZeRO optimizer
[2025-09-23 22:58:46,947] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 22:58:46,948] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 38.44 GB, percent = 18.1%
[2025-09-23 22:58:46,950] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Final Optimizer = DeepSpeedZeroOptimizer
[2025-09-23 22:58:46,950] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed using configured LR scheduler = None
[2025-09-23 22:58:46,950] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed LR Scheduler = None
[2025-09-23 22:58:46,950] [INFO] [logging.py:107:log_dist] [Rank 0] step=0, skipped=0, lr=[0.0, 0.0], mom=[(0.9, 0.95), (0.9, 0.95)]
[2025-09-23 22:58:46,951] [INFO] [config.py:1003:print] DeepSpeedEngine configuration:
[2025-09-23 22:58:46,951] [INFO] [config.py:1007:print] activation_checkpointing_config {
"partition_activations": false,
"contiguous_memory_optimization": false,
"cpu_checkpointing": false,
"number_checkpoints": null,
"synchronize_checkpoint_boundary": false,
"profile": false
}
[2025-09-23 22:58:46,951] [INFO] [config.py:1007:print] aio_config ................... {'block_size': 1048576, 'queue_depth': 8, 'intra_op_parallelism': 1, 'single_submit': False, 'overlap_events': True, 'use_gds': False}
[2025-09-23 22:58:46,951] [INFO] [config.py:1007:print] amp_enabled .................. False
[2025-09-23 22:58:46,951] [INFO] [config.py:1007:print] amp_params ................... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] autotuning_config ............ {
"enabled": false,
"start_step": null,
"end_step": null,
"metric_path": null,
"arg_mappings": null,
"metric": "throughput",
"model_info": null,
"results_dir": "autotuning_results",
"exps_dir": "autotuning_exps",
"overwrite": true,
"fast": true,
"start_profile_step": 3,
"end_profile_step": 5,
"tuner_type": "gridsearch",
"tuner_early_stopping": 5,
"tuner_num_trials": 50,
"model_info_path": null,
"mp_size": 1,
"max_train_batch_size": null,
"min_train_batch_size": 1,
"max_train_micro_batch_size_per_gpu": 1.024000e+03,
"min_train_micro_batch_size_per_gpu": 1,
"num_tuning_micro_batch_sizes": 3
}
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] bfloat16_enabled ............. True
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] bfloat16_immediate_grad_update True
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] checkpoint_parallel_write_pipeline False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] checkpoint_tag_validation_enabled True
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] checkpoint_tag_validation_fail False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] comms_config ................. <deepspeed.comm.config.DeepSpeedCommsConfig object at 0x400309d10560>
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] communication_data_type ...... None
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] compile_config ............... deepcompile=False free_activation=False offload_activation=False offload_opt_states=False double_buffer=True symmetric_memory=False debug_log=False offload_parameters=False sync_before_reduce=False sync_after_reduce=False sync_before_allgather=False sync_after_allgather=False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] compression_config ........... {'weight_quantization': {'shared_parameters': {'enabled': False, 'quantizer_kernel': False, 'schedule_offset': 0, 'quantize_groups': 1, 'quantize_verbose': False, 'quantization_type': 'symmetric', 'quantize_weight_in_forward': False, 'rounding': 'nearest', 'fp16_mixed_quantize': False, 'quantize_change_ratio': 0.001}, 'different_groups': {}}, 'activation_quantization': {'shared_parameters': {'enabled': False, 'quantization_type': 'symmetric', 'range_calibration': 'dynamic', 'schedule_offset': 1000}, 'different_groups': {}}, 'sparse_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'row_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'head_pruning': {'shared_parameters': {'enabled': False, 'method': 'topk', 'schedule_offset': 1000}, 'different_groups': {}}, 'channel_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'layer_reduction': {'enabled': False}}
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] curriculum_enabled_legacy .... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] curriculum_params_legacy ..... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] data_efficiency_config ....... {'enabled': False, 'seed': 1234, 'data_sampling': {'enabled': False, 'num_epochs': 1000, 'num_workers': 0, 'pin_memory': False, 'curriculum_learning': {'enabled': False}, 'dynamic_batching': {'enabled': False, 'lr_scaling_method': 'linear', 'min_batch_size': 1, 'max_batch_size': None, 'sequence_picking_order': 'dataloader', 'verbose': False}}, 'data_routing': {'enabled': False, 'random_ltd': {'enabled': False, 'layer_token_lr_schedule': {'enabled': False}}}}
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] data_efficiency_enabled ...... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] dataloader_drop_last ......... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] disable_allgather ............ False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] dump_state ................... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] dynamic_loss_scale_args ...... None
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] eigenvalue_enabled ........... False
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] eigenvalue_gas_boundary_resolution 1
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] eigenvalue_layer_name ........ bert.encoder.layer
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] eigenvalue_layer_num ......... 0
[2025-09-23 22:58:46,952] [INFO] [config.py:1007:print] eigenvalue_max_iter .......... 100
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] eigenvalue_stability ......... 1e-06
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] eigenvalue_tol ............... 0.01
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] eigenvalue_verbose ........... False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] elasticity_enabled ........... False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] flops_profiler_config ........ {
"enabled": false,
"recompute_fwd_factor": 0.0,
"profile_step": 1,
"module_depth": -1,
"top_modules": 1,
"detailed": true,
"output_file": null
}
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] fp16_auto_cast ............... None
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] fp16_enabled ................. False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] fp16_master_weights_and_gradients False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] global_rank .................. 0
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] grad_accum_dtype ............. None
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] gradient_accumulation_steps .. 1
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] gradient_clipping ............ 1.0
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] gradient_predivide_factor .... 1.0
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] graph_harvesting ............. False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] hybrid_engine ................ enabled=False max_out_tokens=512 inference_tp_size=1 release_inference_cache=False pin_parameters=True tp_gather_partition_size=8
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] initial_dynamic_scale ........ 1
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] load_universal_checkpoint .... False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] loss_scale ................... 1.0
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] memory_breakdown ............. False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] mics_hierarchial_params_gather False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] mics_shard_size .............. -1
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] monitor_config ............... tensorboard=TensorBoardConfig(enabled=False, output_path='', job_name='DeepSpeedJobName') comet=CometConfig(enabled=False, samples_log_interval=100, project=None, workspace=None, api_key=None, experiment_name=None, experiment_key=None, online=None, mode=None) wandb=WandbConfig(enabled=False, group=None, team=None, project='deepspeed') csv_monitor=CSVConfig(enabled=False, output_path='', job_name='DeepSpeedJobName')
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] nebula_config ................ {
"enabled": false,
"persistent_storage_path": null,
"persistent_time_interval": 100,
"num_of_version_in_retention": 2,
"enable_nebula_load": true,
"load_path": null
}
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] optimizer_legacy_fusion ...... False
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] optimizer_name ............... None
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] optimizer_params ............. None
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] pipeline ..................... {'stages': 'auto', 'partition': 'best', 'seed_layers': False, 'activation_checkpoint_interval': 0, 'pipe_partitioned': True, 'grad_partitioned': True}
[2025-09-23 22:58:46,953] [INFO] [config.py:1007:print] pld_enabled .................. False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] pld_params ................... False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] prescale_gradients ........... False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] scheduler_name ............... None
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] scheduler_params ............. None
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] seq_parallel_communication_data_type torch.float32
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] sparse_attention ............. None
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] sparse_gradients_enabled ..... False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] steps_per_print .............. inf
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] tensor_parallel_config ....... dtype=torch.float16 autotp_size=0 tp_overlap_comm=False tensor_parallel=TPConfig(tp_size=1, tp_grain_size=1, mpu=None, tp_group=None) injection_policy_tuple=None keep_module_on_host=False replace_with_kernel_inject=False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] timers_config ................ enabled=True synchronized=True
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] train_batch_size ............. 4
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] train_micro_batch_size_per_gpu 1
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] use_data_before_expert_parallel_ False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] use_node_local_storage ....... False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] wall_clock_breakdown ......... False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] weight_quantization_config ... None
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] world_size ................... 4
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] zero_allow_untested_optimizer True
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] zero_config .................. stage=2 contiguous_gradients=True reduce_scatter=True reduce_bucket_size=500000000 use_multi_rank_bucket_allreduce=True allgather_partitions=True allgather_bucket_size=500000000 overlap_comm=False load_from_fp32_weights=True elastic_checkpoint=False offload_param=None offload_optimizer=None sub_group_size=1000000000 cpu_offload_param=None cpu_offload_use_pin_memory=None cpu_offload=None prefetch_bucket_size=50000000 param_persistence_threshold=100000 model_persistence_threshold=9223372036854775807 max_live_parameters=1000000000 max_reuse_distance=1000000000 gather_16bit_weights_on_model_save=False module_granularity_threshold=0 use_all_reduce_for_fetch_params=False stage3_gather_fp16_weights_on_model_save=False ignore_unused_parameters=True legacy_stage1=False round_robin_gradients=True zero_hpz_partition_size=1 zero_quantized_weights=False zero_quantized_nontrainable_weights=False zero_quantized_gradients=False zeropp_loco_param=None mics_shard_size=-1 mics_hierarchical_params_gather=False memory_efficient_linear=True pipeline_loading_checkpoint=False override_module_apply=True log_trace_cache_warnings=False
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] zero_enabled ................. True
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] zero_force_ds_cpu_optimizer .. True
[2025-09-23 22:58:46,954] [INFO] [config.py:1007:print] zero_optimization_stage ...... 2
[2025-09-23 22:58:46,954] [INFO] [config.py:993:print_user_config] json = {
"train_batch_size": 4,
"train_micro_batch_size_per_gpu": 1,
"gradient_accumulation_steps": 1,
"gradient_clipping": 1.0,
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": false,
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": true
},
"zero_optimization": {
"stage": 2,
"allgather_partitions": true,
"allgather_bucket_size": 5.000000e+08,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5.000000e+08,
"contiguous_gradients": true,
"round_robin_gradients": true
},
"steps_per_print": inf
}
[INFO|trainer.py:2409] 2025-09-23 22:58:46,956 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:58:46,956 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:58:46,956 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:58:46,956 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:58:46,956 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:58:46,956 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:58:46,956 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:58:46,957 >> Number of trainable parameters = 1,543,714,304
[INFO|integration_utils.py:832] 2025-09-23 22:58:46,958 >> Automatic Weights & Biases logging enabled, to disable set os.environ["WANDB_DISABLED"] = "true"
[INFO|trainer.py:2409] 2025-09-23 22:58:46,974 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 22:58:46,974 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 22:58:46,974 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 22:58:46,974 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 22:58:46,974 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 22:58:46,974 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 22:58:46,974 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 22:58:46,975 >> Number of trainable parameters = 1,543,714,304
wandb: Currently logged in as: zsprague (ut_nlp_deduce) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
wandb: creating run
wandb: Tracking run with wandb version 0.22.0
wandb: Run data is saved locally in /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/wandb/run-20250923_225847-rlr328gt
wandb: Run `wandb offline` to turn off syncing.
wandb: Syncing run BASELINE_r1_distillation_sft
wandb: βοΈ View project at https://wandb.ai/ut_nlp_deduce/BASELINE_r1_distillation_sft
wandb: π View run at https://wandb.ai/ut_nlp_deduce/BASELINE_r1_distillation_sft/runs/rlr328gt
c619-112:920311:920512 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920311:920512 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3463818:3464019 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3669486:3669686 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846219 [0] NCCL INFO Connected all trees
c619-112:920311:920512 [0] NCCL INFO Connected all trees
c619-111:3463818:3464019 [0] NCCL INFO Connected all trees
c619-102:3669486:3669686 [0] NCCL INFO Connected all trees
c619-102:3669486:3669687 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3463818:3464020 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3669486:3669687 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920311:920513 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:845961:846220 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-102:3669486:3669687 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-111:3463818:3464020 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-112:920311:920513 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
{'loss': 1.167, 'grad_norm': 5.710583686828613, 'learning_rate': 1.8e-07, 'train/length': 195.6, 'epoch': 0.01}
0%| | 0/1000 [00:00<?, ?it/s]
0%| | 1/1000 [00:01<33:09, 1.99s/it]
0%| | 2/1000 [00:02<18:22, 1.10s/it]
0%| | 3/1000 [00:03<14:33, 1.14it/s]
0%| | 4/1000 [00:03<11:40, 1.42it/s]
0%| | 5/1000 [00:03<10:04, 1.65it/s]
1%| | 6/1000 [00:04<09:09, 1.81it/s]
1%| | 7/1000 [00:04<08:38, 1.91it/s]
1%| | 8/1000 [00:05<08:00, 2.06it/s]
1%| | 9/1000 [00:05<08:03, 2.05it/s]
1%| | 10/1000 [00:06<07:38, 2.16it/s]
1%| | 10/1000 [00:06<07:38, 2.16it/s]
1%| | 11/1000 [00:06<08:54, 1.85it/s]
1%| | 12/1000 [00:07<08:16, 1.99it/s]
1%|β | 13/1000 [00:07<07:51, 2.09it/s]
1%|β | 14/1000 [00:08<07:18, 2.25it/s]
2%|β | 15/1000 [00:08<07:04, 2.32it/s]
2%|β | 16/1000 [00:08<07:07, 2.30it/s]
2%|β | 17/1000 [00:09<06:57, 2.36it/s]
2%|β | 18/1000 [00{'loss': 1.1477, 'grad_norm': 4.990312099456787, 'learning_rate': 3.7999999999999996e-07, 'train/length': 196.0, 'epoch': 0.02}
{'loss': 1.1244, 'grad_norm': 4.064210414886475, 'learning_rate': 5.8e-07, 'train/length': 195.9, 'epoch': 0.03}
:09<07:05, 2.31it/s]
2%|β | 19/1000 [00:10<07:13, 2.26it/s]
2%|β | 20/1000 [00:10<07:16, 2.25it/s]
2%|β | 20/1000 [00:10<07:16, 2.25it/s]
2%|β | 21/1000 [00:11<07:06, 2.30it/s]
2%|β | 22/1000 [00:11<07:03, 2.31it/s]
2%|β | 23/1000 [00:12<07:14, 2.25it/s]
2%|β | 24/1000 [00:12<06:51, 2.37it/s]
2%|β | 25/1000 [00:12<06:57, 2.34it/s]
3%|β | 26/1000 [00:13<06:47, 2.39it/s]
3%|β | 27/1000 [00:13<06:54, 2.35it/s]
3%|β | 28/1000 [00:14<06:54, 2.34it/s]
3%|β | 29/1000 [00:14<06:48, 2.37it/s]
3%|β | 30/1000 [00:15<07:28, 2.16it/s]
3%|β | 30/1000 [00:15<07:28, 2.16it/s]
3%|β | 31/1000 [00:15<07:27, 2.16it/s]
3%|β | 32/1000 [00:15<07:17, 2.21it/s]
3%|β | 33/1000 [00:16<07:49, 2.06it/s]
3%|β {'loss': 1.1121, 'grad_norm': 3.923771619796753, 'learning_rate': 7.799999999999999e-07, 'train/length': 195.9, 'epoch': 0.04}
{'loss': 1.0552, 'grad_norm': 3.0773730278015137, 'learning_rate': 9.8e-07, 'train/length': 1466.0, 'epoch': 0.05}
| 34/1000 [00:16<07:30, 2.14it/s]
4%|β | 35/1000 [00:17<07:28, 2.15it/s]
4%|β | 36/1000 [00:17<07:19, 2.19it/s]
4%|β | 37/1000 [00:18<06:55, 2.32it/s]
4%|β | 38/1000 [00:18<07:36, 2.11it/s]
4%|β | 39/1000 [00:19<07:25, 2.16it/s]
4%|β | 40/1000 [00:19<07:08, 2.24it/s]
4%|β | 40/1000 [00:19<07:08, 2.24it/s]
4%|β | 41/1000 [00:20<07:41, 2.08it/s]
4%|β | 42/1000 [00:20<07:34, 2.11it/s]
4%|β | 43/1000 [00:21<07:33, 2.11it/s]
4%|β | 44/1000 [00:21<07:01, 2.27it/s]
4%|β | 45/1000 [00:21<07:11, 2.21it/s]
5%|β | 46/1000 [00:22<08:27, 1.88it/s]
5%|β | 47/1000 [00:23<08:49, 1.80it/s]
5%|β | 48/1000 [00:23<07:53, 2.01it/s]
5%|β | 49/1000 [00:24<07:32, 2.10it/s]
5%|β | 50/1000 [00:24<07:18, 2.17it/s]
{'loss': 0.9947, 'grad_norm': 2.5174810886383057, 'learning_rate': 9.997785653888834e-07, 'train/length': 3088.0, 'epoch': 0.06}
5%|β | 50/1000 [00:24<07:18, 2.17it/s]
5%|β | 51/1000 [00:24<07:21, 2.15it/s]
5%|β | 52/1000 [00:25<07:09, 2.21it/s]
5%|β | 53/1000 [00:25<07:13, 2.18it/s]
5%|β | 54/1000 [00:26<07:03, 2.23it/s]
6%|β | 55/1000 [00:26<07:09, 2.20it/s]
6%|β | 56/1000 [00:27<07:17, 2.16it/s]
6%|β | 57/1000 [00:27<07:55, 1.98it/s]
6%|β | 58/1000 [00:28<07:35, 2.07it/s]
6%|β | 59/1000 [00:28<08:05, 1.94it/s]
6%|β | 60/1000 [00:29<07:38, 2.05it/s]
6%|β | 60/1000 [00:29<07:38, 2.05it/s]
6%|β | 61/1000 [00:29<07:39, 2.05it/s]
6%|β | 62/1000 [00:30<07:23, 2.11it/s]
6%|β | 63/1000 [00:30<07:09, 2.18it/s]
6%|β | 64/1000 [00:31<07:06, 2.19it/s]
6%|β | 65/1000 [00:31<06:56, 2.24it/s]
7%|β | 66/1000 [00:31<07:00, 2.22it/s]
7%|β | 67/1000 [00:32<06:48{'loss': 0.9413, 'grad_norm': 2.2107906341552734, 'learning_rate': 9.990133642141357e-07, 'train/length': 2918.3, 'epoch': 0.07}
{'loss': 0.8819, 'grad_norm': 1.9446775913238525, 'learning_rate': 9.977024992520601e-07, 'train/length': 3023.7, 'epoch': 0.08}
, 2.28it/s]
7%|β | 68/1000 [00:32<06:27, 2.40it/s]
7%|β | 69/1000 [00:33<06:33, 2.36it/s]
7%|β | 70/1000 [00:33<06:36, 2.35it/s]
7%|β | 70/1000 [00:33<06:36, 2.35it/s]
7%|β | 71/1000 [00:34<06:31, 2.37it/s]
7%|β | 72/1000 [00:34<06:45, 2.29it/s]
7%|β | 73/1000 [00:34<06:39, 2.32it/s]
7%|β | 74/1000 [00:35<06:32, 2.36it/s]
8%|β | 75/1000 [00:35<06:46, 2.28it/s]
8%|β | 76/1000 [00:36<06:55, 2.22it/s]
8%|β | 77/1000 [00:36<06:50, 2.25it/s]
8%|β | 78/1000 [00:37<06:57, 2.21it/s]
8%|β | 79/1000 [00:37<06:59, 2.20it/s]
8%|β | 80/1000 [00:38<06:54, 2.22it/s]
8%|β | 80/1000 [00:38<06:54, 2.22it/s]
8%|β | 81/1000 [00:38<06:47, 2.25it/s]
8%|β | 82/1000 [00:38<06:25, 2.38it/s]
8%|β | 83/100{'loss': 0.8489, 'grad_norm': 1.998170018196106, 'learning_rate': 9.958474039142469e-07, 'train/length': 3057.5, 'epoch': 0.09}
0 [00:39<06:10, 2.47it/s]
8%|β | 84/1000 [00:39<07:01, 2.17it/s]
8%|β | 85/1000 [00:40<06:56, 2.20it/s]
9%|β | 86/1000 [00:40<06:55, 2.20it/s]
9%|β | 87/1000 [00:41<06:43, 2.26it/s]
9%|β | 88/1000 [00:41<06:57, 2.19it/s]
9%|β | 89/1000 [00:42<07:06, 2.14it/s]
9%|β | 90/1000 [00:42<07:03, 2.15it/s]
9%|β | 90/1000 [00:42<07:03, 2.15it/s]
9%|β | 91/1000 [00:43<06:56, 2.18it/s]
9%|β | 92/1000 [00:43<06:54, 2.19it/s]
9%|β | 93/1000 [00:43<06:46, 2.23it/s]
9%|β | 94/1000 [00:44<06:38, 2.27it/s]
10%|β | 95/1000 [00:44<06:18, 2.39it/s]
10%|β | 96/1000 [00:45<06:28, 2.33it/s]
10%|β | 97/1000 [00:45<06:39, 2.26it/s]
10%|β | 98/1000 [00:46<06:37, 2.27it/s]
10%|β | 99/1000 [00:46<06:18, 2.38it/s]
10%|β | 100/1000 [00:46<06:32, 2.29it/s]
{'loss': 0.7973, 'grad_norm': 1.994300365447998, 'learning_rate': 9.934501067202117e-07, 'train/length': 2868.8, 'epoch': 0.1}
{'loss': 0.7529, 'grad_norm': 1.887494683265686, 'learning_rate': 9.905132290792392e-07, 'train/length': 2762.7, 'epoch': 0.11}
10%|β | 100/1000 [00:46<06:32, 2.29it/s]
10%|β | 101/1000 [00:47<07:04, 2.12it/s]
10%|β | 102/1000 [00:48<07:33, 1.98it/s]
10%|β | 103/1000 [00:48<07:15, 2.06it/s]
10%|β | 104/1000 [00:48<07:06, 2.10it/s]
10%|β | 105/1000 [00:49<06:57, 2.14it/s]
11%|β | 106/1000 [00:49<06:57, 2.14it/s]
11%|β | 107/1000 [00:50<06:50, 2.18it/s]
11%|β | 108/1000 [00:50<06:49, 2.18it/s]
11%|β | 109/1000 [00:51<06:35, 2.25it/s]
11%|β | 110/1000 [00:51<06:34, 2.26it/s]
11%|β | 110/1000 [00:51<06:34, 2.26it/s]
11%|β | 111/1000 [00:52<06:25, 2.31it/s]
11%|β | 112/1000 [00:52<06:20, 2.34it/s]
11%|ββ | 113/1000 [00:52<06:16, 2.36it/s]
11%|ββ | 114/1000 [00:53<06:25, 2.30it/s]
12%|ββ | 115/1000 [00:53<06:18, 2.34it/s]
12%|ββ | 1{'loss': 0.7121, 'grad_norm': 1.7834407091140747, 'learning_rate': 9.870399824239114e-07, 'train/length': 2858.4, 'epoch': 0.12}
{'loss': 0.6909, 'grad_norm': 1.926537036895752, 'learning_rate': 9.83034164698452e-07, 'train/length': 2683.0, 'epoch': 0.13}
16/1000 [00:54<06:26, 2.28it/s]
12%|ββ | 117/1000 [00:54<06:26, 2.28it/s]
12%|ββ | 118/1000 [00:55<06:16, 2.34it/s]
12%|ββ | 119/1000 [00:55<06:16, 2.34it/s]
12%|ββ | 120/1000 [00:55<06:22, 2.30it/s]
12%|ββ | 120/1000 [00:55<06:22, 2.30it/s]
12%|ββ | 121/1000 [00:56<06:37, 2.21it/s]
12%|ββ | 122/1000 [00:56<06:24, 2.28it/s]
12%|ββ | 123/1000 [00:57<06:50, 2.14it/s]
12%|ββ | 124/1000 [00:57<06:33, 2.23it/s]
12%|ββ | 125/1000 [00:58<06:22, 2.29it/s]
13%|ββ | 126/1000 [00:58<06:22, 2.28it/s]
13%|ββ | 127/1000 [00:59<07:03, 2.06it/s]
13%|ββ | 128/1000 [00:59<06:51, 2.12it/s]
13%|ββ | 129/1000 [01:00<06:47, 2.14it/s]
13%|ββ | 130/1000 [01:00<06:47, 2.14it/s]
13%|ββ | 130/1000 [01:00<06:47, 2.14it/s]
ββ | 131/1000 [01:00<06:39, 2.17it/s]
13%|ββ | 132/1000 [01:01<06:37, 2.19it/s]
13%|ββ | 133/1000 [01:01<06:15, 2.31it/s]
13%|ββ | 134/1000 [01:02<06:16, 2.30it/s]
14%|ββ | 135/1000 [01:02<06:25, 2.25it/s]
14%|ββ | 136/1000 [01:03<06:19, 2.27it/s]
14%|ββ | 137/1000 [01:03<06:15, 2.30it/s]
14%|ββ | 138/1000 [01:04<06:17, 2.28it/s]
14%|ββ | 139/1000 [01:04<06:14, 2.30it/s]
14%|ββ | 140/1000 [01:04<06:10, 2.32it/s]
14%|ββ | 140/1000 [01:04<06:10, 2.32it/s]
14%|ββ | 141/1000 [01:05<06:17, 2.28it/s]
14%|ββ | 142/1000 [01:05<07:09, 2.00it/s]
14%|ββ | 143/1000 [01:06<06:49, 2.09it/s]
14%|ββ | 144/1000 [01:06<06:41, 2.13it/s]
14%|ββ | 145/1000 [01:07<07:12, 1.97it/s]
15%|ββ | 146/1000 [01:07<06:53, 2.06it/s]
15%|ββ | 147/1000 [01:08<06:41, {'loss': 0.6598, 'grad_norm': 1.9567701816558838, 'learning_rate': 9.734429148174674e-07, 'train/length': 2772.4, 'epoch': 0.15}
{'loss': 0.6338, 'grad_norm': 2.010066270828247, 'learning_rate': 9.678679705528698e-07, 'train/length': 2924.2, 'epoch': 0.16}
2.12it/s]
15%|ββ | 148/1000 [01:08<06:29, 2.19it/s]
15%|ββ | 149/1000 [01:09<06:20, 2.24it/s]
15%|ββ | 150/1000 [01:09<06:09, 2.30it/s]
15%|ββ | 150/1000 [01:09<06:09, 2.30it/s]
15%|ββ | 151/1000 [01:10<06:08, 2.31it/s]
15%|ββ | 152/1000 [01:10<06:10, 2.29it/s]
15%|ββ | 153/1000 [01:10<06:14, 2.26it/s]
15%|ββ | 154/1000 [01:11<05:55, 2.38it/s]
16%|ββ | 155/1000 [01:11<05:53, 2.39it/s]
16%|ββ | 156/1000 [01:12<06:00, 2.34it/s]
16%|ββ | 157/1000 [01:12<06:08, 2.29it/s]
16%|ββ | 158/1000 [01:13<06:12, 2.26it/s]
16%|ββ | 159/1000 [01:13<06:06, 2.29it/s]
16%|ββ | 160/1000 [01:13<05:59, 2.34it/s]
16%|ββ | 160/1000 [01:13<05:59, 2.34it/s]
16%|ββ | 161/1000 [01:14<05:43, 2.44it/s]
16%|ββ | 162/1000{'loss': 0.652, 'grad_norm': 2.033710241317749, 'learning_rate': 9.61781419531641e-07, 'train/length': 2764.9, 'epoch': 0.17}
[01:14<06:00, 2.33it/s]
16%|ββ | 163/1000 [01:15<06:00, 2.32it/s]
16%|ββ | 164/1000 [01:15<06:12, 2.24it/s]
16%|ββ | 165/1000 [01:16<06:08, 2.27it/s]
17%|ββ | 166/1000 [01:16<06:11, 2.25it/s]
17%|ββ | 167/1000 [01:16<06:05, 2.28it/s]
17%|ββ | 168/1000 [01:17<06:14, 2.22it/s]
17%|ββ | 169/1000 [01:17<05:53, 2.35it/s]
17%|ββ | 170/1000 [01:18<06:00, 2.30it/s]
17%|ββ | 170/1000 [01:18<06:00, 2.30it/s]
17%|ββ | 171/1000 [01:18<05:44, 2.41it/s]
17%|ββ | 172/1000 [01:19<06:17, 2.19it/s]
17%|ββ | 173/1000 [01:19<06:11, 2.22it/s]
17%|ββ | 174/1000 [01:20<06:36, 2.08it/s]
18%|ββ | 175/1000 [01:20<06:21, 2.16it/s]
18%|ββ | 176/1000 [01:21<06:13, 2.20it/s]
18%|ββ | 177/1000 [01:21<06:22, 2.15it/s]
18%|ββ | 178/1000 [01:21<05:57, 2.30it/s]
18%|ββ{'loss': 0.6219, 'grad_norm': 1.9039980173110962, 'learning_rate': 9.551899173079606e-07, 'train/length': 2887.4, 'epoch': 0.18}
{'loss': 0.6422, 'grad_norm': 2.0893874168395996, 'learning_rate': 9.481006715927351e-07, 'train/length': 2856.2, 'epoch': 0.19}
| 179/1000 [01:22<05:53, 2.32it/s]
18%|ββ | 180/1000 [01:22<06:07, 2.23it/s]
18%|ββ | 180/1000 [01:22<06:07, 2.23it/s]
18%|ββ | 181/1000 [01:23<05:56, 2.30it/s]
18%|ββ | 182/1000 [01:23<05:40, 2.40it/s]
18%|ββ | 183/1000 [01:23<05:46, 2.36it/s]
18%|ββ | 184/1000 [01:24<06:00, 2.26it/s]
18%|ββ | 185/1000 [01:24<06:01, 2.25it/s]
19%|ββ | 186/1000 [01:25<06:08, 2.21it/s]
19%|ββ | 187/1000 [01:26<06:54, 1.96it/s]
19%|ββ | 188/1000 [01:26<07:13, 1.87it/s]
19%|ββ | 189/1000 [01:27<06:57, 1.94it/s]
19%|ββ | 190/1000 [01:27<06:40, 2.02it/s]
19%|ββ | 190/1000 [01:27<06:40, 2.02it/s]
19%|ββ | 191/1000 [01:28<06:31, 2.07it/s]
19%|ββ | 192/1000 [01:28<06:16, 2.15it/s]
19%|ββ | 193/1000 [01:28<06:04, 2.22it{'loss': 0.6193, 'grad_norm': 2.3811116218566895, 'learning_rate': 9.405214343720706e-07, 'train/length': 3069.7, 'epoch': 0.2}
/s]
19%|ββ | 194/1000 [01:29<06:13, 2.16it/s]
20%|ββ | 195/1000 [01:29<06:32, 2.05it/s]
20%|ββ | 196/1000 [01:30<06:02, 2.22it/s]
20%|ββ | 197/1000 [01:30<06:00, 2.23it/s]
20%|ββ | 198/1000 [01:31<06:34, 2.03it/s]
20%|ββ | 199/1000 [01:31<06:26, 2.07it/s]
20%|ββ | 200/1000 [01:32<06:09, 2.16it/s]
20%|ββ | 200/1000 [01:32<06:09, 2.16it/s]
20%|ββ | 201/1000 [01:32<07:11, 1.85it/s]
20%|ββ | 202/1000 [01:33<07:16, 1.83it/s]
20%|ββ | 203/1000 [01:33<06:51, 1.94it/s]
20%|ββ | 204/1000 [01:34<06:39, 1.99it/s]
20%|ββ | 205/1000 [01:34<06:20, 2.09it/s]
21%|ββ | 206/1000 [01:35<06:46, 1.95it/s]
21%|ββ | 207/1000 [01:35<06:20, 2.08it/s]
21%|ββ | 208/1000 [01:36<05:53, 2.24it/s]
21%|ββ | 209/1000 [01:36<05:51, 2.25it/s]
21%|ββ | 210/1000 [01{'loss': 0.5924, 'grad_norm': 2.109375476837158, 'learning_rate': 9.32460493430591e-07, 'train/length': 3422.0, 'epoch': 0.21}
{'loss': 0.5861, 'grad_norm': 2.2051210403442383, 'learning_rate': 9.239266632888658e-07, 'train/length': 3084.5, 'epoch': 0.22}
:36<05:34, 2.36it/s]
21%|ββ | 210/1000 [01:36<05:34, 2.36it/s]
21%|ββ | 211/1000 [01:37<05:36, 2.35it/s]
21%|ββ | 212/1000 [01:37<05:35, 2.35it/s]
21%|βββ | 213/1000 [01:38<06:10, 2.12it/s]
21%|βββ | 214/1000 [01:38<06:08, 2.14it/s]
22%|βββ | 215/1000 [01:39<05:57, 2.20it/s]
22%|βββ | 216/1000 [01:39<05:48, 2.25it/s]
22%|βββ | 217/1000 [01:40<05:42, 2.28it/s]
22%|βββ | 218/1000 [01:40<05:46, 2.26it/s]
22%|βββ | 219/1000 [01:41<05:50, 2.23it/s]
22%|βββ | 220/1000 [01:41<05:50, 2.22it/s]
22%|βββ | 220/1000 [01:41<05:50, 2.22it/s]
22%|βββ | 221/1000 [01:41<05:41, 2.28it/s]
22%|βββ | 222/1000 [01:42<05:42, 2.27it/s]
22%|βββ | 223/1000 [01:42<05:33, 2.33it/s]
22%|βββ | 224/1000 [01:43<05:43, 2.{'loss': 0.5918, 'grad_norm': 1.9184916019439697, 'learning_rate': 9.14929275564863e-07, 'train/length': 2711.2, 'epoch': 0.23}
26it/s]
22%|βββ | 225/1000 [01:43<06:09, 2.09it/s]
23%|βββ | 226/1000 [01:44<05:48, 2.22it/s]
23%|βββ | 227/1000 [01:44<05:49, 2.21it/s]
23%|βββ | 228/1000 [01:45<05:44, 2.24it/s]
23%|βββ | 229/1000 [01:45<05:48, 2.21it/s]
23%|βββ | 230/1000 [01:46<05:57, 2.15it/s]
23%|βββ | 230/1000 [01:46<05:57, 2.15it/s]
23%|βββ | 231/1000 [01:46<06:19, 2.03it/s]
23%|βββ | 232/1000 [01:47<06:34, 1.95it/s]
23%|βββ | 233/1000 [01:47<06:12, 2.06it/s]
23%|βββ | 234/1000 [01:48<06:09, 2.07it/s]
24%|βββ | 235/1000 [01:48<05:52, 2.17it/s]
24%|βββ | 236/1000 [01:48<05:55, 2.15it/s]
24%|βββ | 237/1000 [01:49<05:42, 2.23it/s]
24%|βββ | 238/1000 [01:49<05:38, 2.25it/s]
24%|βββ | 239/1000 [01:50<05:34, 2.28it/s]
24%|βββ | 240/1000 [01:50<05:30, 2.30i{'loss': 0.5663, 'grad_norm': 2.2633495330810547, 'learning_rate': 9.0547816876996e-07, 'train/length': 3252.8, 'epoch': 0.24}
{'loss': 0.59, 'grad_norm': 1.9557974338531494, 'learning_rate': 8.955836775506775e-07, 'train/length': 2892.5, 'epoch': 0.25}
t/s]
24%|βββ | 240/1000 [01:50<05:30, 2.30it/s]
24%|βββ | 241/1000 [01:51<05:29, 2.31it/s]
24%|βββ | 242/1000 [01:51<05:20, 2.36it/s]
24%|βββ | 243/1000 [01:51<05:47, 2.18it/s]
24%|βββ | 244/1000 [01:52<05:46, 2.18it/s]
24%|βββ | 245/1000 [01:52<05:44, 2.19it/s]
25%|βββ | 246/1000 [01:53<05:36, 2.24it/s]
25%|βββ | 247/1000 [01:53<05:41, 2.21it/s]
25%|βββ | 248/1000 [01:54<05:31, 2.27it/s]
25%|βββ | 249/1000 [01:54<05:27, 2.29it/s]
25%|βββ | 250/1000 [01:55<05:27, 2.29it/s]
25%|βββ | 250/1000 [01:55<05:27, 2.29it/s]
25%|βββ | 251/1000 [01:55<05:35, 2.24it/s]
25%|βββ | 252/1000 [01:56<05:39, 2.20it/s]
25%|βββ | 253/1000 [01:56<05:58, 2.08it/s]
25%|βββ | 254/1000 [01:56<05:50, 2.13it/s]
26{'loss': 0.5628, 'grad_norm': 2.0340464115142822, 'learning_rate': 8.852566213878946e-07, 'train/length': 2908.4, 'epoch': 0.26}
%|βββ | 255/1000 [01:57<05:45, 2.16it/s]
26%|βββ | 256/1000 [01:57<05:22, 2.30it/s]
26%|βββ | 257/1000 [01:58<05:24, 2.29it/s]
26%|βββ | 258/1000 [01:58<05:23, 2.29it/s]
26%|βββ | 259/1000 [01:59<05:32, 2.23it/s]
26%|βββ | 260/1000 [01:59<05:23, 2.29it/s]
26%|βββ | 260/1000 [01:59<05:23, 2.29it/s]
26%|βββ | 261/1000 [02:00<05:46, 2.13it/s]
26%|βββ | 262/1000 [02:00<05:44, 2.14it/s]
26%|βββ | 263/1000 [02:01<05:33, 2.21it/s]
26%|βββ | 264/1000 [02:01<05:37, 2.18it/s]
26%|βββ | 265/1000 [02:01<05:26, 2.25it/s]
27%|βββ | 266/1000 [02:02<05:33, 2.20it/s]
27%|βββ | 267/1000 [02:02<05:28, 2.23it/s]
27%|βββ | 268/1000 [02:03<05:24, 2.26it/s]
27%|βββ | 269/1000 [02:03<05:48, 2.10it/s]
27%|βββ | 270/1000 [02:04<05:46, 2.11it/s]
{'loss': 0.5755, 'grad_norm': 2.2068142890930176, 'learning_rate': 8.745082927659046e-07, 'train/length': 3013.1, 'epoch': 0.27}
{'loss': 0.5649, 'grad_norm': 1.9521008729934692, 'learning_rate': 8.633504448242504e-07, 'train/length': 2889.3, 'epoch': 0.28}
27%|βββ | 270/1000 [02:04<05:46, 2.11it/s]
27%|βββ | 271/1000 [02:04<05:41, 2.13it/s]
27%|βββ | 272/1000 [02:05<05:33, 2.19it/s]
27%|βββ | 273/1000 [02:05<05:20, 2.27it/s]
27%|βββ | 274/1000 [02:05<05:15, 2.30it/s]
28%|βββ | 275/1000 [02:06<05:00, 2.41it/s]
28%|βββ | 276/1000 [02:06<05:11, 2.33it/s]
28%|βββ | 277/1000 [02:07<06:25, 1.87it/s]
28%|βββ | 278/1000 [02:08<06:16, 1.92it/s]
28%|βββ | 279/1000 [02:08<06:05, 1.97it/s]
28%|βββ | 280/1000 [02:09<06:01, 1.99it/s]
28%|βββ | 280/1000 [02:09<06:01, 1.99it/s]
28%|βββ | 281/1000 [02:09<05:43, 2.09it/s]
28%|βββ | 282/1000 [02:09<05:34, 2.15it/s]
28%|βββ | 283/1000 [02:10<05:32, 2.16it/s]
28%|βββ | 284/1000 [02:10<05:35, 2.14it/s]
28%|βββ{'loss': 0.5507, 'grad_norm': 1.9443554878234863, 'learning_rate': 8.517952785058384e-07, 'train/length': 2762.5, 'epoch': 0.29}
| 285/1000 [02:11<05:23, 2.21it/s]
29%|βββ | 286/1000 [02:11<05:25, 2.20it/s]
29%|βββ | 287/1000 [02:12<05:27, 2.18it/s]
29%|βββ | 288/1000 [02:12<05:22, 2.21it/s]
29%|βββ | 289/1000 [02:13<05:22, 2.20it/s]
29%|βββ | 290/1000 [02:13<06:02, 1.96it/s]
29%|βββ | 290/1000 [02:13<06:02, 1.96it/s]
29%|βββ | 291/1000 [02:14<05:45, 2.05it/s]
29%|βββ | 292/1000 [02:14<05:34, 2.12it/s]
29%|βββ | 293/1000 [02:15<05:30, 2.14it/s]
29%|βββ | 294/1000 [02:15<05:35, 2.11it/s]
30%|βββ | 295/1000 [02:15<05:29, 2.14it/s]
30%|βββ | 296/1000 [02:16<05:24, 2.17it/s]
30%|βββ | 297/1000 [02:16<05:16, 2.22it/s]
30%|βββ | 298/1000 [02:17<04:58, 2.35it/s]
30%|βββ | 299/1000 [02:17<05:01, 2.33it/s]
30%|βββ | 300/1000 [02:18<04:55, 2.37it/s]
{'loss': 0.5616, 'grad_norm': 2.26216197013855, 'learning_rate': 8.398554292153865e-07, 'train/length': 3278.6, 'epoch': 0.3}
{'loss': 0.5401, 'grad_norm': 2.2310731410980225, 'learning_rate': 8.275439530027947e-07, 'train/length': 2845.7, 'epoch': 0.31}
30%|βββ | 300/1000 [02:18<04:55, 2.37it/s]
30%|βββ | 301/1000 [02:18<04:43, 2.47it/s]
30%|βββ | 302/1000 [02:18<04:45, 2.45it/s]
30%|βββ | 303/1000 [02:19<04:36, 2.52it/s]
30%|βββ | 304/1000 [02:19<04:44, 2.45it/s]
30%|βββ | 305/1000 [02:20<04:43, 2.45it/s]
31%|βββ | 306/1000 [02:20<05:00, 2.31it/s]
31%|βββ | 307/1000 [02:20<04:57, 2.33it/s]
31%|βββ | 308/1000 [02:21<04:54, 2.35it/s]
31%|βββ | 309/1000 [02:21<04:52, 2.36it/s]
31%|βββ | 310/1000 [02:22<04:41, 2.45it/s]
31%|βββ | 310/1000 [02:22<04:41, 2.45it/s]
31%|βββ | 311/1000 [02:22<04:46, 2.40it/s]
31%|βββ | 312/1000 [02:23<04:44, 2.42it/s]
31%|ββββ | 313/1000 [02:23<04:46, 2.40it/s]
31%|ββββ | 314/1000 [02:23<04:50, 2.36it/s]
32%|ββββ {'loss': 0.5392, 'grad_norm': 2.2278757095336914, 'learning_rate': 8.148743122865463e-07, 'train/length': 2786.0, 'epoch': 0.32}
| 315/1000 [02:24<04:59, 2.29it/s]
32%|ββββ | 316/1000 [02:24<04:55, 2.32it/s]
32%|ββββ | 317/1000 [02:25<04:41, 2.43it/s]
32%|ββββ | 318/1000 [02:25<04:42, 2.42it/s]
32%|ββββ | 319/1000 [02:25<04:47, 2.37it/s]
32%|ββββ | 320/1000 [02:26<04:56, 2.29it/s]
32%|ββββ | 320/1000 [02:26<04:56, 2.29it/s]
32%|ββββ | 321/1000 [02:26<04:55, 2.30it/s]
32%|ββββ | 322/1000 [02:27<05:06, 2.21it/s]
32%|ββββ | 323/1000 [02:27<05:01, 2.24it/s]
32%|ββββ | 324/1000 [02:28<04:56, 2.28it/s]
32%|ββββ | 325/1000 [02:28<04:52, 2.31it/s]
33%|ββββ | 326/1000 [02:29<04:48, 2.33it/s]
33%|ββββ | 327/1000 [02:29<05:10, 2.17it/s]
33%|ββββ | 328/1000 [02:29<04:50, 2.31it/s]
33%|ββββ | 329/1000 [02:30<04:51, 2.30it/s]
33%|ββββ | 330/1000 [02:30<04:50, {'loss': 0.5439, 'grad_norm': 1.8770525455474854, 'learning_rate': 8.018603611327504e-07, 'train/length': 2700.0, 'epoch': 0.33}
{'loss': 0.5488, 'grad_norm': 2.062666177749634, 'learning_rate': 7.88516330105925e-07, 'train/length': 3127.7, 'epoch': 0.34}
2.31it/s]
33%|ββββ | 330/1000 [02:30<04:50, 2.31it/s]
33%|ββββ | 331/1000 [02:31<04:46, 2.34it/s]
33%|ββββ | 332/1000 [02:31<04:55, 2.26it/s]
33%|ββββ | 333/1000 [02:32<05:04, 2.19it/s]
33%|ββββ | 334/1000 [02:32<04:54, 2.26it/s]
34%|ββββ | 335/1000 [02:33<05:04, 2.19it/s]
34%|ββββ | 336/1000 [02:33<05:26, 2.03it/s]
34%|ββββ | 337/1000 [02:34<05:18, 2.08it/s]
34%|ββββ | 338/1000 [02:34<05:09, 2.14it/s]
34%|ββββ | 339/1000 [02:35<05:56, 1.85it/s]
34%|ββββ | 340/1000 [02:35<05:37, 1.96it/s]
34%|ββββ | 340/1000 [02:35<05:37, 1.96it/s]
34%|ββββ | 341/1000 [02:36<05:19, 2.06it/s]
34%|ββββ | 342/1000 [02:36<04:56, 2.22it/s]
34%|ββββ | 343/1000 [02:37<05:00, 2.19it/s]
34%|ββββ | {'loss': 0.5308, 'grad_norm': 2.0756351947784424, 'learning_rate': 7.74856810708083e-07, 'train/length': 2808.5, 'epoch': 0.35}
344/1000 [02:37<04:52, 2.24it/s]
34%|ββββ | 345/1000 [02:37<04:50, 2.26it/s]
35%|ββββ | 346/1000 [02:38<04:43, 2.31it/s]
35%|ββββ | 347/1000 [02:38<04:37, 2.35it/s]
35%|ββββ | 348/1000 [02:39<04:39, 2.33it/s]
35%|ββββ | 349/1000 [02:39<04:43, 2.30it/s]
35%|ββββ | 350/1000 [02:40<04:47, 2.26it/s]
35%|ββββ | 350/1000 [02:40<04:47, 2.26it/s]
35%|ββββ | 351/1000 [02:40<04:43, 2.29it/s]
35%|ββββ | 352/1000 [02:40<04:47, 2.25it/s]
35%|ββββ | 353/1000 [02:41<04:45, 2.27it/s]
35%|ββββ | 354/1000 [02:41<04:41, 2.29it/s]
36%|ββββ | 355/1000 [02:42<04:36, 2.33it/s]
36%|ββββ | 356/1000 [02:42<04:25, 2.42it/s]
36%|ββββ | 357/1000 [02:43<04:36, 2.33it/s]
36%|ββββ | 358/1000 [02:43<04:35, 2.33it/s]
36%|ββββ | 359/1000 [02:43<04:38, 2.30{'loss': 0.5257, 'grad_norm': 1.9412819147109985, 'learning_rate': 7.608967394231386e-07, 'train/length': 2808.8, 'epoch': 0.36}
{'loss': 0.5122, 'grad_norm': 2.1651999950408936, 'learning_rate': 7.466513813840824e-07, 'train/length': 2965.7, 'epoch': 0.37}
it/s]
36%|ββββ | 360/1000 [02:44<04:42, 2.26it/s]
36%|ββββ | 360/1000 [02:44<04:42, 2.26it/s]
36%|ββββ | 361/1000 [02:44<04:28, 2.38it/s]
36%|ββββ | 362/1000 [02:45<04:33, 2.33it/s]
36%|ββββ | 363/1000 [02:45<04:34, 2.32it/s]
36%|ββββ | 364/1000 [02:46<04:36, 2.30it/s]
36%|ββββ | 365/1000 [02:46<04:36, 2.30it/s]
37%|ββββ | 366/1000 [02:46<04:37, 2.29it/s]
37%|ββββ | 367/1000 [02:47<04:43, 2.24it/s]
37%|ββββ | 368/1000 [02:47<04:42, 2.24it/s]
37%|ββββ | 369/1000 [02:48<04:59, 2.11it/s]
37%|ββββ | 370/1000 [02:48<04:49, 2.17it/s]
37%|ββββ | 370/1000 [02:48<04:49, 2.17it/s]
37%|ββββ | 371/1000 [02:49<04:50, 2.17it/s]
37%|ββββ | 372/1000 [02:49<04:53, 2.14it/s]
37%|ββββ | 373/{'loss': 0.5302, 'grad_norm': 1.9450197219848633, 'learning_rate': 7.321363136807818e-07, 'train/length': 2680.5, 'epoch': 0.38}
1000 [02:50<04:46, 2.19it/s]
37%|ββββ | 374/1000 [02:50<04:37, 2.25it/s]
38%|ββββ | 375/1000 [02:51<04:42, 2.22it/s]
38%|ββββ | 376/1000 [02:51<04:36, 2.26it/s]
38%|ββββ | 377/1000 [02:51<04:40, 2.22it/s]
38%|ββββ | 378/1000 [02:52<04:24, 2.35it/s]
38%|ββββ | 379/1000 [02:52<04:22, 2.37it/s]
38%|ββββ | 380/1000 [02:53<04:54, 2.10it/s]
38%|ββββ | 380/1000 [02:53<04:54, 2.10it/s]
38%|ββββ | 381/1000 [02:53<04:45, 2.17it/s]
38%|ββββ | 382/1000 [02:54<04:36, 2.24it/s]
38%|ββββ | 383/1000 [02:54<04:40, 2.20it/s]
38%|ββββ | 384/1000 [02:55<04:33, 2.25it/s]
38%|ββββ | 385/1000 [02:55<04:31, 2.27it/s]
39%|ββββ | 386/1000 [02:55<04:31, 2.26it/s]
39%|ββββ | 387/1000 [02:56<04:49, 2.12it/s]
39%|ββββ | 388/1000 [02:56<04:37, 2.21it/s{'loss': 0.5379, 'grad_norm': 2.3489904403686523, 'learning_rate': 7.173674083266623e-07, 'train/length': 2875.9, 'epoch': 0.39}
{'loss': 0.5241, 'grad_norm': 2.02032470703125, 'learning_rate': 7.023608149028936e-07, 'train/length': 2936.9, 'epoch': 0.4}
]
39%|ββββ | 389/1000 [02:57<04:34, 2.22it/s]
39%|ββββ | 390/1000 [02:57<04:19, 2.35it/s]
39%|ββββ | 390/1000 [02:57<04:19, 2.35it/s]
39%|ββββ | 391/1000 [02:58<04:25, 2.30it/s]
39%|ββββ | 392/1000 [02:58<04:27, 2.27it/s]
39%|ββββ | 393/1000 [02:59<04:30, 2.24it/s]
39%|ββββ | 394/1000 [02:59<04:51, 2.08it/s]
40%|ββββ | 395/1000 [03:00<04:42, 2.14it/s]
40%|ββββ | 396/1000 [03:00<04:41, 2.15it/s]
40%|ββββ | 397/1000 [03:00<04:23, 2.29it/s]
40%|ββββ | 398/1000 [03:01<04:16, 2.35it/s]
40%|ββββ | 399/1000 [03:01<04:04, 2.46it/s]
40%|ββββ | 400/1000 [03:02<04:05, 2.45it/s]
40%|ββββ | 400/1000 [03:02<04:05, 2.45it/s]
40%|ββββ | 401/1000 [03:02<04:04, 2.45it/s]
40%|ββββ | 402/1000{'loss': 0.5156, 'grad_norm': 2.202767848968506, 'learning_rate': 6.871329428990601e-07, 'train/length': 3036.7, 'epoch': 0.41}
[03:02<04:10, 2.38it/s]
40%|ββββ | 403/1000 [03:03<04:23, 2.27it/s]
40%|ββββ | 404/1000 [03:03<04:24, 2.25it/s]
40%|ββββ | 405/1000 [03:04<04:30, 2.20it/s]
41%|ββββ | 406/1000 [03:04<04:33, 2.17it/s]
41%|ββββ | 407/1000 [03:05<04:29, 2.20it/s]
41%|ββββ | 408/1000 [03:05<04:30, 2.19it/s]
41%|ββββ | 409/1000 [03:06<04:27, 2.21it/s]
41%|ββββ | 410/1000 [03:06<04:19, 2.27it/s]
41%|ββββ | 410/1000 [03:06<04:19, 2.27it/s]
41%|ββββ | 411/1000 [03:07<04:26, 2.21it/s]
41%|ββββ | 412/1000 [03:07<04:25, 2.21it/s]
41%|βββββ | 413/1000 [03:07<04:22, 2.24it/s]
41%|βββββ | 414/1000 [03:08<04:22, 2.23it/s]
42%|βββββ | 415/1000 [03:08<04:26, 2.20it/s]
42%|βββββ | 416/1000 [03:09<04:17, 2.27it/s]
42%|βββββ | 417/1000 [03:09<04:22, 2.{'loss': 0.513, 'grad_norm': 1.9685007333755493, 'learning_rate': 6.717004437696249e-07, 'train/length': 2671.5, 'epoch': 0.42}
{'loss': 0.4952, 'grad_norm': 2.1214869022369385, 'learning_rate': 6.560801927258079e-07, 'train/length': 2967.7, 'epoch': 0.43}
22it/s]
42%|βββββ | 418/1000 [03:10<04:19, 2.24it/s]
42%|βββββ | 419/1000 [03:10<04:17, 2.26it/s]
42%|βββββ | 420/1000 [03:11<04:16, 2.26it/s]
42%|βββββ | 420/1000 [03:11<04:16, 2.26it/s]
42%|βββββ | 421/1000 [03:11<04:08, 2.33it/s]
42%|βββββ | 422/1000 [03:11<04:05, 2.35it/s]
42%|βββββ | 423/1000 [03:12<04:10, 2.30it/s]
42%|βββββ | 424/1000 [03:12<04:14, 2.26it/s]
42%|βββββ | 425/1000 [03:13<04:37, 2.07it/s]
43%|βββββ | 426/1000 [03:13<04:27, 2.15it/s]
43%|βββββ | 427/1000 [03:14<04:22, 2.18it/s]
43%|βββββ | 428/1000 [03:14<04:37, 2.06it/s]
43%|βββββ | 429/1000 [03:15<04:35, 2.07it/s]
43%|βββββ | 430/1000 [03:15<04:27, 2.13it/s]
43%|βββββ | 430/1000 [03:15<04:27, 2.13it/{'loss': 0.499, 'grad_norm': 1.9519883394241333, 'learning_rate': 6.402892702827916e-07, 'train/length': 3022.3, 'epoch': 0.44}
s]
43%|βββββ | 431/1000 [03:16<04:10, 2.27it/s]
43%|βββββ | 432/1000 [03:16<04:12, 2.25it/s]
43%|βββββ | 433/1000 [03:16<04:09, 2.28it/s]
43%|βββββ | 434/1000 [03:17<04:02, 2.34it/s]
44%|βββββ | 435/1000 [03:17<04:13, 2.23it/s]
44%|βββββ | 436/1000 [03:18<04:13, 2.22it/s]
44%|βββββ | 437/1000 [03:18<04:08, 2.27it/s]
44%|βββββ | 438/1000 [03:19<04:29, 2.08it/s]
44%|βββββ | 439/1000 [03:19<04:22, 2.14it/s]
44%|βββββ | 440/1000 [03:20<04:46, 1.96it/s]
44%|βββββ | 440/1000 [03:20<04:46, 1.96it/s]
44%|βββββ | 441/1000 [03:20<04:21, 2.14it/s]
44%|βββββ | 442/1000 [03:21<04:15, 2.18it/s]
44%|βββββ | 443/1000 [03:21<04:08, 2.24it/s]
44%|βββββ | 444/1000 [03:22<04:09, 2.23it/s]
44%|βββββ | 445/1000 [03:22<04:02, 2.2{'loss': 0.5099, 'grad_norm': 2.002332925796509, 'learning_rate': 6.243449435824276e-07, 'train/length': 2944.6, 'epoch': 0.45}
9it/s]
45%|βββββ | 446/1000 [03:22<03:57, 2.34it/s]
45%|βββββ | 447/1000 [03:23<03:47, 2.43it/s]
45%|βββββ | 448/1000 [03:23<03:40, 2.50it/s]
45%|βββββ | 449/1000 [03:24<03:49, 2.40it/s]
45%|βββββ | 450/1000 [03:24<03:53, 2.35it/s]
45%|βββββ | 450/1000 [03:24<03:53, 2.35it/s]
45%|βββββ | 451/1000 [03:24<03:54, 2.35it/s]
45%|βββββ | 452/1000 [03:25<03:53, 2.34it/s]
45%|βββββ | 453/1000 [03:25<04:01, 2.26it/s]
45%|βββββ | 454/1000 [03:26<04:07, 2.21it/s]
46%|βββββ | 455/1000 [03:26<04:03, 2.24it/s]
46%|βββββ | 456/1000 [03:27<04:08, 2.19it/s]
46%|βββββ | 457/1000 [03:27<03:52, 2.33it/s]
46%|βββββ | 458/1000 [03:28<03:52, 2.34it/s]
46%|βββββ | 459/1000 [03:28<03:51, 2.34it/s]
46%|βββββ | 460/1000 [03:29<04:39, {'loss': 0.5062, 'grad_norm': 1.8899197578430176, 'learning_rate': 6.082646475118699e-07, 'train/length': 3218.9, 'epoch': 0.46}
{'loss': 0.5162, 'grad_norm': 2.0483741760253906, 'learning_rate': 5.920659656387836e-07, 'train/length': 2673.4, 'epoch': 0.47}
1.93it/s]
46%|βββββ | 460/1000 [03:29<04:39, 1.93it/s]
46%|βββββ | 461/1000 [03:29<04:23, 2.04it/s]
46%|βββββ | 462/1000 [03:30<04:11, 2.14it/s]
46%|βββββ | 463/1000 [03:30<04:02, 2.21it/s]
46%|βββββ | 464/1000 [03:30<03:56, 2.26it/s]
46%|βββββ | 465/1000 [03:31<03:54, 2.28it/s]
47%|βββββ | 466/1000 [03:31<03:42, 2.40it/s]
47%|βββββ | 467/1000 [03:32<03:50, 2.31it/s]
47%|βββββ | 468/1000 [03:32<03:47, 2.34it/s]
47%|βββββ | 469/1000 [03:33<04:30, 1.96it/s]
47%|βββββ | 470/1000 [03:33<04:20, 2.04it/s]
47%|βββββ | 470/1000 [03:33<04:20, 2.04it/s]
47%|βββββ | 471/1000 [03:34<04:16, 2.06it/s]
47%|βββββ | 472/1000 [03:34<04:08, 2.12it/s]
47%|βββββ | 473/1000 [03:35<04:03, 2.16{'loss': 0.4957, 'grad_norm': 2.184000015258789, 'learning_rate': 5.757666109839702e-07, 'train/length': 2849.0, 'epoch': 0.48}
it/s]
47%|βββββ | 474/1000 [03:35<03:47, 2.31it/s]
48%|βββββ | 475/1000 [03:35<03:51, 2.27it/s]
48%|βββββ | 476/1000 [03:36<03:52, 2.25it/s]
48%|βββββ | 477/1000 [03:36<03:56, 2.22it/s]
48%|βββββ | 478/1000 [03:37<03:49, 2.27it/s]
48%|βββββ | 479/1000 [03:37<03:37, 2.39it/s]
48%|βββββ | 480/1000 [03:37<03:35, 2.41it/s]
48%|βββββ | 480/1000 [03:37<03:35, 2.41it/s]
48%|βββββ | 481/1000 [03:38<03:43, 2.32it/s]
48%|βββββ | 482/1000 [03:39<04:07, 2.10it/s]
48%|βββββ | 483/1000 [03:39<03:59, 2.15it/s]
48%|βββββ | 484/1000 [03:40<04:49, 1.79it/s]
48%|βββββ | 485/1000 [03:40<04:31, 1.89it/s]
49%|βββββ | 486/1000 [03:41<04:26, 1.93it/s]
49%|βββββ | 487/1000 [03:41<04:28, 1.91it/s]
49%|βββββ | 488/1000 [03:42<04:23, {'loss': 0.5248, 'grad_norm': 2.5239667892456055, 'learning_rate': 5.5938440665244e-07, 'train/length': 3359.9, 'epoch': 0.49}
{'loss': 0.4806, 'grad_norm': 2.0843234062194824, 'learning_rate': 5.429372663441085e-07, 'train/length': 2714.9, 'epoch': 0.5}
1.94it/s]
49%|βββββ | 489/1000 [03:42<04:11, 2.03it/s]
49%|βββββ | 490/1000 [03:43<04:04, 2.08it/s]
49%|βββββ | 490/1000 [03:43<04:04, 2.08it/s]
49%|βββββ | 491/1000 [03:43<03:53, 2.18it/s]
49%|βββββ | 492/1000 [03:43<03:52, 2.18it/s]
49%|βββββ | 493/1000 [03:44<03:44, 2.26it/s]
49%|βββββ | 494/1000 [03:44<03:39, 2.30it/s]
50%|βββββ | 495/1000 [03:45<03:35, 2.34it/s]
50%|βββββ | 496/1000 [03:45<03:25, 2.46it/s]
50%|βββββ | 497/1000 [03:45<03:17, 2.54it/s]
50%|βββββ | 498/1000 [03:46<03:21, 2.49it/s]
50%|βββββ | 499/1000 [03:46<03:25, 2.44it/s]
50%|βββββ | 500/1000 [03:47<03:26, 2.43it/s]
50%|βββββ | 500/1000 [03:47<03:26, 2.43it/s]
50%|βββββ | 501/1000 [03:47<03:25, 2.43i{'loss': 0.4968, 'grad_norm': 1.9533307552337646, 'learning_rate': 5.264431747654283e-07, 'train/length': 2707.7, 'epoch': 0.51}
t/s]
50%|βββββ | 502/1000 [03:48<03:23, 2.44it/s]
50%|βββββ | 503/1000 [03:48<03:26, 2.41it/s]
50%|βββββ | 504/1000 [03:48<03:23, 2.44it/s]
50%|βββββ | 505/1000 [03:49<03:44, 2.20it/s]
51%|βββββ | 506/1000 [03:49<04:05, 2.01it/s]
51%|βββββ | 507/1000 [03:50<03:58, 2.07it/s]
51%|βββββ | 508/1000 [03:50<03:57, 2.07it/s]
51%|βββββ | 509/1000 [03:51<03:52, 2.11it/s]
51%|βββββ | 510/1000 [03:51<03:47, 2.15it/s]
51%|βββββ | 510/1000 [03:51<03:47, 2.15it/s]
51%|βββββ | 511/1000 [03:52<03:51, 2.12it/s]
51%|βββββ | 512/1000 [03:53<04:41, 1.73it/s]
51%|ββββββ | 513/1000 [03:53<04:26, 1.83it/s]
51%|ββββββ | 514/1000 [03:54<04:14, 1.91it/s]
52%|ββββββ | 515/1000 [03:54<03:56, 2.05it/s]
52%|ββββββ | 516/1000 [03:54<0{'loss': 0.4917, 'grad_norm': 1.9566922187805176, 'learning_rate': 5.099201679633768e-07, 'train/length': 3217.4, 'epoch': 0.52}
3:56, 2.04it/s]
52%|ββββββ | 517/1000 [03:55<03:49, 2.10it/s]
52%|ββββββ | 518/1000 [03:55<03:45, 2.14it/s]
52%|ββββββ | 519/1000 [03:56<03:44, 2.14it/s]
52%|ββββββ | 520/1000 [03:56<03:44, 2.13it/s]
52%|ββββββ | 520/1000 [03:56<03:44, 2.13it/s]
52%|ββββββ | 521/1000 [03:57<03:54, 2.05it/s]
52%|ββββββ | 522/1000 [03:57<03:47, 2.10it/s]
52%|ββββββ | 523/1000 [03:58<03:31, 2.26it/s]
52%|ββββββ | 524/1000 [03:58<03:29, 2.28it/s]
52%|ββββββ | 525/1000 [03:59<03:25, 2.31it/s]
53%|ββββββ | 526/1000 [03:59<03:24, 2.32it/s]
53%|ββββββ | 527/1000 [04:00<03:43, 2.11it/s]
53%|ββββββ | 528/1000 [04:00<03:33, 2.22it/s]
53%|ββββββ | 529/1000 [04:00<03:38, 2.16it/s]
53%|ββββββ | 530/1000 [04:01<03:33, 2.20it/s]
{'loss': 0.4968, 'grad_norm': 2.134997606277466, 'learning_rate': 4.933863136033039e-07, 'train/length': 3204.5, 'epoch': 0.53}
{'loss': 0.5039, 'grad_norm': 1.9608609676361084, 'learning_rate': 4.768596912122045e-07, 'train/length': 3039.2, 'epoch': 0.54}
53%|ββββββ | 530/1000 [04:01<03:33, 2.20it/s]
53%|ββββββ | 531/1000 [04:01<03:30, 2.23it/s]
53%|ββββββ | 532/1000 [04:02<03:29, 2.24it/s]
53%|ββββββ | 533/1000 [04:02<03:22, 2.31it/s]
53%|ββββββ | 534/1000 [04:03<03:22, 2.30it/s]
54%|ββββββ | 535/1000 [04:03<03:23, 2.28it/s]
54%|ββββββ | 536/1000 [04:03<03:27, 2.23it/s]
54%|ββββββ | 537/1000 [04:04<03:33, 2.17it/s]
54%|ββββββ | 538/1000 [04:04<03:32, 2.17it/s]
54%|ββββββ | 539/1000 [04:05<03:29, 2.20it/s]
54%|ββββββ | 540/1000 [04:05<03:32, 2.17it/s]
54%|ββββββ | 540/1000 [04:05<03:32, 2.17it/s]
54%|ββββββ | 541/1000 [04:06<03:48, 2.01it/s]
54%|ββββββ | 542/1000 [04:06<03:41, 2.07it/s]
54%|ββββββ | 543/1000 [04:07<03:{'loss': 0.4914, 'grad_norm': 2.0940401554107666, 'learning_rate': 4.60358372409022e-07, 'train/length': 3083.5, 'epoch': 0.55}
51, 1.98it/s]
54%|ββββββ | 544/1000 [04:07<03:42, 2.05it/s]
55%|ββββββ | 545/1000 [04:08<03:29, 2.17it/s]
55%|ββββββ | 546/1000 [04:08<03:24, 2.22it/s]
55%|ββββββ | 547/1000 [04:09<03:21, 2.25it/s]
55%|ββββββ | 548/1000 [04:09<03:25, 2.20it/s]
55%|ββββββ | 549/1000 [04:10<03:37, 2.07it/s]
55%|ββββββ | 550/1000 [04:10<03:29, 2.15it/s]
55%|ββββββ | 550/1000 [04:10<03:29, 2.15it/s]
55%|ββββββ | 551/1000 [04:11<03:26, 2.18it/s]
55%|ββββββ | 552/1000 [04:11<03:24, 2.19it/s]
55%|ββββββ | 553/1000 [04:11<03:19, 2.24it/s]
55%|ββββββ | 554/1000 [04:12<03:17, 2.26it/s]
56%|ββββββ | 555/1000 [04:12<03:07, 2.37it/s]
56%|ββββββ | 556/1000 [04:13<03:16, 2.26it/s]
56%|ββββββ | 557/1000 [04:13<03:16, 2.26it/s]
ββββ | 558/1000 [04:14<03:36, 2.05it/s]
56%|ββββββ | 559/1000 [04:14<03:54, 1.88it/s]
56%|ββββββ | 560/1000 [04:15<03:32, 2.07it/s]
56%|ββββββ | 560/1000 [04:15<03:32, 2.07it/s]
56%|ββββββ | 561/1000 [04:15<03:24, 2.15it/s]
56%|ββββββ | 562/1000 [04:16<03:17, 2.22it/s]
56%|ββββββ | 563/1000 [04:16<03:05, 2.36it/s]
56%|ββββββ | 564/1000 [04:16<03:09, 2.30it/s]
56%|ββββββ | 565/1000 [04:17<03:07, 2.33it/s]
57%|ββββββ | 566/1000 [04:17<03:09, 2.29it/s]
57%|ββββββ | 567/1000 [04:18<03:12, 2.25it/s]
57%|ββββββ | 568/1000 [04:18<03:01, 2.38it/s]
57%|ββββββ | 569/1000 [04:19<03:22, 2.13it/s]
57%|ββββββ | 570/1000 [04:19<03:22, 2.12it/s]
57%|ββββββ | 570/1000 [04:19<03:22{'loss': 0.4795, 'grad_norm': 2.235196113586426, 'learning_rate': 4.1118642034694565e-07, 'train/length': 2963.6, 'epoch': 0.58}
, 2.12it/s]
57%|ββββββ | 571/1000 [04:20<03:14, 2.21it/s]
57%|ββββββ | 572/1000 [04:20<03:13, 2.21it/s]
57%|ββββββ | 573/1000 [04:20<03:12, 2.22it/s]
57%|ββββββ | 574/1000 [04:21<03:05, 2.30it/s]
57%|ββββββ | 575/1000 [04:21<03:09, 2.25it/s]
58%|ββββββ | 576/1000 [04:22<03:03, 2.30it/s]
58%|ββββββ | 577/1000 [04:22<03:23, 2.08it/s]
58%|ββββββ | 578/1000 [04:23<03:16, 2.15it/s]
58%|ββββββ | 579/1000 [04:23<03:16, 2.15it/s]
58%|ββββββ | 580/1000 [04:24<03:14, 2.16it/s]
58%|ββββββ | 580/1000 [04:24<03:14, 2.16it/s]
58%|ββββββ | 581/1000 [04:24<03:05, 2.25it/s]
58%|ββββββ | 582/1000 [04:25<03:04, 2.27it/s]
58%|ββββββ | 583/1000 [04:25<03:02, 2.29it/s]
58%|ββββββ | 584/1000 [04:25<03:01, 2.29it/s]
58%|βββ{'loss': 0.4867, 'grad_norm': 2.000352144241333, 'learning_rate': 3.9496618307341713e-07, 'train/length': 2880.3, 'epoch': 0.59}
βββ | 585/1000 [04:26<03:06, 2.23it/s]
59%|ββββββ | 586/1000 [04:26<03:05, 2.23it/s]
59%|ββββββ | 587/1000 [04:27<03:02, 2.26it/s]
59%|ββββββ | 588/1000 [04:27<03:00, 2.28it/s]
59%|ββββββ | 589/1000 [04:28<03:17, 2.08it/s]
59%|ββββββ | 590/1000 [04:28<03:11, 2.14it/s]
59%|ββββββ | 590/1000 [04:28<03:11, 2.14it/s]
59%|ββββββ | 591/1000 [04:29<03:07, 2.18it/s]
59%|ββββββ | 592/1000 [04:29<03:01, 2.25it/s]
59%|ββββββ | 593/1000 [04:29<02:57, 2.29it/s]
59%|ββββββ | 594/1000 [04:30<02:55, 2.31it/s]
60%|ββββββ | 595/1000 [04:30<02:57, 2.29it/s]
60%|ββββββ | 596/1000 [04:31<02:57, 2.27it/s]
60%|ββββββ | 597/1000 [04:31<02:48, 2.39it/s]
60%|ββββββ | 598/1000 [04:32<02:52, 2.33it/s]
60%|ββββββ | 599/1000 [04{'loss': 0.4843, 'grad_norm': 2.068528175354004, 'learning_rate': 3.788607987366069e-07, 'train/length': 3026.9, 'epoch': 0.6}
{'loss': 0.498, 'grad_norm': 2.0729715824127197, 'learning_rate': 3.6288787833783016e-07, 'train/length': 3130.2, 'epoch': 0.61}
:32<02:53, 2.31it/s]
60%|ββββββ | 600/1000 [04:32<02:56, 2.26it/s]
60%|ββββββ | 600/1000 [04:32<02:56, 2.26it/s]
60%|ββββββ | 601/1000 [04:33<02:57, 2.24it/s]
60%|ββββββ | 602/1000 [04:34<03:13, 2.05it/s]
60%|ββββββ | 603/1000 [04:34<03:11, 2.07it/s]
60%|ββββββ | 604/1000 [04:34<03:04, 2.14it/s]
60%|ββββββ | 605/1000 [04:35<03:01, 2.17it/s]
61%|ββββββ | 606/1000 [04:35<02:56, 2.24it/s]
61%|ββββββ | 607/1000 [04:36<02:54, 2.25it/s]
61%|ββββββ | 608/1000 [04:36<02:51, 2.28it/s]
61%|ββββββ | 609/1000 [04:37<02:51, 2.28it/s]
61%|ββββββ | 610/1000 [04:37<02:50, 2.29it/s]
61%|ββββββ | 610/1000 [04:37<02:50, 2.29it/s]
61%|ββββββ | 611/1000 [04:37<02:48, 2.31it/s]
βββ | 612/1000 [04:38<03:04, 2.10it/s]
61%|βββββββ | 613/1000 [04:38<03:04, 2.10it/s]
61%|βββββββ | 614/1000 [04:39<02:51, 2.25it/s]
62%|βββββββ | 615/1000 [04:39<02:49, 2.27it/s]
62%|βββββββ | 616/1000 [04:40<02:49, 2.26it/s]
62%|βββββββ | 617/1000 [04:40<02:52, 2.22it/s]
62%|βββββββ | 618/1000 [04:41<02:47, 2.28it/s]
62%|βββββββ | 619/1000 [04:41<02:45, 2.30it/s]
62%|βββββββ | 620/1000 [04:42<02:48, 2.26it/s]
62%|βββββββ | 620/1000 [04:42<02:48, 2.26it/s]
62%|βββββββ | 621/1000 [04:42<02:44, 2.30it/s]
62%|βββββββ | 622/1000 [04:42<02:46, 2.27it/s]
62%|βββββββ | 623/1000 [04:43<02:42, 2.33it/s]
62%|βββββββ | 624/1000 [04:43<02:34, 2.43it/s]
62%|βββββββ | 625/1000 [04:44<02:41, 2.33it/s]
ββββ | 626/1000 [04:44<02:40, 2.33it/s]
63%|βββββββ | 627/1000 [04:44<02:41, 2.32it/s]
63%|βββββββ | 628/1000 [04:45<02:44, 2.26it/s]
63%|βββββββ | 629/1000 [04:45<02:46, 2.23it/s]
63%|βββββββ | 630/1000 [04:46<02:48, 2.20it/s]
63%|βββββββ | 630/1000 [04:46<02:48, 2.20it/s]
63%|βββββββ | 631/1000 [04:47<03:15, 1.88it/s]
63%|βββββββ | 632/1000 [04:47<03:03, 2.01it/s]
63%|βββββββ | 633/1000 [04:47<02:58, 2.06it/s]
63%|βββββββ | 634/1000 [04:48<02:51, 2.13it/s]
64%|βββββββ | 635/1000 [04:48<02:39, 2.29it/s]
64%|βββββββ | 636/1000 [04:49<02:38, 2.30it/s]
64%|βββββββ | 637/1000 [04:49<02:30, 2.41it/s]
64%|βββββββ | 638/1000 [04:49<02:30, 2.40it/s]
64%|βββββββ | 639/1000 [04:50<02:45, 2.18it/s]
βββββ | 640/1000 [04:51<02:46, 2.16it/s]
64%|βββββββ | 640/1000 [04:51<02:46, 2.16it/s]
64%|βββββββ | 641/1000 [04:51<02:35, 2.31it/s]
64%|βββββββ | 642/1000 [04:51<02:37, 2.27it/s]
64%|βββββββ | 643/1000 [04:52<02:49, 2.11it/s]
64%|βββββββ | 644/1000 [04:52<02:41, 2.20it/s]
64%|βββββββ | 645/1000 [04:53<02:41, 2.19it/s]
65%|βββββββ | 646/1000 [04:53<02:38, 2.23it/s]
65%|βββββββ | 647/1000 [04:54<02:36, 2.25it/s]
65%|βββββββ | 648/1000 [04:54<02:47, 2.10it/s]
65%|βββββββ | 649/1000 [04:55<02:45, 2.13it/s]
65%|βββββββ | 650/1000 [04:55<02:40, 2.17it/s]
65%|βββββββ | 650/1000 [04:55<02:40, 2.17it/s]
65%|βββββββ | 651/1000 [04:56<02:39, 2.18it/s]
ββ | 652/1000 [04:56<02:38, 2.19it/s]
65%|βββββββ | 653/1000 [04:56<02:36, 2.22it/s]
65%|βββββββ | 654/1000 [04:57<02:48, 2.05it/s]
66%|βββββββ | 655/1000 [04:57<02:39, 2.16it/s]
66%|βββββββ | 656/1000 [04:58<02:36, 2.19it/s]
66%|βββββββ | 657/1000 [04:58<02:34, 2.22it/s]
66%|βββββββ | 658/1000 [04:59<02:30, 2.27it/s]
66%|βββββββ | 659/1000 [04:59<02:26, 2.33it/s]
66%|βββββββ | 660/1000 [05:00<02:25, 2.34it/s]
66%|βββββββ | 660/1000 [05:00<02:25, 2.34it/s]
66%|βββββββ | 661/1000 [05:00<02:37, 2.16it/s]
66%|βββββββ | 662/1000 [05:01<02:35, 2.17it/s]
66%|βββββββ | 663/1000 [05:01<02:34, 2.17it/s]
66%|βββββββ | 664/1000 [05:01<02:33, 2.19it/s]
66%|βββββββ | 665/1000 [05:02<02:31, 2.22it/s]
βββ | 666/1000 [05:02<02:33, 2.18it/s]
67%|βββββββ | 667/1000 [05:03<02:31, 2.20it/s]
67%|βββββββ | 668/1000 [05:03<02:31, 2.19it/s]
67%|βββββββ | 669/1000 [05:04<02:27, 2.24it/s]
67%|βββββββ | 670/1000 [05:05<03:16, 1.68it/s]
67%|βββββββ | 670/1000 [05:05<03:16, 1.68it/s]
67%|βββββββ | 671/1000 [05:05<03:04, 1.78it/s]
67%|βββββββ | 672/1000 [05:06<02:51, 1.91it/s]
67%|βββββββ | 673/1000 [05:06<02:41, 2.03it/s]
67%|βββββββ | 674/1000 [05:06<02:28, 2.19it/s]
68%|βββββββ | 675/1000 [05:07<02:30, 2.16it/s]
68%|βββββββ | 676/1000 [05:07<02:26, 2.21it/s]
68%|βββββββ | 677/1000 [05:08<02:34, 2.09it/s]
68%|βββββββ | 678/1000 [05:08<02:26, 2.19it/s]
68%|βββββββ | 679/1000 [05:09<02:29, 2.14it/s]
ββββ | 680/1000 [05:09<02:26, 2.19it/s]
68%|βββββββ | 680/1000 [05:09<02:26, 2.19it/s]
68%|βββββββ | 681/1000 [05:10<02:23, 2.22it/s]
68%|βββββββ | 682/1000 [05:10<02:15, 2.35it/s]
68%|βββββββ | 683/1000 [05:10<02:17, 2.31it/s]
68%|βββββββ | 684/1000 [05:11<02:13, 2.36it/s]
68%|βββββββ | 685/1000 [05:11<02:15, 2.33it/s]
69%|βββββββ | 686/1000 [05:12<02:14, 2.34it/s]
69%|βββββββ | 687/1000 [05:12<02:17, 2.27it/s]
69%|βββββββ | 688/1000 [05:13<02:16, 2.28it/s]
69%|βββββββ | 689/1000 [05:13<02:14, 2.31it/s]
69%|βββββββ | 690/1000 [05:13<02:08, 2.41it/s]
69%|βββββββ | 690/1000 [05:13<02:08, 2.41it/s]
69%|βββββββ | 691/1000 [05:14<02:25, 2.13it/s]
69%|βββββββ{'loss': 0.4724, 'grad_norm': 1.8839915990829468, 'learning_rate': 2.2791164325437046e-07, 'train/length': 2926.8, 'epoch': 0.7}
| 692/1000 [05:14<02:20, 2.20it/s]
69%|βββββββ | 693/1000 [05:15<02:18, 2.22it/s]
69%|βββββββ | 694/1000 [05:15<02:19, 2.19it/s]
70%|βββββββ | 695/1000 [05:16<02:16, 2.23it/s]
70%|βββββββ | 696/1000 [05:16<02:14, 2.26it/s]
70%|βββββββ | 697/1000 [05:17<02:16, 2.22it/s]
70%|βββββββ | 698/1000 [05:17<02:16, 2.22it/s]
70%|βββββββ | 699/1000 [05:17<02:08, 2.34it/s]
70%|βββββββ | 700/1000 [05:18<02:10, 2.30it/s]
70%|βββββββ | 700/1000 [05:18<02:10, 2.30it/s]
70%|βββββββ | 701/1000 [05:18<02:12, 2.26it/s]
70%|βββββββ | 702/1000 [05:19<02:10, 2.28it/s]
70%|βββββββ | 703/1000 [05:19<02:07, 2.33it/s]
70%|βββββββ | 704/1000 [05:20<02:08, 2.30it/s]
70%|βββββββ | 705/1000 [05:20<02:09, 2.28it/s]
71%|ββββββ{'loss': 0.4901, 'grad_norm': 1.9471701383590698, 'learning_rate': 2.1419080780610122e-07, 'train/length': 2825.8, 'epoch': 0.71}
β | 706/1000 [05:21<02:18, 2.12it/s]
71%|βββββββ | 707/1000 [05:21<02:25, 2.01it/s]
71%|βββββββ | 708/1000 [05:22<02:21, 2.06it/s]
71%|βββββββ | 709/1000 [05:22<02:16, 2.13it/s]
71%|βββββββ | 710/1000 [05:22<02:12, 2.19it/s]
71%|βββββββ | 710/1000 [05:22<02:12, 2.19it/s]
71%|βββββββ | 711/1000 [05:23<02:09, 2.24it/s]
71%|βββββββ | 712/1000 [05:23<02:09, 2.22it/s]
71%|ββββββββ | 713/1000 [05:24<02:34, 1.86it/s]
71%|ββββββββ | 714/1000 [05:25<02:27, 1.94it/s]
72%|ββββββββ | 715/1000 [05:25<02:19, 2.04it/s]
72%|ββββββββ | 716/1000 [05:25<02:09, 2.20it/s]
72%|ββββββββ | 717/1000 [05:26<02:08, 2.21it/s]
72%|ββββββββ | 718/1000 [05:26<02:06, 2.22it/s]
72%|ββββββββ | 719/1000 [05:27<02:05, 2.24it/s]
ββββββββ | 720/1000 [05:27<02:20, 1.99it/s]
72%|ββββββββ | 720/1000 [05:27<02:20, 1.99it/s]
72%|ββββββββ | 721/1000 [05:28<02:30, 1.86it/s]
72%|ββββββββ | 722/1000 [05:28<02:21, 1.97it/s]
72%|ββββββββ | 723/1000 [05:29<02:17, 2.02it/s]
72%|ββββββββ | 724/1000 [05:29<02:11, 2.10it/s]
72%|ββββββββ | 725/1000 [05:30<02:06, 2.17it/s]
73%|ββββββββ | 726/1000 [05:30<02:06, 2.16it/s]
73%|ββββββββ | 727/1000 [05:31<02:04, 2.20it/s]
73%|ββββββββ | 728/1000 [05:31<02:06, 2.15it/s]
73%|ββββββββ | 729/1000 [05:32<02:03, 2.19it/s]
73%|ββββββββ | 730/1000 [05:32<01:59, 2.27it/s]
73%|ββββββββ | 730/1000 [05:32<01:59, 2.27it/s]
73%|ββββββββ | 731/1000 [05:32<01:57,{'loss': 0.4644, 'grad_norm': 2.1221508979797363, 'learning_rate': 1.7496175976529337e-07, 'train/length': 2679.0, 'epoch': 0.74}
2.28it/s]
73%|ββββββββ | 732/1000 [05:33<01:58, 2.27it/s]
73%|ββββββββ | 733/1000 [05:33<01:58, 2.26it/s]
73%|ββββββββ | 734/1000 [05:34<01:56, 2.27it/s]
74%|ββββββββ | 735/1000 [05:34<01:58, 2.23it/s]
74%|ββββββββ | 736/1000 [05:35<01:54, 2.30it/s]
74%|ββββββββ | 737/1000 [05:35<01:56, 2.26it/s]
74%|ββββββββ | 738/1000 [05:35<01:52, 2.33it/s]
74%|ββββββββ | 739/1000 [05:36<01:52, 2.31it/s]
74%|ββββββββ | 740/1000 [05:36<01:52, 2.30it/s]
74%|ββββββββ | 740/1000 [05:36<01:52, 2.30it/s]
74%|ββββββββ | 741/1000 [05:37<01:52, 2.31it/s]
74%|ββββββββ | 742/1000 [05:37<02:01, 2.13it/s]
74%|ββββββββ | 743/1000 [05:38<01:57, 2.19it/s]
74%|ββββββββ | 744/1000 [05:38<01:50, 2.32it/s]
βββ | 745/1000 [05:38<01:47, 2.37it/s]
75%|ββββββββ | 746/1000 [05:39<01:59, 2.13it/s]
75%|ββββββββ | 747/1000 [05:40<01:58, 2.13it/s]
75%|ββββββββ | 748/1000 [05:40<01:57, 2.15it/s]
75%|ββββββββ | 749/1000 [05:40<01:55, 2.18it/s]
75%|ββββββββ | 750/1000 [05:41<01:47, 2.33it/s]
75%|ββββββββ | 750/1000 [05:41<01:47, 2.33it/s]
75%|ββββββββ | 751/1000 [05:41<01:44, 2.38it/s]
75%|ββββββββ | 752/1000 [05:42<01:48, 2.28it/s]
75%|ββββββββ | 753/1000 [05:42<01:46, 2.32it/s]
75%|ββββββββ | 754/1000 [05:43<01:47, 2.30it/s]
76%|ββββββββ | 755/1000 [05:43<01:45, 2.33it/s]
76%|ββββββββ | 756/1000 [05:44<01:56, 2.09it/s]
76%|ββββββββ | 757/1000 [05:44<02:03, 1.96it/s]
76%|ββββββββ | 758/1000 [05:45<02:01, {'loss': 0.4778, 'grad_norm': 1.9940024614334106, 'learning_rate': 1.5056232853991208e-07, 'train/length': 2819.2, 'epoch': 0.76}
{'loss': 0.4668, 'grad_norm': 2.2041783332824707, 'learning_rate': 1.389292011321498e-07, 'train/length': 2242.3, 'epoch': 0.77}
1.99it/s]
76%|ββββββββ | 759/1000 [05:45<01:55, 2.09it/s]
76%|ββββββββ | 760/1000 [05:46<02:00, 2.00it/s]
76%|ββββββββ | 760/1000 [05:46<02:00, 2.00it/s]
76%|ββββββββ | 761/1000 [05:46<02:02, 1.95it/s]
76%|ββββββββ | 762/1000 [05:47<01:56, 2.04it/s]
76%|ββββββββ | 763/1000 [05:47<01:54, 2.06it/s]
76%|ββββββββ | 764/1000 [05:47<01:48, 2.17it/s]
76%|ββββββββ | 765/1000 [05:48<01:41, 2.31it/s]
77%|ββββββββ | 766/1000 [05:48<01:45, 2.22it/s]
77%|ββββββββ | 767/1000 [05:49<01:43, 2.25it/s]
77%|ββββββββ | 768/1000 [05:49<01:44, 2.23it/s]
77%|ββββββββ | 769/1000 [05:50<01:46, 2.17it/s]
77%|ββββββββ | 770/1000 [05:50<01:42, 2.24it/s]
77%|ββββββββ | 770/1{'loss': 0.4786, 'grad_norm': 1.9703295230865479, 'learning_rate': 1.2769089934176126e-07, 'train/length': 2731.4, 'epoch': 0.78}
000 [05:50<01:42, 2.24it/s]
77%|ββββββββ | 771/1000 [05:51<01:42, 2.24it/s]
77%|ββββββββ | 772/1000 [05:51<01:39, 2.29it/s]
77%|ββββββββ | 773/1000 [05:51<01:40, 2.26it/s]
77%|ββββββββ | 774/1000 [05:52<01:46, 2.12it/s]
78%|ββββββββ | 775/1000 [05:52<01:43, 2.17it/s]
78%|ββββββββ | 776/1000 [05:53<01:39, 2.25it/s]
78%|ββββββββ | 777/1000 [05:53<01:36, 2.30it/s]
78%|ββββββββ | 778/1000 [05:54<01:36, 2.29it/s]
78%|ββββββββ | 779/1000 [05:54<01:37, 2.26it/s]
78%|ββββββββ | 780/1000 [05:55<01:38, 2.22it/s]
78%|ββββββββ | 780/1000 [05:55<01:38, 2.22it/s]
78%|ββββββββ | 781/1000 [05:55<01:36, 2.28it/s]
78%|ββββββββ | 782/1000 [05:55<01:35, 2.28it/s]
78%|ββββββββ | 783/1000 [05:56<01:34, 2.30it/s]
78%|{'loss': 0.4671, 'grad_norm': 2.061434030532837, 'learning_rate': 1.1685971208675538e-07, 'train/length': 2960.5, 'epoch': 0.79}
ββββββββ | 784/1000 [05:56<01:43, 2.09it/s]
78%|ββββββββ | 785/1000 [05:57<01:42, 2.10it/s]
79%|ββββββββ | 786/1000 [05:57<01:37, 2.20it/s]
79%|ββββββββ | 787/1000 [05:58<01:36, 2.22it/s]
79%|ββββββββ | 788/1000 [05:58<01:33, 2.26it/s]
79%|ββββββββ | 789/1000 [05:59<01:28, 2.38it/s]
79%|ββββββββ | 790/1000 [05:59<01:29, 2.33it/s]
79%|ββββββββ | 790/1000 [05:59<01:29, 2.33it/s]
79%|ββββββββ | 791/1000 [05:59<01:33, 2.24it/s]
79%|ββββββββ | 792/1000 [06:00<01:33, 2.22it/s]
79%|ββββββββ | 793/1000 [06:00<01:33, 2.21it/s]
79%|ββββββββ | 794/1000 [06:01<01:39, 2.06it/s]
80%|ββββββββ | 795/1000 [06:01<01:36, 2.13it/s]
80%|ββββββββ | 796/1000 [06:02<01:31, 2.22it/s]
80%|ββββββββ | 797/10{'loss': 0.4786, 'grad_norm': 2.1371705532073975, 'learning_rate': 1.0644748311137375e-07, 'train/length': 2973.1, 'epoch': 0.8}
00 [06:02<01:27, 2.32it/s]
80%|ββββββββ | 798/1000 [06:03<01:23, 2.42it/s]
80%|ββββββββ | 799/1000 [06:03<01:25, 2.34it/s]
80%|ββββββββ | 800/1000 [06:03<01:25, 2.34it/s]
80%|ββββββββ | 800/1000 [06:03<01:25, 2.34it/s]
80%|ββββββββ | 801/1000 [06:04<01:25, 2.34it/s]
80%|ββββββββ | 802/1000 [06:04<01:26, 2.30it/s]
80%|ββββββββ | 803/1000 [06:05<01:29, 2.21it/s]
80%|ββββββββ | 804/1000 [06:05<01:27, 2.25it/s]
80%|ββββββββ | 805/1000 [06:06<01:27, 2.23it/s]
81%|ββββββββ | 806/1000 [06:06<01:29, 2.17it/s]
81%|ββββββββ | 807/1000 [06:07<01:26, 2.23it/s]
81%|ββββββββ | 808/1000 [06:07<01:25, 2.25it/s]
81%|ββββββββ | 809/1000 [06:07<01:26, 2.22it/s]
81%|ββββββββ | 810/1000 [06:08<01:31, 2.08it/s]
{'loss': 0.4768, 'grad_norm': 1.92405366897583, 'learning_rate': 9.646559803512993e-08, 'train/length': 3054.2, 'epoch': 0.81}
{'loss': 0.4819, 'grad_norm': 2.0029549598693848, 'learning_rate': 8.692497190280224e-08, 'train/length': 2740.3, 'epoch': 0.82}
81%|ββββββββ | 810/1000 [06:08<01:31, 2.08it/s]
81%|ββββββββ | 811/1000 [06:09<01:34, 2.01it/s]
81%|ββββββββ | 812/1000 [06:09<01:26, 2.16it/s]
81%|βββββββββ | 813/1000 [06:09<01:23, 2.24it/s]
81%|βββββββββ | 814/1000 [06:10<01:23, 2.22it/s]
82%|βββββββββ | 815/1000 [06:10<01:23, 2.23it/s]
82%|βββββββββ | 816/1000 [06:11<01:20, 2.27it/s]
82%|βββββββββ | 817/1000 [06:11<01:20, 2.27it/s]
82%|βββββββββ | 818/1000 [06:12<01:17, 2.35it/s]
82%|βββββββββ | 819/1000 [06:12<01:20, 2.24it/s]
82%|βββββββββ | 820/1000 [06:12<01:20, 2.23it/s]
82%|βββββββββ | 820/1000 [06:12<01:20, 2.23it/s]
82%|βββββββββ | 821/1000 [06:13<01:34, 1.90it/s]
82%|βββββββββ | {'loss': 0.4837, 'grad_norm': 2.049612283706665, 'learning_rate': 7.783603724899257e-08, 'train/length': 2878.8, 'epoch': 0.83}
822/1000 [06:14<01:43, 1.72it/s]
82%|βββββββββ | 823/1000 [06:14<01:34, 1.86it/s]
82%|βββββββββ | 824/1000 [06:15<01:30, 1.94it/s]
82%|βββββββββ | 825/1000 [06:15<01:27, 1.99it/s]
83%|βββββββββ | 826/1000 [06:16<01:23, 2.09it/s]
83%|βββββββββ | 827/1000 [06:16<01:16, 2.25it/s]
83%|βββββββββ | 828/1000 [06:17<01:16, 2.24it/s]
83%|βββββββββ | 829/1000 [06:17<01:14, 2.31it/s]
83%|βββββββββ | 830/1000 [06:17<01:14, 2.28it/s]
83%|βββββββββ | 830/1000 [06:17<01:14, 2.28it/s]
83%|βββββββββ | 831/1000 [06:18<01:13, 2.31it/s]
83%|βββββββββ | 832/1000 [06:18<01:14, 2.25it/s]
83%|βββββββββ | 833/1000 [06:19<01:13, 2.27it/s]
83%|βββββββββ | 834/1000 [06:19<01:13, 2.25it/s]
84%|βββββββββ | 835/10{'loss': 0.4735, 'grad_norm': 2.04294753074646, 'learning_rate': 6.92087326903022e-08, 'train/length': 3054.8, 'epoch': 0.84}
00 [06:20<01:14, 2.23it/s]
84%|βββββββββ | 836/1000 [06:20<01:19, 2.08it/s]
84%|βββββββββ | 837/1000 [06:21<01:17, 2.11it/s]
84%|βββββββββ | 838/1000 [06:21<01:15, 2.14it/s]
84%|βββββββββ | 839/1000 [06:21<01:13, 2.19it/s]
84%|βββββββββ | 840/1000 [06:22<01:10, 2.26it/s]
84%|βββββββββ | 840/1000 [06:22<01:10, 2.26it/s]
84%|βββββββββ | 841/1000 [06:22<01:08, 2.32it/s]
84%|βββββββββ | 842/1000 [06:23<01:07, 2.32it/s]
84%|βββββββββ | 843/1000 [06:23<01:09, 2.25it/s]
84%|βββββββββ | 844/1000 [06:24<01:05, 2.38it/s]
84%|βββββββββ | 845/1000 [06:24<01:06, 2.32it/s]
85%|βββββββββ | 846/1000 [06:24<01:07, 2.27it/s]
85%|βββββββββ | 847/1000 [06:25<01:09, 2.20it/s]
85%|βββββββββ | 848/1000 [06{'loss': 0.4745, 'grad_norm': 2.017137050628662, 'learning_rate': 6.105249205760127e-08, 'train/length': 2701.7, 'epoch': 0.85}
{'loss': 0.4867, 'grad_norm': 1.8250972032546997, 'learning_rate': 5.337623408027292e-08, 'train/length': 3675.4, 'epoch': 0.86}
:25<01:09, 2.19it/s]
85%|βββββββββ | 849/1000 [06:26<01:07, 2.24it/s]
85%|βββββββββ | 850/1000 [06:26<01:05, 2.30it/s]
85%|βββββββββ | 850/1000 [06:26<01:05, 2.30it/s]
85%|βββββββββ | 851/1000 [06:27<01:06, 2.26it/s]
85%|βββββββββ | 852/1000 [06:27<01:09, 2.11it/s]
85%|βββββββββ | 853/1000 [06:28<01:07, 2.17it/s]
85%|βββββββββ | 854/1000 [06:29<01:47, 1.36it/s]
86%|βββββββββ | 855/1000 [06:30<01:38, 1.47it/s]
86%|βββββββββ | 856/1000 [06:30<01:28, 1.63it/s]
86%|βββββββββ | 857/1000 [06:31<01:19, 1.81it/s]
86%|βββββββββ | 858/1000 [06:31<01:14, 1.90it/s]
86%|βββββββββ | 859/1000 [06:31<01:12, 1.96it/s]
86%|βββββββββ | 860/1000 [06:32<01:12, 1.92it/s]
{'loss': 0.4644, 'grad_norm': 1.9171910285949707, 'learning_rate': 4.6188352633713956e-08, 'train/length': 2700.5, 'epoch': 0.87}
86%|βββββββββ | 860/1000 [06:32<01:12, 1.92it/s]
86%|βββββββββ | 861/1000 [06:32<01:09, 2.00it/s]
86%|βββββββββ | 862/1000 [06:33<01:06, 2.08it/s]
86%|βββββββββ | 863/1000 [06:33<01:05, 2.09it/s]
86%|βββββββββ | 864/1000 [06:34<01:00, 2.24it/s]
86%|βββββββββ | 865/1000 [06:34<01:01, 2.20it/s]
87%|βββββββββ | 866/1000 [06:35<01:00, 2.22it/s]
87%|βββββββββ | 867/1000 [06:35<00:56, 2.35it/s]
87%|βββββββββ | 868/1000 [06:36<01:00, 2.17it/s]
87%|βββββββββ | 869/1000 [06:36<00:59, 2.20it/s]
87%|βββββββββ | 870/1000 [06:36<00:58, 2.21it/s]
87%|βββββββββ | 870/1000 [06:36<00:58, 2.21it/s]
87%|βββββββββ | 871/1000 [06:37<00:58, 2.20it/s]
87%|βββββββββ | 872/1000 [06:37<00:56, 2.25it/s]
βββββββββ | 873/1000 [06:38<00:57, 2.21it/s]
87%|βββββββββ | 874/1000 [06:38<00:57, 2.18it/s]
88%|βββββββββ | 875/1000 [06:39<00:56, 2.23it/s]
88%|βββββββββ | 876/1000 [06:39<00:54, 2.28it/s]
88%|βββββββββ | 877/1000 [06:39<00:52, 2.36it/s]
88%|βββββββββ | 878/1000 [06:40<00:51, 2.36it/s]
88%|βββββββββ | 879/1000 [06:40<00:53, 2.27it/s]
88%|βββββββββ | 880/1000 [06:41<00:52, 2.27it/s]
88%|βββββββββ | 880/1000 [06:41<00:52, 2.27it/s]
88%|βββββββββ | 881/1000 [06:41<00:51, 2.29it/s]
88%|βββββββββ | 882/1000 [06:42<00:52, 2.24it/s]
88%|βββββββββ | 883/1000 [06:42<00:53, 2.19it/s]
88%|βββββββββ | 884/1000 [06:43<00:51, 2.24it/s]
88%|βββββββββ | 885/1000 [06:43<00:52, 2.20it/s]
βββββββ | 886/1000 [06:44<00:50, 2.24it/s]
89%|βββββββββ | 887/1000 [06:44<00:49, 2.27it/s]
89%|βββββββββ | 888/1000 [06:44<00:46, 2.39it/s]
89%|βββββββββ | 889/1000 [06:45<00:47, 2.35it/s]
89%|βββββββββ | 890/1000 [06:45<00:46, 2.35it/s]
89%|βββββββββ | 890/1000 [06:45<00:46, 2.35it/s]
89%|βββββββββ | 891/1000 [06:46<00:47, 2.27it/s]
89%|βββββββββ | 892/1000 [06:46<00:47, 2.26it/s]
89%|βββββββββ | 893/1000 [06:47<00:46, 2.31it/s]
89%|βββββββββ | 894/1000 [06:47<00:47, 2.22it/s]
90%|βββββββββ | 895/1000 [06:47<00:46, 2.28it/s]
90%|βββββββββ | 896/1000 [06:48<00:45, 2.29it/s]
90%|βββββββββ | 897/1000 [06:48<00:44, 2.30it/s]
90%|βββββββββ | 898/1000 [06:49<00:45, 2.24it/s]
βββββ | 899/1000 [06:49<00:44, 2.29it/s]
90%|βββββββββ | 900/1000 [06:50<00:44, 2.23it/s]
90%|βββββββββ | 900/1000 [06:50<00:44, 2.23it/s]
90%|βββββββββ | 901/1000 [06:50<00:45, 2.18it/s]
90%|βββββββββ | 902/1000 [06:51<00:44, 2.20it/s]
90%|βββββββββ | 903/1000 [06:51<00:44, 2.19it/s]
90%|βββββββββ | 904/1000 [06:52<00:44, 2.14it/s]
90%|βββββββββ | 905/1000 [06:52<00:43, 2.17it/s]
91%|βββββββββ | 906/1000 [06:52<00:43, 2.16it/s]
91%|βββββββββ | 907/1000 [06:53<00:41, 2.22it/s]
91%|βββββββββ | 908/1000 [06:53<00:40, 2.29it/s]
91%|βββββββββ | 909/1000 [06:54<00:42, 2.12it/s]
91%|βββββββββ | 910/1000 [06:54<00:41, 2.17it/s]
91%|βββββββββ | 910/100{'loss': 0.4854, 'grad_norm': 1.9634671211242676, 'learning_rate': 1.7830532106104746e-08, 'train/length': 3004.0, 'epoch': 0.92}
0 [06:54<00:41, 2.17it/s]
91%|βββββββββ | 911/1000 [06:55<00:40, 2.22it/s]
91%|βββββββββ | 912/1000 [06:55<00:39, 2.26it/s]
91%|ββββββββββ| 913/1000 [06:56<00:37, 2.29it/s]
91%|ββββββββββ| 914/1000 [06:56<00:36, 2.34it/s]
92%|ββββββββββ| 915/1000 [06:56<00:36, 2.31it/s]
92%|ββββββββββ| 916/1000 [06:57<00:37, 2.21it/s]
92%|ββββββββββ| 917/1000 [06:58<00:44, 1.86it/s]
92%|ββββββββββ| 918/1000 [06:58<00:42, 1.93it/s]
92%|ββββββββββ| 919/1000 [06:59<00:40, 2.00it/s]
92%|ββββββββββ| 920/1000 [06:59<00:41, 1.95it/s]
92%|ββββββββββ| 920/1000 [06:59<00:41, 1.95it/s]
92%|ββββββββββ| 921/1000 [07:00<00:39, 1.99it/s]
92%|ββββββββββ| 922/1000 [07:00<00:42, 1.84it/s]
92%|βββββββ{'loss': 0.4847, 'grad_norm': 1.7138712406158447, 'learning_rate': 1.3718707247769134e-08, 'train/length': 3362.2, 'epoch': 0.93}
βββ| 923/1000 [07:01<00:38, 1.98it/s]
92%|ββββββββββ| 924/1000 [07:01<00:37, 2.04it/s]
92%|ββββββββββ| 925/1000 [07:02<00:35, 2.12it/s]
93%|ββββββββββ| 926/1000 [07:02<00:36, 2.00it/s]
93%|ββββββββββ| 927/1000 [07:03<00:35, 2.06it/s]
93%|ββββββββββ| 928/1000 [07:03<00:33, 2.14it/s]
93%|ββββββββββ| 929/1000 [07:03<00:32, 2.21it/s]
93%|ββββββββββ| 930/1000 [07:04<00:32, 2.15it/s]
93%|ββββββββββ| 930/1000 [07:04<00:32, 2.15it/s]
93%|ββββββββββ| 931/1000 [07:04<00:32, 2.14it/s]
93%|ββββββββββ| 932/1000 [07:05<00:30, 2.23it/s]
93%|ββββββββββ| 933/1000 [07:05<00:32, 2.04it/s]
93%|ββββββββββ| 934/1000 [07:06<00:32, 2.04it/s]
94%|ββββββββββ| 935/1000 [07:06<00:30, 2.13it/s]
94%{'loss': 0.4756, 'grad_norm': 2.3641197681427, 'learning_rate': 1.0138623843548078e-08, 'train/length': 3000.4, 'epoch': 0.94}
|ββββββββββ| 936/1000 [07:07<00:30, 2.12it/s]
94%|ββββββββββ| 937/1000 [07:07<00:28, 2.19it/s]
94%|ββββββββββ| 938/1000 [07:08<00:28, 2.20it/s]
94%|ββββββββββ| 939/1000 [07:08<00:27, 2.25it/s]
94%|ββββββββββ| 940/1000 [07:08<00:26, 2.25it/s]
94%|ββββββββββ| 940/1000 [07:08<00:26, 2.25it/s]
94%|ββββββββββ| 941/1000 [07:09<00:25, 2.32it/s]
94%|ββββββββββ| 942/1000 [07:09<00:25, 2.28it/s]
94%|ββββββββββ| 943/1000 [07:10<00:25, 2.25it/s]
94%|ββββββββββ| 944/1000 [07:10<00:24, 2.24it/s]
94%|ββββββββββ| 945/1000 [07:11<00:24, 2.26it/s]
95%|ββββββββββ| 946/1000 [07:11<00:23, 2.31it/s]
95%|ββββββββββ| 947/1000 [07:12<00:23, 2.27it/s]
95%|ββββββββββ| 948/1000 [07:12<{'loss': 0.4773, 'grad_norm': 1.9625495672225952, 'learning_rate': 7.09419666208183e-09, 'train/length': 2867.9, 'epoch': 0.95}
00:25, 2.03it/s]
95%|ββββββββββ| 949/1000 [07:13<00:23, 2.14it/s]
95%|ββββββββββ| 950/1000 [07:13<00:23, 2.16it/s]
95%|ββββββββββ| 950/1000 [07:13<00:23, 2.16it/s]
95%|ββββββββββ| 951/1000 [07:13<00:22, 2.15it/s]
95%|ββββββββββ| 952/1000 [07:14<00:21, 2.20it/s]
95%|ββββββββββ| 953/1000 [07:14<00:21, 2.15it/s]
95%|ββββββββββ| 954/1000 [07:15<00:21, 2.16it/s]
96%|ββββββββββ| 955/1000 [07:15<00:21, 2.14it/s]
96%|ββββββββββ| 956/1000 [07:16<00:19, 2.21it/s]
96%|ββββββββββ| 957/1000 [07:16<00:19, 2.24it/s]
96%|ββββββββββ| 958/1000 [07:17<00:17, 2.36it/s]
96%|ββββββββββ| 959/1000 [07:17<00:18, 2.27it/s]
96%|ββββββββββ| 960/1000 [07:17<00:17, 2.24it/s]
{'loss': 0.4695, 'grad_norm': 2.008470296859741, 'learning_rate': 4.588754739795586e-09, 'train/length': 3086.9, 'epoch': 0.96}
{'loss': 0.4822, 'grad_norm': 1.901129126548767, 'learning_rate': 2.6250377406467627e-09, 'train/length': 2927.7, 'epoch': 0.97}
96%|ββββββββββ| 960/1000 [07:17<00:17, 2.24it/s]
96%|ββββββββββ| 961/1000 [07:18<00:17, 2.24it/s]
96%|ββββββββββ| 962/1000 [07:18<00:17, 2.19it/s]
96%|ββββββββββ| 963/1000 [07:19<00:18, 2.05it/s]
96%|ββββββββββ| 964/1000 [07:19<00:16, 2.13it/s]
96%|ββββββββββ| 965/1000 [07:20<00:15, 2.22it/s]
97%|ββββββββββ| 966/1000 [07:20<00:16, 2.00it/s]
97%|ββββββββββ| 967/1000 [07:21<00:16, 2.05it/s]
97%|ββββββββββ| 968/1000 [07:21<00:14, 2.17it/s]
97%|ββββββββββ| 969/1000 [07:22<00:14, 2.16it/s]
97%|ββββββββββ| 970/1000 [07:22<00:14, 2.04it/s]
97%|ββββββββββ| 970/1000 [07:22<00:14, 2.04it/s]
97%|ββββββββββ| 971/1000 [07:23<00:13, 2.13it/s]
βββ| 972/1000 [07:23<00:12, 2.17it/s]
97%|ββββββββββ| 973/1000 [07:24<00:12, 2.23it/s]
97%|ββββββββββ| 974/1000 [07:24<00:11, 2.30it/s]
98%|ββββββββββ| 975/1000 [07:24<00:11, 2.22it/s]
98%|ββββββββββ| 976/1000 [07:25<00:10, 2.21it/s]
98%|ββββββββββ| 977/1000 [07:25<00:10, 2.26it/s]
98%|ββββββββββ| 978/1000 [07:26<00:09, 2.38it/s]
98%|ββββββββββ| 979/1000 [07:26<00:09, 2.33it/s]
98%|ββββββββββ| 980/1000 [07:27<00:08, 2.33it/s]
98%|ββββββββββ| 980/1000 [07:27<00:08, 2.33it/s]
98%|ββββββββββ| 981/1000 [07:27<00:08, 2.28it/s]
98%|ββββββββββ| 982/1000 [07:27<00:07, 2.33it/s]
98%|ββββββββββ| 983/1000 [07:28<00:07, 2.29it/s]
98%|ββββββββββ| 984/1000 [07:28<00:07, 2.25it/s]
ββββββββββ| 985/1000 [07:29<00:06, 2.19it/s]
99%|ββββββββββ| 986/1000 [07:29<00:06, 2.18it/s]
99%|ββββββββββ| 987/1000 [07:30<00:06, 2.15it/s]
99%|ββββββββββ| 988/1000 [07:30<00:05, 2.20it/s]
99%|ββββββββββ| 989/1000 [07:31<00:04, 2.22it/s]
99%|ββββββββββ| 990/1000 [07:31<00:04, 2.27it/s]
99%|ββββββββββ| 990/1000 [07:31<00:04, 2.27it/s]
99%|ββββββββββ| 991/1000 [07:32<00:04, 2.24it/s]
99%|ββββββββββ| 992/1000 [07:32<00:03, 2.36it/s]
99%|ββββββββββ| 993/1000 [07:32<00:03, 2.32it/s]
99%|ββββββββββ| 994/1000 [07:33<00:02, 2.32it/s]
100%|ββββββββββ| 995/1000 [07:33<00:02, 2.30it/s]
100%|ββββββββββ| 996/1000 [07:34<00:01, 2.25it/s]
100%|ββββββββββ| 997/1000 [07:34<00{'loss': 0.4858, 'grad_norm': 1.9469858407974243, 'learning_rate': 2.7339599464326622e-12, 'train/length': 2764.4, 'epoch': 1.0}
:01, 2.26it/s]
100%|ββββββββββ| 998/1000 [07:35<00:00, 2.12it/s]
100%|ββββββββββ| 999/1000 [07:35<00:00, 2.28it/s]
100%|ββββββββββ| 1000/1000 [07:35<00:00, 2.31it/s]
100%|ββββββββββ| 1000/1000 [07:35<00:00, 2.31it/s][INFO|trainer.py:3993] 2025-09-23 23:06:27,875 >> Saving model checkpoint to /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000
[INFO|configuration_utils.py:424] 2025-09-23 23:06:27,883 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/config.json
[INFO|configuration_utils.py:904] 2025-09-23 23:06:27,887 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/generation_config.json
[INFO|trainer.py:2676] 2025-09-23 23:06:27,911 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
[INFO|trainer.py:2676] 2025-09-23 23:06:27,983 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
[INFO|trainer.py:2676] 2025-09-23 23:06:28,004 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
[INFO|modeling_utils.py:3724] 2025-09-23 23:06:31,827 >> Model weights saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/model.safetensors
[INFO|tokenization_utils_base.py:2356] 2025-09-23 23:06:31,835 >> chat template saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/chat_template.jinja
[INFO|tokenization_utils_base.py:2525] 2025-09-23 23:06:31,841 >> tokenizer config file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/tokenizer_config.json
[INFO|tokenization_utils_base.py:2534] 2025-09-23 23:06:31,845 >> Special tokens file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/special_tokens_map.json
[INFO|trainer.py:2676] 2025-09-23 23:06:32,091 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
{'train_runtime': 465.1346, 'train_samples_per_second': 8.595, 'train_steps_per_second': 2.15, 'train_loss': 0.5689593420028687, 'epoch': 1.0}
100%|ββββββββββ| 1000/1000 [07:40<00:00, 2.31it/s]
100%|ββββββββββ| 1000/1000 [07:40<00:00, 2.17it/s]
[INFO|trainer.py:3993] 2025-09-23 23:06:32,510 >> Saving model checkpoint to /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints
[INFO|configuration_utils.py:424] 2025-09-23 23:06:32,517 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/config.json
[INFO|configuration_utils.py:904] 2025-09-23 23:06:32,521 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/generation_config.json
c619-111:3463818:3464187 [0] NCCL INFO comm 0x51c28a80 rank 2 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-111:3463818:3464189 [0] NCCL INFO comm 0x4a0c9a70 rank 2 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-112:920311:920681 [0] NCCL INFO comm 0x16b43940 rank 3 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-102:3669486:3669856 [0] NCCL INFO comm 0x3ed369e0 rank 1 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-112:920311:920683 [0] NCCL INFO comm 0x146a66f0 rank 3 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-102:3669486:3669858 [0] NCCL INFO comm 0x36c20920 rank 1 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[INFO|modeling_utils.py:3724] 2025-09-23 23:06:35,975 >> Model weights saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/model.safetensors
[INFO|tokenization_utils_base.py:2356] 2025-09-23 23:06:35,982 >> chat template saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/chat_template.jinja
[INFO|tokenization_utils_base.py:2525] 2025-09-23 23:06:35,986 >> tokenizer config file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/tokenizer_config.json
[INFO|tokenization_utils_base.py:2534] 2025-09-23 23:06:35,988 >> Special tokens file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/special_tokens_map.json
***** train metrics *****
epoch = 1.0
total_flos = 86028158GF
train_loss = 0.569
train_runtime = 0:07:45.13
train_samples_per_second = 8.595
train_steps_per_second = 2.15
Figure saved at: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/training_loss.png
[WARNING|2025-09-23 23:06:37] llamafactory.extras.ploting:148 >> No metric eval_loss to plot.
[WARNING|2025-09-23 23:06:37] llamafactory.extras.ploting:148 >> No metric eval_accuracy to plot.
[INFO|modelcard.py:450] 2025-09-23 23:06:37,589 >> Dropping the following result as it does not have all the necessary fields:
{'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}}
c619-101:845961:846415 [0] NCCL INFO comm 0x31f63cc0 rank 0 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-101:845961:846417 [0] NCCL INFO comm 0x28280b30 rank 0 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[1;34mwandb[0m:
[1;34mwandb[0m: π View run [33mBASELINE_r1_distillation_sft[0m at: [34m[0m
[1;34mwandb[0m: Find logs at: [1;35mwandb/run-20250923_225847-rlr328gt/logs[0m
W0923 23:06:43.090000 3463814 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1341] The node 'c619-111.vista.tacc.utexas.edu_3463814_0' has failed to send a keep-alive heartbeat to the rendezvous '375339' due to an error of type RendezvousTimeoutError.
================================================================================
[INFO] Training completed successfully
[INFO] Merging trained model...
[DEBUG] Loaded 72 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 73)
[INFO] Starting model merge with real-time output...
============================================================
[WARNING|2025-09-23 23:07:08] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 23:07:13,087] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
INFO 09-23 23:07:23 [__init__.py:244] Automatically detected platform cuda.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/llamafactory-cli", line 7, in <module>
sys.exit(main())
^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/cli.py", line 40, in main
from .api.app import run_api
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/api/app.py", line 21, in <module>
from ..chat import ChatModel
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/__init__.py", line 16, in <module>
from .chat_model import ChatModel
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 29, in <module>
from .vllm_engine import VllmEngine
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/vllm_engine.py", line 33, in <module>
from vllm import AsyncEngineArgs, AsyncLLMEngine, RequestOutput, SamplingParams
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/__init__.py", line 64, in __getattr__
module = import_module(module_name, __package__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/engine/arg_utils.py", line 24, in <module>
from vllm.config import (BlockSize, CacheConfig, CacheDType, CompilationConfig,
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/config.py", line 36, in <module>
from vllm.platforms import current_platform
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/platforms/__init__.py", line 276, in __getattr__
_current_platform = resolve_obj_by_qualname(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/utils/__init__.py", line 2304, in resolve_obj_by_qualname
module = importlib.import_module(module_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/platforms/cuda.py", line 18, in <module>
import vllm._C # noqa
^^^^^^^^^^^^^^
ImportError: /opt/apps/gcc/13.2.0/lib64/libstdc++.so.6: version `CXXABI_1.3.15' not found (required by /work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/_C.abi3.so)
============================================================
[ERROR] Model merge failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Model merge failed
[ERROR] Stage error: RuntimeError: Model merge failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 44.31ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 36.3MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 23.1MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.20 shards/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 1.20 shards/s]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.1MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/9.43k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.43k/9.43k [00:00<00:00, 19.6kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.43k/9.43k [00:00<00:00, 19.6kB/s]
Generating train split: 0%| | 0/71 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 71/71 [00:00<00:00, 16084.02 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2392.64ba/s]
Uploading...: 0%| | 0.00/9.55k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.55k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.55k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.55k/9.55k [00:00<00:00, 47.7kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.55k/9.55k [00:00<00:00, 11.2kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.06s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.06s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 11.4MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/9.55k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.55k/9.55k [00:00<00:00, 15.8kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.55k/9.55k [00:00<00:00, 15.8kB/s]
Generating train split: 0%| | 0/72 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 72/72 [00:00<00:00, 3063.83 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2629.66ba/s]
Uploading...: 0%| | 0.00/9.63k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.63k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.63k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.63k/9.63k [00:00<00:00, 48.2kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.63k/9.63k [00:00<00:00, 10.2kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.15s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.15s/ shards]
| BASELINE_r1_distillation | 570.121299 | true |
2025-09-23T23:23:12.244140 | 2025-09-23T23:32:21.191950 | llamafactory_sft | 1 | INFO | Complete log capture for stage: llamafactory_sft | [INFO] Starting stage: LLaMAFactory training - sft
[INFO] Starting LLaMAFactory Training
[INFO] Found existing dataset registration: TAUR_dev__D_SFT_C_BASELINE_r1_distillation_sft_data__sft_train
[INFO] Created training config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
[INFO] Created merge config: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/merge_config.yaml
[INFO]οΈ Starting LLaMAFactory training...
[DEBUG] Loaded 76 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 77)
[DEBUG] Training Script #!/bin/bash
cd /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
source ~/.profile; source /opt/apps/lmod/lmod/init/bash;module load gcc/14 cuda/12.6 nccl/12.4 nvidia_math/12.4
module load gcc/14 cuda/12.6 nvidia_math/12.4 nccl/12.4 && source /home1/10286/georgetsoukalas/miniconda3/etc/profile.d/conda.sh && conda deactivate && conda deactivate && conda activate && conda activate vllm && echo PYTHON: $(which python)
# Verify python environment is working
PYTHON_PATH=$(which python)
echo "Python environment check: $PYTHON_PATH"
export HF_HOME="/scratch/10286/georgetsoukalas/hf_cache"
export TRITON_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/triton"
export OUTLINES_CACHE_DIR="/scratch/10286/georgetsoukalas/.cache/outlines"
export PYTHONPATH="/scratch/10286/georgetsoukalas/skillfactory/skill-factory"
export CUDA_LAUNCH_BLOCKING="0"
export DISABLE_VERSION_CHECK="1"
export CC="gcc"
export CXX="g++"
export FORCE_TORCHRUN="1"
export FI_EFA_FORK_SAFE="1"
export FI_LOG_LEVEL="1"
export FI_EFA_USE_DEVICE_RDMA="1"
export PYTHONFAULTHANDLER="1"
export OMPI_MCA_mtl_base_verbose="1"
export FI_EFA_ENABLE_SHM_TRANSFER="0"
export FI_EFA_TX_MIN_CREDITS="64"
export NCCL_TREE_THRESHOLD="0"
export NCCL_DEBUG="INFO"
export HF_DATASETS_DISABLE_MEMMAP="1"
export DATASETS_DISABLE_MEMMAP="1"
export HF_DATASETS_CACHE="/tmp/.sf_cache/datasets"
export HF_DATASETS_DISABLE_MEMMAP=1
export DATASETS_DISABLE_MEMMAP=1
# Master node coordination
export MASTER_ADDR=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_PORT=12802
# Python path setup
export PYTHONPATH=$PWD:$PYTHONPATH
echo "π Multi-node setup: MASTER_ADDR=$MASTER_ADDR, MASTER_PORT=$MASTER_PORT"
echo "π Starting multi-node training with 1 GPUs per node across $SLURM_JOB_NUM_NODES nodes"
echo "π Working directory: $(pwd)"
echo "π Python path: $(which python)"
echo "π₯ Torchrun path: $(which torchrun)"
srun /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python -m torch.distributed.run \
--nproc-per-node 1 \
--nnodes $SLURM_JOB_NUM_NODES \
--rdzv_id=$SLURM_JOB_ID \
--rdzv_backend=c10d \
--rdzv_endpoint="$MASTER_ADDR:$MASTER_PORT" \
/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/train.py /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/configs/training_config.yaml
π Starting training with real-time output...
================================================================================
The following have been reloaded with a version change:
1) cuda/12.8 => cuda/12.6 2) gcc/13.2.0 => gcc/14.2.0
PYTHON: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
Python environment check: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π Multi-node setup: MASTER_ADDR=c619-101, MASTER_PORT=12802
π Starting multi-node training with 1 GPUs per node across 4 nodes
π Working directory: /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory
π Python path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/python
π₯ Torchrun path: /home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/torchrun
[WARNING|2025-09-23 23:23:28] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 23:23:28] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 23:23:29] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[WARNING|2025-09-23 23:23:29] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 23:23:29,947] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2025-09-23 23:23:30,022] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 23:23:30,090] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
[2025-09-23 23:23:30,135] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
Registered source: longmult
Registered source: countdown
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: gsm8k
Registered source: piqa
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: mmlu
Registered source: piqa
Registered source: mmlu_pro
Registered source: mmlu
Registered source: csqa
Registered source: mmlu_pro
Registered source: social_iqa
Registered source: csqa
Registered source: strategy_qa
Registered source: social_iqa
Registered source: winogrande
Registered source: strategy_qa
Registered source: bbh
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: letter_countdown
Registered source: acronym
Registered source: acronym
[2025-09-23 23:23:34,288] [INFO] [comm.py:669:init_distributed] cdb=None
[2025-09-23 23:23:34,288] [INFO] [comm.py:700:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl
[INFO|2025-09-23 23:23:34] llamafactory.hparams.parser:406 >> Process rank: 0, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[2025-09-23 23:23:34,338] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 23:23:34] llamafactory.hparams.parser:406 >> Process rank: 2, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,486 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,538 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:34,668 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[2025-09-23 23:23:34,711] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:34,714 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 23:23:34] llamafactory.hparams.parser:406 >> Process rank: 1, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[2025-09-23 23:23:34,758] [INFO] [comm.py:669:init_distributed] cdb=None
[INFO|2025-09-23 23:23:34] llamafactory.hparams.parser:406 >> Process rank: 3, world size: 4, device: cuda:0, distributed training: True, compute dtype: torch.bfloat16
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,910 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:34,962 >> loading file chat_template.jinja from cache at None
[INFO|configuration_utils.py:698] 2025-09-23 23:23:35,073 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:35,077 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,091 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|configuration_utils.py:698] 2025-09-23 23:23:35,118 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:35,120 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,145 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,212 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,213 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,250 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,389 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 23:23:35] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,427 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 23:23:35] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|configuration_utils.py:698] 2025-09-23 23:23:35,479 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:35,481 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|configuration_utils.py:698] 2025-09-23 23:23:35,533 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:35,536 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,663 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file vocab.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/vocab.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file merges.txt from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/merges.txt
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file tokenizer.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file special_tokens_map.json from cache at None
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file tokenizer_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/tokenizer_config.json
[INFO|tokenization_utils_base.py:2023] 2025-09-23 23:23:35,781 >> loading file chat_template.jinja from cache at None
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,845 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 23:23:35] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
[INFO|tokenization_utils_base.py:2299] 2025-09-23 23:23:35,959 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO|2025-09-23 23:23:35] llamafactory.data.loader:143 >> Loading dataset TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data...
`trust_remote_code` is not supported anymore.
Please check that the Hugging Face dataset 'TAUR-dev/D-SFT_C-BASELINE_r1_distillation-sft-data' isn't based on a loading script and remove `trust_remote_code`.
If the dataset is based on a loading script, please ask the dataset author to remove it and convert it to a standard format like Parquet.
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6140 examples [00:00, 20268.92 examples/s]
Converting format of dataset: 7996 examples [00:00, 21120.04 examples/s]
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6139 examples [00:00, 20710.53 examples/s]
Converting format of dataset: 7996 examples [00:00, 21279.40 examples/s]
[rank2]:[W923 23:23:36.923344421 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 2] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
[rank0]:[W923 23:23:36.965227574 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 0] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-101:850874:850874 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.212<0>
c619-101:850874:850874 [0] NCCL INFO cudaDriverVersion 12080
c619-101:850874:850874 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-101:850874:850874 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3464458:3464458 [0] NCCL INFO cudaDriverVersion 12080
c619-111:3464458:3464458 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.214<0>
c619-111:3464458:3464458 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-111:3464458:3464458 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3464458:3464572 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-111:3464458:3464572 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.214<0>
c619-111:3464458:3464572 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-111:3464458:3464572 [0] NCCL INFO Using network IB
c619-111:3464458:3464572 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3464458:3464572 [0] NCCL INFO ncclCommInitRankConfig comm 0x2ee276b0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init START
c619-101:850874:850990 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-101:850874:850990 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.212<0>
c619-101:850874:850990 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-101:850874:850990 [0] NCCL INFO Using network IB
c619-101:850874:850990 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-101:850874:850990 [0] NCCL INFO ncclCommInitRankConfig comm 0x39f30fe0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init START
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6163 examples [00:00, 21247.29 examples/s]
Converting format of dataset: 7996 examples [00:00, 21476.28 examples/s]
[rank1]:[W923 23:23:37.557353938 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 1] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-102:3670125:3670125 [0] NCCL INFO cudaDriverVersion 12080
c619-102:3670125:3670125 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.213<0>
c619-102:3670125:3670125 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-102:3670125:3670125 [0] NCCL INFO Comm config Blocking set to 1
Converting format of dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Converting format of dataset: 6081 examples [00:00, 20731.41 examples/s]
Converting format of dataset: 7996 examples [00:00, 20904.90 examples/s]
c619-102:3670125:3670238 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-102:3670125:3670238 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.213<0>
c619-102:3670125:3670238 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-102:3670125:3670238 [0] NCCL INFO Using network IB
c619-102:3670125:3670238 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3670125:3670238 [0] NCCL INFO ncclCommInitRankConfig comm 0x431bf9b0 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init START
c619-102:3670125:3670238 [0] NCCL INFO RAS client listening socket at ::1<28028>
[rank3]:[W923 23:23:37.767981080 ProcessGroupNCCL.cpp:4715] [PG ID 0 PG GUID 0 Rank 3] using GPU 0 as device used by this process is currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect. You can pecify device_id in init_process_group() to force use of a particular device.
c619-112:920946:920946 [0] NCCL INFO cudaDriverVersion 12080
c619-112:920946:920946 [0] NCCL INFO Bootstrap: Using ibP2s2:192.168.17.215<0>
c619-112:920946:920946 [0] NCCL INFO NCCL version 2.26.2+cuda12.8
c619-112:920946:920946 [0] NCCL INFO Comm config Blocking set to 1
c619-112:920946:921059 [0] NCCL INFO NET/Plugin: Could not find: libnccl-net.so. Using internal net plugin.
c619-112:920946:921059 [0] NCCL INFO NET/IB : Using [0]mlx5_0:1/IB [RO]; OOB ibP2s2:192.168.17.215<0>
c619-112:920946:921059 [0] NCCL INFO PROFILER/Plugin: Could not find: libnccl-profiler.so.
c619-112:920946:921059 [0] NCCL INFO Using network IB
c619-112:920946:921059 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:920946:921059 [0] NCCL INFO ncclCommInitRankConfig comm 0x24f81300 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init START
c619-111:3464458:3464572 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-101:850874:850990 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-112:920946:921059 [0] NCCL INFO RAS client listening socket at ::1<28028>
c619-111:3464458:3464572 [0] NCCL INFO Bootstrap timings total 0.749800 (create 0.000027, send 0.000159, recv 0.720326, ring 0.000218, delay 0.000001)
c619-112:920946:921059 [0] NCCL INFO Bootstrap timings total 0.038970 (create 0.000029, send 0.000140, recv 0.000204, ring 0.000249, delay 0.000001)
c619-101:850874:850990 [0] NCCL INFO Bootstrap timings total 0.761089 (create 0.000023, send 0.000073, recv 0.590384, ring 0.000258, delay 0.000000)
c619-102:3670125:3670238 [0] NCCL INFO Bootstrap timings total 0.180148 (create 0.000028, send 0.000149, recv 0.000271, ring 0.118365, delay 0.000001)
c619-111:3464458:3464572 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:850874:850990 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:920946:921059 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-102:3670125:3670238 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3464458:3464572 [0] NCCL INFO comm 0x2ee276b0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:850874:850990 [0] NCCL INFO comm 0x39f30fe0 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-112:920946:921059 [0] NCCL INFO comm 0x24f81300 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:850874:850990 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-102:3670125:3670238 [0] NCCL INFO comm 0x431bf9b0 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:850874:850990 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:850874:850990 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-111:3464458:3464572 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-111:3464458:3464572 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:850874:850990 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:850874:850990 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:850874:850990 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-112:920946:921059 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-102:3670125:3670238 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-112:920946:921059 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3670125:3670238 [0] NCCL INFO P2P Chunksize set to 131072
c619-111:3464458:3464576 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 13
c619-111:3464458:3464575 [0] NCCL INFO [Proxy Service] Device 0 CPU core 12
c619-112:920946:921062 [0] NCCL INFO [Proxy Service] Device 0 CPU core 63
c619-112:920946:921063 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 64
c619-102:3670125:3670241 [0] NCCL INFO [Proxy Service] Device 0 CPU core 63
c619-102:3670125:3670242 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 64
c619-101:850874:850993 [0] NCCL INFO [Proxy Service] Device 0 CPU core 64
c619-101:850874:850994 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 65
c619-111:3464458:3464572 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3464458:3464572 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3464458:3464572 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-111:3464458:3464572 [0] NCCL INFO ncclCommInitRankConfig comm 0x2ee276b0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init COMPLETE
c619-111:3464458:3464572 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.86 (kernels 0.06, alloc 0.01, bootstrap 0.75, allgathers 0.03, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:850874:850990 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:850874:850990 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:850874:850990 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-102:3670125:3670238 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3670125:3670238 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:850874:850990 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-101:850874:850990 [0] NCCL INFO ncclCommInitRankConfig comm 0x39f30fe0 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init COMPLETE
c619-101:850874:850990 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 0.86 (kernels 0.07, alloc 0.01, bootstrap 0.76, allgathers 0.01, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:920946:921059 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-112:920946:921059 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-102:3670125:3670238 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-102:3670125:3670238 [0] NCCL INFO ncclCommInitRankConfig comm 0x431bf9b0 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init COMPLETE
c619-102:3670125:3670238 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.26 (kernels 0.06, alloc 0.01, bootstrap 0.18, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-112:920946:921059 [0] NCCL INFO TUNER/Plugin: Could not find: libnccl-tuner.so. Using internal tuner plugin.
c619-112:920946:921059 [0] NCCL INFO ncclCommInitRankConfig comm 0x24f81300 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0xf2ba7110912945b1 - Init COMPLETE
c619-112:920946:921059 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.14 (kernels 0.06, alloc 0.00, bootstrap 0.04, allgathers 0.02, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3464458:3464578 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 14
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850996 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 67
c619-101:850874:850995 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670244 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 66
c619-101:850874:850995 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921065 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 65
c619-101:850874:850995 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464577 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921064 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670243 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:850995 [0] NCCL INFO Connected all trees
c619-112:920946:921064 [0] NCCL INFO Connected all trees
c619-102:3670125:3670243 [0] NCCL INFO Connected all trees
c619-111:3464458:3464577 [0] NCCL INFO Connected all trees
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 231.51 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 229.41 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 228.13 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 226.06 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 226.89 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 229.82 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 227.82 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 226.72 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 224.49 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 225.35 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 228.63 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 226.66 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 225.07 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 223.20 examples/s]
Running tokenizer on dataset: 7996 examples [00:17, 224.04 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
Running tokenizer on dataset: 100%|ββββββββββ| 3998/3998 [00:00<?, ? examples/s]
Running tokenizer on dataset: 4998 examples [00:04, 226.15 examples/s]
Running tokenizer on dataset: 5998 examples [00:08, 223.53 examples/s]
Running tokenizer on dataset: 6998 examples [00:13, 222.10 examples/s]
Running tokenizer on dataset: 7996 examples [00:18, 220.79 examples/s]
Running tokenizer on dataset: 7996 examples [00:18, 221.43 examples/s]
training example:
input_ids:
[151644, 8948, 198, 2610, 525, 1207, 16948, 11, 3465, 553, 54364, 14817, 13, 1446, 525, 264, 10950, 17847, 13, 151645, 198, 151644, 872, 198, 16141, 279, 2701, 3491, 13, 81917, 697, 32711, 3019, 553, 3019, 13, 3197, 498, 525, 8060, 11, 2968, 697, 4226, 304, 419, 3561, 25, 366, 9217, 2235, 21732, 4226, 12533, 9217, 94367, 2, 22079, 198, 16429, 279, 5109, 304, 279, 1140, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 1855, 458, 23606, 429, 16819, 220, 16, 15, 24, 13, 1446, 646, 990, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 8, 323, 1817, 1372, 646, 1172, 387, 1483, 3055, 13, 4615, 6291, 1265, 2924, 264, 4013, 315, 7354, 330, 8304, 1599, 2974, 1380, 1817, 3019, 374, 264, 35972, 5666, 323, 279, 1590, 3019, 13653, 11508, 311, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 382, 35127, 697, 4226, 304, 279, 2701, 3561, 510, 27, 9217, 397, 7021, 413, 4226, 340, 522, 9217, 1339, 9064, 11993, 21732, 4226, 9940, 374, 279, 1140, 315, 7354, 311, 5545, 279, 2169, 1372, 476, 432, 1265, 387, 264, 3175, 23606, 429, 3059, 304, 279, 2169, 13, 4710, 2461, 3110, 510, 2679, 279, 1140, 315, 5109, 572, 508, 16, 11, 220, 17, 11, 220, 18, 60, 323, 279, 2169, 572, 220, 16, 11, 498, 1410, 3270, 510, 27, 9217, 397, 8304, 220, 16, 25, 220, 16, 488, 220, 17, 284, 220, 18, 198, 8304, 220, 17, 25, 220, 18, 608, 220, 18, 284, 220, 16, 198, 522, 9217, 1339, 269, 4710, 27, 9217, 397, 7, 16, 488, 220, 17, 8, 608, 220, 18, 198, 522, 9217, 1339, 10061, 594, 1744, 3019, 553, 3019, 13, 151645, 198, 151644, 77091, 198, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
inputs:
<|im_start|>system
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>
<|im_start|>user
Answer the following problem. Explain your reasoning step by step. When you are finished, give your answer in this format: <answer>(your answer)</answer>.
# Problem
Using the numbers in the list [99, 57, 67], create an equation that equals 109. You can use basic arithmetic operations (+, -, *, /) and each number can only be used once. Your solution should include a series of steps "Step X:" where each step is a mathematical operation and the final step ultimately leads to the target number or it should be a single equation that results in the target.
Give your answer in the following format:
<answer>
(your answer)
</answer>
Where "(your answer)" is the list of steps to reach the target number or it should be a single equation that results in the target.
For example:
If the list of numbers was [1, 2, 3] and the target was 1, you could write:
<answer>
Step 1: 1 + 2 = 3
Step 2: 3 / 3 = 1
</answer>
or
<answer>
(1 + 2) / 3
</answer>
Let's think step by step.<|im_end|>
<|im_start|>assistant
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
label_ids:
[-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 13708, 766, 397, 40, 614, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 323, 1184, 311, 1281, 220, 16, 15, 24, 1667, 6770, 34784, 7525, 25, 5256, 11, 75240, 11, 46444, 11, 323, 12804, 13, 8886, 1372, 646, 1172, 387, 1483, 3055, 382, 5338, 11, 358, 1265, 2908, 1128, 7525, 2578, 975, 13, 8704, 220, 16, 15, 24, 374, 264, 2699, 8131, 1091, 279, 5109, 11, 46444, 2578, 387, 6398, 11, 714, 220, 24, 24, 323, 220, 21, 22, 525, 2176, 3460, 11, 323, 220, 20, 22, 374, 9155, 13, 6771, 752, 1490, 279, 11799, 476, 36398, 382, 2679, 358, 912, 220, 24, 24, 323, 220, 20, 22, 11, 429, 594, 220, 24, 24, 488, 220, 20, 22, 284, 220, 16, 20, 21, 11, 892, 374, 11243, 1091, 220, 16, 15, 24, 13, 5005, 358, 614, 220, 21, 22, 2115, 13, 1416, 358, 32256, 220, 21, 22, 504, 220, 16, 20, 21, 11, 429, 594, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 892, 374, 2686, 1091, 220, 16, 15, 24, 13, 220, 23, 24, 374, 2238, 2613, 13, 1416, 358, 653, 220, 16, 20, 21, 481, 220, 21, 22, 284, 220, 23, 24, 11, 537, 220, 16, 15, 24, 382, 21390, 358, 646, 30270, 13, 220, 24, 24, 3039, 2494, 13, 220, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 2238, 2613, 13, 220, 24, 24, 353, 220, 17, 284, 220, 16, 24, 23, 11, 2238, 2409, 13, 1988, 358, 614, 311, 990, 279, 5109, 2661, 382, 31476, 12804, 13, 220, 24, 24, 17779, 553, 2494, 13, 220, 24, 24, 608, 220, 16, 284, 220, 24, 24, 11, 537, 10950, 13, 220, 24, 24, 608, 220, 15, 13, 24, 476, 2494, 11, 714, 358, 614, 311, 990, 279, 5109, 382, 14037, 4522, 25, 7196, 358, 646, 653, 220, 21, 22, 488, 220, 20, 22, 284, 220, 16, 17, 19, 11, 1221, 220, 16, 17, 19, 481, 220, 16, 20, 11, 714, 358, 1513, 944, 614, 220, 16, 20, 11, 358, 614, 220, 24, 24, 13, 220, 16, 17, 19, 481, 220, 24, 24, 284, 220, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 13, 6771, 594, 1140, 279, 5109, 25, 220, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 13, 13483, 220, 16, 15, 24, 382, 31476, 358, 1184, 311, 990, 75240, 476, 12804, 311, 633, 9155, 5109, 382, 21390, 320, 24, 24, 353, 2494, 8, 488, 2494, 382, 24, 24, 353, 220, 16, 284, 220, 24, 24, 11, 1221, 358, 1184, 488, 16, 15, 11, 714, 358, 614, 220, 20, 22, 323, 220, 21, 22, 11, 2176, 8131, 1091, 220, 16, 15, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 8670, 0, 2938, 1410, 975, 382, 4416, 11, 1156, 11, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 3793, 315, 7354, 11, 358, 1184, 311, 990, 279, 5109, 1172, 3055, 11, 323, 304, 419, 1142, 11, 358, 1483, 220, 21, 22, 323, 220, 20, 22, 311, 1281, 220, 16, 15, 11, 1221, 3694, 311, 220, 24, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 271, 8304, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 271, 3983, 220, 16, 15, 374, 537, 825, 315, 279, 4024, 5109, 26, 432, 594, 14257, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 6915, 11, 438, 817, 279, 3110, 382, 641, 279, 3110, 448, 508, 16, 11, 17, 11, 18, 60, 311, 1281, 220, 16, 11, 807, 1521, 320, 16, 10, 17, 5620, 18, 476, 7354, 1075, 220, 16, 10, 17, 28, 18, 1221, 220, 18, 14, 18, 28, 16, 382, 67691, 1588, 11, 358, 646, 614, 7354, 382, 3983, 1077, 752, 7683, 25, 1667, 220, 21, 22, 11, 220, 20, 22, 11, 323, 220, 24, 24, 382, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 13, 7414, 382, 40, 1410, 3270, 432, 438, 458, 23606, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 4792, 4977, 4396, 382, 3983, 374, 1052, 2441, 1616, 30, 10696, 448, 46444, 476, 12804, 382, 2461, 3110, 11, 220, 24, 24, 353, 220, 21, 22, 374, 6765, 11, 1075, 220, 21, 21, 18, 18, 11, 17779, 553, 220, 20, 22, 476, 2494, 13, 220, 21, 21, 18, 18, 608, 220, 20, 22, 13, 6771, 752, 11047, 25, 220, 20, 22, 353, 220, 16, 16, 21, 284, 220, 20, 22, 9, 16, 15, 15, 28, 20, 22, 15, 15, 11, 220, 20, 22, 9, 16, 21, 28, 24, 16, 17, 11, 2790, 220, 20, 22, 15, 15, 10, 24, 16, 17, 28, 21, 21, 16, 17, 11, 220, 21, 21, 18, 18, 12, 21, 21, 16, 17, 28, 17, 16, 11, 773, 220, 16, 16, 21, 488, 220, 17, 16, 14, 20, 22, 11, 537, 7546, 11, 323, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 20, 22, 353, 220, 21, 22, 284, 937, 220, 20, 15, 9, 21, 22, 28, 18, 18, 20, 15, 11, 220, 22, 9, 21, 22, 28, 19, 21, 24, 11, 2790, 220, 18, 18, 20, 15, 10, 19, 21, 24, 28, 18, 23, 16, 24, 11, 1221, 17779, 553, 220, 24, 24, 30, 220, 18, 23, 16, 24, 608, 220, 24, 24, 13, 220, 24, 24, 9, 18, 23, 28, 18, 22, 21, 17, 11, 220, 18, 23, 16, 24, 12, 18, 22, 21, 17, 28, 20, 22, 11, 773, 220, 18, 23, 488, 220, 20, 22, 14, 24, 24, 11, 537, 220, 16, 15, 24, 382, 2623, 3238, 382, 24, 24, 353, 220, 20, 22, 284, 220, 20, 21, 19, 18, 11, 17779, 553, 220, 21, 22, 30, 220, 21, 22, 9, 23, 19, 28, 21, 22, 9, 23, 15, 28, 20, 18, 21, 15, 11, 220, 21, 22, 9, 19, 28, 17, 21, 23, 11, 2790, 220, 20, 18, 21, 15, 10, 17, 21, 23, 28, 20, 21, 17, 23, 11, 220, 20, 21, 19, 18, 12, 20, 21, 17, 23, 28, 16, 20, 11, 773, 220, 23, 19, 488, 220, 16, 20, 14, 21, 22, 11, 537, 7546, 382, 2623, 220, 16, 15, 24, 382, 4416, 11, 5256, 323, 75240, 2803, 2664, 382, 14037, 10601, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 27283, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 30, 220, 16, 21, 21, 481, 220, 20, 22, 382, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 27283, 220, 22, 803, 374, 220, 16, 15, 24, 30, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 0, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 16, 21, 21, 481, 220, 20, 22, 25, 220, 16, 21, 21, 481, 220, 20, 15, 284, 220, 16, 16, 21, 11, 220, 16, 16, 21, 481, 220, 22, 284, 220, 16, 15, 24, 11, 9834, 382, 3036, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 4416, 11, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 271, 8304, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 4792, 1083, 4278, 382, 641, 419, 1142, 11, 358, 1483, 678, 2326, 5109, 5961, 382, 67691, 11, 304, 279, 1156, 1714, 11, 358, 1483, 75240, 1156, 382, 20629, 525, 2697, 382, 3983, 304, 279, 1156, 1714, 11, 358, 1030, 220, 21, 22, 481, 220, 20, 22, 1156, 11, 1221, 912, 311, 220, 24, 24, 382, 8420, 11, 220, 24, 24, 488, 220, 21, 22, 1156, 11, 1221, 32256, 220, 20, 22, 382, 7039, 11, 892, 825, 311, 3042, 30, 358, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 2169, 25, 220, 16, 15, 24, 382, 641, 279, 2086, 1714, 11, 432, 594, 30339, 25, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 641, 279, 1156, 25, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 20629, 525, 4396, 382, 31476, 358, 1265, 3270, 432, 438, 264, 3175, 23606, 382, 785, 3491, 2727, 25, 330, 64, 4013, 315, 7354, 1, 476, 330, 64, 3175, 23606, 11436, 40, 646, 653, 2987, 382, 641, 279, 4226, 3561, 11, 358, 646, 5157, 382, 3983, 1077, 752, 1490, 421, 1052, 594, 12804, 476, 2494, 11, 714, 537, 5871, 382, 14037, 1616, 25, 220, 16, 15, 24, 374, 3265, 311, 220, 16, 15, 15, 11, 714, 220, 24, 24, 374, 4558, 220, 16, 15, 15, 382, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 11, 323, 220, 16, 15, 504, 220, 21, 22, 12, 20, 22, 382, 2195, 5961, 382, 40, 1744, 432, 594, 6915, 382, 7039, 11, 369, 279, 4226, 11, 358, 1184, 311, 2182, 432, 304, 279, 3561, 382, 13394, 11, 304, 279, 7354, 11, 358, 1184, 311, 13837, 279, 7525, 382, 641, 279, 1156, 3019, 14319, 29208, 25, 14822, 220, 16, 25, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 14822, 220, 17, 25, 220, 24, 24, 488, 220, 16, 15, 284, 220, 16, 15, 24, 382, 3983, 304, 14822, 220, 17, 11, 358, 2776, 7842, 220, 24, 24, 323, 220, 16, 15, 11, 714, 220, 16, 15, 374, 537, 264, 1372, 11, 432, 594, 279, 1102, 13, 1988, 304, 279, 2266, 11, 432, 1265, 387, 21555, 11, 438, 817, 279, 3110, 382, 67691, 11, 304, 279, 2086, 25, 14822, 220, 16, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 14822, 220, 17, 25, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 8420, 11, 220, 16, 21, 21, 374, 28439, 382, 40, 1410, 3270, 279, 23606, 5961, 382, 12549, 279, 3491, 2727, 330, 3182, 458, 23606, 497, 714, 432, 6147, 7354, 476, 3175, 23606, 382, 40, 1744, 358, 3278, 728, 448, 279, 3175, 23606, 369, 38975, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 2195, 220, 24, 24, 488, 320, 21, 22, 481, 220, 20, 22, 8, 284, 220, 16, 15, 24, 271, 49244, 1616, 382, 31476, 2041, 73975, 11, 714, 1973, 12850, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 382, 7039, 11, 5256, 323, 75240, 614, 1852, 53056, 11, 2115, 311, 1290, 382, 4416, 11, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 1221, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 11, 892, 374, 4396, 382, 67691, 11, 421, 358, 653, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 11, 429, 594, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 11, 1083, 4396, 382, 4416, 11, 304, 2176, 5048, 11, 2041, 73975, 11, 432, 4278, 1576, 315, 2115, 311, 1290, 16460, 382, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 25, 1156, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 11, 1221, 220, 16, 15, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 25, 220, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 11, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 20629, 6915, 382, 4416, 11, 358, 646, 3270, 432, 438, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 382, 2195, 220, 21, 22, 481, 220, 20, 22, 488, 220, 24, 24, 284, 220, 16, 15, 24, 382, 40, 1744, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 374, 803, 5810, 382, 12549, 220, 24, 24, 374, 1156, 304, 279, 1140, 11, 714, 537, 5871, 382, 641, 279, 1140, 11, 432, 594, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 1125, 714, 1973, 3171, 944, 4925, 369, 279, 23606, 382, 641, 279, 7354, 11, 358, 2578, 1184, 311, 990, 1105, 304, 1973, 11, 714, 537, 5189, 382, 40, 1744, 369, 279, 23606, 11, 432, 594, 6915, 382, 7039, 11, 311, 3042, 382, 40, 1744, 358, 3278, 3270, 279, 3175, 23606, 382, 3983, 304, 279, 4226, 3561, 11, 432, 1265, 387, 43810, 304, 366, 9217, 29, 9492, 382, 13394, 11, 369, 279, 23606, 11, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 31476, 358, 646, 3270, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 271, 3983, 304, 279, 2266, 11, 2474, 432, 594, 279, 6291, 11, 432, 1265, 387, 2797, 382, 14037, 3166, 25, 304, 279, 3491, 11, 432, 2727, 330, 3182, 458, 23606, 429, 16819, 220, 16, 15, 24, 497, 773, 358, 1265, 614, 458, 7493, 6144, 311, 220, 16, 15, 24, 382, 4416, 11, 320, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 476, 2494, 382, 3983, 438, 358, 1053, 11, 2041, 73975, 432, 4278, 382, 1249, 387, 23560, 11, 358, 646, 990, 73975, 382, 3983, 537, 5871, 382, 40, 1744, 432, 594, 6915, 382, 31476, 358, 1265, 3270, 279, 7354, 311, 387, 12966, 448, 279, 3110, 382, 641, 279, 3110, 11, 807, 1030, 1378, 2606, 382, 40, 1744, 358, 3278, 653, 2176, 369, 31273, 11, 714, 279, 3491, 2727, 330, 46430, 697, 4226, 1, 304, 279, 3561, 11, 773, 825, 374, 3322, 382, 40, 3278, 728, 448, 279, 23606, 382, 4416, 11, 366, 9217, 29, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 522, 9217, 1339, 3983, 304, 279, 23606, 11, 358, 1184, 311, 990, 279, 5109, 11, 323, 432, 1265, 15442, 311, 220, 16, 15, 24, 382, 24, 24, 488, 220, 21, 22, 374, 220, 16, 21, 21, 11, 27283, 220, 20, 22, 374, 220, 16, 15, 24, 11, 9834, 382, 1249, 3270, 432, 438, 458, 7493, 25, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 271, 3983, 358, 1184, 311, 1473, 432, 16819, 220, 16, 15, 24, 382, 641, 279, 4226, 11, 358, 646, 3270, 279, 23606, 382, 67691, 11, 304, 279, 3019, 14319, 29208, 382, 40, 1744, 2987, 374, 6915, 382, 3983, 1077, 752, 1490, 279, 3491, 25, 330, 21732, 6291, 1265, 2924, 264, 4013, 315, 7354, 1112, 476, 432, 1265, 387, 264, 3175, 23606, 1837, 40, 646, 5157, 382, 40, 1744, 279, 3175, 23606, 374, 63594, 382, 4416, 11, 358, 3278, 653, 429, 382, 3983, 304, 279, 1140, 11, 220, 20, 22, 374, 2086, 11, 714, 304, 279, 23606, 11, 358, 614, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 11, 892, 5711, 678, 382, 14037, 10601, 25, 1128, 421, 358, 990, 12804, 1939, 2461, 3110, 11, 220, 16, 15, 24, 284, 220, 16, 15, 15, 488, 220, 24, 11, 714, 537, 10950, 382, 21, 22, 353, 220, 17, 284, 220, 16, 18, 19, 11, 2238, 2409, 11, 27283, 220, 17, 20, 11, 902, 382, 20, 22, 353, 220, 17, 284, 220, 16, 16, 19, 11, 3265, 311, 220, 16, 15, 24, 11, 220, 16, 16, 19, 481, 220, 20, 11, 714, 902, 220, 20, 382, 16, 16, 19, 481, 220, 20, 11, 714, 358, 614, 220, 24, 24, 323, 220, 21, 22, 382, 16, 16, 19, 481, 220, 20, 11, 714, 220, 20, 537, 2500, 382, 7, 20, 22, 353, 220, 17, 8, 481, 2494, 11, 714, 358, 1513, 944, 614, 220, 17, 11, 358, 614, 311, 990, 279, 5109, 3055, 11, 773, 358, 646, 944, 30270, 553, 220, 17, 7241, 358, 42430, 432, 382, 2461, 3110, 11, 358, 1410, 653, 12804, 311, 633, 64895, 382, 10048, 2900, 358, 653, 220, 24, 24, 608, 220, 20, 22, 476, 2494, 382, 24, 24, 608, 220, 20, 22, 284, 220, 18, 18, 14, 16, 24, 11, 13187, 220, 16, 13, 22, 18, 21, 11, 537, 7546, 382, 12209, 30270, 553, 220, 21, 22, 30, 220, 16, 13, 22, 18, 21, 353, 220, 21, 22, 21103, 230, 220, 16, 16, 21, 13, 2494, 11, 537, 220, 16, 15, 24, 382, 2623, 1661, 382, 21, 22, 608, 220, 20, 22, 21103, 230, 220, 16, 13, 16, 22, 20, 11, 3039, 220, 24, 24, 21103, 230, 220, 16, 16, 21, 13, 18, 17, 20, 11, 537, 220, 16, 15, 24, 382, 2753, 382, 4416, 11, 5256, 323, 75240, 525, 279, 1616, 311, 728, 382, 54815, 11, 220, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 374, 4396, 382, 1249, 3270, 432, 448, 279, 5109, 304, 1973, 25, 714, 537, 5871, 382, 40, 1744, 358, 2776, 1661, 624, 522, 26865, 397, 1249, 11625, 279, 3491, 315, 6825, 458, 23606, 6144, 311, 220, 16, 15, 24, 1667, 279, 5109, 508, 24, 24, 11, 220, 20, 22, 11, 220, 21, 22, 60, 448, 6770, 34784, 7525, 17973, 11, 85922, 11777, 608, 701, 1817, 1483, 3055, 11, 358, 6509, 3204, 27459, 13, 8704, 220, 16, 15, 24, 374, 3265, 311, 279, 2629, 315, 220, 24, 24, 323, 220, 21, 22, 320, 8206, 374, 220, 16, 21, 21, 701, 32256, 287, 220, 20, 22, 504, 419, 2629, 3059, 304, 220, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 13, 1096, 5486, 5711, 5256, 323, 75240, 11, 892, 525, 5420, 7525, 382, 785, 23606, 646, 387, 5326, 438, 510, 59, 9640, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 1986, 374, 2697, 1576, 510, 12, 78646, 323, 75240, 525, 10660, 2115, 311, 1290, 25, 1156, 11, 17767, 24, 24, 488, 220, 21, 22, 284, 220, 16, 21, 21, 59, 701, 1221, 17767, 16, 21, 21, 481, 220, 20, 22, 284, 220, 16, 15, 24, 59, 4292, 12, 2009, 5109, 504, 279, 1140, 525, 1483, 6896, 3055, 382, 92014, 11, 279, 23606, 646, 387, 13302, 448, 73975, 369, 31273, 11, 714, 807, 525, 537, 5871, 4152, 311, 2115, 4686, 6701, 16460, 369, 7525, 315, 279, 1852, 53056, 510, 59, 9640, 7, 24, 24, 488, 220, 21, 22, 8, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 59, 2533, 20629, 7586, 7540, 279, 2169, 1372, 220, 16, 15, 24, 382, 27, 9217, 397, 24, 24, 488, 220, 21, 22, 481, 220, 20, 22, 284, 220, 16, 15, 24, 198, 522, 9217, 29, 151645, 198]
labels:
<think>
I have the numbers [99, 57, 67] and need to make 109 using basic arithmetic operations: addition, subtraction, multiplication, and division. Each number can only be used once.
First, I should consider what operations might work. Since 109 is a bit larger than the numbers, multiplication might be involved, but 99 and 67 are both large, and 57 is smaller. Let me see the differences or sums.
If I add 99 and 57, that's 99 + 57 = 156, which is bigger than 109. Then I have 67 left. If I subtract 67 from 156, that's 156 - 67 = 89, which is less than 109. 89 is too small. If I do 156 - 67 = 89, not 109.
Maybe I can multiply. 99 times something. 99 * 1 = 99, too small. 99 * 2 = 198, too big. But I have to use the numbers given.
Perhaps division. 99 divided by something. 99 / 1 = 99, not helpful. 99 / 0.9 or something, but I have to use the numbers.
Another idea: maybe I can do 67 + 57 = 124, then 124 - 15, but I don't have 15, I have 99. 124 - 99 = 25, not 109.
Not working. Let's list the numbers: 99, 57, 67. Target 109.
Perhaps I need to use subtraction or division to get smaller numbers.
Maybe (99 * something) + something.
99 * 1 = 99, then I need +10, but I have 57 and 67, both larger than 10.
67 - 57 = 10, then 99 + 10 = 109. Oh! That could work.
So, first, 67 - 57 = 10, then 99 + 10 = 109.
But in terms of steps, I need to use the numbers only once, and in this case, I used 67 and 57 to make 10, then added to 99.
So, Step 1: 67 - 57 = 10
Step 2: 99 + 10 = 109
But 10 is not one of the original numbers; it's derived. But in the context, it should be fine, as per the example.
In the example with [1,2,3] to make 1, they did (1+2)/3 or steps like 1+2=3 then 3/3=1.
Similarly here, I can have steps.
But let me confirm: using 67, 57, and 99.
67 - 57 = 10, then 99 + 10 = 109. Yes.
I could write it as an equation: 99 + (67 - 57) = 99 + 10 = 109.
That seems correct.
But is there another way? Maybe with multiplication or division.
For example, 99 * 67 is huge, like 6633, divided by 57 or something. 6633 / 57. Let me calculate: 57 * 116 = 57*100=5700, 57*16=912, total 5700+912=6612, 6633-6612=21, so 116 + 21/57, not integer, and not 109.
Not good.
57 * 67 = ? 50*67=3350, 7*67=469, total 3350+469=3819, then divided by 99? 3819 / 99. 99*38=3762, 3819-3762=57, so 38 + 57/99, not 109.
Not working.
99 * 57 = 5643, divided by 67? 67*84=67*80=5360, 67*4=268, total 5360+268=5628, 5643-5628=15, so 84 + 15/67, not integer.
Not 109.
So, addition and subtraction seem better.
Another combination: 99 + 67 = 166, then minus 57 = 166 - 57 = 109? 166 - 57.
166 - 50 = 116, minus 7 more is 109? 116 - 7 = 109, yes! 166 - 57 = 109.
166 - 57: 166 - 50 = 116, 116 - 7 = 109, yes.
And 99 + 67 = 166, then 166 - 57 = 109.
So, Step 1: 99 + 67 = 166
Step 2: 166 - 57 = 109
That also works.
In this case, I used all three numbers directly.
Similarly, in the first method, I used subtraction first.
Both are valid.
But in the first method, I had 67 - 57 first, then add to 99.
Here, 99 + 67 first, then subtract 57.
Now, which one to present? I think either is fine.
But let me see the target: 109.
In the second method, it's straightforward: (99 + 67) - 57 = 166 - 57 = 109.
In the first: 99 + (67 - 57) = 99 + 10 = 109.
Both are correct.
Perhaps I should write it as a single equation.
The problem says: "a series of steps" or "a single equation".
I can do either.
In the answer format, I can choose.
But let me see if there's division or something, but not necessary.
Another way: 109 is close to 100, but 99 is almost 100.
99 + 10 = 109, and 10 from 67-57.
Or directly.
I think it's fine.
Now, for the answer, I need to put it in the format.
Also, in the steps, I need to specify the operations.
In the first step-by-step: Step 1: 67 - 57 = 10, Step 2: 99 + 10 = 109.
But in Step 2, I'm adding 99 and 10, but 10 is not a number, it's the result. But in the context, it should be acceptable, as per the example.
Similarly, in the second: Step 1: 99 + 67 = 166, Step 2: 166 - 57 = 109.
Here, 166 is intermediate.
I could write the equation directly.
Since the problem says "create an equation", but it allows steps or single equation.
I think I'll go with the single equation for simplicity.
So, (99 + 67) - 57 = 109
Or 99 + (67 - 57) = 109
Either way.
Perhaps without parentheses, but order matters.
99 + 67 - 57.
Now, addition and subtraction have same precedence, left to right.
So, 99 + 67 = 166, then 166 - 57 = 109, which is correct.
Similarly, if I do 67 - 57 + 99, that's 10 + 99 = 109, also correct.
So, in both cases, without parentheses, it works because of left to right evaluation.
67 - 57 + 99: first 67 - 57 = 10, then 10 + 99 = 109.
99 + 67 - 57: 99 + 67 = 166, 166 - 57 = 109.
Both fine.
So, I can write it as 99 + 67 - 57 = 109.
Or 67 - 57 + 99 = 109.
I think 99 + 67 - 57 is more natural.
Since 99 is first in the list, but not necessary.
In the list, it's [99, 57, 67], but order doesn't matter for the equation.
In the steps, I might need to use them in order, but not specified.
I think for the equation, it's fine.
Now, to present.
I think I'll write the single equation.
But in the answer format, it should be enclosed in <answer> tags.
Also, for the equation, I need to show it equals 109.
Perhaps I can write: 99 + 67 - 57 = 109
But in the context, since it's the solution, it should be clear.
Another thing: in the problem, it says "create an equation that equals 109", so I should have an expression equal to 109.
So, (99 + 67) - 57 or something.
But as I said, without parentheses it works.
To be precise, I can use parentheses.
But not necessary.
I think it's fine.
Perhaps I should write the steps to be consistent with the example.
In the example, they had two options.
I think I'll do both for clarity, but the problem says "give your answer" in the format, so one is enough.
I'll go with the equation.
So, <answer>99 + 67 - 57 = 109</answer>
But in the equation, I need to use the numbers, and it should evaluate to 109.
99 + 67 is 166, minus 57 is 109, yes.
To write it as an expression: 99 + 67 - 57
But I need to show it equals 109.
In the answer, I can write the equation.
Similarly, in the step-by-step.
I think either is fine.
But let me see the problem: "your solution should include a series of steps... or it should be a single equation"
I can choose.
I think the single equation is concise.
So, I'll do that.
But in the list, 57 is second, but in the equation, I have 99 + 67 - 57, which uses all.
Another combination: what if I use division?
For example, 109 = 100 + 9, but not helpful.
67 * 2 = 134, too big, minus 25, no.
57 * 2 = 114, close to 109, 114 - 5, but no 5.
114 - 5, but I have 99 and 67.
114 - 5, but 5 not available.
(57 * 2) - something, but I don't have 2, I have to use the numbers once, so I can't multiply by 2 unless I derive it.
For example, I could do division to get fractions.
Suppose I do 99 / 57 or something.
99 / 57 = 33/19, approximately 1.736, not integer.
Then multiply by 67? 1.736 * 67 β 116. something, not 109.
Not good.
67 / 57 β 1.175, times 99 β 116.325, not 109.
No.
So, addition and subtraction are the way to go.
Therefore, 99 + 67 - 57 = 109 is correct.
To write it with the numbers in order: but not necessary.
I think I'm good.
</think>
To solve the problem of creating an equation equal to 109 using the numbers [99, 57, 67] with basic arithmetic operations (+, -, *, /), each used once, I considered possible combinations. Since 109 is close to the sum of 99 and 67 (which is 166), subtracting 57 from this sum results in 166 - 57 = 109. This approach uses addition and subtraction, which are allowed operations.
The equation can be written as:
\[
99 + 67 - 57 = 109
\]
This is valid because:
- Addition and subtraction are performed left to right: first, \(99 + 67 = 166\), then \(166 - 57 = 109\).
- All numbers from the list are used exactly once.
Alternatively, the equation can be expressed with parentheses for clarity, but they are not necessary due to left-to-right evaluation for operations of the same precedence:
\[
(99 + 67) - 57 = 109
\]
Both forms yield the target number 109.
<answer>
99 + 67 - 57 = 109
</answer><|im_end|>
[INFO|configuration_utils.py:698] 2025-09-23 23:23:56,719 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:698] 2025-09-23 23:23:56,718 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:56,719 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 23:23:56] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:770] 2025-09-23 23:23:56,719 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 23:23:56] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 23:23:56,721 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:56,722 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 23:23:56] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|configuration_utils.py:698] 2025-09-23 23:23:56,724 >> loading configuration file config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/config.json
[INFO|configuration_utils.py:770] 2025-09-23 23:23:56,724 >> Model config Qwen2Config {
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 1536,
"initializer_range": 0.02,
"intermediate_size": 8960,
"max_position_embeddings": 32768,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 12,
"num_hidden_layers": 28,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.52.3",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}
[INFO|2025-09-23 23:23:56] llamafactory.model.model_utils.kv_cache:143 >> KV cache is disabled during training.
[INFO|modeling_utils.py:1150] 2025-09-23 23:23:57,233 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 23:23:57,234 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,235 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 23:23:57,275 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 23:23:57,276 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,277 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 23:23:57,277 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 23:23:57,279 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,280 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:1150] 2025-09-23 23:23:57,457 >> loading weights file model.safetensors from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/model.safetensors
[INFO|modeling_utils.py:2240] 2025-09-23 23:23:57,458 >> Instantiating Qwen2ForCausalLM model under default dtype torch.bfloat16.
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,459 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"eos_token_id": 151645,
"use_cache": false
}
[INFO|modeling_utils.py:5130] 2025-09-23 23:23:57,653 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 23:23:57,653 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|modeling_utils.py:5130] 2025-09-23 23:23:57,712 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 23:23:57,712 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|modeling_utils.py:5130] 2025-09-23 23:23:57,715 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 23:23:57,715 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|configuration_utils.py:1090] 2025-09-23 23:23:57,726 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,726 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|configuration_utils.py:1090] 2025-09-23 23:23:57,788 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,788 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|configuration_utils.py:1090] 2025-09-23 23:23:57,792 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,792 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 23:23:57] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 23:23:57,837 >> Using auto half precision backend
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 23:23:57] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 23:23:57] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[INFO|2025-09-23 23:23:57] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|2025-09-23 23:23:57] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 23:23:57,891 >> Using auto half precision backend
[INFO|modeling_utils.py:5130] 2025-09-23 23:23:57,895 >> All model checkpoint weights were used when initializing Qwen2ForCausalLM.
[INFO|modeling_utils.py:5138] 2025-09-23 23:23:57,895 >> All the weights of Qwen2ForCausalLM were initialized from the model checkpoint at Qwen/Qwen2.5-1.5B-Instruct.
If your task is similar to the task the model of the checkpoint was trained on, you can already use Qwen2ForCausalLM for predictions without further training.
[INFO|trainer.py:756] 2025-09-23 23:23:57,900 >> Using auto half precision backend
[INFO|configuration_utils.py:1090] 2025-09-23 23:23:57,976 >> loading configuration file generation_config.json from cache at /scratch/10286/georgetsoukalas/hf_cache/hub/models--Qwen--Qwen2.5-1.5B-Instruct/snapshots/989aa7980e4cf806f80c7fef2b1adb7bc71aa306/generation_config.json
[INFO|configuration_utils.py:1135] 2025-09-23 23:23:57,976 >> Generate config GenerationConfig {
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"repetition_penalty": 1.1,
"temperature": 0.7,
"top_k": 20,
"top_p": 0.8
}
[INFO|2025-09-23 23:23:58] llamafactory.model.model_utils.checkpointing:143 >> Gradient checkpointing enabled.
[INFO|2025-09-23 23:23:58] llamafactory.model.model_utils.attention:143 >> Using torch SDPA for faster training and inference.
[INFO|2025-09-23 23:23:58] llamafactory.model.adapter:143 >> Upcasting trainable params to float32.
[INFO|2025-09-23 23:23:58] llamafactory.model.adapter:143 >> Fine-tuning method: Full
[2025-09-23 23:23:58,058] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[INFO|2025-09-23 23:23:58] llamafactory.model.loader:143 >> trainable params: 1,543,714,304 || all params: 1,543,714,304 || trainable%: 100.0000
[INFO|trainer.py:756] 2025-09-23 23:23:58,083 >> Using auto half precision backend
[WARNING|2025-09-23 23:23:58] llamafactory.train.callbacks:154 >> Previous trainer log in this folder will be deleted.
[2025-09-23 23:23:58,116] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[2025-09-23 23:23:58,125] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
[2025-09-23 23:23:58,309] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed info: version=0.16.9, git-hash=unknown, git-branch=unknown
[2025-09-23 23:23:58,309] [INFO] [config.py:735:__init__] Config mesh_device None world_size = 4
c619-101:850874:850874 [0] NCCL INFO Comm config Blocking set to 1
c619-101:850874:851072 [0] NCCL INFO Using network IB
c619-112:920946:920946 [0] NCCL INFO Comm config Blocking set to 1
c619-101:850874:851072 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3670125:3670125 [0] NCCL INFO Comm config Blocking set to 1
c619-111:3464458:3464458 [0] NCCL INFO Comm config Blocking set to 1
c619-101:850874:851072 [0] NCCL INFO ncclCommInitRankConfig comm 0x41cf8330 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init START
c619-102:3670125:3670319 [0] NCCL INFO Using network IB
c619-112:920946:921140 [0] NCCL INFO Using network IB
c619-111:3464458:3464653 [0] NCCL INFO Using network IB
c619-102:3670125:3670319 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-112:920946:921140 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-111:3464458:3464653 [0] NCCL INFO DMA-BUF is available on GPU device 0
c619-102:3670125:3670319 [0] NCCL INFO ncclCommInitRankConfig comm 0x4ae2db90 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init START
c619-112:920946:921140 [0] NCCL INFO ncclCommInitRankConfig comm 0x2872c050 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init START
c619-111:3464458:3464653 [0] NCCL INFO ncclCommInitRankConfig comm 0x327fd1d0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init START
c619-101:850874:851072 [0] NCCL INFO Bootstrap timings total 0.000939 (create 0.000023, send 0.000069, recv 0.000322, ring 0.000108, delay 0.000000)
c619-112:920946:921140 [0] NCCL INFO Bootstrap timings total 0.000778 (create 0.000031, send 0.000114, recv 0.000354, ring 0.000137, delay 0.000000)
c619-102:3670125:3670319 [0] NCCL INFO Bootstrap timings total 0.000800 (create 0.000033, send 0.000107, recv 0.000220, ring 0.000299, delay 0.000000)
c619-111:3464458:3464653 [0] NCCL INFO Bootstrap timings total 0.000761 (create 0.000032, send 0.000101, recv 0.000269, ring 0.000181, delay 0.000000)
c619-101:850874:851072 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-112:920946:921140 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-102:3670125:3670319 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-111:3464458:3464653 [0] NCCL INFO Setting affinity for GPU 0 to ff,ffffffff,ffffffff
c619-101:850874:851072 [0] NCCL INFO comm 0x41cf8330 rank 0 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:850874:851072 [0] NCCL INFO Channel 00/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 01/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 02/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 03/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 04/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 05/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 06/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 07/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 08/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 09/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 10/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 11/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 12/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 13/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 14/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 15/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 16/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 17/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 18/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 19/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 20/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 21/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 22/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 23/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 24/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 25/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 26/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 27/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 28/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 29/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 30/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 31/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 32/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 33/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 34/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 35/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 36/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 37/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 38/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 39/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 40/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 41/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 42/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 43/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 44/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 45/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 46/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 47/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 48/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 49/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 50/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 51/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 52/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 53/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 54/64 : 0 1 2 3
c619-102:3670125:3670319 [0] NCCL INFO comm 0x4ae2db90 rank 1 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-101:850874:851072 [0] NCCL INFO Channel 55/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 56/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 57/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 58/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 59/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 60/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 61/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 62/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Channel 63/64 : 0 1 2 3
c619-101:850874:851072 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 2/-1/-1->0->-1 [3] 2/-1/-1->0->-1 [4] 2/-1/-1->0->-1 [5] 2/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 2/-1/-1->0->-1 [9] 2/-1/-1->0->-1 [10] 2/-1/-1->0->-1 [11] 2/-1/-1->0->-1 [12] 2/-1/-1->0->-1 [13] 2/-1/-1->0->-1 [14] 2/-1/-1->0->-1 [15] 2/-1/-1->0->-1 [16] 2/-1/-1->0->-1 [17] 2/-1/-1->0->-1 [18] 2/-1/-1->0->-1 [19] 2/-1/-1->0->-1 [20] 2/-1/-1->0->-1 [21] 2/-1/-1->0->-1 [22] 2/-1/-1->0->-1 [23] 2/-1/-1->0->-1 [24] 2/-1/-1->0->-1 [25] 2/-1/-1->0->-1 [26] 2/-1/-1->0->-1 [27] 2/-1/-1->0->-1 [28] 2/-1/-1->0->-1 [29] 2/-1/-1->0->-1 [30] 2/-1/-1->0->-1 [31] 2/-1/-1->0->-1 [32] -1/-1/-1->0->1 [33] -1/-1/-1->0->1 [34] -1/-1/-1->0->1 [35] -1/-1/-1->0->1 [36] -1/-1/-1->0->1 [37] -1/-1/-1->0->1 [38] -1/-1/-1->0->1 [39] -1/-1/-1->0->1 [40] -1/-1/-1->0->1 [41] -1/-1/-1->0->1 [42] -1/-1/-1->0->1 [43] -1/-1/-1->0->1 [44] -1/-1/-1->0->1 [45] -1/-1/-1->0->1 [46] -1/-1/-1->0->1 [47] -1/-1/-1->0->1 [48] -1/-1/-1->0->1 [49] -1/-1
c619-101:850874:851072 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:850874:851072 [0] NCCL INFO Check P2P Type intraNodeP2pSupport 0 directMode 0
c619-101:850874:851073 [0] NCCL INFO [Proxy Service] Device 0 CPU core 6
c619-111:3464458:3464653 [0] NCCL INFO comm 0x327fd1d0 rank 2 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-112:920946:921140 [0] NCCL INFO comm 0x2872c050 rank 3 nRanks 4 nNodes 4 localRanks 1 localRank 0 MNNVL 0
c619-102:3670125:3670319 [0] NCCL INFO Trees [0] -1/-1/-1->1->2 [1] -1/-1/-1->1->2 [2] -1/-1/-1->1->2 [3] -1/-1/-1->1->2 [4] -1/-1/-1->1->2 [5] -1/-1/-1->1->2 [6] -1/-1/-1->1->2 [7] -1/-1/-1->1->2 [8] -1/-1/-1->1->2 [9] -1/-1/-1->1->2 [10] -1/-1/-1->1->2 [11] -1/-1/-1->1->2 [12] -1/-1/-1->1->2 [13] -1/-1/-1->1->2 [14] -1/-1/-1->1->2 [15] -1/-1/-1->1->2 [16] -1/-1/-1->1->2 [17] -1/-1/-1->1->2 [18] -1/-1/-1->1->2 [19] -1/-1/-1->1->2 [20] -1/-1/-1->1->2 [21] -1/-1/-1->1->2 [22] -1/-1/-1->1->2 [23] -1/-1/-1->1->2 [24] -1/-1/-1->1->2 [25] -1/-1/-1->1->2 [26] -1/-1/-1->1->2 [27] -1/-1/-1->1->2 [28] -1/-1/-1->1->2 [29] -1/-1/-1->1->2 [30] -1/-1/-1->1->2 [31] -1/-1/-1->1->2 [32] 2/0/-1->1->3 [33] 2/0/-1->1->3 [34] 2/0/-1->1->3 [35] 2/0/-1->1->3 [36] 2/0/-1->1->3 [37] 2/0/-1->1->3 [38] 2/0/-1->1->3 [39] 2/0/-1->1->3 [40] 2/0/-1->1->3 [41] 2/0/-1->1->3 [42] 2/0/-1->1->3 [43] 2/0/-1->1->3 [44] 2/0/-1->1->3 [45] 2/0/-1->1->3 [46] 2/0/-1->1->3 [47] 2/0/-1->1->3 [48] 2/0/-1->1->3 [49] 2/0/-1->1->3 [50] 2/0/-1->1->3 [51] 2
c619-102:3670125:3670319 [0] NCCL INFO P2P Chunksize set to 131072
c619-101:850874:851074 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 7
c619-111:3464458:3464653 [0] NCCL INFO Trees [0] 1/3/-1->2->0 [1] 1/3/-1->2->0 [2] 1/3/-1->2->0 [3] 1/3/-1->2->0 [4] 1/3/-1->2->0 [5] 1/3/-1->2->0 [6] 1/3/-1->2->0 [7] 1/3/-1->2->0 [8] 1/3/-1->2->0 [9] 1/3/-1->2->0 [10] 1/3/-1->2->0 [11] 1/3/-1->2->0 [12] 1/3/-1->2->0 [13] 1/3/-1->2->0 [14] 1/3/-1->2->0 [15] 1/3/-1->2->0 [16] 1/3/-1->2->0 [17] 1/3/-1->2->0 [18] 1/3/-1->2->0 [19] 1/3/-1->2->0 [20] 1/3/-1->2->0 [21] 1/3/-1->2->0 [22] 1/3/-1->2->0 [23] 1/3/-1->2->0 [24] 1/3/-1->2->0 [25] 1/3/-1->2->0 [26] 1/3/-1->2->0 [27] 1/3/-1->2->0 [28] 1/3/-1->2->0 [29] 1/3/-1->2->0 [30] 1/3/-1->2->0 [31] 1/3/-1->2->0 [32] -1/-1/-1->2->1 [33] -1/-1/-1->2->1 [34] -1/-1/-1->2->1 [35] -1/-1/-1->2->1 [36] -1/-1/-1->2->1 [37] -1/-1/-1->2->1 [38] -1/-1/-1->2->1 [39] -1/-1/-1->2->1 [40] -1/-1/-1->2->1 [41] -1/-1/-1->2->1 [42] -1/-1/-1->2->1 [43] -1/-1/-1->2->1 [44] -1/-1/-1->2->1 [45] -1/-1/-1->2->1 [46] -1/-1/-1->2->1 [47] -1/-1/-1->2->1 [48] -1/-1/-1->2->1 [49] -1/-1/-1->2->1 [50] -1/-1/-1->2->1 [51] -1/-1/-1->2->1 [52] -1/-1/-
c619-112:920946:921140 [0] NCCL INFO Trees [0] -1/-1/-1->3->2 [1] -1/-1/-1->3->2 [2] -1/-1/-1->3->2 [3] -1/-1/-1->3->2 [4] -1/-1/-1->3->2 [5] -1/-1/-1->3->2 [6] -1/-1/-1->3->2 [7] -1/-1/-1->3->2 [8] -1/-1/-1->3->2 [9] -1/-1/-1->3->2 [10] -1/-1/-1->3->2 [11] -1/-1/-1->3->2 [12] -1/-1/-1->3->2 [13] -1/-1/-1->3->2 [14] -1/-1/-1->3->2 [15] -1/-1/-1->3->2 [16] -1/-1/-1->3->2 [17] -1/-1/-1->3->2 [18] -1/-1/-1->3->2 [19] -1/-1/-1->3->2 [20] -1/-1/-1->3->2 [21] -1/-1/-1->3->2 [22] -1/-1/-1->3->2 [23] -1/-1/-1->3->2 [24] -1/-1/-1->3->2 [25] -1/-1/-1->3->2 [26] -1/-1/-1->3->2 [27] -1/-1/-1->3->2 [28] -1/-1/-1->3->2 [29] -1/-1/-1->3->2 [30] -1/-1/-1->3->2 [31] -1/-1/-1->3->2 [32] 1/-1/-1->3->-1 [33] 1/-1/-1->3->-1 [34] 1/-1/-1->3->-1 [35] 1/-1/-1->3->-1 [36] 1/-1/-1->3->-1 [37] 1/-1/-1->3->-1 [38] 1/-1/-1->3->-1 [39] 1/-1/-1->3->-1 [40] 1/-1/-1->3->-1 [41] 1/-1/-1->3->-1 [42] 1/-1/-1->3->-1 [43] 1/-1/-1->3->-1 [44] 1/-1/-1->3->-1 [45] 1/-1/-1->3->-1 [46] 1/-1/-1->3->-1 [47] 1/-1/-1->3->-1 [48] 1/-1/-1->3->-1 [49] 1/-1/
c619-111:3464458:3464653 [0] NCCL INFO P2P Chunksize set to 131072
c619-112:920946:921140 [0] NCCL INFO P2P Chunksize set to 131072
c619-102:3670125:3670321 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 66
c619-102:3670125:3670320 [0] NCCL INFO [Proxy Service] Device 0 CPU core 65
c619-112:920946:921141 [0] NCCL INFO [Proxy Service] Device 0 CPU core 53
c619-112:920946:921142 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 54
c619-111:3464458:3464654 [0] NCCL INFO [Proxy Service] Device 0 CPU core 59
c619-111:3464458:3464655 [0] NCCL INFO [Proxy Service UDS] Device 0 CPU core 60
c619-101:850874:851072 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:850874:851072 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:850874:851072 [0] NCCL INFO CC Off, workFifoBytes 1048576
c619-111:3464458:3464653 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-111:3464458:3464653 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:850874:851072 [0] NCCL INFO ncclCommInitRankConfig comm 0x41cf8330 rank 0 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init COMPLETE
c619-101:850874:851072 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 0 nranks 4 total 0.01 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-111:3464458:3464653 [0] NCCL INFO ncclCommInitRankConfig comm 0x327fd1d0 rank 2 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init COMPLETE
c619-111:3464458:3464653 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 2 nranks 4 total 0.01 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:850874:851076 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 9
c619-101:850874:851075 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670319 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-102:3670125:3670319 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-111:3464458:3464657 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 61
c619-101:850874:851075 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921140 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 512 | 512
c619-101:850874:851075 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670319 [0] NCCL INFO ncclCommInitRankConfig comm 0x4ae2db90 rank 1 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init COMPLETE
c619-101:850874:851075 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670319 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 1 nranks 4 total 0.02 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:850874:851075 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921140 [0] NCCL INFO 64 coll channels, 64 collnet channels, 0 nvls channels, 64 p2p channels, 2 p2p channels per peer
c619-101:850874:851075 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921140 [0] NCCL INFO ncclCommInitRankConfig comm 0x2872c050 rank 3 nranks 4 cudaDev 0 nvmlDev 0 busId 901000 commId 0x3ecf665277c1bda8 - Init COMPLETE
c619-101:850874:851075 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921140 [0] NCCL INFO Init timings - ncclCommInitRankConfig: rank 3 nranks 4 total 0.02 (kernels 0.00, alloc 0.00, bootstrap 0.00, allgathers 0.00, topo 0.00, graphs 0.00, connections 0.01, rest 0.00)
c619-101:850874:851075 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670323 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 68
c619-111:3464458:3464656 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921144 [0] NCCL INFO [Proxy Progress] Device 0 CPU core 56
c619-111:3464458:3464656 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 00/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 01/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 02/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 03/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 04/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 05/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 06/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 07/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 08/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 09/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 10/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 11/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 12/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 13/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 14/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 15/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 16/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 17/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 18/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 19/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 20/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 21/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 22/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 23/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 24/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 25/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 26/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 27/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 28/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 29/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 30/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 31/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464656 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 00/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 01/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 02/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 03/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 04/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 05/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 06/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 07/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 08/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 09/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 10/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 11/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 12/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 13/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 14/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 15/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 16/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 17/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 18/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 19/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 20/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 21/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 22/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 23/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 24/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 25/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 26/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 27/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 28/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 29/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 30/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 31/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 32/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 33/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 34/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 35/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 36/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 37/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 38/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 39/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 40/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 41/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 42/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 43/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 44/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 45/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 46/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 47/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 48/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 49/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 50/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 51/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 52/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 53/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 54/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 55/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 56/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 57/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 58/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 59/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 60/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 61/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 62/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670322 [0] NCCL INFO Channel 63/0 : 1[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 32/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 33/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 34/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 35/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 36/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 37/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 38/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 39/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 40/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 41/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 42/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921143 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 43/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 44/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 45/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 46/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 47/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 48/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 49/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 50/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 51/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 52/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 53/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 54/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 55/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 56/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 57/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 58/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 59/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 60/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 61/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 62/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Channel 63/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851075 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-102:3670125:3670322 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-111:3464458:3464656 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-112:920946:921143 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
[2025-09-23 23:23:58,765] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Flops Profiler Enabled: False
[2025-09-23 23:23:58,765] [INFO] [logging.py:107:log_dist] [Rank 0] Using client Optimizer as basic optimizer
[2025-09-23 23:23:58,765] [INFO] [logging.py:107:log_dist] [Rank 0] Removing param_group that has no 'params' in the basic Optimizer
[2025-09-23 23:23:58,773] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Basic Optimizer = AdamW
[2025-09-23 23:23:58,773] [INFO] [utils.py:59:is_zero_supported_optimizer] Checking ZeRO support for optimizer=AdamW type=<class 'torch.optim.adamw.AdamW'>
[2025-09-23 23:23:58,773] [INFO] [logging.py:107:log_dist] [Rank 0] Creating torch.bfloat16 ZeRO stage 2 optimizer
[2025-09-23 23:23:58,773] [INFO] [stage_1_and_2.py:150:__init__] Reduce bucket size 500000000
[2025-09-23 23:23:58,773] [INFO] [stage_1_and_2.py:151:__init__] Allgather bucket size 500000000
[2025-09-23 23:23:58,773] [INFO] [stage_1_and_2.py:152:__init__] CPU Offload: False
[2025-09-23 23:23:58,773] [INFO] [stage_1_and_2.py:153:__init__] Round robin gradient partitioning: True
[INFO|trainer.py:2409] 2025-09-23 23:23:59,255 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 23:23:59,255 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 23:23:59,255 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 23:23:59,255 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 23:23:59,255 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 23:23:59,255 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 23:23:59,255 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 23:23:59,256 >> Number of trainable parameters = 1,543,714,304
[INFO|trainer.py:2409] 2025-09-23 23:23:59,279 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 23:23:59,279 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 23:23:59,279 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 23:23:59,279 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 23:23:59,279 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 23:23:59,279 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 23:23:59,279 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 23:23:59,280 >> Number of trainable parameters = 1,543,714,304
[2025-09-23 23:23:59,518] [INFO] [utils.py:781:see_memory_usage] Before initializing optimizer states
[2025-09-23 23:23:59,519] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 4.32 GB Max_CA 4 GB
[2025-09-23 23:23:59,519] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 39.55 GB, percent = 18.6%
[INFO|trainer.py:2409] 2025-09-23 23:23:59,526 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 23:23:59,527 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 23:23:59,527 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 23:23:59,527 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 23:23:59,527 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 23:23:59,527 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 23:23:59,527 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 23:23:59,527 >> Number of trainable parameters = 1,543,714,304
[2025-09-23 23:23:59,708] [INFO] [utils.py:781:see_memory_usage] After initializing optimizer states
[2025-09-23 23:23:59,709] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 5.75 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 23:23:59,709] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 40.98 GB, percent = 19.3%
[2025-09-23 23:23:59,709] [INFO] [stage_1_and_2.py:557:__init__] optimizer state initialized
[2025-09-23 23:23:59,893] [INFO] [utils.py:781:see_memory_usage] After initializing ZeRO optimizer
[2025-09-23 23:23:59,893] [INFO] [utils.py:782:see_memory_usage] MA 4.31 GB Max_MA 4.31 GB CA 5.76 GB Max_CA 6 GB
[2025-09-23 23:23:59,893] [INFO] [utils.py:789:see_memory_usage] CPU Virtual Memory: used = 40.99 GB, percent = 19.3%
[2025-09-23 23:23:59,896] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed Final Optimizer = DeepSpeedZeroOptimizer
[2025-09-23 23:23:59,896] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed using configured LR scheduler = None
[2025-09-23 23:23:59,896] [INFO] [logging.py:107:log_dist] [Rank 0] DeepSpeed LR Scheduler = None
[2025-09-23 23:23:59,896] [INFO] [logging.py:107:log_dist] [Rank 0] step=0, skipped=0, lr=[0.0, 0.0], mom=[(0.9, 0.95), (0.9, 0.95)]
[2025-09-23 23:23:59,897] [INFO] [config.py:1003:print] DeepSpeedEngine configuration:
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] activation_checkpointing_config {
"partition_activations": false,
"contiguous_memory_optimization": false,
"cpu_checkpointing": false,
"number_checkpoints": null,
"synchronize_checkpoint_boundary": false,
"profile": false
}
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] aio_config ................... {'block_size': 1048576, 'queue_depth': 8, 'intra_op_parallelism': 1, 'single_submit': False, 'overlap_events': True, 'use_gds': False}
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] amp_enabled .................. False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] amp_params ................... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] autotuning_config ............ {
"enabled": false,
"start_step": null,
"end_step": null,
"metric_path": null,
"arg_mappings": null,
"metric": "throughput",
"model_info": null,
"results_dir": "autotuning_results",
"exps_dir": "autotuning_exps",
"overwrite": true,
"fast": true,
"start_profile_step": 3,
"end_profile_step": 5,
"tuner_type": "gridsearch",
"tuner_early_stopping": 5,
"tuner_num_trials": 50,
"model_info_path": null,
"mp_size": 1,
"max_train_batch_size": null,
"min_train_batch_size": 1,
"max_train_micro_batch_size_per_gpu": 1.024000e+03,
"min_train_micro_batch_size_per_gpu": 1,
"num_tuning_micro_batch_sizes": 3
}
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] bfloat16_enabled ............. True
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] bfloat16_immediate_grad_update True
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] checkpoint_parallel_write_pipeline False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] checkpoint_tag_validation_enabled True
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] checkpoint_tag_validation_fail False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] comms_config ................. <deepspeed.comm.config.DeepSpeedCommsConfig object at 0x40031eac5dc0>
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] communication_data_type ...... None
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] compile_config ............... deepcompile=False free_activation=False offload_activation=False offload_opt_states=False double_buffer=True symmetric_memory=False debug_log=False offload_parameters=False sync_before_reduce=False sync_after_reduce=False sync_before_allgather=False sync_after_allgather=False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] compression_config ........... {'weight_quantization': {'shared_parameters': {'enabled': False, 'quantizer_kernel': False, 'schedule_offset': 0, 'quantize_groups': 1, 'quantize_verbose': False, 'quantization_type': 'symmetric', 'quantize_weight_in_forward': False, 'rounding': 'nearest', 'fp16_mixed_quantize': False, 'quantize_change_ratio': 0.001}, 'different_groups': {}}, 'activation_quantization': {'shared_parameters': {'enabled': False, 'quantization_type': 'symmetric', 'range_calibration': 'dynamic', 'schedule_offset': 1000}, 'different_groups': {}}, 'sparse_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'row_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'head_pruning': {'shared_parameters': {'enabled': False, 'method': 'topk', 'schedule_offset': 1000}, 'different_groups': {}}, 'channel_pruning': {'shared_parameters': {'enabled': False, 'method': 'l1', 'schedule_offset': 1000}, 'different_groups': {}}, 'layer_reduction': {'enabled': False}}
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] curriculum_enabled_legacy .... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] curriculum_params_legacy ..... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] data_efficiency_config ....... {'enabled': False, 'seed': 1234, 'data_sampling': {'enabled': False, 'num_epochs': 1000, 'num_workers': 0, 'pin_memory': False, 'curriculum_learning': {'enabled': False}, 'dynamic_batching': {'enabled': False, 'lr_scaling_method': 'linear', 'min_batch_size': 1, 'max_batch_size': None, 'sequence_picking_order': 'dataloader', 'verbose': False}}, 'data_routing': {'enabled': False, 'random_ltd': {'enabled': False, 'layer_token_lr_schedule': {'enabled': False}}}}
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] data_efficiency_enabled ...... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] dataloader_drop_last ......... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] disable_allgather ............ False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] dump_state ................... False
[2025-09-23 23:23:59,898] [INFO] [config.py:1007:print] dynamic_loss_scale_args ...... None
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_enabled ........... False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_gas_boundary_resolution 1
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_layer_name ........ bert.encoder.layer
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_layer_num ......... 0
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_max_iter .......... 100
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_stability ......... 1e-06
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_tol ............... 0.01
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] eigenvalue_verbose ........... False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] elasticity_enabled ........... False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] flops_profiler_config ........ {
"enabled": false,
"recompute_fwd_factor": 0.0,
"profile_step": 1,
"module_depth": -1,
"top_modules": 1,
"detailed": true,
"output_file": null
}
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] fp16_auto_cast ............... None
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] fp16_enabled ................. False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] fp16_master_weights_and_gradients False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] global_rank .................. 0
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] grad_accum_dtype ............. None
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] gradient_accumulation_steps .. 1
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] gradient_clipping ............ 1.0
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] gradient_predivide_factor .... 1.0
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] graph_harvesting ............. False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] hybrid_engine ................ enabled=False max_out_tokens=512 inference_tp_size=1 release_inference_cache=False pin_parameters=True tp_gather_partition_size=8
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] initial_dynamic_scale ........ 1
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] load_universal_checkpoint .... False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] loss_scale ................... 1.0
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] memory_breakdown ............. False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] mics_hierarchial_params_gather False
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] mics_shard_size .............. -1
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] monitor_config ............... tensorboard=TensorBoardConfig(enabled=False, output_path='', job_name='DeepSpeedJobName') comet=CometConfig(enabled=False, samples_log_interval=100, project=None, workspace=None, api_key=None, experiment_name=None, experiment_key=None, online=None, mode=None) wandb=WandbConfig(enabled=False, group=None, team=None, project='deepspeed') csv_monitor=CSVConfig(enabled=False, output_path='', job_name='DeepSpeedJobName')
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] nebula_config ................ {
"enabled": false,
"persistent_storage_path": null,
"persistent_time_interval": 100,
"num_of_version_in_retention": 2,
"enable_nebula_load": true,
"load_path": null
}
[2025-09-23 23:23:59,899] [INFO] [config.py:1007:print] optimizer_legacy_fusion ...... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] optimizer_name ............... None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] optimizer_params ............. None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] pipeline ..................... {'stages': 'auto', 'partition': 'best', 'seed_layers': False, 'activation_checkpoint_interval': 0, 'pipe_partitioned': True, 'grad_partitioned': True}
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] pld_enabled .................. False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] pld_params ................... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] prescale_gradients ........... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] scheduler_name ............... None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] scheduler_params ............. None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] seq_parallel_communication_data_type torch.float32
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] sparse_attention ............. None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] sparse_gradients_enabled ..... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] steps_per_print .............. inf
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] tensor_parallel_config ....... dtype=torch.float16 autotp_size=0 tp_overlap_comm=False tensor_parallel=TPConfig(tp_size=1, tp_grain_size=1, mpu=None, tp_group=None) injection_policy_tuple=None keep_module_on_host=False replace_with_kernel_inject=False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] timers_config ................ enabled=True synchronized=True
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] train_batch_size ............. 4
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] train_micro_batch_size_per_gpu 1
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] use_data_before_expert_parallel_ False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] use_node_local_storage ....... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] wall_clock_breakdown ......... False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] weight_quantization_config ... None
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] world_size ................... 4
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] zero_allow_untested_optimizer True
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] zero_config .................. stage=2 contiguous_gradients=True reduce_scatter=True reduce_bucket_size=500000000 use_multi_rank_bucket_allreduce=True allgather_partitions=True allgather_bucket_size=500000000 overlap_comm=False load_from_fp32_weights=True elastic_checkpoint=False offload_param=None offload_optimizer=None sub_group_size=1000000000 cpu_offload_param=None cpu_offload_use_pin_memory=None cpu_offload=None prefetch_bucket_size=50000000 param_persistence_threshold=100000 model_persistence_threshold=9223372036854775807 max_live_parameters=1000000000 max_reuse_distance=1000000000 gather_16bit_weights_on_model_save=False module_granularity_threshold=0 use_all_reduce_for_fetch_params=False stage3_gather_fp16_weights_on_model_save=False ignore_unused_parameters=True legacy_stage1=False round_robin_gradients=True zero_hpz_partition_size=1 zero_quantized_weights=False zero_quantized_nontrainable_weights=False zero_quantized_gradients=False zeropp_loco_param=None mics_shard_size=-1 mics_hierarchical_params_gather=False memory_efficient_linear=True pipeline_loading_checkpoint=False override_module_apply=True log_trace_cache_warnings=False
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] zero_enabled ................. True
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] zero_force_ds_cpu_optimizer .. True
[2025-09-23 23:23:59,900] [INFO] [config.py:1007:print] zero_optimization_stage ...... 2
[2025-09-23 23:23:59,900] [INFO] [config.py:993:print_user_config] json = {
"train_batch_size": 4,
"train_micro_batch_size_per_gpu": 1,
"gradient_accumulation_steps": 1,
"gradient_clipping": 1.0,
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": false,
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": true
},
"zero_optimization": {
"stage": 2,
"allgather_partitions": true,
"allgather_bucket_size": 5.000000e+08,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5.000000e+08,
"contiguous_gradients": true,
"round_robin_gradients": true
},
"steps_per_print": inf
}
[INFO|trainer.py:2409] 2025-09-23 23:23:59,902 >> ***** Running training *****
[INFO|trainer.py:2410] 2025-09-23 23:23:59,902 >> Num examples = 3,998
[INFO|trainer.py:2411] 2025-09-23 23:23:59,902 >> Num Epochs = 1
[INFO|trainer.py:2412] 2025-09-23 23:23:59,902 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2415] 2025-09-23 23:23:59,902 >> Total train batch size (w. parallel, distributed & accumulation) = 4
[INFO|trainer.py:2416] 2025-09-23 23:23:59,902 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2417] 2025-09-23 23:23:59,902 >> Total optimization steps = 1,000
[INFO|trainer.py:2418] 2025-09-23 23:23:59,903 >> Number of trainable parameters = 1,543,714,304
[INFO|integration_utils.py:832] 2025-09-23 23:23:59,904 >> Automatic Weights & Biases logging enabled, to disable set os.environ["WANDB_DISABLED"] = "true"
wandb: Currently logged in as: zsprague (ut_nlp_deduce) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
wandb: creating run
wandb: Tracking run with wandb version 0.22.0
wandb: Run data is saved locally in /scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/wandb/run-20250923_232400-iql4ovmz
wandb: Run `wandb offline` to turn off syncing.
wandb: Syncing run BASELINE_r1_distillation_sft
wandb: βοΈ View project at https://wandb.ai/ut_nlp_deduce/BASELINE_r1_distillation_sft
wandb: π View run at https://wandb.ai/ut_nlp_deduce/BASELINE_r1_distillation_sft/runs/iql4ovmz
c619-102:3670125:3670325 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 00/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 01/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 02/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 32/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 33/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 34/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 03/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 32/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 00/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 01/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 04/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 05/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 33/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 34/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 35/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 36/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 37/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 06/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 02/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 03/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 07/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 35/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 36/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 08/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 38/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 39/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 40/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 09/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 04/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 05/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 06/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 37/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 38/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 10/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 41/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 42/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 43/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 11/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 07/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 08/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 09/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 39/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 12/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 40/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 41/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 44/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 45/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 13/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 14/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 10/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 11/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 15/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 42/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 43/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 46/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 47/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 16/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 12/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 13/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 17/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 44/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 45/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 48/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 49/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 18/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 19/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 14/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 15/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 46/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 20/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 50/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 47/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 51/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 48/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 21/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 16/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 17/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 18/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 22/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 52/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 49/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 53/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 50/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 54/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 23/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 19/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 20/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 24/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 55/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 51/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 56/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 52/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 25/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 26/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 57/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 21/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 58/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 22/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 53/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 27/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 23/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 54/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 28/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 29/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 59/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 60/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 61/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 55/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 56/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 57/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 30/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 24/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 25/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 26/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 31/0 : 0[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 62/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 63/0 : 1[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 58/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 59/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 27/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 28/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 29/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 60/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 61/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 30/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 31/0 : 2[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 62/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 63/0 : 3[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 00/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 01/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 02/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 03/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 04/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 05/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 06/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 07/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 08/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 09/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 10/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 11/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 12/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 13/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 14/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921146 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 15/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 16/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 17/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 18/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 19/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 20/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 21/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 22/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 23/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 24/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 25/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 26/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 27/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 28/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 29/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 30/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 31/0 : 3[0] -> 2[0] [receive] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 00/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 01/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 02/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 03/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 04/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 05/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 06/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 07/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 08/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 09/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 10/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 11/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 12/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 13/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 14/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464659 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 15/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 16/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 17/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 18/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 19/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 20/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 21/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 22/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 23/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 24/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 25/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 26/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 27/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 28/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 29/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 30/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 31/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 32/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 33/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 34/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 35/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 36/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 37/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 38/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 39/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 40/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 41/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 42/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 43/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 44/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 45/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 46/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 47/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 48/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 49/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 50/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 51/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 52/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 53/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 54/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 55/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 56/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 57/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 58/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 59/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 60/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 61/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 62/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 63/0 : 2[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 32/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 33/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 34/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 35/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 36/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 37/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 38/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 39/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 40/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 41/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 42/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 43/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 44/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 45/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 46/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 47/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 48/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 49/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 50/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 51/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 52/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 53/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 54/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 55/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 56/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 57/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 58/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 59/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 60/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 61/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 62/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670325 [0] NCCL INFO Channel 63/0 : 1[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851128 [0] NCCL INFO Connected all trees
c619-102:3670125:3670325 [0] NCCL INFO Connected all trees
c619-111:3464458:3464659 [0] NCCL INFO Connected all trees
c619-112:920946:921146 [0] NCCL INFO Connected all trees
c619-111:3464458:3464660 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 32/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 33/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 34/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 35/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 36/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 37/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 38/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 39/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 40/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 41/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 42/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 43/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 44/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 45/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 46/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 47/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 48/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 49/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 50/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 51/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 52/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 53/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 54/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 55/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 56/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 57/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 58/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 59/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 60/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 61/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 62/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 63/0 : 2[0] -> 3[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 00/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 01/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 02/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-102:3670125:3670326 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 03/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 04/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 05/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 06/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 07/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 08/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 09/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 10/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 11/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 12/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 13/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 14/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 15/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 16/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 17/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 18/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 19/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 20/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 21/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 22/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 23/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 24/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 25/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 26/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 27/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 28/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 29/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 30/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 31/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 32/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 33/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 34/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 35/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 36/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 37/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [receive] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 38/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 39/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 00/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 01/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 40/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 41/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 02/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 03/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 42/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 43/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 04/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 05/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 44/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 45/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 06/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 07/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 46/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 47/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 08/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 09/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 48/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 49/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 10/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 11/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 50/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 12/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 51/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 13/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 14/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 52/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 53/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 54/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 15/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 16/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 55/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 56/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 17/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 18/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 57/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 58/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 19/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 20/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 59/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 21/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 60/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 61/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 22/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 23/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 62/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-112:920946:921147 [0] NCCL INFO Channel 63/0 : 3[0] -> 0[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 24/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 25/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 26/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 27/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 28/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 29/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 30/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-101:850874:851132 [0] NCCL INFO Channel 31/0 : 0[0] -> 1[0] [send] via NET/IB/1
c619-111:3464458:3464660 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-101:850874:851132 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-112:920946:921147 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
c619-102:3670125:3670326 [0] NCCL INFO Connected all rings, use ring PXN 0 GDR 0
{'loss': 1.1671, 'grad_norm': 5.7659831047058105, 'learning_rate': 1.8e-07, 'train/length': 195.6, 'epoch': 0.01}
0%| | 0/1000 [00:00<?, ?it/s]
0%| | 1/1000 [00:01<19:38, 1.18s/it]
0%| | 2/1000 [00:01<12:52, 1.29it/s]
0%| | 3/1000 [00:02<10:26, 1.59it/s]
0%| | 4/1000 [00:02<09:09, 1.81it/s]
0%| | 5/1000 [00:02<08:29, 1.95it/s]
1%| | 6/1000 [00:03<08:06, 2.04it/s]
1%| | 7/1000 [00:03<07:56, 2.08it/s]
1%| | 8/1000 [00:04<07:32, 2.19it/s]
1%| | 9/1000 [00:04<07:43, 2.14it/s]
1%| | 10/1000 [00:05<07:25, 2.22it/s]
1%| | 10/1000 [00:05<07:25, 2.22it/s]
1%| | 11/1000 [00:05<08:47, 1.88it/s]
1%| | 12/1000 [00:06<08:11, 2.01it/s]
1%|β | 13/1000 [00:06<07:49, 2.10it/s]
1%|β | 14/1000 [00:07<07:17, 2.25it/s]
2%|β | 15/1000 [00:07<07:04, 2.32it/s]
2%|β | 16/1000 [00:07<07:07, 2.30it/s]
2%|β | 17/1000 [00:08<06:57, 2.36it/s]
2%|β | 18/1000 [00{'loss': 1.1475, 'grad_norm': 4.98866605758667, 'learning_rate': 3.7999999999999996e-07, 'train/length': 196.0, 'epoch': 0.02}
{'loss': 1.1243, 'grad_norm': 4.064605236053467, 'learning_rate': 5.8e-07, 'train/length': 195.9, 'epoch': 0.03}
:08<07:05, 2.31it/s]
2%|β | 19/1000 [00:09<07:13, 2.26it/s]
2%|β | 20/1000 [00:09<07:17, 2.24it/s]
2%|β | 20/1000 [00:09<07:17, 2.24it/s]
2%|β | 21/1000 [00:10<07:06, 2.30it/s]
2%|β | 22/1000 [00:10<07:03, 2.31it/s]
2%|β | 23/1000 [00:11<07:13, 2.25it/s]
2%|β | 24/1000 [00:11<06:51, 2.37it/s]
2%|β | 25/1000 [00:11<06:56, 2.34it/s]
3%|β | 26/1000 [00:12<06:47, 2.39it/s]
3%|β | 27/1000 [00:12<06:54, 2.35it/s]
3%|β | 28/1000 [00:13<06:55, 2.34it/s]
3%|β | 29/1000 [00:13<06:49, 2.37it/s]
3%|β | 30/1000 [00:14<07:29, 2.16it/s]
3%|β | 30/1000 [00:14<07:29, 2.16it/s]
3%|β | 31/1000 [00:14<07:27, 2.16it/s]
3%|β | 32/1000 [00:15<07:17, 2.21it/s]
3%|β | 33/1000 [00:15<07:49, 2.06it/s]
3%|β {'loss': 1.1121, 'grad_norm': 3.9239065647125244, 'learning_rate': 7.799999999999999e-07, 'train/length': 195.9, 'epoch': 0.04}
{'loss': 1.0552, 'grad_norm': 3.072298526763916, 'learning_rate': 9.8e-07, 'train/length': 1466.0, 'epoch': 0.05}
| 34/1000 [00:15<07:31, 2.14it/s]
4%|β | 35/1000 [00:16<07:28, 2.15it/s]
4%|β | 36/1000 [00:16<07:19, 2.19it/s]
4%|β | 37/1000 [00:17<06:54, 2.32it/s]
4%|β | 38/1000 [00:17<07:36, 2.11it/s]
4%|β | 39/1000 [00:18<07:25, 2.16it/s]
4%|β | 40/1000 [00:18<07:08, 2.24it/s]
4%|β | 40/1000 [00:18<07:08, 2.24it/s]
4%|β | 41/1000 [00:19<07:42, 2.07it/s]
4%|β | 42/1000 [00:19<07:35, 2.10it/s]
4%|β | 43/1000 [00:20<07:33, 2.11it/s]
4%|β | 44/1000 [00:20<07:02, 2.26it/s]
4%|β | 45/1000 [00:21<07:12, 2.21it/s]
5%|β | 46/1000 [00:21<08:27, 1.88it/s]
5%|β | 47/1000 [00:22<08:50, 1.80it/s]
5%|β | 48/1000 [00:22<07:54, 2.01it/s]
5%|β | 49/1000 [00:23<07:33, 2.10it/s]
5%|β | 50/1000 [00:23<07:18, 2.17it/s]
{'loss': 0.9946, 'grad_norm': 2.5179457664489746, 'learning_rate': 9.997785653888834e-07, 'train/length': 3088.0, 'epoch': 0.06}
5%|β | 50/1000 [00:23<07:18, 2.17it/s]
5%|β | 51/1000 [00:24<07:22, 2.15it/s]
5%|β | 52/1000 [00:24<07:10, 2.20it/s]
5%|β | 53/1000 [00:24<07:14, 2.18it/s]
5%|β | 54/1000 [00:25<07:04, 2.23it/s]
6%|β | 55/1000 [00:25<07:10, 2.19it/s]
6%|β | 56/1000 [00:26<07:18, 2.16it/s]
6%|β | 57/1000 [00:26<07:56, 1.98it/s]
6%|β | 58/1000 [00:27<07:36, 2.06it/s]
6%|β | 59/1000 [00:27<08:07, 1.93it/s]
6%|β | 60/1000 [00:28<07:39, 2.05it/s]
6%|β | 60/1000 [00:28<07:39, 2.05it/s]
6%|β | 61/1000 [00:28<07:39, 2.04it/s]
6%|β | 62/1000 [00:29<07:24, 2.11it/s]
6%|β | 63/1000 [00:29<07:09, 2.18it/s]
6%|β | 64/1000 [00:30<07:06, 2.19it/s]
6%|β | 65/1000 [00:30<06:57, 2.24it/s]
7%|β | 66/1000 [00:31<07:00, 2.22it/s]
7%|β | 67/1000 [00:31<06:49{'loss': 0.9412, 'grad_norm': 2.2130417823791504, 'learning_rate': 9.990133642141357e-07, 'train/length': 2918.3, 'epoch': 0.07}
{'loss': 0.8816, 'grad_norm': 1.9483217000961304, 'learning_rate': 9.977024992520601e-07, 'train/length': 3023.7, 'epoch': 0.08}
, 2.28it/s]
7%|β | 68/1000 [00:31<06:30, 2.39it/s]
7%|β | 69/1000 [00:32<06:35, 2.35it/s]
7%|β | 70/1000 [00:32<06:38, 2.34it/s]
7%|β | 70/1000 [00:32<06:38, 2.34it/s]
7%|β | 71/1000 [00:33<06:32, 2.36it/s]
7%|β | 72/1000 [00:33<06:46, 2.28it/s]
7%|β | 73/1000 [00:34<06:40, 2.31it/s]
7%|β | 74/1000 [00:34<06:32, 2.36it/s]
8%|β | 75/1000 [00:34<06:47, 2.27it/s]
8%|β | 76/1000 [00:35<06:56, 2.22it/s]
8%|β | 77/1000 [00:35<06:52, 2.24it/s]
8%|β | 78/1000 [00:36<06:59, 2.20it/s]
8%|β | 79/1000 [00:36<07:00, 2.19it/s]
8%|β | 80/1000 [00:37<06:55, 2.21it/s]
8%|β | 80/1000 [00:37<06:55, 2.21it/s]
8%|β | 81/1000 [00:37<06:49, 2.25it/s]
8%|β | 82/1000 [00:37<06:28, 2.37it/s]
8%|β | 83/100{'loss': 0.8489, 'grad_norm': 2.0010433197021484, 'learning_rate': 9.958474039142469e-07, 'train/length': 3057.5, 'epoch': 0.09}
0 [00:38<06:14, 2.45it/s]
8%|β | 84/1000 [00:38<07:04, 2.16it/s]
8%|β | 85/1000 [00:39<06:58, 2.19it/s]
9%|β | 86/1000 [00:39<06:56, 2.19it/s]
9%|β | 87/1000 [00:40<06:45, 2.25it/s]
9%|β | 88/1000 [00:40<06:58, 2.18it/s]
9%|β | 89/1000 [00:41<07:06, 2.13it/s]
9%|β | 90/1000 [00:41<07:04, 2.15it/s]
9%|β | 90/1000 [00:41<07:04, 2.15it/s]
9%|β | 91/1000 [00:42<06:56, 2.18it/s]
9%|β | 92/1000 [00:42<06:54, 2.19it/s]
9%|β | 93/1000 [00:43<06:47, 2.23it/s]
9%|β | 94/1000 [00:43<06:39, 2.27it/s]
10%|β | 95/1000 [00:43<06:20, 2.38it/s]
10%|β | 96/1000 [00:44<06:30, 2.31it/s]
10%|β | 97/1000 [00:44<06:41, 2.25it/s]
10%|β | 98/1000 [00:45<06:38, 2.26it/s]
10%|β | 99/1000 [00:45<06:20, 2.37it/s]
10%|β | 100/1000 [00:46<06:32, 2.29it/s]
{'loss': 0.7973, 'grad_norm': 1.9877142906188965, 'learning_rate': 9.934501067202117e-07, 'train/length': 2868.8, 'epoch': 0.1}
{'loss': 0.7529, 'grad_norm': 1.890247106552124, 'learning_rate': 9.905132290792392e-07, 'train/length': 2762.7, 'epoch': 0.11}
10%|β | 100/1000 [00:46<06:32, 2.29it/s]
10%|β | 101/1000 [00:46<07:05, 2.11it/s]
10%|β | 102/1000 [00:47<07:34, 1.98it/s]
10%|β | 103/1000 [00:47<07:15, 2.06it/s]
10%|β | 104/1000 [00:48<07:07, 2.10it/s]
10%|β | 105/1000 [00:48<06:58, 2.14it/s]
11%|β | 106/1000 [00:49<06:58, 2.14it/s]
11%|β | 107/1000 [00:49<06:51, 2.17it/s]
11%|β | 108/1000 [00:49<06:50, 2.17it/s]
11%|β | 109/1000 [00:50<06:36, 2.25it/s]
11%|β | 110/1000 [00:50<06:34, 2.25it/s]
11%|β | 110/1000 [00:50<06:34, 2.25it/s]
11%|β | 111/1000 [00:51<06:25, 2.30it/s]
11%|β | 112/1000 [00:51<06:20, 2.33it/s]
11%|ββ | 113/1000 [00:52<06:17, 2.35it/s]
11%|ββ | 114/1000 [00:52<06:26, 2.29it/s]
12%|ββ | 115/1000 [00:52<06:19, 2.33it/s]
12%|ββ | 1{'loss': 0.7123, 'grad_norm': 1.7795876264572144, 'learning_rate': 9.870399824239114e-07, 'train/length': 2858.4, 'epoch': 0.12}
{'loss': 0.6909, 'grad_norm': 1.930146336555481, 'learning_rate': 9.83034164698452e-07, 'train/length': 2683.0, 'epoch': 0.13}
16/1000 [00:53<06:27, 2.28it/s]
12%|ββ | 117/1000 [00:53<06:27, 2.28it/s]
12%|ββ | 118/1000 [00:54<06:17, 2.34it/s]
12%|ββ | 119/1000 [00:54<06:16, 2.34it/s]
12%|ββ | 120/1000 [00:55<06:22, 2.30it/s]
12%|ββ | 120/1000 [00:55<06:22, 2.30it/s]
12%|ββ | 121/1000 [00:55<06:37, 2.21it/s]
12%|ββ | 122/1000 [00:55<06:25, 2.28it/s]
12%|ββ | 123/1000 [00:56<06:51, 2.13it/s]
12%|ββ | 124/1000 [00:56<06:33, 2.22it/s]
12%|ββ | 125/1000 [00:57<06:23, 2.28it/s]
13%|ββ | 126/1000 [00:57<06:23, 2.28it/s]
13%|ββ | 127/1000 [00:58<07:03, 2.06it/s]
13%|ββ | 128/1000 [00:58<06:51, 2.12it/s]
13%|ββ | 129/1000 [00:59<06:47, 2.14it/s]
13%|ββ | 130/1000 [00:59<06:47, 2.13it/s]
13%|ββ | 130/1000 [00:59<06:47, 2.13it/s]
ββ | 131/1000 [01:00<06:40, 2.17it/s]
13%|ββ | 132/1000 [01:00<06:37, 2.18it/s]
13%|ββ | 133/1000 [01:00<06:16, 2.31it/s]
13%|ββ | 134/1000 [01:01<06:17, 2.29it/s]
14%|ββ | 135/1000 [01:01<06:26, 2.24it/s]
14%|ββ | 136/1000 [01:02<06:21, 2.27it/s]
14%|ββ | 137/1000 [01:02<06:17, 2.29it/s]
14%|ββ | 138/1000 [01:03<06:19, 2.27it/s]
14%|ββ | 139/1000 [01:03<06:16, 2.29it/s]
14%|ββ | 140/1000 [01:04<06:11, 2.31it/s]
14%|ββ | 140/1000 [01:04<06:11, 2.31it/s]
14%|ββ | 141/1000 [01:04<06:18, 2.27it/s]
14%|ββ | 142/1000 [01:05<07:09, 2.00it/s]
14%|ββ | 143/1000 [01:05<06:50, 2.09it/s]
14%|ββ | 144/1000 [01:06<06:42, 2.13it/s]
14%|ββ | 145/1000 [01:06<07:13, 1.97it/s]
15%|ββ | 146/1000 [01:07<06:54, 2.06it/s]
15%|ββ | 147/1000 [01:07<06:42, {'loss': 0.66, 'grad_norm': 1.9510775804519653, 'learning_rate': 9.734429148174674e-07, 'train/length': 2772.4, 'epoch': 0.15}
{'loss': 0.6337, 'grad_norm': 1.9982353448867798, 'learning_rate': 9.678679705528698e-07, 'train/length': 2924.2, 'epoch': 0.16}
2.12it/s]
15%|ββ | 148/1000 [01:07<06:29, 2.18it/s]
15%|ββ | 149/1000 [01:08<06:20, 2.24it/s]
15%|ββ | 150/1000 [01:08<06:09, 2.30it/s]
15%|ββ | 150/1000 [01:08<06:09, 2.30it/s]
15%|ββ | 151/1000 [01:09<06:08, 2.30it/s]
15%|ββ | 152/1000 [01:09<06:11, 2.28it/s]
15%|ββ | 153/1000 [01:10<06:15, 2.26it/s]
15%|ββ | 154/1000 [01:10<05:57, 2.36it/s]
16%|ββ | 155/1000 [01:10<05:55, 2.38it/s]
16%|ββ | 156/1000 [01:11<06:01, 2.33it/s]
16%|ββ | 157/1000 [01:11<06:08, 2.29it/s]
16%|ββ | 158/1000 [01:12<06:13, 2.25it/s]
16%|ββ | 159/1000 [01:12<06:07, 2.29it/s]
16%|ββ | 160/1000 [01:13<06:00, 2.33it/s]
16%|ββ | 160/1000 [01:13<06:00, 2.33it/s]
16%|ββ | 161/1000 [01:13<05:44, 2.43it/s]
16%|ββ | 162/1000{'loss': 0.652, 'grad_norm': 2.038006544113159, 'learning_rate': 9.61781419531641e-07, 'train/length': 2764.9, 'epoch': 0.17}
[01:13<06:00, 2.32it/s]
16%|ββ | 163/1000 [01:14<06:01, 2.31it/s]
16%|ββ | 164/1000 [01:14<06:13, 2.24it/s]
16%|ββ | 165/1000 [01:15<06:08, 2.26it/s]
17%|ββ | 166/1000 [01:15<06:11, 2.24it/s]
17%|ββ | 167/1000 [01:16<06:06, 2.27it/s]
17%|ββ | 168/1000 [01:16<06:15, 2.22it/s]
17%|ββ | 169/1000 [01:16<05:55, 2.33it/s]
17%|ββ | 170/1000 [01:17<06:01, 2.29it/s]
17%|ββ | 170/1000 [01:17<06:01, 2.29it/s]
17%|ββ | 171/1000 [01:17<05:45, 2.40it/s]
17%|ββ | 172/1000 [01:18<06:18, 2.19it/s]
17%|ββ | 173/1000 [01:18<06:12, 2.22it/s]
17%|ββ | 174/1000 [01:19<06:37, 2.08it/s]
18%|ββ | 175/1000 [01:19<06:21, 2.16it/s]
18%|ββ | 176/1000 [01:20<06:14, 2.20it/s]
18%|ββ | 177/1000 [01:20<06:23, 2.15it/s]
18%|ββ | 178/1000 [01:21<05:58, 2.29it/s]
18%|ββ{'loss': 0.6218, 'grad_norm': 1.9112772941589355, 'learning_rate': 9.551899173079606e-07, 'train/length': 2887.4, 'epoch': 0.18}
{'loss': 0.6423, 'grad_norm': 2.0989060401916504, 'learning_rate': 9.481006715927351e-07, 'train/length': 2856.2, 'epoch': 0.19}
| 179/1000 [01:21<05:54, 2.32it/s]
18%|ββ | 180/1000 [01:21<06:08, 2.23it/s]
18%|ββ | 180/1000 [01:21<06:08, 2.23it/s]
18%|ββ | 181/1000 [01:22<05:56, 2.30it/s]
18%|ββ | 182/1000 [01:22<05:42, 2.39it/s]
18%|ββ | 183/1000 [01:23<05:47, 2.35it/s]
18%|ββ | 184/1000 [01:23<06:02, 2.25it/s]
18%|ββ | 185/1000 [01:24<06:03, 2.24it/s]
19%|ββ | 186/1000 [01:24<06:09, 2.20it/s]
19%|ββ | 187/1000 [01:25<06:55, 1.96it/s]
19%|ββ | 188/1000 [01:25<07:13, 1.87it/s]
19%|ββ | 189/1000 [01:26<06:57, 1.94it/s]
19%|ββ | 190/1000 [01:26<06:41, 2.02it/s]
19%|ββ | 190/1000 [01:26<06:41, 2.02it/s]
19%|ββ | 191/1000 [01:27<06:32, 2.06it/s]
19%|ββ | 192/1000 [01:27<06:16, 2.14it/s]
19%|ββ | 193/1000 [01:28<06:05, 2.21it{'loss': 0.6194, 'grad_norm': 2.389956474304199, 'learning_rate': 9.405214343720706e-07, 'train/length': 3069.7, 'epoch': 0.2}
/s]
19%|ββ | 194/1000 [01:28<06:14, 2.15it/s]
20%|ββ | 195/1000 [01:29<06:32, 2.05it/s]
20%|ββ | 196/1000 [01:29<06:03, 2.21it/s]
20%|ββ | 197/1000 [01:29<06:01, 2.22it/s]
20%|ββ | 198/1000 [01:30<06:35, 2.03it/s]
20%|ββ | 199/1000 [01:30<06:26, 2.07it/s]
20%|ββ | 200/1000 [01:31<06:10, 2.16it/s]
20%|ββ | 200/1000 [01:31<06:10, 2.16it/s]
20%|ββ | 201/1000 [01:32<07:12, 1.85it/s]
20%|ββ | 202/1000 [01:32<07:16, 1.83it/s]
20%|ββ | 203/1000 [01:33<06:51, 1.94it/s]
20%|ββ | 204/1000 [01:33<06:39, 1.99it/s]
20%|ββ | 205/1000 [01:34<06:20, 2.09it/s]
21%|ββ | 206/1000 [01:34<06:47, 1.95it/s]
21%|ββ | 207/1000 [01:35<06:20, 2.08it/s]
21%|ββ | 208/1000 [01:35<05:55, 2.23it/s]
21%|ββ | 209/1000 [01:35<05:53, 2.24it/s]
21%|ββ | 210/1000 [01{'loss': 0.5923, 'grad_norm': 2.0992257595062256, 'learning_rate': 9.32460493430591e-07, 'train/length': 3422.0, 'epoch': 0.21}
{'loss': 0.5861, 'grad_norm': 2.1993329524993896, 'learning_rate': 9.239266632888658e-07, 'train/length': 3084.5, 'epoch': 0.22}
:36<05:35, 2.35it/s]
21%|ββ | 210/1000 [01:36<05:35, 2.35it/s]
21%|ββ | 211/1000 [01:36<05:37, 2.34it/s]
21%|ββ | 212/1000 [01:37<05:36, 2.34it/s]
21%|βββ | 213/1000 [01:37<06:10, 2.12it/s]
21%|βββ | 214/1000 [01:38<06:07, 2.14it/s]
22%|βββ | 215/1000 [01:38<05:57, 2.20it/s]
22%|βββ | 216/1000 [01:38<05:48, 2.25it/s]
22%|βββ | 217/1000 [01:39<05:42, 2.28it/s]
22%|βββ | 218/1000 [01:39<05:47, 2.25it/s]
22%|βββ | 219/1000 [01:40<05:51, 2.22it/s]
22%|βββ | 220/1000 [01:40<05:50, 2.22it/s]
22%|βββ | 220/1000 [01:40<05:50, 2.22it/s]
22%|βββ | 221/1000 [01:41<05:42, 2.28it/s]
22%|βββ | 222/1000 [01:41<05:42, 2.27it/s]
22%|βββ | 223/1000 [01:42<05:33, 2.33it/s]
22%|βββ | 224/1000 [01:42<05:43, 2.{'loss': 0.5917, 'grad_norm': 1.9166151285171509, 'learning_rate': 9.14929275564863e-07, 'train/length': 2711.2, 'epoch': 0.23}
26it/s]
22%|βββ | 225/1000 [01:43<06:10, 2.09it/s]
23%|βββ | 226/1000 [01:43<05:49, 2.22it/s]
23%|βββ | 227/1000 [01:43<05:50, 2.20it/s]
23%|βββ | 228/1000 [01:44<05:44, 2.24it/s]
23%|βββ | 229/1000 [01:44<05:49, 2.20it/s]
23%|βββ | 230/1000 [01:45<05:58, 2.15it/s]
23%|βββ | 230/1000 [01:45<05:58, 2.15it/s]
23%|βββ | 231/1000 [01:45<06:20, 2.02it/s]
23%|βββ | 232/1000 [01:46<06:35, 1.94it/s]
23%|βββ | 233/1000 [01:46<06:13, 2.05it/s]
23%|βββ | 234/1000 [01:47<06:10, 2.07it/s]
24%|βββ | 235/1000 [01:47<05:53, 2.17it/s]
24%|βββ | 236/1000 [01:48<05:56, 2.14it/s]
24%|βββ | 237/1000 [01:48<05:42, 2.23it/s]
24%|βββ | 238/1000 [01:49<05:39, 2.24it/s]
24%|βββ | 239/1000 [01:49<05:34, 2.27it/s]
24%|βββ | 240/1000 [01:49<05:31, 2.29i{'loss': 0.5663, 'grad_norm': 2.2675297260284424, 'learning_rate': 9.0547816876996e-07, 'train/length': 3252.8, 'epoch': 0.24}
{'loss': 0.5901, 'grad_norm': 1.9638926982879639, 'learning_rate': 8.955836775506775e-07, 'train/length': 2892.5, 'epoch': 0.25}
t/s]
24%|βββ | 240/1000 [01:49<05:31, 2.29it/s]
24%|βββ | 241/1000 [01:50<05:29, 2.30it/s]
24%|βββ | 242/1000 [01:50<05:21, 2.36it/s]
24%|βββ | 243/1000 [01:51<05:47, 2.18it/s]
24%|βββ | 244/1000 [01:51<05:46, 2.18it/s]
24%|βββ | 245/1000 [01:52<05:44, 2.19it/s]
25%|βββ | 246/1000 [01:52<05:36, 2.24it/s]
25%|βββ | 247/1000 [01:53<05:41, 2.20it/s]
25%|βββ | 248/1000 [01:53<05:31, 2.27it/s]
25%|βββ | 249/1000 [01:53<05:27, 2.29it/s]
25%|βββ | 250/1000 [01:54<05:28, 2.29it/s]
25%|βββ | 250/1000 [01:54<05:28, 2.29it/s]
25%|βββ | 251/1000 [01:54<05:35, 2.23it/s]
25%|βββ | 252/1000 [01:55<05:40, 2.20it/s]
25%|βββ | 253/1000 [01:55<05:59, 2.08it/s]
25%|βββ | 254/1000 [01:56<05:50, 2.13it/s]
26{'loss': 0.5627, 'grad_norm': 2.0307955741882324, 'learning_rate': 8.852566213878946e-07, 'train/length': 2908.4, 'epoch': 0.26}
%|βββ | 255/1000 [01:56<05:45, 2.15it/s]
26%|βββ | 256/1000 [01:57<05:24, 2.30it/s]
26%|βββ | 257/1000 [01:57<05:24, 2.29it/s]
26%|βββ | 258/1000 [01:57<05:24, 2.29it/s]
26%|βββ | 259/1000 [01:58<05:33, 2.22it/s]
26%|βββ | 260/1000 [01:58<05:23, 2.28it/s]
26%|βββ | 260/1000 [01:58<05:23, 2.28it/s]
26%|βββ | 261/1000 [01:59<05:47, 2.13it/s]
26%|βββ | 262/1000 [01:59<05:44, 2.14it/s]
26%|βββ | 263/1000 [02:00<05:33, 2.21it/s]
26%|βββ | 264/1000 [02:00<05:36, 2.18it/s]
26%|βββ | 265/1000 [02:01<05:26, 2.25it/s]
27%|βββ | 266/1000 [02:01<05:34, 2.20it/s]
27%|βββ | 267/1000 [02:02<05:28, 2.23it/s]
27%|βββ | 268/1000 [02:02<05:24, 2.26it/s]
27%|βββ | 269/1000 [02:03<05:48, 2.10it/s]
27%|βββ | 270/1000 [02:03<05:45, 2.11it/s]
{'loss': 0.5755, 'grad_norm': 2.2053990364074707, 'learning_rate': 8.745082927659046e-07, 'train/length': 3013.1, 'epoch': 0.27}
{'loss': 0.5648, 'grad_norm': 1.9379103183746338, 'learning_rate': 8.633504448242504e-07, 'train/length': 2889.3, 'epoch': 0.28}
27%|βββ | 270/1000 [02:03<05:45, 2.11it/s]
27%|βββ | 271/1000 [02:03<05:41, 2.14it/s]
27%|βββ | 272/1000 [02:04<05:32, 2.19it/s]
27%|βββ | 273/1000 [02:04<05:21, 2.26it/s]
27%|βββ | 274/1000 [02:05<05:16, 2.30it/s]
28%|βββ | 275/1000 [02:05<05:01, 2.40it/s]
28%|βββ | 276/1000 [02:06<05:11, 2.32it/s]
28%|βββ | 277/1000 [02:06<06:25, 1.88it/s]
28%|βββ | 278/1000 [02:07<06:16, 1.92it/s]
28%|βββ | 279/1000 [02:07<06:04, 1.98it/s]
28%|βββ | 280/1000 [02:08<06:01, 1.99it/s]
28%|βββ | 280/1000 [02:08<06:01, 1.99it/s]
28%|βββ | 281/1000 [02:08<05:42, 2.10it/s]
28%|βββ | 282/1000 [02:09<05:34, 2.15it/s]
28%|βββ | 283/1000 [02:09<05:32, 2.16it/s]
28%|βββ | 284/1000 [02:10<05:35, 2.14it/s]
28%|βββ{'loss': 0.5507, 'grad_norm': 1.9451686143875122, 'learning_rate': 8.517952785058384e-07, 'train/length': 2762.5, 'epoch': 0.29}
| 285/1000 [02:10<05:24, 2.20it/s]
29%|βββ | 286/1000 [02:10<05:25, 2.19it/s]
29%|βββ | 287/1000 [02:11<05:28, 2.17it/s]
29%|βββ | 288/1000 [02:11<05:22, 2.21it/s]
29%|βββ | 289/1000 [02:12<05:23, 2.20it/s]
29%|βββ | 290/1000 [02:12<06:03, 1.96it/s]
29%|βββ | 290/1000 [02:12<06:03, 1.96it/s]
29%|βββ | 291/1000 [02:13<05:45, 2.05it/s]
29%|βββ | 292/1000 [02:13<05:34, 2.11it/s]
29%|βββ | 293/1000 [02:14<05:31, 2.13it/s]
29%|βββ | 294/1000 [02:14<05:35, 2.10it/s]
30%|βββ | 295/1000 [02:15<05:30, 2.13it/s]
30%|βββ | 296/1000 [02:15<05:24, 2.17it/s]
30%|βββ | 297/1000 [02:16<05:16, 2.22it/s]
30%|βββ | 298/1000 [02:16<05:00, 2.34it/s]
30%|βββ | 299/1000 [02:16<05:02, 2.31it/s]
30%|βββ | 300/1000 [02:17<04:56, 2.36it/s]
{'loss': 0.5616, 'grad_norm': 2.2565317153930664, 'learning_rate': 8.398554292153865e-07, 'train/length': 3278.6, 'epoch': 0.3}
{'loss': 0.5402, 'grad_norm': 2.2505862712860107, 'learning_rate': 8.275439530027947e-07, 'train/length': 2845.7, 'epoch': 0.31}
30%|βββ | 300/1000 [02:17<04:56, 2.36it/s]
30%|βββ | 301/1000 [02:17<04:45, 2.45it/s]
30%|βββ | 302/1000 [02:18<04:46, 2.44it/s]
30%|βββ | 303/1000 [02:18<04:37, 2.51it/s]
30%|βββ | 304/1000 [02:18<04:45, 2.44it/s]
30%|βββ | 305/1000 [02:19<04:44, 2.44it/s]
31%|βββ | 306/1000 [02:19<05:01, 2.31it/s]
31%|βββ | 307/1000 [02:20<04:58, 2.32it/s]
31%|βββ | 308/1000 [02:20<04:55, 2.34it/s]
31%|βββ | 309/1000 [02:21<04:53, 2.35it/s]
31%|βββ | 310/1000 [02:21<04:42, 2.44it/s]
31%|βββ | 310/1000 [02:21<04:42, 2.44it/s]
31%|βββ | 311/1000 [02:21<04:48, 2.39it/s]
31%|βββ | 312/1000 [02:22<04:45, 2.41it/s]
31%|ββββ | 313/1000 [02:22<04:47, 2.39it/s]
31%|ββββ | 314/1000 [02:23<04:51, 2.35it/s]
32%|ββββ {'loss': 0.5391, 'grad_norm': 2.210747241973877, 'learning_rate': 8.148743122865463e-07, 'train/length': 2786.0, 'epoch': 0.32}
| 315/1000 [02:23<05:00, 2.28it/s]
32%|ββββ | 316/1000 [02:24<04:55, 2.31it/s]
32%|ββββ | 317/1000 [02:24<04:43, 2.41it/s]
32%|ββββ | 318/1000 [02:24<04:43, 2.41it/s]
32%|ββββ | 319/1000 [02:25<04:48, 2.36it/s]
32%|ββββ | 320/1000 [02:25<04:57, 2.28it/s]
32%|ββββ | 320/1000 [02:25<04:57, 2.28it/s]
32%|ββββ | 321/1000 [02:26<04:56, 2.29it/s]
32%|ββββ | 322/1000 [02:26<05:07, 2.20it/s]
32%|ββββ | 323/1000 [02:27<05:02, 2.24it/s]
32%|ββββ | 324/1000 [02:27<04:56, 2.28it/s]
32%|ββββ | 325/1000 [02:27<04:52, 2.30it/s]
33%|ββββ | 326/1000 [02:28<04:49, 2.33it/s]
33%|ββββ | 327/1000 [02:28<05:10, 2.17it/s]
33%|ββββ | 328/1000 [02:29<04:50, 2.31it/s]
33%|ββββ | 329/1000 [02:29<04:51, 2.30it/s]
33%|ββββ | 330/1000 [02:30<04:50, {'loss': 0.5439, 'grad_norm': 1.867335319519043, 'learning_rate': 8.018603611327504e-07, 'train/length': 2700.0, 'epoch': 0.33}
{'loss': 0.5485, 'grad_norm': 2.0507471561431885, 'learning_rate': 7.88516330105925e-07, 'train/length': 3127.7, 'epoch': 0.34}
2.30it/s]
33%|ββββ | 330/1000 [02:30<04:50, 2.30it/s]
33%|ββββ | 331/1000 [02:30<04:46, 2.34it/s]
33%|ββββ | 332/1000 [02:31<04:55, 2.26it/s]
33%|ββββ | 333/1000 [02:31<05:05, 2.18it/s]
33%|ββββ | 334/1000 [02:31<04:55, 2.26it/s]
34%|ββββ | 335/1000 [02:32<05:04, 2.18it/s]
34%|ββββ | 336/1000 [02:33<05:27, 2.03it/s]
34%|ββββ | 337/1000 [02:33<05:19, 2.08it/s]
34%|ββββ | 338/1000 [02:33<05:10, 2.13it/s]
34%|ββββ | 339/1000 [02:34<05:57, 1.85it/s]
34%|ββββ | 340/1000 [02:35<05:37, 1.95it/s]
34%|ββββ | 340/1000 [02:35<05:37, 1.95it/s]
34%|ββββ | 341/1000 [02:35<05:19, 2.06it/s]
34%|ββββ | 342/1000 [02:35<04:57, 2.21it/s]
34%|ββββ | 343/1000 [02:36<05:01, 2.18it/s]
34%|ββββ | {'loss': 0.5309, 'grad_norm': 2.066507577896118, 'learning_rate': 7.74856810708083e-07, 'train/length': 2808.5, 'epoch': 0.35}
344/1000 [02:36<04:53, 2.23it/s]
34%|ββββ | 345/1000 [02:37<04:50, 2.25it/s]
35%|ββββ | 346/1000 [02:37<04:43, 2.30it/s]
35%|ββββ | 347/1000 [02:38<04:38, 2.35it/s]
35%|ββββ | 348/1000 [02:38<04:39, 2.33it/s]
35%|ββββ | 349/1000 [02:38<04:43, 2.30it/s]
35%|ββββ | 350/1000 [02:39<04:47, 2.26it/s]
35%|ββββ | 350/1000 [02:39<04:47, 2.26it/s]
35%|ββββ | 351/1000 [02:39<04:43, 2.29it/s]
35%|ββββ | 352/1000 [02:40<04:47, 2.25it/s]
35%|ββββ | 353/1000 [02:40<04:45, 2.27it/s]
35%|ββββ | 354/1000 [02:41<04:42, 2.29it/s]
36%|ββββ | 355/1000 [02:41<04:37, 2.33it/s]
36%|ββββ | 356/1000 [02:41<04:26, 2.42it/s]
36%|ββββ | 357/1000 [02:42<04:37, 2.32it/s]
36%|ββββ | 358/1000 [02:42<04:35, 2.33it/s]
36%|ββββ | 359/1000 [02:43<04:38, 2.30{'loss': 0.5258, 'grad_norm': 1.9634464979171753, 'learning_rate': 7.608967394231386e-07, 'train/length': 2808.8, 'epoch': 0.36}
{'loss': 0.5122, 'grad_norm': 2.1704704761505127, 'learning_rate': 7.466513813840824e-07, 'train/length': 2965.7, 'epoch': 0.37}
it/s]
36%|ββββ | 360/1000 [02:43<04:42, 2.27it/s]
36%|ββββ | 360/1000 [02:43<04:42, 2.27it/s]
36%|ββββ | 361/1000 [02:44<04:29, 2.37it/s]
36%|ββββ | 362/1000 [02:44<04:34, 2.32it/s]
36%|ββββ | 363/1000 [02:44<04:35, 2.31it/s]
36%|ββββ | 364/1000 [02:45<04:37, 2.29it/s]
36%|ββββ | 365/1000 [02:45<04:36, 2.29it/s]
37%|ββββ | 366/1000 [02:46<04:37, 2.28it/s]
37%|ββββ | 367/1000 [02:46<04:43, 2.23it/s]
37%|ββββ | 368/1000 [02:47<04:42, 2.24it/s]
37%|ββββ | 369/1000 [02:47<04:59, 2.11it/s]
37%|ββββ | 370/1000 [02:48<04:50, 2.17it/s]
37%|ββββ | 370/1000 [02:48<04:50, 2.17it/s]
37%|ββββ | 371/1000 [02:48<04:50, 2.16it/s]
37%|ββββ | 372/1000 [02:49<04:53, 2.14it/s]
37%|ββββ | 373/{'loss': 0.5304, 'grad_norm': 1.9664157629013062, 'learning_rate': 7.321363136807818e-07, 'train/length': 2680.5, 'epoch': 0.38}
1000 [02:49<04:46, 2.19it/s]
37%|ββββ | 374/1000 [02:49<04:37, 2.25it/s]
38%|ββββ | 375/1000 [02:50<04:42, 2.21it/s]
38%|ββββ | 376/1000 [02:50<04:36, 2.26it/s]
38%|ββββ | 377/1000 [02:51<04:41, 2.22it/s]
38%|ββββ | 378/1000 [02:51<04:25, 2.34it/s]
38%|ββββ | 379/1000 [02:52<04:23, 2.36it/s]
38%|ββββ | 380/1000 [02:52<04:55, 2.10it/s]
38%|ββββ | 380/1000 [02:52<04:55, 2.10it/s]
38%|ββββ | 381/1000 [02:53<04:46, 2.16it/s]
38%|ββββ | 382/1000 [02:53<04:36, 2.24it/s]
38%|ββββ | 383/1000 [02:54<04:41, 2.19it/s]
38%|ββββ | 384/1000 [02:54<04:34, 2.24it/s]
38%|ββββ | 385/1000 [02:54<04:32, 2.26it/s]
39%|ββββ | 386/1000 [02:55<04:32, 2.26it/s]
39%|ββββ | 387/1000 [02:55<04:49, 2.12it/s]
39%|ββββ | 388/1000 [02:56<04:37, 2.20it/s{'loss': 0.538, 'grad_norm': 2.3522841930389404, 'learning_rate': 7.173674083266623e-07, 'train/length': 2875.9, 'epoch': 0.39}
{'loss': 0.5241, 'grad_norm': 2.012702703475952, 'learning_rate': 7.023608149028936e-07, 'train/length': 2936.9, 'epoch': 0.4}
]
39%|ββββ | 389/1000 [02:56<04:35, 2.22it/s]
39%|ββββ | 390/1000 [02:57<04:21, 2.33it/s]
39%|ββββ | 390/1000 [02:57<04:21, 2.33it/s]
39%|ββββ | 391/1000 [02:57<04:26, 2.28it/s]
39%|ββββ | 392/1000 [02:58<04:28, 2.26it/s]
39%|ββββ | 393/1000 [02:58<04:31, 2.23it/s]
39%|ββββ | 394/1000 [02:59<04:51, 2.08it/s]
40%|ββββ | 395/1000 [02:59<04:43, 2.14it/s]
40%|ββββ | 396/1000 [02:59<04:41, 2.15it/s]
40%|ββββ | 397/1000 [03:00<04:24, 2.28it/s]
40%|ββββ | 398/1000 [03:00<04:17, 2.34it/s]
40%|ββββ | 399/1000 [03:01<04:05, 2.44it/s]
40%|ββββ | 400/1000 [03:01<04:06, 2.44it/s]
40%|ββββ | 400/1000 [03:01<04:06, 2.44it/s]
40%|ββββ | 401/1000 [03:01<04:06, 2.43it/s]
40%|ββββ | 402/1000{'loss': 0.5156, 'grad_norm': 2.1949377059936523, 'learning_rate': 6.871329428990601e-07, 'train/length': 3036.7, 'epoch': 0.41}
[03:02<04:11, 2.37it/s]
40%|ββββ | 403/1000 [03:02<04:23, 2.26it/s]
40%|ββββ | 404/1000 [03:03<04:25, 2.25it/s]
40%|ββββ | 405/1000 [03:03<04:30, 2.20it/s]
41%|ββββ | 406/1000 [03:04<04:34, 2.17it/s]
41%|ββββ | 407/1000 [03:04<04:30, 2.19it/s]
41%|ββββ | 408/1000 [03:05<04:30, 2.19it/s]
41%|ββββ | 409/1000 [03:05<04:27, 2.21it/s]
41%|ββββ | 410/1000 [03:06<04:20, 2.27it/s]
41%|ββββ | 410/1000 [03:06<04:20, 2.27it/s]
41%|ββββ | 411/1000 [03:06<04:26, 2.21it/s]
41%|ββββ | 412/1000 [03:06<04:26, 2.21it/s]
41%|βββββ | 413/1000 [03:07<04:22, 2.23it/s]
41%|βββββ | 414/1000 [03:07<04:23, 2.23it/s]
42%|βββββ | 415/1000 [03:08<04:26, 2.20it/s]
42%|βββββ | 416/1000 [03:08<04:17, 2.26it/s]
42%|βββββ | 417/1000 [03:09<04:22, 2.{'loss': 0.513, 'grad_norm': 1.956830382347107, 'learning_rate': 6.717004437696249e-07, 'train/length': 2671.5, 'epoch': 0.42}
{'loss': 0.4954, 'grad_norm': 2.126512289047241, 'learning_rate': 6.560801927258079e-07, 'train/length': 2967.7, 'epoch': 0.43}
22it/s]
42%|βββββ | 418/1000 [03:09<04:20, 2.24it/s]
42%|βββββ | 419/1000 [03:10<04:17, 2.25it/s]
42%|βββββ | 420/1000 [03:10<04:16, 2.26it/s]
42%|βββββ | 420/1000 [03:10<04:16, 2.26it/s]
42%|βββββ | 421/1000 [03:10<04:09, 2.32it/s]
42%|βββββ | 422/1000 [03:11<04:05, 2.35it/s]
42%|βββββ | 423/1000 [03:11<04:10, 2.30it/s]
42%|βββββ | 424/1000 [03:12<04:15, 2.26it/s]
42%|βββββ | 425/1000 [03:12<04:37, 2.07it/s]
43%|βββββ | 426/1000 [03:13<04:28, 2.14it/s]
43%|βββββ | 427/1000 [03:13<04:23, 2.17it/s]
43%|βββββ | 428/1000 [03:14<04:37, 2.06it/s]
43%|βββββ | 429/1000 [03:14<04:36, 2.07it/s]
43%|βββββ | 430/1000 [03:15<04:27, 2.13it/s]
43%|βββββ | 430/1000 [03:15<04:27, 2.13it/{'loss': 0.499, 'grad_norm': 1.9472200870513916, 'learning_rate': 6.402892702827916e-07, 'train/length': 3022.3, 'epoch': 0.44}
s]
43%|βββββ | 431/1000 [03:15<04:11, 2.26it/s]
43%|βββββ | 432/1000 [03:15<04:12, 2.25it/s]
43%|βββββ | 433/1000 [03:16<04:09, 2.27it/s]
43%|βββββ | 434/1000 [03:16<04:02, 2.33it/s]
44%|βββββ | 435/1000 [03:17<04:13, 2.23it/s]
44%|βββββ | 436/1000 [03:17<04:13, 2.22it/s]
44%|βββββ | 437/1000 [03:18<04:08, 2.27it/s]
44%|βββββ | 438/1000 [03:18<04:30, 2.08it/s]
44%|βββββ | 439/1000 [03:19<04:22, 2.14it/s]
44%|βββββ | 440/1000 [03:19<04:46, 1.96it/s]
44%|βββββ | 440/1000 [03:19<04:46, 1.96it/s]
44%|βββββ | 441/1000 [03:20<04:22, 2.13it/s]
44%|βββββ | 442/1000 [03:20<04:16, 2.17it/s]
44%|βββββ | 443/1000 [03:21<04:09, 2.23it/s]
44%|βββββ | 444/1000 [03:21<04:10, 2.22it/s]
44%|βββββ | 445/1000 [03:21<04:03, 2.2{'loss': 0.5099, 'grad_norm': 1.9985867738723755, 'learning_rate': 6.243449435824276e-07, 'train/length': 2944.6, 'epoch': 0.45}
8it/s]
45%|βββββ | 446/1000 [03:22<03:57, 2.33it/s]
45%|βββββ | 447/1000 [03:22<03:47, 2.43it/s]
45%|βββββ | 448/1000 [03:23<03:41, 2.49it/s]
45%|βββββ | 449/1000 [03:23<03:50, 2.40it/s]
45%|βββββ | 450/1000 [03:23<03:54, 2.35it/s]
45%|βββββ | 450/1000 [03:23<03:54, 2.35it/s]
45%|βββββ | 451/1000 [03:24<03:54, 2.34it/s]
45%|βββββ | 452/1000 [03:24<03:54, 2.33it/s]
45%|βββββ | 453/1000 [03:25<04:02, 2.25it/s]
45%|βββββ | 454/1000 [03:25<04:07, 2.20it/s]
46%|βββββ | 455/1000 [03:26<04:04, 2.23it/s]
46%|βββββ | 456/1000 [03:26<04:08, 2.19it/s]
46%|βββββ | 457/1000 [03:27<03:54, 2.32it/s]
46%|βββββ | 458/1000 [03:27<03:53, 2.32it/s]
46%|βββββ | 459/1000 [03:27<03:52, 2.33it/s]
46%|βββββ | 460/1000 [03:28<04:40, {'loss': 0.5061, 'grad_norm': 1.8925042152404785, 'learning_rate': 6.082646475118699e-07, 'train/length': 3218.9, 'epoch': 0.46}
{'loss': 0.5161, 'grad_norm': 2.0513346195220947, 'learning_rate': 5.920659656387836e-07, 'train/length': 2673.4, 'epoch': 0.47}
1.93it/s]
46%|βββββ | 460/1000 [03:28<04:40, 1.93it/s]
46%|βββββ | 461/1000 [03:29<04:24, 2.04it/s]
46%|βββββ | 462/1000 [03:29<04:11, 2.14it/s]
46%|βββββ | 463/1000 [03:29<04:02, 2.21it/s]
46%|βββββ | 464/1000 [03:30<03:57, 2.26it/s]
46%|βββββ | 465/1000 [03:30<03:55, 2.28it/s]
47%|βββββ | 466/1000 [03:31<03:44, 2.38it/s]
47%|βββββ | 467/1000 [03:31<03:52, 2.30it/s]
47%|βββββ | 468/1000 [03:32<03:48, 2.33it/s]
47%|βββββ | 469/1000 [03:32<04:31, 1.95it/s]
47%|βββββ | 470/1000 [03:33<04:21, 2.03it/s]
47%|βββββ | 470/1000 [03:33<04:21, 2.03it/s]
47%|βββββ | 471/1000 [03:33<04:17, 2.05it/s]
47%|βββββ | 472/1000 [03:34<04:09, 2.12it/s]
47%|βββββ | 473/1000 [03:34<04:04, 2.16{'loss': 0.4954, 'grad_norm': 2.1877002716064453, 'learning_rate': 5.757666109839702e-07, 'train/length': 2849.1, 'epoch': 0.48}
it/s]
47%|βββββ | 474/1000 [03:34<03:48, 2.30it/s]
48%|βββββ | 475/1000 [03:35<03:51, 2.26it/s]
48%|βββββ | 476/1000 [03:35<03:53, 2.25it/s]
48%|βββββ | 477/1000 [03:36<03:56, 2.21it/s]
48%|βββββ | 478/1000 [03:36<03:49, 2.27it/s]
48%|βββββ | 479/1000 [03:37<03:38, 2.38it/s]
48%|βββββ | 480/1000 [03:37<03:36, 2.40it/s]
48%|βββββ | 480/1000 [03:37<03:36, 2.40it/s]
48%|βββββ | 481/1000 [03:37<03:44, 2.31it/s]
48%|βββββ | 482/1000 [03:38<04:07, 2.09it/s]
48%|βββββ | 483/1000 [03:38<04:00, 2.15it/s]
48%|βββββ | 484/1000 [03:39<04:49, 1.78it/s]
48%|βββββ | 485/1000 [03:40<04:32, 1.89it/s]
49%|βββββ | 486/1000 [03:40<04:26, 1.93it/s]
49%|βββββ | 487/1000 [03:41<04:29, 1.91it/s]
49%|βββββ | 488/1000 [03:41<04:23, {'loss': 0.5249, 'grad_norm': 2.507943630218506, 'learning_rate': 5.5938440665244e-07, 'train/length': 3359.9, 'epoch': 0.49}
{'loss': 0.4807, 'grad_norm': 2.068692922592163, 'learning_rate': 5.429372663441085e-07, 'train/length': 2714.9, 'epoch': 0.5}
1.94it/s]
49%|βββββ | 489/1000 [03:42<04:12, 2.03it/s]
49%|βββββ | 490/1000 [03:42<04:05, 2.08it/s]
49%|βββββ | 490/1000 [03:42<04:05, 2.08it/s]
49%|βββββ | 491/1000 [03:43<03:54, 2.17it/s]
49%|βββββ | 492/1000 [03:43<03:53, 2.18it/s]
49%|βββββ | 493/1000 [03:43<03:44, 2.26it/s]
49%|βββββ | 494/1000 [03:44<03:40, 2.30it/s]
50%|βββββ | 495/1000 [03:44<03:36, 2.33it/s]
50%|βββββ | 496/1000 [03:45<03:25, 2.45it/s]
50%|βββββ | 497/1000 [03:45<03:18, 2.53it/s]
50%|βββββ | 498/1000 [03:45<03:22, 2.48it/s]
50%|βββββ | 499/1000 [03:46<03:25, 2.44it/s]
50%|βββββ | 500/1000 [03:46<03:26, 2.42it/s]
50%|βββββ | 500/1000 [03:46<03:26, 2.42it/s]
50%|βββββ | 501/1000 [03:47<03:25, 2.43i{'loss': 0.4968, 'grad_norm': 1.9514744281768799, 'learning_rate': 5.264431747654283e-07, 'train/length': 2707.7, 'epoch': 0.51}
t/s]
50%|βββββ | 502/1000 [03:47<03:23, 2.44it/s]
50%|βββββ | 503/1000 [03:47<03:26, 2.41it/s]
50%|βββββ | 504/1000 [03:48<03:23, 2.44it/s]
50%|βββββ | 505/1000 [03:48<03:44, 2.20it/s]
51%|βββββ | 506/1000 [03:49<04:06, 2.01it/s]
51%|βββββ | 507/1000 [03:49<03:58, 2.06it/s]
51%|βββββ | 508/1000 [03:50<03:57, 2.07it/s]
51%|βββββ | 509/1000 [03:50<03:52, 2.11it/s]
51%|βββββ | 510/1000 [03:51<03:48, 2.15it/s]
51%|βββββ | 510/1000 [03:51<03:48, 2.15it/s]
51%|βββββ | 511/1000 [03:51<03:51, 2.11it/s]
51%|βββββ | 512/1000 [03:52<04:41, 1.73it/s]
51%|ββββββ | 513/1000 [03:53<04:26, 1.83it/s]
51%|ββββββ | 514/1000 [03:53<04:14, 1.91it/s]
52%|ββββββ | 515/1000 [03:54<03:57, 2.05it/s]
52%|ββββββ | 516/1000 [03:54<0{'loss': 0.4916, 'grad_norm': 1.9575438499450684, 'learning_rate': 5.099201679633768e-07, 'train/length': 3217.4, 'epoch': 0.52}
3:57, 2.04it/s]
52%|ββββββ | 517/1000 [03:54<03:50, 2.10it/s]
52%|ββββββ | 518/1000 [03:55<03:46, 2.13it/s]
52%|ββββββ | 519/1000 [03:55<03:45, 2.14it/s]
52%|ββββββ | 520/1000 [03:56<03:45, 2.13it/s]
52%|ββββββ | 520/1000 [03:56<03:45, 2.13it/s]
52%|ββββββ | 521/1000 [03:56<03:54, 2.04it/s]
52%|ββββββ | 522/1000 [03:57<03:47, 2.10it/s]
52%|ββββββ | 523/1000 [03:57<03:31, 2.25it/s]
52%|ββββββ | 524/1000 [03:58<03:29, 2.27it/s]
52%|ββββββ | 525/1000 [03:58<03:26, 2.30it/s]
53%|ββββββ | 526/1000 [03:58<03:25, 2.31it/s]
53%|ββββββ | 527/1000 [03:59<03:44, 2.11it/s]
53%|ββββββ | 528/1000 [03:59<03:33, 2.21it/s]
53%|ββββββ | 529/1000 [04:00<03:38, 2.15it/s]
53%|ββββββ | 530/1000 [04:00<03:34, 2.19it/s]
{'loss': 0.4968, 'grad_norm': 2.145535707473755, 'learning_rate': 4.933863136033039e-07, 'train/length': 3204.5, 'epoch': 0.53}
{'loss': 0.5037, 'grad_norm': 1.955794095993042, 'learning_rate': 4.768596912122045e-07, 'train/length': 3039.2, 'epoch': 0.54}
53%|ββββββ | 530/1000 [04:00<03:34, 2.19it/s]
53%|ββββββ | 531/1000 [04:01<03:30, 2.23it/s]
53%|ββββββ | 532/1000 [04:01<03:29, 2.24it/s]
53%|ββββββ | 533/1000 [04:02<03:22, 2.30it/s]
53%|ββββββ | 534/1000 [04:02<03:22, 2.30it/s]
54%|ββββββ | 535/1000 [04:03<03:23, 2.28it/s]
54%|ββββββ | 536/1000 [04:03<03:28, 2.23it/s]
54%|ββββββ | 537/1000 [04:03<03:33, 2.17it/s]
54%|ββββββ | 538/1000 [04:04<03:32, 2.17it/s]
54%|ββββββ | 539/1000 [04:04<03:29, 2.20it/s]
54%|ββββββ | 540/1000 [04:05<03:32, 2.17it/s]
54%|ββββββ | 540/1000 [04:05<03:32, 2.17it/s]
54%|ββββββ | 541/1000 [04:05<03:48, 2.00it/s]
54%|ββββββ | 542/1000 [04:06<03:41, 2.07it/s]
54%|ββββββ | 543/1000 [04:06<03:{'loss': 0.4913, 'grad_norm': 2.091341257095337, 'learning_rate': 4.60358372409022e-07, 'train/length': 3083.5, 'epoch': 0.55}
51, 1.98it/s]
54%|ββββββ | 544/1000 [04:07<03:42, 2.05it/s]
55%|ββββββ | 545/1000 [04:07<03:30, 2.17it/s]
55%|ββββββ | 546/1000 [04:08<03:24, 2.22it/s]
55%|ββββββ | 547/1000 [04:08<03:21, 2.25it/s]
55%|ββββββ | 548/1000 [04:09<03:25, 2.20it/s]
55%|ββββββ | 549/1000 [04:09<03:37, 2.07it/s]
55%|ββββββ | 550/1000 [04:10<03:29, 2.14it/s]
55%|ββββββ | 550/1000 [04:10<03:29, 2.14it/s]
55%|ββββββ | 551/1000 [04:10<03:26, 2.17it/s]
55%|ββββββ | 552/1000 [04:11<03:24, 2.19it/s]
55%|ββββββ | 553/1000 [04:11<03:19, 2.24it/s]
55%|ββββββ | 554/1000 [04:11<03:17, 2.26it/s]
56%|ββββββ | 555/1000 [04:12<03:08, 2.36it/s]
56%|ββββββ | 556/1000 [04:12<03:16, 2.25it/s]
56%|ββββββ | 557/1000 [04:13<03:16, 2.25it/s]
ββββ | 558/1000 [04:13<03:36, 2.04it/s]
56%|ββββββ | 559/1000 [04:14<03:55, 1.88it/s]
56%|ββββββ | 560/1000 [04:14<03:33, 2.06it/s]
56%|ββββββ | 560/1000 [04:14<03:33, 2.06it/s]
56%|ββββββ | 561/1000 [04:15<03:24, 2.14it/s]
56%|ββββββ | 562/1000 [04:15<03:17, 2.22it/s]
56%|ββββββ | 563/1000 [04:15<03:06, 2.35it/s]
56%|ββββββ | 564/1000 [04:16<03:09, 2.30it/s]
56%|ββββββ | 565/1000 [04:16<03:07, 2.32it/s]
57%|ββββββ | 566/1000 [04:17<03:10, 2.28it/s]
57%|ββββββ | 567/1000 [04:17<03:12, 2.25it/s]
57%|ββββββ | 568/1000 [04:18<03:02, 2.37it/s]
57%|ββββββ | 569/1000 [04:18<03:22, 2.13it/s]
57%|ββββββ | 570/1000 [04:19<03:23, 2.11it/s]
57%|ββββββ | 570/1000 [04:19<03:23{'loss': 0.4797, 'grad_norm': 2.2423527240753174, 'learning_rate': 4.1118642034694565e-07, 'train/length': 2963.6, 'epoch': 0.58}
, 2.11it/s]
57%|ββββββ | 571/1000 [04:19<03:14, 2.20it/s]
57%|ββββββ | 572/1000 [04:20<03:14, 2.20it/s]
57%|ββββββ | 573/1000 [04:20<03:12, 2.22it/s]
57%|ββββββ | 574/1000 [04:20<03:05, 2.29it/s]
57%|ββββββ | 575/1000 [04:21<03:09, 2.24it/s]
58%|ββββββ | 576/1000 [04:21<03:04, 2.30it/s]
58%|ββββββ | 577/1000 [04:22<03:23, 2.07it/s]
58%|ββββββ | 578/1000 [04:22<03:16, 2.15it/s]
58%|ββββββ | 579/1000 [04:23<03:16, 2.14it/s]
58%|ββββββ | 580/1000 [04:23<03:14, 2.16it/s]
58%|ββββββ | 580/1000 [04:23<03:14, 2.16it/s]
58%|ββββββ | 581/1000 [04:24<03:06, 2.25it/s]
58%|ββββββ | 582/1000 [04:24<03:04, 2.27it/s]
58%|ββββββ | 583/1000 [04:25<03:02, 2.28it/s]
58%|ββββββ | 584/1000 [04:25<03:02, 2.28it/s]
58%|βββ{'loss': 0.4867, 'grad_norm': 1.9821356534957886, 'learning_rate': 3.9496618307341713e-07, 'train/length': 2880.3, 'epoch': 0.59}
βββ | 585/1000 [04:25<03:06, 2.22it/s]
59%|ββββββ | 586/1000 [04:26<03:05, 2.23it/s]
59%|ββββββ | 587/1000 [04:26<03:02, 2.26it/s]
59%|ββββββ | 588/1000 [04:27<03:01, 2.27it/s]
59%|ββββββ | 589/1000 [04:27<03:17, 2.08it/s]
59%|ββββββ | 590/1000 [04:28<03:12, 2.13it/s]
59%|ββββββ | 590/1000 [04:28<03:12, 2.13it/s]
59%|ββββββ | 591/1000 [04:28<03:08, 2.17it/s]
59%|ββββββ | 592/1000 [04:29<03:01, 2.25it/s]
59%|ββββββ | 593/1000 [04:29<02:58, 2.28it/s]
59%|ββββββ | 594/1000 [04:29<02:56, 2.30it/s]
60%|ββββββ | 595/1000 [04:30<02:57, 2.29it/s]
60%|ββββββ | 596/1000 [04:30<02:57, 2.27it/s]
60%|ββββββ | 597/1000 [04:31<02:49, 2.38it/s]
60%|ββββββ | 598/1000 [04:31<02:53, 2.32it/s]
60%|ββββββ | 599/1000 [04{'loss': 0.4847, 'grad_norm': 2.069033622741699, 'learning_rate': 3.788607987366069e-07, 'train/length': 3026.9, 'epoch': 0.6}
{'loss': 0.498, 'grad_norm': 2.075808525085449, 'learning_rate': 3.6288787833783016e-07, 'train/length': 3130.2, 'epoch': 0.61}
:32<02:54, 2.30it/s]
60%|ββββββ | 600/1000 [04:32<02:57, 2.26it/s]
60%|ββββββ | 600/1000 [04:32<02:57, 2.26it/s]
60%|ββββββ | 601/1000 [04:33<02:58, 2.24it/s]
60%|ββββββ | 602/1000 [04:33<03:13, 2.05it/s]
60%|ββββββ | 603/1000 [04:34<03:12, 2.07it/s]
60%|ββββββ | 604/1000 [04:34<03:05, 2.14it/s]
60%|ββββββ | 605/1000 [04:34<03:01, 2.17it/s]
61%|ββββββ | 606/1000 [04:35<02:56, 2.24it/s]
61%|ββββββ | 607/1000 [04:35<02:54, 2.25it/s]
61%|ββββββ | 608/1000 [04:36<02:51, 2.28it/s]
61%|ββββββ | 609/1000 [04:36<02:51, 2.28it/s]
61%|ββββββ | 610/1000 [04:37<02:51, 2.28it/s]
61%|ββββββ | 610/1000 [04:37<02:51, 2.28it/s]
61%|ββββββ | 611/1000 [04:37<02:48, 2.31it/s]
βββ | 612/1000 [04:38<03:04, 2.10it/s]
61%|βββββββ | 613/1000 [04:38<03:04, 2.10it/s]
61%|βββββββ | 614/1000 [04:38<02:52, 2.24it/s]
62%|βββββββ | 615/1000 [04:39<02:49, 2.27it/s]
62%|βββββββ | 616/1000 [04:39<02:50, 2.26it/s]
62%|βββββββ | 617/1000 [04:40<02:52, 2.22it/s]
62%|βββββββ | 618/1000 [04:40<02:48, 2.27it/s]
62%|βββββββ | 619/1000 [04:41<02:45, 2.30it/s]
62%|βββββββ | 620/1000 [04:41<02:48, 2.26it/s]
62%|βββββββ | 620/1000 [04:41<02:48, 2.26it/s]
62%|βββββββ | 621/1000 [04:42<02:45, 2.29it/s]
62%|βββββββ | 622/1000 [04:42<02:46, 2.27it/s]
62%|βββββββ | 623/1000 [04:42<02:42, 2.32it/s]
62%|βββββββ | 624/1000 [04:43<02:35, 2.42it/s]
62%|βββββββ | 625/1000 [04:43<02:41, 2.32it/s]
ββββ | 626/1000 [04:44<02:40, 2.33it/s]
63%|βββββββ | 627/1000 [04:44<02:41, 2.31it/s]
63%|βββββββ | 628/1000 [04:45<02:45, 2.25it/s]
63%|βββββββ | 629/1000 [04:45<02:46, 2.23it/s]
63%|βββββββ | 630/1000 [04:46<02:48, 2.20it/s]
63%|βββββββ | 630/1000 [04:46<02:48, 2.20it/s]
63%|βββββββ | 631/1000 [04:46<03:16, 1.88it/s]
63%|βββββββ | 632/1000 [04:47<03:03, 2.01it/s]
63%|βββββββ | 633/1000 [04:47<02:58, 2.05it/s]
63%|βββββββ | 634/1000 [04:48<02:52, 2.12it/s]
64%|βββββββ | 635/1000 [04:48<02:40, 2.28it/s]
64%|βββββββ | 636/1000 [04:48<02:38, 2.30it/s]
64%|βββββββ | 637/1000 [04:49<02:31, 2.40it/s]
64%|βββββββ | 638/1000 [04:49<02:31, 2.40it/s]
64%|βββββββ | 639/1000 [04:50<02:46, 2.17it/s]
βββββ | 640/1000 [04:50<02:46, 2.16it/s]
64%|βββββββ | 640/1000 [04:50<02:46, 2.16it/s]
64%|βββββββ | 641/1000 [04:51<02:36, 2.29it/s]
64%|βββββββ | 642/1000 [04:51<02:38, 2.25it/s]
64%|βββββββ | 643/1000 [04:52<02:49, 2.10it/s]
64%|βββββββ | 644/1000 [04:52<02:42, 2.20it/s]
64%|βββββββ | 645/1000 [04:52<02:42, 2.19it/s]
65%|βββββββ | 646/1000 [04:53<02:38, 2.23it/s]
65%|βββββββ | 647/1000 [04:53<02:37, 2.25it/s]
65%|βββββββ | 648/1000 [04:54<02:48, 2.09it/s]
65%|βββββββ | 649/1000 [04:54<02:45, 2.12it/s]
65%|βββββββ | 650/1000 [04:55<02:41, 2.17it/s]
65%|βββββββ | 650/1000 [04:55<02:41, 2.17it/s]
65%|βββββββ | 651/1000 [04:55<02:40, 2.18it/s]
ββ | 652/1000 [04:56<02:39, 2.19it/s]
65%|βββββββ | 653/1000 [04:56<02:36, 2.21it/s]
65%|βββββββ | 654/1000 [04:57<02:48, 2.05it/s]
66%|βββββββ | 655/1000 [04:57<02:40, 2.15it/s]
66%|βββββββ | 656/1000 [04:57<02:36, 2.19it/s]
66%|βββββββ | 657/1000 [04:58<02:34, 2.22it/s]
66%|βββββββ | 658/1000 [04:58<02:30, 2.27it/s]
66%|βββββββ | 659/1000 [04:59<02:26, 2.33it/s]
66%|βββββββ | 660/1000 [04:59<02:25, 2.33it/s]
66%|βββββββ | 660/1000 [04:59<02:25, 2.33it/s]
66%|βββββββ | 661/1000 [05:00<02:37, 2.16it/s]
66%|βββββββ | 662/1000 [05:00<02:35, 2.17it/s]
66%|βββββββ | 663/1000 [05:01<02:35, 2.17it/s]
66%|βββββββ | 664/1000 [05:01<02:33, 2.19it/s]
66%|βββββββ | 665/1000 [05:02<02:31, 2.21it/s]
βββ | 666/1000 [05:02<02:33, 2.17it/s]
67%|βββββββ | 667/1000 [05:02<02:31, 2.20it/s]
67%|βββββββ | 668/1000 [05:03<02:31, 2.19it/s]
67%|βββββββ | 669/1000 [05:03<02:27, 2.24it/s]
67%|βββββββ | 670/1000 [05:04<03:16, 1.68it/s]
67%|βββββββ | 670/1000 [05:04<03:16, 1.68it/s]
67%|βββββββ | 671/1000 [05:05<03:04, 1.78it/s]
67%|βββββββ | 672/1000 [05:05<02:51, 1.91it/s]
67%|βββββββ | 673/1000 [05:06<02:41, 2.03it/s]
67%|βββββββ | 674/1000 [05:06<02:29, 2.19it/s]
68%|βββββββ | 675/1000 [05:06<02:30, 2.16it/s]
68%|βββββββ | 676/1000 [05:07<02:27, 2.20it/s]
68%|βββββββ | 677/1000 [05:07<02:35, 2.08it/s]
68%|βββββββ | 678/1000 [05:08<02:27, 2.19it/s]
68%|βββββββ | 679/1000 [05:08<02:30, 2.14it/s]
ββββ | 680/1000 [05:09<02:26, 2.18it/s]
68%|βββββββ | 680/1000 [05:09<02:26, 2.18it/s]
68%|βββββββ | 681/1000 [05:09<02:23, 2.22it/s]
68%|βββββββ | 682/1000 [05:10<02:15, 2.34it/s]
68%|βββββββ | 683/1000 [05:10<02:17, 2.30it/s]
68%|βββββββ | 684/1000 [05:10<02:14, 2.36it/s]
68%|βββββββ | 685/1000 [05:11<02:15, 2.33it/s]
69%|βββββββ | 686/1000 [05:11<02:14, 2.33it/s]
69%|βββββββ | 687/1000 [05:12<02:18, 2.27it/s]
69%|βββββββ | 688/1000 [05:12<02:17, 2.27it/s]
69%|βββββββ | 689/1000 [05:13<02:15, 2.30it/s]
69%|βββββββ | 690/1000 [05:13<02:09, 2.40it/s]
69%|βββββββ | 690/1000 [05:13<02:09, 2.40it/s]
69%|βββββββ | 691/1000 [05:14<02:25, 2.12it/s]
69%|βββββββ{'loss': 0.4723, 'grad_norm': 1.8892021179199219, 'learning_rate': 2.2791164325437046e-07, 'train/length': 2926.8, 'epoch': 0.7}
| 692/1000 [05:14<02:20, 2.19it/s]
69%|βββββββ | 693/1000 [05:14<02:18, 2.22it/s]
69%|βββββββ | 694/1000 [05:15<02:19, 2.19it/s]
70%|βββββββ | 695/1000 [05:15<02:17, 2.22it/s]
70%|βββββββ | 696/1000 [05:16<02:14, 2.26it/s]
70%|βββββββ | 697/1000 [05:16<02:16, 2.22it/s]
70%|βββββββ | 698/1000 [05:17<02:16, 2.21it/s]
70%|βββββββ | 699/1000 [05:17<02:09, 2.33it/s]
70%|βββββββ | 700/1000 [05:18<02:10, 2.29it/s]
70%|βββββββ | 700/1000 [05:18<02:10, 2.29it/s]
70%|βββββββ | 701/1000 [05:18<02:12, 2.26it/s]
70%|βββββββ | 702/1000 [05:18<02:10, 2.28it/s]
70%|βββββββ | 703/1000 [05:19<02:07, 2.33it/s]
70%|βββββββ | 704/1000 [05:19<02:09, 2.29it/s]
70%|βββββββ | 705/1000 [05:20<02:09, 2.27it/s]
71%|ββββββ{'loss': 0.4901, 'grad_norm': 1.9484336376190186, 'learning_rate': 2.1419080780610122e-07, 'train/length': 2825.8, 'epoch': 0.71}
β | 706/1000 [05:20<02:19, 2.11it/s]
71%|βββββββ | 707/1000 [05:21<02:25, 2.01it/s]
71%|βββββββ | 708/1000 [05:21<02:22, 2.06it/s]
71%|βββββββ | 709/1000 [05:22<02:17, 2.12it/s]
71%|βββββββ | 710/1000 [05:22<02:12, 2.19it/s]
71%|βββββββ | 710/1000 [05:22<02:12, 2.19it/s]
71%|βββββββ | 711/1000 [05:23<02:09, 2.23it/s]
71%|βββββββ | 712/1000 [05:23<02:10, 2.21it/s]
71%|ββββββββ | 713/1000 [05:24<02:34, 1.86it/s]
71%|ββββββββ | 714/1000 [05:24<02:27, 1.94it/s]
72%|ββββββββ | 715/1000 [05:25<02:20, 2.03it/s]
72%|ββββββββ | 716/1000 [05:25<02:10, 2.18it/s]
72%|ββββββββ | 717/1000 [05:26<02:08, 2.20it/s]
72%|ββββββββ | 718/1000 [05:26<02:07, 2.21it/s]
72%|ββββββββ | 719/1000 [05:26<02:05, 2.23it/s]
ββββββββ | 720/1000 [05:27<02:21, 1.98it/s]
72%|ββββββββ | 720/1000 [05:27<02:21, 1.98it/s]
72%|ββββββββ | 721/1000 [05:28<02:30, 1.85it/s]
72%|ββββββββ | 722/1000 [05:28<02:21, 1.96it/s]
72%|ββββββββ | 723/1000 [05:29<02:17, 2.02it/s]
72%|ββββββββ | 724/1000 [05:29<02:11, 2.09it/s]
72%|ββββββββ | 725/1000 [05:29<02:06, 2.17it/s]
73%|ββββββββ | 726/1000 [05:30<02:07, 2.15it/s]
73%|ββββββββ | 727/1000 [05:30<02:04, 2.19it/s]
73%|ββββββββ | 728/1000 [05:31<02:07, 2.14it/s]
73%|ββββββββ | 729/1000 [05:31<02:04, 2.18it/s]
73%|ββββββββ | 730/1000 [05:32<01:59, 2.26it/s]
73%|ββββββββ | 730/1000 [05:32<01:59, 2.26it/s]
73%|ββββββββ | 731/1000 [05:32<01:57,{'loss': 0.4646, 'grad_norm': 2.1329450607299805, 'learning_rate': 1.7496175976529337e-07, 'train/length': 2679.0, 'epoch': 0.74}
2.28it/s]
73%|ββββββββ | 732/1000 [05:33<01:58, 2.27it/s]
73%|ββββββββ | 733/1000 [05:33<01:58, 2.25it/s]
73%|ββββββββ | 734/1000 [05:33<01:57, 2.27it/s]
74%|ββββββββ | 735/1000 [05:34<01:59, 2.22it/s]
74%|ββββββββ | 736/1000 [05:34<01:55, 2.29it/s]
74%|ββββββββ | 737/1000 [05:35<01:56, 2.26it/s]
74%|ββββββββ | 738/1000 [05:35<01:52, 2.32it/s]
74%|ββββββββ | 739/1000 [05:36<01:53, 2.31it/s]
74%|ββββββββ | 740/1000 [05:36<01:53, 2.30it/s]
74%|ββββββββ | 740/1000 [05:36<01:53, 2.30it/s]
74%|ββββββββ | 741/1000 [05:36<01:52, 2.31it/s]
74%|ββββββββ | 742/1000 [05:37<02:01, 2.12it/s]
74%|ββββββββ | 743/1000 [05:37<01:57, 2.19it/s]
74%|ββββββββ | 744/1000 [05:38<01:50, 2.32it/s]
βββ | 745/1000 [05:38<01:47, 2.36it/s]
75%|ββββββββ | 746/1000 [05:39<01:59, 2.12it/s]
75%|ββββββββ | 747/1000 [05:39<01:59, 2.12it/s]
75%|ββββββββ | 748/1000 [05:40<01:57, 2.15it/s]
75%|ββββββββ | 749/1000 [05:40<01:55, 2.18it/s]
75%|ββββββββ | 750/1000 [05:41<01:47, 2.32it/s]
75%|ββββββββ | 750/1000 [05:41<01:47, 2.32it/s]
75%|ββββββββ | 751/1000 [05:41<01:45, 2.36it/s]
75%|ββββββββ | 752/1000 [05:41<01:49, 2.28it/s]
75%|ββββββββ | 753/1000 [05:42<01:47, 2.31it/s]
75%|ββββββββ | 754/1000 [05:42<01:47, 2.29it/s]
76%|ββββββββ | 755/1000 [05:43<01:45, 2.32it/s]
76%|ββββββββ | 756/1000 [05:43<01:57, 2.08it/s]
76%|ββββββββ | 757/1000 [05:44<02:03, 1.96it/s]
76%|ββββββββ | 758/1000 [05:44<02:02, {'loss': 0.478, 'grad_norm': 1.991397738456726, 'learning_rate': 1.5056232853991208e-07, 'train/length': 2819.2, 'epoch': 0.76}
{'loss': 0.467, 'grad_norm': 2.1963040828704834, 'learning_rate': 1.389292011321498e-07, 'train/length': 2242.3, 'epoch': 0.77}
1.98it/s]
76%|ββββββββ | 759/1000 [05:45<01:55, 2.09it/s]
76%|ββββββββ | 760/1000 [05:45<02:00, 1.99it/s]
76%|ββββββββ | 760/1000 [05:45<02:00, 1.99it/s]
76%|ββββββββ | 761/1000 [05:46<02:02, 1.95it/s]
76%|ββββββββ | 762/1000 [05:46<01:56, 2.04it/s]
76%|ββββββββ | 763/1000 [05:47<01:55, 2.06it/s]
76%|ββββββββ | 764/1000 [05:47<01:48, 2.17it/s]
76%|ββββββββ | 765/1000 [05:48<01:42, 2.29it/s]
77%|ββββββββ | 766/1000 [05:48<01:46, 2.21it/s]
77%|ββββββββ | 767/1000 [05:48<01:44, 2.24it/s]
77%|ββββββββ | 768/1000 [05:49<01:44, 2.22it/s]
77%|ββββββββ | 769/1000 [05:49<01:46, 2.16it/s]
77%|ββββββββ | 770/1000 [05:50<01:43, 2.23it/s]
77%|ββββββββ | 770/1{'loss': 0.4787, 'grad_norm': 1.970973253250122, 'learning_rate': 1.2769089934176126e-07, 'train/length': 2731.4, 'epoch': 0.78}
000 [05:50<01:43, 2.23it/s]
77%|ββββββββ | 771/1000 [05:50<01:42, 2.24it/s]
77%|ββββββββ | 772/1000 [05:51<01:39, 2.28it/s]
77%|ββββββββ | 773/1000 [05:51<01:40, 2.25it/s]
77%|ββββββββ | 774/1000 [05:52<01:46, 2.12it/s]
78%|ββββββββ | 775/1000 [05:52<01:44, 2.16it/s]
78%|ββββββββ | 776/1000 [05:53<01:39, 2.24it/s]
78%|ββββββββ | 777/1000 [05:53<01:36, 2.30it/s]
78%|ββββββββ | 778/1000 [05:53<01:37, 2.28it/s]
78%|ββββββββ | 779/1000 [05:54<01:37, 2.26it/s]
78%|ββββββββ | 780/1000 [05:54<01:39, 2.22it/s]
78%|ββββββββ | 780/1000 [05:54<01:39, 2.22it/s]
78%|ββββββββ | 781/1000 [05:55<01:36, 2.28it/s]
78%|ββββββββ | 782/1000 [05:55<01:35, 2.27it/s]
78%|ββββββββ | 783/1000 [05:56<01:34, 2.30it/s]
78%|{'loss': 0.4673, 'grad_norm': 2.0588560104370117, 'learning_rate': 1.1685971208675538e-07, 'train/length': 2960.5, 'epoch': 0.79}
ββββββββ | 784/1000 [05:56<01:43, 2.09it/s]
78%|ββββββββ | 785/1000 [05:57<01:42, 2.10it/s]
79%|ββββββββ | 786/1000 [05:57<01:37, 2.19it/s]
79%|ββββββββ | 787/1000 [05:58<01:36, 2.21it/s]
79%|ββββββββ | 788/1000 [05:58<01:33, 2.26it/s]
79%|ββββββββ | 789/1000 [05:58<01:28, 2.37it/s]
79%|ββββββββ | 790/1000 [05:59<01:30, 2.33it/s]
79%|ββββββββ | 790/1000 [05:59<01:30, 2.33it/s]
79%|ββββββββ | 791/1000 [05:59<01:33, 2.23it/s]
79%|ββββββββ | 792/1000 [06:00<01:33, 2.23it/s]
79%|ββββββββ | 793/1000 [06:00<01:33, 2.21it/s]
79%|ββββββββ | 794/1000 [06:01<01:39, 2.06it/s]
80%|ββββββββ | 795/1000 [06:01<01:36, 2.12it/s]
80%|ββββββββ | 796/1000 [06:02<01:32, 2.22it/s]
80%|ββββββββ | 797/10{'loss': 0.4786, 'grad_norm': 2.1473374366760254, 'learning_rate': 1.0644748311137375e-07, 'train/length': 2973.1, 'epoch': 0.8}
00 [06:02<01:27, 2.32it/s]
80%|ββββββββ | 798/1000 [06:02<01:23, 2.41it/s]
80%|ββββββββ | 799/1000 [06:03<01:26, 2.34it/s]
80%|ββββββββ | 800/1000 [06:03<01:25, 2.33it/s]
80%|ββββββββ | 800/1000 [06:03<01:25, 2.33it/s]
80%|ββββββββ | 801/1000 [06:04<01:25, 2.33it/s]
80%|ββββββββ | 802/1000 [06:04<01:26, 2.29it/s]
80%|ββββββββ | 803/1000 [06:05<01:29, 2.21it/s]
80%|ββββββββ | 804/1000 [06:05<01:27, 2.25it/s]
80%|ββββββββ | 805/1000 [06:05<01:27, 2.22it/s]
81%|ββββββββ | 806/1000 [06:06<01:29, 2.16it/s]
81%|ββββββββ | 807/1000 [06:06<01:26, 2.22it/s]
81%|ββββββββ | 808/1000 [06:07<01:25, 2.25it/s]
81%|ββββββββ | 809/1000 [06:07<01:26, 2.21it/s]
81%|ββββββββ | 810/1000 [06:08<01:31, 2.07it/s]
{'loss': 0.4769, 'grad_norm': 1.9274868965148926, 'learning_rate': 9.646559803512993e-08, 'train/length': 3054.2, 'epoch': 0.81}
{'loss': 0.4819, 'grad_norm': 2.0100791454315186, 'learning_rate': 8.692497190280224e-08, 'train/length': 2740.3, 'epoch': 0.82}
81%|ββββββββ | 810/1000 [06:08<01:31, 2.07it/s]
81%|ββββββββ | 811/1000 [06:08<01:34, 2.00it/s]
81%|ββββββββ | 812/1000 [06:09<01:27, 2.16it/s]
81%|βββββββββ | 813/1000 [06:09<01:23, 2.24it/s]
81%|βββββββββ | 814/1000 [06:10<01:23, 2.22it/s]
82%|βββββββββ | 815/1000 [06:10<01:23, 2.23it/s]
82%|βββββββββ | 816/1000 [06:10<01:20, 2.27it/s]
82%|βββββββββ | 817/1000 [06:11<01:20, 2.27it/s]
82%|βββββββββ | 818/1000 [06:11<01:17, 2.35it/s]
82%|βββββββββ | 819/1000 [06:12<01:20, 2.24it/s]
82%|βββββββββ | 820/1000 [06:12<01:20, 2.23it/s]
82%|βββββββββ | 820/1000 [06:12<01:20, 2.23it/s]
82%|βββββββββ | 821/1000 [06:13<01:34, 1.90it/s]
82%|βββββββββ | {'loss': 0.4838, 'grad_norm': 2.0422775745391846, 'learning_rate': 7.783603724899257e-08, 'train/length': 2878.8, 'epoch': 0.83}
822/1000 [06:14<01:43, 1.72it/s]
82%|βββββββββ | 823/1000 [06:14<01:34, 1.86it/s]
82%|βββββββββ | 824/1000 [06:15<01:30, 1.94it/s]
82%|βββββββββ | 825/1000 [06:15<01:28, 1.99it/s]
83%|βββββββββ | 826/1000 [06:15<01:23, 2.09it/s]
83%|βββββββββ | 827/1000 [06:16<01:17, 2.24it/s]
83%|βββββββββ | 828/1000 [06:16<01:17, 2.23it/s]
83%|βββββββββ | 829/1000 [06:17<01:14, 2.30it/s]
83%|βββββββββ | 830/1000 [06:17<01:14, 2.27it/s]
83%|βββββββββ | 830/1000 [06:17<01:14, 2.27it/s]
83%|βββββββββ | 831/1000 [06:18<01:13, 2.30it/s]
83%|βββββββββ | 832/1000 [06:18<01:14, 2.25it/s]
83%|βββββββββ | 833/1000 [06:18<01:13, 2.26it/s]
83%|βββββββββ | 834/1000 [06:19<01:14, 2.24it/s]
84%|βββββββββ | 835/10{'loss': 0.4734, 'grad_norm': 2.044065237045288, 'learning_rate': 6.92087326903022e-08, 'train/length': 3054.8, 'epoch': 0.84}
00 [06:19<01:14, 2.22it/s]
84%|βββββββββ | 836/1000 [06:20<01:19, 2.07it/s]
84%|βββββββββ | 837/1000 [06:20<01:17, 2.11it/s]
84%|βββββββββ | 838/1000 [06:21<01:15, 2.14it/s]
84%|βββββββββ | 839/1000 [06:21<01:13, 2.19it/s]
84%|βββββββββ | 840/1000 [06:22<01:10, 2.26it/s]
84%|βββββββββ | 840/1000 [06:22<01:10, 2.26it/s]
84%|βββββββββ | 841/1000 [06:22<01:08, 2.32it/s]
84%|βββββββββ | 842/1000 [06:23<01:08, 2.32it/s]
84%|βββββββββ | 843/1000 [06:23<01:09, 2.25it/s]
84%|βββββββββ | 844/1000 [06:23<01:05, 2.36it/s]
84%|βββββββββ | 845/1000 [06:24<01:07, 2.31it/s]
85%|βββββββββ | 846/1000 [06:24<01:07, 2.27it/s]
85%|βββββββββ | 847/1000 [06:25<01:09, 2.19it/s]
85%|βββββββββ | 848/1000 [06{'loss': 0.4744, 'grad_norm': 2.017573118209839, 'learning_rate': 6.105249205760127e-08, 'train/length': 2701.7, 'epoch': 0.85}
{'loss': 0.4867, 'grad_norm': 1.8446524143218994, 'learning_rate': 5.337623408027292e-08, 'train/length': 3675.4, 'epoch': 0.86}
:25<01:09, 2.18it/s]
85%|βββββββββ | 849/1000 [06:26<01:07, 2.24it/s]
85%|βββββββββ | 850/1000 [06:26<01:05, 2.29it/s]
85%|βββββββββ | 850/1000 [06:26<01:05, 2.29it/s]
85%|βββββββββ | 851/1000 [06:27<01:06, 2.26it/s]
85%|βββββββββ | 852/1000 [06:27<01:10, 2.11it/s]
85%|βββββββββ | 853/1000 [06:28<01:07, 2.16it/s]
85%|βββββββββ | 854/1000 [06:29<01:39, 1.46it/s]
86%|βββββββββ | 855/1000 [06:29<01:29, 1.61it/s]
86%|βββββββββ | 856/1000 [06:30<01:22, 1.75it/s]
86%|βββββββββ | 857/1000 [06:30<01:14, 1.91it/s]
86%|βββββββββ | 858/1000 [06:31<01:11, 1.98it/s]
86%|βββββββββ | 859/1000 [06:31<01:09, 2.02it/s]
86%|βββββββββ | 860/1000 [06:32<01:11, 1.96it/s]
{'loss': 0.4643, 'grad_norm': 1.9179342985153198, 'learning_rate': 4.6188352633713956e-08, 'train/length': 2700.5, 'epoch': 0.87}
86%|βββββββββ | 860/1000 [06:32<01:11, 1.96it/s]
86%|βββββββββ | 861/1000 [06:32<01:08, 2.03it/s]
86%|βββββββββ | 862/1000 [06:32<01:05, 2.10it/s]
86%|βββββββββ | 863/1000 [06:33<01:05, 2.10it/s]
86%|βββββββββ | 864/1000 [06:33<01:00, 2.25it/s]
86%|βββββββββ | 865/1000 [06:34<01:01, 2.20it/s]
87%|βββββββββ | 866/1000 [06:34<01:00, 2.22it/s]
87%|βββββββββ | 867/1000 [06:35<00:56, 2.35it/s]
87%|βββββββββ | 868/1000 [06:35<01:00, 2.17it/s]
87%|βββββββββ | 869/1000 [06:36<00:59, 2.20it/s]
87%|βββββββββ | 870/1000 [06:36<00:58, 2.20it/s]
87%|βββββββββ | 870/1000 [06:36<00:58, 2.20it/s]
87%|βββββββββ | 871/1000 [06:36<00:58, 2.19it/s]
87%|βββββββββ | 872/1000 [06:37<00:56, 2.25it/s]
βββββββββ | 873/1000 [06:37<00:57, 2.21it/s]
87%|βββββββββ | 874/1000 [06:38<00:57, 2.17it/s]
88%|βββββββββ | 875/1000 [06:38<00:56, 2.23it/s]
88%|βββββββββ | 876/1000 [06:39<00:54, 2.27it/s]
88%|βββββββββ | 877/1000 [06:39<00:52, 2.35it/s]
88%|βββββββββ | 878/1000 [06:40<00:51, 2.36it/s]
88%|βββββββββ | 879/1000 [06:40<00:53, 2.27it/s]
88%|βββββββββ | 880/1000 [06:40<00:52, 2.27it/s]
88%|βββββββββ | 880/1000 [06:40<00:52, 2.27it/s]
88%|βββββββββ | 881/1000 [06:41<00:51, 2.29it/s]
88%|βββββββββ | 882/1000 [06:41<00:52, 2.24it/s]
88%|βββββββββ | 883/1000 [06:42<00:53, 2.19it/s]
88%|βββββββββ | 884/1000 [06:42<00:51, 2.23it/s]
88%|βββββββββ | 885/1000 [06:43<00:52, 2.20it/s]
βββββββ | 886/1000 [06:43<00:51, 2.23it/s]
89%|βββββββββ | 887/1000 [06:44<00:49, 2.27it/s]
89%|βββββββββ | 888/1000 [06:44<00:47, 2.37it/s]
89%|βββββββββ | 889/1000 [06:44<00:47, 2.34it/s]
89%|βββββββββ | 890/1000 [06:45<00:46, 2.34it/s]
89%|βββββββββ | 890/1000 [06:45<00:46, 2.34it/s]
89%|βββββββββ | 891/1000 [06:45<00:48, 2.26it/s]
89%|βββββββββ | 892/1000 [06:46<00:47, 2.26it/s]
89%|βββββββββ | 893/1000 [06:46<00:46, 2.30it/s]
89%|βββββββββ | 894/1000 [06:47<00:47, 2.21it/s]
90%|βββββββββ | 895/1000 [06:47<00:46, 2.27it/s]
90%|βββββββββ | 896/1000 [06:47<00:45, 2.29it/s]
90%|βββββββββ | 897/1000 [06:48<00:44, 2.30it/s]
90%|βββββββββ | 898/1000 [06:48<00:45, 2.24it/s]
βββββ | 899/1000 [06:49<00:44, 2.29it/s]
90%|βββββββββ | 900/1000 [06:49<00:44, 2.23it/s]
90%|βββββββββ | 900/1000 [06:49<00:44, 2.23it/s]
90%|βββββββββ | 901/1000 [06:50<00:45, 2.18it/s]
90%|βββββββββ | 902/1000 [06:50<00:44, 2.20it/s]
90%|βββββββββ | 903/1000 [06:51<00:44, 2.18it/s]
90%|βββββββββ | 904/1000 [06:51<00:45, 2.13it/s]
90%|βββββββββ | 905/1000 [06:52<00:43, 2.16it/s]
91%|βββββββββ | 906/1000 [06:52<00:43, 2.15it/s]
91%|βββββββββ | 907/1000 [06:52<00:42, 2.21it/s]
91%|βββββββββ | 908/1000 [06:53<00:40, 2.29it/s]
91%|βββββββββ | 909/1000 [06:53<00:42, 2.12it/s]
91%|βββββββββ | 910/1000 [06:54<00:41, 2.17it/s]
91%|βββββββββ | 910/100{'loss': 0.4855, 'grad_norm': 1.970093011856079, 'learning_rate': 1.7830532106104746e-08, 'train/length': 3004.0, 'epoch': 0.92}
0 [06:54<00:41, 2.17it/s]
91%|βββββββββ | 911/1000 [06:54<00:40, 2.22it/s]
91%|βββββββββ | 912/1000 [06:55<00:39, 2.25it/s]
91%|ββββββββββ| 913/1000 [06:55<00:38, 2.29it/s]
91%|ββββββββββ| 914/1000 [06:56<00:36, 2.33it/s]
92%|ββββββββββ| 915/1000 [06:56<00:36, 2.30it/s]
92%|ββββββββββ| 916/1000 [06:57<00:38, 2.21it/s]
92%|ββββββββββ| 917/1000 [06:57<00:44, 1.86it/s]
92%|ββββββββββ| 918/1000 [06:58<00:42, 1.93it/s]
92%|ββββββββββ| 919/1000 [06:58<00:40, 1.99it/s]
92%|ββββββββββ| 920/1000 [06:59<00:41, 1.95it/s]
92%|ββββββββββ| 920/1000 [06:59<00:41, 1.95it/s]
92%|ββββββββββ| 921/1000 [06:59<00:39, 1.98it/s]
92%|ββββββββββ| 922/1000 [07:00<00:42, 1.84it/s]
92%|βββββββ{'loss': 0.4847, 'grad_norm': 1.7198147773742676, 'learning_rate': 1.3718707247769134e-08, 'train/length': 3362.2, 'epoch': 0.93}
βββ| 923/1000 [07:00<00:38, 1.98it/s]
92%|ββββββββββ| 924/1000 [07:01<00:37, 2.04it/s]
92%|ββββββββββ| 925/1000 [07:01<00:35, 2.12it/s]
93%|ββββββββββ| 926/1000 [07:02<00:36, 2.00it/s]
93%|ββββββββββ| 927/1000 [07:02<00:35, 2.06it/s]
93%|ββββββββββ| 928/1000 [07:03<00:33, 2.14it/s]
93%|ββββββββββ| 929/1000 [07:03<00:32, 2.20it/s]
93%|ββββββββββ| 930/1000 [07:04<00:32, 2.15it/s]
93%|ββββββββββ| 930/1000 [07:04<00:32, 2.15it/s]
93%|ββββββββββ| 931/1000 [07:04<00:32, 2.14it/s]
93%|ββββββββββ| 932/1000 [07:04<00:30, 2.23it/s]
93%|ββββββββββ| 933/1000 [07:05<00:32, 2.04it/s]
93%|ββββββββββ| 934/1000 [07:05<00:32, 2.03it/s]
94%|ββββββββββ| 935/1000 [07:06<00:30, 2.12it/s]
94%{'loss': 0.4755, 'grad_norm': 2.3541147708892822, 'learning_rate': 1.0138623843548078e-08, 'train/length': 3000.4, 'epoch': 0.94}
|ββββββββββ| 936/1000 [07:06<00:30, 2.11it/s]
94%|ββββββββββ| 937/1000 [07:07<00:28, 2.18it/s]
94%|ββββββββββ| 938/1000 [07:07<00:28, 2.20it/s]
94%|ββββββββββ| 939/1000 [07:08<00:27, 2.24it/s]
94%|ββββββββββ| 940/1000 [07:08<00:26, 2.25it/s]
94%|ββββββββββ| 940/1000 [07:08<00:26, 2.25it/s]
94%|ββββββββββ| 941/1000 [07:09<00:25, 2.31it/s]
94%|ββββββββββ| 942/1000 [07:09<00:25, 2.28it/s]
94%|ββββββββββ| 943/1000 [07:09<00:25, 2.25it/s]
94%|ββββββββββ| 944/1000 [07:10<00:24, 2.24it/s]
94%|ββββββββββ| 945/1000 [07:10<00:24, 2.25it/s]
95%|ββββββββββ| 946/1000 [07:11<00:23, 2.31it/s]
95%|ββββββββββ| 947/1000 [07:11<00:23, 2.27it/s]
95%|ββββββββββ| 948/1000 [07:12<{'loss': 0.4774, 'grad_norm': 1.9618563652038574, 'learning_rate': 7.09419666208183e-09, 'train/length': 2867.9, 'epoch': 0.95}
00:25, 2.03it/s]
95%|ββββββββββ| 949/1000 [07:12<00:23, 2.14it/s]
95%|ββββββββββ| 950/1000 [07:13<00:23, 2.15it/s]
95%|ββββββββββ| 950/1000 [07:13<00:23, 2.15it/s]
95%|ββββββββββ| 951/1000 [07:13<00:22, 2.14it/s]
95%|ββββββββββ| 952/1000 [07:14<00:21, 2.20it/s]
95%|ββββββββββ| 953/1000 [07:14<00:21, 2.15it/s]
95%|ββββββββββ| 954/1000 [07:15<00:21, 2.15it/s]
96%|ββββββββββ| 955/1000 [07:15<00:21, 2.14it/s]
96%|ββββββββββ| 956/1000 [07:15<00:19, 2.21it/s]
96%|ββββββββββ| 957/1000 [07:16<00:19, 2.23it/s]
96%|ββββββββββ| 958/1000 [07:16<00:17, 2.35it/s]
96%|ββββββββββ| 959/1000 [07:17<00:18, 2.26it/s]
96%|ββββββββββ| 960/1000 [07:17<00:17, 2.24it/s]
{'loss': 0.4695, 'grad_norm': 1.9971542358398438, 'learning_rate': 4.588754739795586e-09, 'train/length': 3086.9, 'epoch': 0.96}
{'loss': 0.4823, 'grad_norm': 1.9023200273513794, 'learning_rate': 2.6250377406467627e-09, 'train/length': 2927.7, 'epoch': 0.97}
96%|ββββββββββ| 960/1000 [07:17<00:17, 2.24it/s]
96%|ββββββββββ| 961/1000 [07:18<00:17, 2.24it/s]
96%|ββββββββββ| 962/1000 [07:18<00:17, 2.19it/s]
96%|ββββββββββ| 963/1000 [07:19<00:18, 2.05it/s]
96%|ββββββββββ| 964/1000 [07:19<00:16, 2.12it/s]
96%|ββββββββββ| 965/1000 [07:19<00:15, 2.22it/s]
97%|ββββββββββ| 966/1000 [07:20<00:17, 2.00it/s]
97%|ββββββββββ| 967/1000 [07:21<00:16, 2.05it/s]
97%|ββββββββββ| 968/1000 [07:21<00:14, 2.16it/s]
97%|ββββββββββ| 969/1000 [07:21<00:14, 2.15it/s]
97%|ββββββββββ| 970/1000 [07:22<00:14, 2.04it/s]
97%|ββββββββββ| 970/1000 [07:22<00:14, 2.04it/s]
97%|ββββββββββ| 971/1000 [07:22<00:13, 2.13it/s]
βββ| 972/1000 [07:23<00:12, 2.16it/s]
97%|ββββββββββ| 973/1000 [07:23<00:12, 2.23it/s]
97%|ββββββββββ| 974/1000 [07:24<00:11, 2.29it/s]
98%|ββββββββββ| 975/1000 [07:24<00:11, 2.21it/s]
98%|ββββββββββ| 976/1000 [07:25<00:10, 2.21it/s]
98%|ββββββββββ| 977/1000 [07:25<00:10, 2.26it/s]
98%|ββββββββββ| 978/1000 [07:25<00:09, 2.37it/s]
98%|ββββββββββ| 979/1000 [07:26<00:09, 2.32it/s]
98%|ββββββββββ| 980/1000 [07:26<00:08, 2.32it/s]
98%|ββββββββββ| 980/1000 [07:26<00:08, 2.32it/s]
98%|ββββββββββ| 981/1000 [07:27<00:08, 2.27it/s]
98%|ββββββββββ| 982/1000 [07:27<00:07, 2.33it/s]
98%|ββββββββββ| 983/1000 [07:28<00:07, 2.28it/s]
98%|ββββββββββ| 984/1000 [07:28<00:07, 2.25it/s]
ββββββββββ| 985/1000 [07:29<00:06, 2.18it/s]
99%|ββββββββββ| 986/1000 [07:29<00:06, 2.18it/s]
99%|ββββββββββ| 987/1000 [07:30<00:06, 2.15it/s]
99%|ββββββββββ| 988/1000 [07:30<00:05, 2.20it/s]
99%|ββββββββββ| 989/1000 [07:30<00:04, 2.22it/s]
99%|ββββββββββ| 990/1000 [07:31<00:04, 2.27it/s]
99%|ββββββββββ| 990/1000 [07:31<00:04, 2.27it/s]
99%|ββββββββββ| 991/1000 [07:31<00:04, 2.24it/s]
99%|ββββββββββ| 992/1000 [07:32<00:03, 2.36it/s]
99%|ββββββββββ| 993/1000 [07:32<00:03, 2.33it/s]
99%|ββββββββββ| 994/1000 [07:33<00:02, 2.32it/s]
100%|ββββββββββ| 995/1000 [07:33<00:02, 2.31it/s]
100%|ββββββββββ| 996/1000 [07:33<00:01, 2.25it/s]
100%|ββββββββββ| 997/1000 [07:34<00{'loss': 0.4857, 'grad_norm': 1.9484124183654785, 'learning_rate': 2.7339599464326622e-12, 'train/length': 2764.4, 'epoch': 1.0}
[INFO|trainer.py:2676] 2025-09-23 23:31:39,845 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
[INFO|trainer.py:2676] 2025-09-23 23:31:39,854 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
[INFO|trainer.py:2676] 2025-09-23 23:31:40,071 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
:01, 2.26it/s]
100%|ββββββββββ| 998/1000 [07:34<00:00, 2.12it/s]
100%|ββββββββββ| 999/1000 [07:35<00:00, 2.26it/s]
100%|ββββββββββ| 1000/1000 [07:35<00:00, 2.30it/s]
100%|ββββββββββ| 1000/1000 [07:35<00:00, 2.30it/s][INFO|trainer.py:3993] 2025-09-23 23:31:40,264 >> Saving model checkpoint to /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000
[INFO|configuration_utils.py:424] 2025-09-23 23:31:40,271 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/config.json
[INFO|configuration_utils.py:904] 2025-09-23 23:31:40,281 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/generation_config.json
[INFO|modeling_utils.py:3724] 2025-09-23 23:31:44,274 >> Model weights saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/model.safetensors
[INFO|tokenization_utils_base.py:2356] 2025-09-23 23:31:44,283 >> chat template saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/chat_template.jinja
[INFO|tokenization_utils_base.py:2525] 2025-09-23 23:31:44,290 >> tokenizer config file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/tokenizer_config.json
[INFO|tokenization_utils_base.py:2534] 2025-09-23 23:31:44,294 >> Special tokens file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/checkpoint-1000/special_tokens_map.json
[INFO|trainer.py:2676] 2025-09-23 23:31:44,484 >>
Training completed. Do not forget to share your model on huggingface.co/models =)
{'train_runtime': 464.5812, 'train_samples_per_second': 8.606, 'train_steps_per_second': 2.152, 'train_loss': 0.5689675788879395, 'epoch': 1.0}
100%|ββββββββββ| 1000/1000 [07:40<00:00, 2.30it/s]
100%|ββββββββββ| 1000/1000 [07:40<00:00, 2.17it/s]
c619-112:920946:921303 [0] NCCL INFO comm 0x2872c050 rank 3 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-111:3464458:3464823 [0] NCCL INFO comm 0x327fd1d0 rank 2 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[INFO|trainer.py:3993] 2025-09-23 23:31:44,895 >> Saving model checkpoint to /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints
c619-102:3670125:3670482 [0] NCCL INFO comm 0x4ae2db90 rank 1 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[INFO|configuration_utils.py:424] 2025-09-23 23:31:44,902 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/config.json
[INFO|configuration_utils.py:904] 2025-09-23 23:31:44,906 >> Configuration saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/generation_config.json
c619-112:920946:921305 [0] NCCL INFO comm 0x24f81300 rank 3 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-111:3464458:3464825 [0] NCCL INFO comm 0x2ee276b0 rank 2 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-102:3670125:3670484 [0] NCCL INFO comm 0x431bf9b0 rank 1 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[INFO|modeling_utils.py:3724] 2025-09-23 23:31:48,263 >> Model weights saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/model.safetensors
[INFO|tokenization_utils_base.py:2356] 2025-09-23 23:31:48,272 >> chat template saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/chat_template.jinja
[INFO|tokenization_utils_base.py:2525] 2025-09-23 23:31:48,276 >> tokenizer config file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/tokenizer_config.json
[INFO|tokenization_utils_base.py:2534] 2025-09-23 23:31:48,279 >> Special tokens file saved in /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/special_tokens_map.json
***** train metrics *****
epoch = 1.0
total_flos = 86028158GF
train_loss = 0.569
train_runtime = 0:07:44.58
train_samples_per_second = 8.606
train_steps_per_second = 2.152
Figure saved at: /scratch/10286/georgetsoukalas/skill_inject_outputs/llamafactory/checkpoints/training_loss.png
[WARNING|2025-09-23 23:31:48] llamafactory.extras.ploting:148 >> No metric eval_loss to plot.
[WARNING|2025-09-23 23:31:48] llamafactory.extras.ploting:148 >> No metric eval_accuracy to plot.
[INFO|modelcard.py:450] 2025-09-23 23:31:48,862 >> Dropping the following result as it does not have all the necessary fields:
{'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}}
c619-101:850874:851318 [0] NCCL INFO comm 0x41cf8330 rank 0 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
c619-101:850874:851320 [0] NCCL INFO comm 0x39f30fe0 rank 0 nranks 4 cudaDev 0 busId 901000 - Destroy COMPLETE
[1;34mwandb[0m:
[1;34mwandb[0m: π View run [33mBASELINE_r1_distillation_sft[0m at: [34m[0m
[1;34mwandb[0m: Find logs at: [1;35mwandb/run-20250923_232400-iql4ovmz/logs[0m
W0923 23:31:52.424000 3464455 /work/10286/georgetsoukalas/vista/miniconda3/envs/vllm/lib/python3.12/site-packages/torch/distributed/elastic/rendezvous/dynamic_rendezvous.py:1341] The node 'c619-111.vista.tacc.utexas.edu_3464455_0' has failed to send a keep-alive heartbeat to the rendezvous '375339' due to an error of type RendezvousTimeoutError.
================================================================================
[INFO] Training completed successfully
[INFO] Merging trained model...
[DEBUG] Loaded 77 existing entries from metadata
[DEBUG] Successfully appended 1 entries to metadata (total: 78)
[INFO] Starting model merge with real-time output...
============================================================
[WARNING|2025-09-23 23:32:10] llamafactory.extras.misc:154 >> Version checking has been disabled, may lead to unexpected behaviors.
[2025-09-23 23:32:12,543] [INFO] [real_accelerator.py:254:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Warning: The cache directory for DeepSpeed Triton autotune, /scratch/10286/georgetsoukalas/.cache/triton, appears to be on an NFS system. While this is generally acceptable, if you experience slowdowns or hanging when DeepSpeed exits, it is recommended to set the TRITON_CACHE_DIR environment variable to a non-NFS path.
Registered source: longmult
Registered source: countdown
Registered source: gsm8k
Registered source: arc
Registered source: arc_challenge
Registered source: arc_easy
Registered source: piqa
Registered source: mmlu
Registered source: mmlu_pro
Registered source: csqa
Registered source: social_iqa
Registered source: strategy_qa
Registered source: winogrande
Registered source: bbh
Registered source: letter_countdown
Registered source: acronym
INFO 09-23 23:32:19 [__init__.py:244] Automatically detected platform cuda.
Traceback (most recent call last):
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/bin/llamafactory-cli", line 7, in <module>
sys.exit(main())
^^^^^^
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/cli.py", line 40, in main
from .api.app import run_api
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/api/app.py", line 21, in <module>
from ..chat import ChatModel
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/__init__.py", line 16, in <module>
from .chat_model import ChatModel
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 29, in <module>
from .vllm_engine import VllmEngine
File "/scratch/10286/georgetsoukalas/skillfactory/skill-factory/thirdparty/LLaMA-Factory/src/llamafactory/chat/vllm_engine.py", line 33, in <module>
from vllm import AsyncEngineArgs, AsyncLLMEngine, RequestOutput, SamplingParams
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/__init__.py", line 64, in __getattr__
module = import_module(module_name, __package__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/engine/arg_utils.py", line 24, in <module>
from vllm.config import (BlockSize, CacheConfig, CacheDType, CompilationConfig,
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/config.py", line 36, in <module>
from vllm.platforms import current_platform
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/platforms/__init__.py", line 276, in __getattr__
_current_platform = resolve_obj_by_qualname(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/utils/__init__.py", line 2304, in resolve_obj_by_qualname
module = importlib.import_module(module_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home1/10286/georgetsoukalas/miniconda3/envs/vllm/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/platforms/cuda.py", line 18, in <module>
import vllm._C # noqa
^^^^^^^^^^^^^^
ImportError: /opt/apps/gcc/13.2.0/lib64/libstdc++.so.6: version `CXXABI_1.3.15' not found (required by /work/10286/georgetsoukalas/vista/vllm-v0.9.2/vllm/vllm/_C.abi3.so)
============================================================
[ERROR] Model merge failed with return code 1
[ERROR] LLaMAFactory stage 'sft' failed: Model merge failed
[ERROR] Stage error: RuntimeError: Model merge failed
|
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/4 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 25%|βββββββββββββββββββββββββββββββββ | 1/4 [00:00<00:00, 5.75ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4/4 [00:00<00:00, 15.40ba/s]
Uploading...: 0%| | 0.00/12.9M [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 34.9MB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 12.9M/12.9M [00:00<00:00, 22.6MB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.22s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.22s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.7MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/9.72k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.72k/9.72k [00:00<00:00, 26.9kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.72k/9.72k [00:00<00:00, 26.8kB/s]
Generating train split: 0%| | 0/76 [00:00<?, ? examples/s]
Generating train split: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 76/76 [00:00<00:00, 17623.13 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2395.38ba/s]
Uploading...: 0%| | 0.00/9.86k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.86k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.86k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.86k/9.86k [00:00<00:00, 49.5kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.86k/9.86k [00:00<00:00, 11.3kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.10s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.10s/ shards]
README.md: 0.00B [00:00, ?B/s]
README.md: 1.26kB [00:00, 10.8MB/s]
metadata/train-00000-of-00001.parquet: 0%| | 0.00/9.86k [00:00<?, ?B/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.86k/9.86k [00:00<00:00, 18.4kB/s]
metadata/train-00000-of-00001.parquet: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.86k/9.86k [00:00<00:00, 18.4kB/s]
Generating train split: 0%| | 0/77 [00:00<?, ? examples/s]
Generating train split: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 77/77 [00:00<00:00, 3625.93 examples/s]
Uploading the dataset shards: 0%| | 0/1 [00:00<?, ? shards/s]
Creating parquet from Arrow format: 0%| | 0/1 [00:00<?, ?ba/s][A
Creating parquet from Arrow format: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:00<00:00, 2493.64ba/s]
Uploading...: 0%| | 0.00/9.87k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.87k [00:00<?, ?B/s][A
Uploading...: 0%| | 0.00/9.87k [00:00<?, ?B/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.87k/9.87k [00:00<00:00, 49.4kB/s][A
Uploading...: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 9.87k/9.87k [00:00<00:00, 10.3kB/s]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.18s/ shards]
Uploading the dataset shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1/1 [00:01<00:00, 1.18s/ shards]
| BASELINE_r1_distillation | 548.94781 | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.