ds7b-ep2-data10-ood-math-taskwise-lambda05 / logs /save_merged_model_20251118_132648.log
lejelly's picture
Upload folder using huggingface_hub
8735f3d verified
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Starting merged model save process
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Arguments: {'lambdas_path': '/work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset-lambda05/llm_adamerge_lambdas.json', 'model_config': '/work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset-lambda05/model_config.yaml', 'output_dir': '/work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset/lambda05', 'model_name': 'merged-model', 'push_to_hub': False, 'hub_repo_id': 'lejelly/ds7b-ep2-data10-ood-math-taskwise-lambda05', 'private': False, 'device': 'cuda', 'debug': False}
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Loading lambdas from /work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset-lambda05/llm_adamerge_lambdas.json
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Auto-detected parameter-wise merge from JSON structure
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Merge type: parameter_wise
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - [Initial] Memory Usage:
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Process: 0.42 GB (0.2%)
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - System: 8.91 GB / 212.49 GB (8.8%)
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Available: 193.75 GB
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB
2025-11-18 13:26:48 - experiment_save_merged_model - INFO - Loading models
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - [After loading models] Memory Usage:
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - Process: 0.67 GB (0.3%)
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - System: 50.56 GB / 212.49 GB (31.7%)
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - Available: 145.04 GB
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - GPU 0: Allocated: 38.61 GB, Reserved: 40.64 GB, Total: 94.50 GB
2025-11-18 13:27:01 - experiment_save_merged_model - INFO - Initializing parameter_wise AdaMerge
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Loading learned lambdas
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Deleting original models to free memory (task vectors already computed)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - [Before deleting models] Memory Usage:
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Process: 39.11 GB (18.4%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - System: 104.76 GB / 212.49 GB (57.2%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Available: 90.85 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - GPU 0: Allocated: 38.61 GB, Reserved: 40.64 GB, Total: 94.50 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Clearing model_loader references
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Deleting model variables
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Running garbage collection
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - [After deleting models and GC] Memory Usage:
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Process: 39.11 GB (18.4%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - System: 63.91 GB / 212.49 GB (38.0%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Available: 131.70 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - [After loading lambdas] Memory Usage:
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Process: 39.11 GB (18.4%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - System: 63.91 GB / 212.49 GB (38.0%)
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Available: 131.70 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Creating merged model with learned lambdas
2025-11-18 13:31:27 - experiment_save_merged_model - INFO - Using merge_models_for_save()
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - [After merging models] Memory Usage:
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - Process: 39.06 GB (18.4%)
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - System: 102.93 GB / 212.49 GB (56.4%)
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - Available: 92.68 GB
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - GPU 0: Allocated: 12.87 GB, Reserved: 53.44 GB, Total: 94.50 GB
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - Freeing memory from AdaMerge object (task vectors and base params no longer needed)
2025-11-18 13:33:26 - experiment_save_merged_model - INFO - Deleting task vectors
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Deleting base params
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Deleting functional model
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - [After freeing AdaMerge memory] Memory Usage:
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Process: 0.41 GB (0.2%)
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - System: 22.90 GB / 212.49 GB (18.7%)
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Available: 172.71 GB
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - GPU 0: Allocated: 12.87 GB, Reserved: 13.05 GB, Total: 94.50 GB
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Saving merged model to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset/lambda05
2025-11-18 13:33:27 - experiment_save_merged_model - INFO - Moving merged model to CPU for saving
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - Successfully saved 3 safetensors files:
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - - model-00003-of-00003.safetensors (3674.14 MB)
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - - model-00002-of-00003.safetensors (4750.20 MB)
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - - model-00001-of-00003.safetensors (4756.17 MB)
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - [After saving model] Memory Usage:
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - Process: 13.30 GB (6.3%)
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - System: 22.78 GB / 212.49 GB (18.7%)
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - Available: 172.82 GB
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB
2025-11-18 13:34:19 - experiment_save_merged_model - INFO - Saving tokenizer
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Copied lambdas file to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset/lambda05/learned_lambdas.json
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Creating model card
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Cleaning up models
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - [After cleanup] Memory Usage:
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Process: 13.31 GB (6.3%)
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - System: 22.78 GB / 212.49 GB (18.7%)
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Available: 172.83 GB
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB
2025-11-18 13:34:20 - experiment_save_merged_model - INFO - Model saved successfully to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/k-fold/task-wise/math/cross_entropy-ep2-10%dataset/lambda05