project_name string | class_name string | class_modifiers string | class_implements int64 | class_extends int64 | function_name string | function_body string | cyclomatic_complexity int64 | NLOC int64 | num_parameter int64 | num_token int64 | num_variable int64 | start_line int64 | end_line int64 | function_index int64 | function_params string | function_variable string | function_return_type string | function_body_line_type string | function_num_functions int64 | function_num_lines int64 | outgoing_function_count int64 | outgoing_function_names string | incoming_function_count int64 | incoming_function_names string | lexical_representation string |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
aws-deadline_deadline-cloud | IncrementalDownloadTest | public | 0 | 0 | run_incremental_download_without_storage_profiles | def run_incremental_download_without_storage_profiles(self,output_dir: str,force_bootstrap: bool = False,conflict_resolution: Optional[str] = None,lookback_window: Optional[int] = None,test_name: Optional[str] = None,) -> subprocess.CompletedProcess:"""Run the incremental download CLI command."""# Use test-specific checkpoint directory to avoid conflicts between parallel testscheckpoint_suffix = f"_{test_name}" if test_name else ""checkpoint_dir = os.path.join(output_dir, f"checkpoints{checkpoint_suffix}")os.makedirs(checkpoint_dir, exist_ok=True)cmd = ["deadline","queue","sync-output","--farm-id",self.farm_id,"--queue-id",self.queue_id,"--checkpoint-dir",checkpoint_dir,"--ignore-storage-profiles",]if force_bootstrap:cmd.append("--force-bootstrap")if conflict_resolution:cmd.extend(["--conflict-resolution", conflict_resolution])if lookback_window is not None:cmd.extend(["--bootstrap-lookback-minutes", str(lookback_window)])# Run from the specified output directory so files are downloaded to their manifest paths# The CLI will create the necessary directory structure based on job manifestsresult = subprocess.run(cmd, capture_output=True, text=True, check=False, cwd=output_dir)# Print CLI output for debuggingif result.stdout:print(f"[sync-output] STDOUT:\n{result.stdout}")if result.stderr:print(f"[sync-output] STDERR:\n{result.stderr}")if result.returncode != 0:print(f"[sync-output] Exit code: {result.returncode}")return result | 8 | 37 | 6 | 206 | 0 | 108 | 156 | 108 | self,output_dir,force_bootstrap,conflict_resolution,lookback_window,test_name | [] | subprocess.CompletedProcess | {"Assign": 4, "Expr": 8, "If": 6, "Return": 1} | 10 | 49 | 10 | ["os.path.join", "os.makedirs", "cmd.append", "cmd.extend", "cmd.extend", "str", "subprocess.run", "print", "print", "print"] | 0 | [] | The function (run_incremental_download_without_storage_profiles) defined within the public class called IncrementalDownloadTest.The function start at line 108 and ends at 156. It contains 37 lines of code and it has a cyclomatic complexity of 8. It takes 6 parameters, represented as [108.0] and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["os.path.join", "os.makedirs", "cmd.append", "cmd.extend", "cmd.extend", "str", "subprocess.run", "print", "print", "print"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | incremental_download_test | def incremental_download_test(deadline_cli_test: DeadlineCliTest):"""Fixture to get the IncrementalDownloadTest object."""return IncrementalDownloadTest(farm_id=deadline_cli_test.farm_id, queue_id=deadline_cli_test.queue_id) | 1 | 4 | 1 | 23 | 0 | 160 | 164 | 160 | deadline_cli_test | [] | Returns | {"Expr": 1, "Return": 1} | 2 | 5 | 2 | ["IncrementalDownloadTest", "pytest.fixture"] | 0 | [] | The function (incremental_download_test) defined within the public class called public.The function start at line 160 and ends at 164. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["IncrementalDownloadTest", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_incremental_download_many_small_files | def test_incremental_download_many_small_files(incremental_download_test, tmp_path):"""Test incremental download with many small files (10,000 files total)."""files_per_task = 100task_count = 100total_files = files_per_task * task_count# 10,000 filesunique_output_dir = f"{tmp_path}/many_small_files_output"print(f"[make_many_small_files] Submitting job with {total_files} files ({task_count} tasks × {files_per_task} files)")job_id = submit_make_many_small_files_job(farm_id=incremental_download_test.farm_id,queue_id=incremental_download_test.queue_id,files_per_task=files_per_task,task_count=task_count,output_dir=unique_output_dir,)print(f"[make_many_small_files] Job submitted with ID: {job_id}")# Run incremental download in a loop until job completesjob_complete = Falseincremental_download_iteration_number = 0start_time = time.time()job_completion_timeout = 600# 10 minuteswhile not job_complete:incremental_download_iteration_number += 1job = incremental_download_test.deadline_client.get_job(farmId=incremental_download_test.farm_id,queueId=incremental_download_test.queue_id,jobId=job_id,)task_run_status = job.get("taskRunStatus")# Check if job failedif task_run_status == "FAILED":print(f"[make_many_small_files] Job {job_id} FAILED - stopping test")assert False, f"Job {job_id} failed with status: {task_run_status}"incremental_download_test.run_incremental_download_without_storage_profiles(str(tmp_path),force_bootstrap=(incremental_download_iteration_number == 1),lookback_window=2,test_name="many_small_files",)job_complete = task_run_status in ["SUCCEEDED", "FAILED", "CANCELED"]# Check timeoutif time.time() - start_time > job_completion_timeout:assert False, f"Job {job_id} did not complete within {job_completion_timeout} seconds"# Wait 5 secs if job's still not completeif not job_complete:time.sleep(5)print(f"[make_many_small_files] Job completed after {incremental_download_iteration_number} download iterations")# Run final incremental download to ensure all files are capturedincremental_download_test.wait_for_all_files(tmp_path=tmp_path,expected_files={"count": total_files},test_name="make_many_small_files",file_pattern="**/*.txt",count_only=True,)print(f"[make_many_small_files] Successfully verified all {total_files} files were downloaded") | 5 | 53 | 2 | 235 | 11 | 172 | 245 | 172 | incremental_download_test,tmp_path | ['total_files', 'job_complete', 'files_per_task', 'incremental_download_iteration_number', 'task_count', 'task_run_status', 'job_id', 'job_completion_timeout', 'unique_output_dir', 'job', 'start_time'] | None | {"Assign": 12, "AugAssign": 1, "Expr": 9, "If": 3, "While": 1} | 16 | 74 | 16 | ["print", "submit_make_many_small_files_job", "print", "time.time", "incremental_download_test.deadline_client.get_job", "job.get", "print", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"] | 0 | [] | The function (test_incremental_download_many_small_files) defined within the public class called public.The function start at line 172 and ends at 245. It contains 53 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [172.0] and does not return any value. It declares 16.0 functions, and It has 16.0 functions called inside which are ["print", "submit_make_many_small_files_job", "print", "time.time", "incremental_download_test.deadline_client.get_job", "job.get", "print", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_incremental_download_dep_data_flow | def test_incremental_download_dep_data_flow(incremental_download_test, tmp_path):"""Test incremental download with dep_data_flow template."""# Create data and input directories in tmp_pathunique_data_dir = tmp_path / "data_dir"unique_data_dir.mkdir()# Create the required input files for the job templatecreate_job_input_content = "1. Input to CreateJob from data_dir\n"# Create 2 files in the data directory (template expects exactly 2)(unique_data_dir / "create_job_in.txt").write_text(create_job_input_content)(unique_data_dir / "initial_data.txt").write_text("Initial data file for job\n")# Also create input_dir with 2 required files (template expects exactly 2)unique_input_dir = tmp_path / "input_dir"unique_input_dir.mkdir()(unique_input_dir / "create_job_in.txt").write_text(create_job_input_content)(unique_input_dir / "initial_input.txt").write_text("Initial input file for job\n")job_id = submit_dep_data_flow_job(farm_id=incremental_download_test.farm_id,queue_id=incremental_download_test.queue_id,data_dir=str(unique_data_dir),input_dir=str(unique_input_dir),)print(f"[dep_data_flow] Job submitted with ID: {job_id}")# Run incremental download in a loop until job completesjob_complete = Falseincremental_download_iteration_number = 0force_bootstrap_first = True# Force bootstrap on first run onlystart_time = time.time()job_completion_timeout = 600# 10 minuteswhile not job_complete:incremental_download_iteration_number += 1print(f"[dep_data_flow] Running incremental download iteration {incremental_download_iteration_number}...")job = incremental_download_test.deadline_client.get_job(farmId=incremental_download_test.farm_id,queueId=incremental_download_test.queue_id,jobId=job_id,)task_run_status = job.get("taskRunStatus")# Check if job failedif task_run_status == "FAILED":print(f"[dep_data_flow] Job {job_id} FAILED - stopping test")assert False, f"Job {job_id} failed with status: {task_run_status}"incremental_download_test.run_incremental_download_without_storage_profiles(str(tmp_path),force_bootstrap=force_bootstrap_first,lookback_window=2,test_name="dep_data_flow",)force_bootstrap_first = False# Only bootstrap on first iterationjob_complete = task_run_status in ["SUCCEEDED", "FAILED", "CANCELED"]# Check timeoutif time.time() - start_time > job_completion_timeout:assert False, f"Job {job_id} did not complete within {job_completion_timeout} seconds"# Wait 5 secs if job's still not completeif not job_complete:time.sleep(5)print(f"[dep_data_flow] Job completed after {incremental_download_iteration_number} download iterations")# Wait for all output files to be available with incremental downloadexpected_files = {"Step1.out": "2. Processed in Step1","Step1-2.8.out": "3.8 Processed in Step1-2.8","Step1-2.9.out": "3.9 Processed in Step1-2.9","Step1-2.10.out": "3.10 Processed in Step1-2.10","Step1-2.11.out": "3.11 Processed in Step1-2.11","Step1-2-3.out": "4. Processed in Step1-2-3","Step1-2-4.out": "4. Processed in Step1-2-4","Step1-2-34-5.out": "5. Processed in Step1-2-34-5",}incremental_download_test.wait_for_all_files(tmp_path=tmp_path,expected_files=expected_files,test_name="dep_data_flow",file_pattern="**/*.out",content_check="1. Input to CreateJob from data_dir",)print("[dep_data_flow] Successfully verified all 8 expected output files with correct content") | 5 | 69 | 2 | 318 | 12 | 253 | 349 | 253 | incremental_download_test,tmp_path | ['force_bootstrap_first', 'job_complete', 'incremental_download_iteration_number', 'unique_input_dir', 'unique_data_dir', 'task_run_status', 'job_id', 'job_completion_timeout', 'start_time', 'job', 'expected_files', 'create_job_input_content'] | None | {"Assign": 14, "AugAssign": 1, "Expr": 15, "If": 3, "While": 1} | 24 | 97 | 24 | ["unique_data_dir.mkdir", "write_text", "write_text", "unique_input_dir.mkdir", "write_text", "write_text", "submit_dep_data_flow_job", "str", "str", "print", "time.time", "print", "incremental_download_test.deadline_client.get_job", "job.get", "print", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"] | 0 | [] | The function (test_incremental_download_dep_data_flow) defined within the public class called public.The function start at line 253 and ends at 349. It contains 69 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [253.0] and does not return any value. It declares 24.0 functions, and It has 24.0 functions called inside which are ["unique_data_dir.mkdir", "write_text", "write_text", "unique_input_dir.mkdir", "write_text", "write_text", "submit_dep_data_flow_job", "str", "str", "print", "time.time", "print", "incremental_download_test.deadline_client.get_job", "job.get", "print", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_incremental_download_dependency_chain | def test_incremental_download_dependency_chain(incremental_download_test, tmp_path):"""Test incremental download with dep_chain template."""unique_output_dir = f"{tmp_path}/dep_chain_output"job_id = submit_dep_chain_job(farm_id=incremental_download_test.farm_id,queue_id=incremental_download_test.queue_id,output_dir=unique_output_dir,)# Run incremental download in a loop until job completesjob_complete = Falseforce_bootstrap = Truestart_time = time.time()job_completion_timeout = 600# 10 minuteswhile not job_complete:job = incremental_download_test.deadline_client.get_job(farmId=incremental_download_test.farm_id,queueId=incremental_download_test.queue_id,jobId=job_id,)task_run_status = job.get("taskRunStatus")# Check if job failedif task_run_status == "FAILED":assert False, f"Job {job_id} failed with status: {task_run_status}"incremental_download_test.run_incremental_download_without_storage_profiles(str(tmp_path),force_bootstrap=force_bootstrap,lookback_window=2,test_name="dep_chain",)force_bootstrap = Falsejob_complete = task_run_status in ["SUCCEEDED", "FAILED", "CANCELED"]# Check timeoutif time.time() - start_time > job_completion_timeout:assert False, f"Job {job_id} did not complete within {job_completion_timeout} seconds"if not job_complete:time.sleep(5)# Run final incremental download to ensure all files are capturedtime.sleep(5)# Verify expected output files from dep_chain templateexpected_files = {}for i in range(6):# A through F = 6 filesstep_name = chr(ord("A") + i)filename = f"{step_name}.txt"expected_content = f"Step {step_name} is correct"expected_files[filename] = expected_contentprint(f"[dep_chain] Expected files: {expected_files}")incremental_download_test.wait_for_all_files(tmp_path=tmp_path,expected_files=expected_files,test_name="dep_chain",file_pattern="**/*.txt",exact_match=True,)print("[dep_chain] Successfully verified all 6 expected chain files with correct content") | 6 | 48 | 2 | 235 | 12 | 357 | 425 | 357 | incremental_download_test,tmp_path | ['job_complete', 'expected_content', 'force_bootstrap', 'task_run_status', 'step_name', 'filename', 'job_id', 'job_completion_timeout', 'unique_output_dir', 'job', 'expected_files', 'start_time'] | None | {"Assign": 15, "Expr": 7, "For": 1, "If": 3, "While": 1} | 17 | 69 | 17 | ["submit_dep_chain_job", "time.time", "incremental_download_test.deadline_client.get_job", "job.get", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "time.sleep", "range", "chr", "ord", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"] | 0 | [] | The function (test_incremental_download_dependency_chain) defined within the public class called public.The function start at line 357 and ends at 425. It contains 48 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [357.0] and does not return any value. It declares 17.0 functions, and It has 17.0 functions called inside which are ["submit_dep_chain_job", "time.time", "incremental_download_test.deadline_client.get_job", "job.get", "incremental_download_test.run_incremental_download_without_storage_profiles", "str", "time.time", "time.sleep", "time.sleep", "range", "chr", "ord", "print", "incremental_download_test.wait_for_all_files", "print", "pytest.mark.timeout", "pytest.mark.xfail"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_conflict_resolution_with_requeue | def test_conflict_resolution_with_requeue(incremental_download_test, requeue_level, tmp_path):"""Test incremental download with re-queuing at different levels and conflict resolution."""files_per_task = 10task_count = 2expected_initial_files = files_per_task * task_countunique_output_dir = tmp_path / f"output_{requeue_level}"unique_output_dir.mkdir()# Added this to be sure that the unique output directory is not being sharedassert os.listdir(unique_output_dir) == []# Submit and wait for initial jobjob_id = submit_make_many_small_files_slow_job(farm_id=incremental_download_test.farm_id,queue_id=incremental_download_test.queue_id,files_per_task=files_per_task,task_count=task_count,output_dir=unique_output_dir,)job_completed, final_status = incremental_download_test.wait_for_job_completion(job_id, timeout=600)assert job_completed, f"Initial job failed with status: {final_status}"# Download initial files_run_download_until_complete(incremental_download_test,tmp_path,unique_output_dir,expected_initial_files,requeue_level,"initial",)# Requeue at specified level_requeue_at_level(incremental_download_test, job_id, requeue_level)# Wait for requeue completionrequeue_completed, requeue_status = incremental_download_test.wait_for_job_completion(job_id, timeout=600)assert requeue_completed, f"Requeue failed with status: {requeue_status}"# Download files after requeueexpected_final = _get_expected_file_count(requeue_level, expected_initial_files, files_per_task)_run_download_until_complete(incremental_download_test,tmp_path,unique_output_dir,expected_final,requeue_level,"requeue",) | 1 | 40 | 3 | 157 | 6 | 434 | 489 | 434 | incremental_download_test,requeue_level,tmp_path | ['expected_initial_files', 'files_per_task', 'task_count', 'job_id', 'expected_final', 'unique_output_dir'] | None | {"Assign": 8, "Expr": 5} | 12 | 56 | 12 | ["unique_output_dir.mkdir", "os.listdir", "submit_make_many_small_files_slow_job", "incremental_download_test.wait_for_job_completion", "_run_download_until_complete", "_requeue_at_level", "incremental_download_test.wait_for_job_completion", "_get_expected_file_count", "_run_download_until_complete", "pytest.mark.timeout", "pytest.mark.xfail", "pytest.mark.parametrize"] | 0 | [] | The function (test_conflict_resolution_with_requeue) defined within the public class called public.The function start at line 434 and ends at 489. It contains 40 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [434.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["unique_output_dir.mkdir", "os.listdir", "submit_make_many_small_files_slow_job", "incremental_download_test.wait_for_job_completion", "_run_download_until_complete", "_requeue_at_level", "incremental_download_test.wait_for_job_completion", "_get_expected_file_count", "_run_download_until_complete", "pytest.mark.timeout", "pytest.mark.xfail", "pytest.mark.parametrize"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _run_download_until_complete | def _run_download_until_complete(test_instance, tmp_path, unique_output_dir, expected_count, level, phase):"""Run incremental download until expected file count is reached."""# Use the actual unique output directory passed from the testtest_output_dir = Path(unique_output_dir)for iteration in range(1, 11):# Max 10 iterationsresult = test_instance.run_incremental_download_without_storage_profiles(str(tmp_path),force_bootstrap=(phase == "initial" and iteration == 1),conflict_resolution="create_copy",lookback_window=1,test_name=f"requeue_{level}",)if result.returncode != 0 and "had incorrect size 0" not in result.stdout:assert False, f"{phase.title()} download failed: {result.stderr}"# Only look for files in the test-specific output directoryfiles = list(test_output_dir.glob("**/file_*")) if test_output_dir.exists() else []if len(files) >= expected_count:breaktime.sleep(2)assert len(files) == expected_count, (f"Expected {expected_count} files after {phase} in {test_output_dir}, got {len(files)}") | 7 | 21 | 6 | 135 | 3 | 492 | 520 | 492 | test_instance,tmp_path,unique_output_dir,expected_count,level,phase | ['test_output_dir', 'result', 'files'] | None | {"Assign": 3, "Expr": 2, "For": 1, "If": 2} | 12 | 29 | 12 | ["Path", "range", "test_instance.run_incremental_download_without_storage_profiles", "str", "phase.title", "test_output_dir.exists", "list", "test_output_dir.glob", "len", "time.sleep", "len", "len"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"] | The function (_run_download_until_complete) defined within the public class called public.The function start at line 492 and ends at 520. It contains 21 lines of code and it has a cyclomatic complexity of 7. It takes 6 parameters, represented as [492.0] and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["Path", "range", "test_instance.run_incremental_download_without_storage_profiles", "str", "phase.title", "test_output_dir.exists", "list", "test_output_dir.glob", "len", "time.sleep", "len", "len"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _wait_for_requeue_to_take_effect | def _wait_for_requeue_to_take_effect(test_instance, job_id, timeout=60):"""Wait for requeue to take effect by checking job status changes from SUCCEEDED."""client = test_instance.deadline_clientfarm_id = test_instance.farm_idqueue_id = test_instance.queue_idprint(f"Waiting for requeue to take effect for job {job_id}...")start_time = time.time()while time.time() - start_time < timeout:job = client.get_job(farmId=farm_id, queueId=queue_id, jobId=job_id)task_run_status = job.get("taskRunStatus")print(f"Current job taskRunStatus: {task_run_status}")# Job has been re-queued if it's no longer SUCCEEDEDif task_run_status in ["READY", "ASSIGNED", "STARTING", "SCHEDULED", "RUNNING"]:print(f"Requeue took effect - job status is now: {task_run_status}")return Truetime.sleep(2)# If we get here, requeue didn't take effect within timeoutjob = client.get_job(farmId=farm_id, queueId=queue_id, jobId=job_id)current_status = job.get("taskRunStatus")raise AssertionError(f"Requeue did not take effect within {timeout}s. Job status is still: {current_status}") | 3 | 19 | 3 | 141 | 7 | 523 | 550 | 523 | test_instance,job_id,timeout | ['client', 'current_status', 'farm_id', 'task_run_status', 'start_time', 'job', 'queue_id'] | Returns | {"Assign": 8, "Expr": 5, "If": 1, "Return": 1, "While": 1} | 11 | 28 | 11 | ["print", "time.time", "time.time", "client.get_job", "job.get", "print", "print", "time.sleep", "client.get_job", "job.get", "AssertionError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py._requeue_at_level"] | The function (_wait_for_requeue_to_take_effect) defined within the public class called public.The function start at line 523 and ends at 550. It contains 19 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [523.0], and this function return a value. It declares 11.0 functions, It has 11.0 functions called inside which are ["print", "time.time", "time.time", "client.get_job", "job.get", "print", "print", "time.sleep", "client.get_job", "job.get", "AssertionError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py._requeue_at_level"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _requeue_at_level | def _requeue_at_level(test_instance, job_id, level):"""Requeue job at the specified level."""client = test_instance.deadline_clientfarm_id = test_instance.farm_idqueue_id = test_instance.queue_idif level == "job":client.update_job(farmId=farm_id, queueId=queue_id, jobId=job_id, targetTaskRunStatus="READY")else:steps = client.list_steps(farmId=farm_id, queueId=queue_id, jobId=job_id)["steps"]assert steps, "No steps found in job"step_id = steps[0]["stepId"]if level == "step":client.update_step(farmId=farm_id,queueId=queue_id,jobId=job_id,stepId=step_id,targetTaskRunStatus="READY",)else:# tasktasks = client.list_tasks(farmId=farm_id, queueId=queue_id, jobId=job_id, stepId=step_id)["tasks"]assert tasks, "No tasks found in step"task_id = tasks[0]["taskId"]client.update_task(farmId=farm_id,queueId=queue_id,jobId=job_id,stepId=step_id,taskId=task_id,targetRunStatus="READY",)print(f"Requeue API call completed for {level} level")# Wait for the requeue to actually take effect_wait_for_requeue_to_take_effect(test_instance, job_id) | 3 | 36 | 3 | 196 | 7 | 553 | 594 | 553 | test_instance,job_id,level | ['client', 'task_id', 'farm_id', 'steps', 'tasks', 'step_id', 'queue_id'] | None | {"Assign": 7, "Expr": 6, "If": 2} | 7 | 42 | 7 | ["client.update_job", "client.list_steps", "client.update_step", "client.list_tasks", "client.update_task", "print", "_wait_for_requeue_to_take_effect"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"] | The function (_requeue_at_level) defined within the public class called public.The function start at line 553 and ends at 594. It contains 36 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [553.0] and does not return any value. It declares 7.0 functions, It has 7.0 functions called inside which are ["client.update_job", "client.list_steps", "client.update_step", "client.list_tasks", "client.update_task", "print", "_wait_for_requeue_to_take_effect"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_expected_file_count | def _get_expected_file_count(level, initial_count, files_per_task):"""Calculate expected file count after requeue based on level."""return initial_count * 2 if level in ["job", "step"] else initial_count + files_per_task | 2 | 2 | 3 | 26 | 0 | 597 | 599 | 597 | level,initial_count,files_per_task | [] | Returns | {"Expr": 1, "Return": 1} | 0 | 3 | 0 | [] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"] | The function (_get_expected_file_count) defined within the public class called public.The function start at line 597 and ends at 599. It contains 2 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [597.0], and this function return a value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.test_conflict_resolution_with_requeue"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | temp_dir | def temp_dir(self):with tempfile.TemporaryDirectory() as tmpdir_path:yield tmpdir_path | 1 | 3 | 1 | 16 | 0 | 32 | 34 | 32 | self | [] | None | {"Expr": 1, "With": 1} | 1 | 3 | 1 | ["tempfile.TemporaryDirectory"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"] | The function (temp_dir) defined within the public class called TestManifestDownload.The function start at line 32 and ends at 34. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["tempfile.TemporaryDirectory"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | _assert_input_mainfests_exist | def _assert_input_mainfests_exist(self, files):assert "inputs/textures", "brick.png" in filesassert "inputs/textures", "cloth.png" in filesassert "inputs/scene.ma" in files | 1 | 4 | 2 | 23 | 0 | 36 | 39 | 36 | self,files | [] | None | {} | 0 | 4 | 0 | [] | 0 | [] | The function (_assert_input_mainfests_exist) defined within the public class called TestManifestDownload.The function start at line 36 and ends at 39. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [36.0] and does not return any value.. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | _assert_output_manifests_exist | def _assert_output_manifests_exist(self, files):assert "output_file" in filesassert "output/nested_output_file" in files | 1 | 3 | 2 | 15 | 0 | 41 | 43 | 41 | self,files | [] | None | {} | 0 | 3 | 0 | [] | 0 | [] | The function (_assert_output_manifests_exist) defined within the public class called TestManifestDownload.The function start at line 41 and ends at 43. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [41.0] and does not return any value.. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | _assert_dependent_step_output_exist | def _assert_dependent_step_output_exist(self, files):assert "dependent_step_output_file" in filesassert "dependent_step_output/nested_output_file" in files | 1 | 3 | 2 | 15 | 0 | 45 | 47 | 45 | self,files | [] | None | {} | 0 | 3 | 0 | [] | 0 | [] | The function (_assert_dependent_step_output_exist) defined within the public class called TestManifestDownload.The function start at line 45 and ends at 47. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [45.0] and does not return any value.. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | check_json_mode_contains_files | def check_json_mode_contains_files(self, result):# If JSON mode was specified, make sure the output is JSON and contains the downloaded manifest file.download = json.loads(result.output)assert download is not Noneassert len(download["downloaded"]) == 1return download | 1 | 5 | 2 | 34 | 0 | 49 | 54 | 49 | self,result | [] | Returns | {"Assign": 1, "Return": 1} | 2 | 6 | 2 | ["json.loads", "len"] | 0 | [] | The function (check_json_mode_contains_files) defined within the public class called TestManifestDownload.The function start at line 49 and ends at 54. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [49.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["json.loads", "len"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | validate_result_exit_code_0 | def validate_result_exit_code_0(self, job_attachment_test, result):# Thenassert result.exit_code == 0, (f"{result.output}, {job_attachment_test.farm_id}, {job_attachment_test.queue_id}") | 1 | 4 | 3 | 20 | 0 | 56 | 60 | 56 | self,job_attachment_test,result | [] | None | {} | 0 | 5 | 0 | [] | 0 | [] | The function (validate_result_exit_code_0) defined within the public class called TestManifestDownload.The function start at line 56 and ends at 60. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [56.0] and does not return any value.. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | run_cli_with_params | def run_cli_with_params(self, asset_type, job_attachment_test, job_id, json_output, step_id, temp_dir):runner = CliRunner()# Download for farm, queue, job to temp dir.args = ["manifest","download","--farm-id",job_attachment_test.farm_id,"--queue-id",job_attachment_test.queue_id,"--job-id",job_id,"--asset-type",asset_type,temp_dir,]if json_output:args.append("--json")if step_id:args.append("--step-id")args.append(step_id)result = runner.invoke(main, args)return result | 3 | 24 | 7 | 88 | 0 | 62 | 89 | 62 | self,asset_type,job_attachment_test,job_id,json_output,step_id,temp_dir | [] | Returns | {"Assign": 3, "Expr": 3, "If": 2, "Return": 1} | 5 | 28 | 5 | ["CliRunner", "args.append", "args.append", "args.append", "runner.invoke"] | 0 | [] | The function (run_cli_with_params) defined within the public class called TestManifestDownload.The function start at line 62 and ends at 89. It contains 24 lines of code and it has a cyclomatic complexity of 3. It takes 7 parameters, represented as [62.0], and this function return a value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["CliRunner", "args.append", "args.append", "args.append", "runner.invoke"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | _setup_create_job | def _setup_create_job(self,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,job_template: str,job_attachment_test: JobAttachmentTest,) -> str:"""Create a job with the provided template and wait for the job to be created."""farm_id: str = job_attachment_test.farm_idqueue_id: str = job_attachment_test.queue_id# Setup failure for the test.assert farm_idassert queue_id# Create a job w/ CAS data already created.job_response = job_attachment_test.deadline_client.create_job(farmId=farm_id,queueId=queue_id,attachments=upload_input_files_one_asset_in_cas.attachments.to_dict(),# type: ignoretargetTaskRunStatus="SUSPENDED",template=job_template,templateType="JSON",priority=50,)job_id: str = job_response["jobId"]# Wait for the job to be created.waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=job_id,queueId=job_attachment_test.queue_id,farmId=job_attachment_test.farm_id,)# Return the created Job ID.return job_id | 1 | 27 | 4 | 123 | 0 | 91 | 129 | 91 | self,upload_input_files_one_asset_in_cas,job_template,job_attachment_test | [] | str | {"AnnAssign": 3, "Assign": 2, "Expr": 2, "Return": 1} | 4 | 39 | 4 | ["job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "job_attachment_test.deadline_client.get_waiter", "waiter.wait"] | 0 | [] | The function (_setup_create_job) defined within the public class called TestManifestDownload.The function start at line 91 and ends at 129. It contains 27 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [91.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "job_attachment_test.deadline_client.get_waiter", "waiter.wait"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | validate_manifest_is_not_None | def validate_manifest_is_not_None(self, manifest_file):manifest: BaseAssetManifest = decode_manifest(manifest_file.read())assert manifest is not Nonereturn manifest | 1 | 4 | 2 | 26 | 0 | 131 | 134 | 131 | self,manifest_file | [] | Returns | {"AnnAssign": 1, "Return": 1} | 2 | 4 | 2 | ["decode_manifest", "manifest_file.read"] | 0 | [] | The function (validate_manifest_is_not_None) defined within the public class called TestManifestDownload.The function start at line 131 and ends at 134. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [131.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["decode_manifest", "manifest_file.read"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | _sync_mock_output_file | def _sync_mock_output_file(self,job_attachment_test: JobAttachmentTest,job_id: str,first_step_name: str,second_step_name: Optional[str],asset_root_path: str,output_file_path: str,nested_output_file_path: str,) -> str:"""Create a fake manifest file, upload it as a step output and return the step ID that is dependent.job_attachment_test: JobAttachmentTest test harnessjob_id: str, self-explanatory.first_step_name: str, independent step.second_step_name: Optional[str], dependent on first step.asset_root_path: Asset root to upload an output file."""list_steps_response = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=job_id,)# Find the IDs of the steps:step_ids = {step["name"]: step["stepId"] for step in list_steps_response["steps"]}first_step_id = step_ids[first_step_name]second_step_id = step_ids[second_step_name] if second_step_name is not None else None# Get the task of the first step so we can upload a fake manifest.first_step_first_task_id = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=job_id,stepId=first_step_id,)["tasks"][0]["taskId"]assert first_step_first_task_id is not None# Create a fake manifest as output and upload it to S3.asset_sync = AssetSync(job_attachment_test.farm_id)output_manifest = AssetManifest(hash_alg=HashAlgorithm("xxh128"),total_size=10,paths=[ManifestPath(path=output_file_path, hash="a", size=1, mtime=167907934333848),ManifestPath(path=nested_output_file_path, hash="b", size=1, mtime=1479079344833848),],)session_action_id_with_time_stamp = (f"{_float_to_iso_datetime_string(time.time())}_session-86231a00283449158900410c7d58051e")full_output_prefix = job_attachment_test.job_attachment_settings.full_output_prefix(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=job_id,step_id=first_step_id,task_id=first_step_first_task_id,session_action_id=session_action_id_with_time_stamp,)asset_sync._upload_output_manifest_to_s3(s3_settings=job_attachment_test.job_attachment_settings,output_manifest=output_manifest,full_output_prefix=full_output_prefix,root_path=asset_root_path,)return second_step_id if second_step_id is not None else first_step_id | 4 | 54 | 8 | 289 | 0 | 136 | 206 | 136 | self,job_attachment_test,job_id,first_step_name,second_step_name,asset_root_path,output_file_path,nested_output_file_path | [] | str | {"Assign": 9, "Expr": 2, "Return": 1} | 11 | 71 | 11 | ["job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "AssetSync", "AssetManifest", "HashAlgorithm", "ManifestPath", "ManifestPath", "_float_to_iso_datetime_string", "time.time", "job_attachment_test.job_attachment_settings.full_output_prefix", "asset_sync._upload_output_manifest_to_s3"] | 0 | [] | The function (_sync_mock_output_file) defined within the public class called TestManifestDownload.The function start at line 136 and ends at 206. It contains 54 lines of code and it has a cyclomatic complexity of 4. It takes 8 parameters, represented as [136.0] and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "AssetSync", "AssetManifest", "HashAlgorithm", "ManifestPath", "ManifestPath", "_float_to_iso_datetime_string", "time.time", "job_attachment_test.job_attachment_settings.full_output_prefix", "asset_sync._upload_output_manifest_to_s3"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | test_manifest_download_job_asset_type_with_no_step_dependency | def test_manifest_download_job_asset_type_with_no_step_dependency(self,temp_dir: str,json_output: bool,asset_type: str,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,default_job_template: str,job_attachment_test: JobAttachmentTest,):# Given:# Create a jobjob_id: str = self._setup_create_job(upload_input_files_one_asset_in_cas, default_job_template, job_attachment_test)# Upload the task output manifest for the stepasset_root_path: str = upload_input_files_one_asset_in_cas.attachments.manifests[0].rootPathstep_id: str = self._sync_mock_output_file(job_attachment_test,job_id,"custom-step-2",None,asset_root_path,"output_file","output/nested_output_file",)# Whenresult = self.run_cli_with_params(asset_type=asset_type,job_attachment_test=job_attachment_test,job_id=job_id,json_output=json_output,temp_dir=temp_dir,step_id=step_id,)self.validate_result_exit_code_0(job_attachment_test, result)if json_output:download = self.check_json_mode_contains_files(result)# With JSON mode, we can also check the manifest file itself.with open(download["downloaded"][0]["local_manifest_path"]) as manifest_file:manifest = self.validate_manifest_is_not_None(manifest_file)# Create a list of files we know should be in the input paths.files: List[str] = [path.path for path in manifest.paths]if asset_type == AssetType.INPUT:self._assert_input_mainfests_exist(files)if asset_type == AssetType.OUTPUT:assert "inputs/textures", "brick.png" not in filesself._assert_output_manifests_exist(files)if asset_type == AssetType.ALL:self._assert_input_mainfests_exist(files)self._assert_output_manifests_exist(files) | 6 | 44 | 7 | 226 | 0 | 218 | 274 | 218 | self,temp_dir,json_output,asset_type,upload_input_files_one_asset_in_cas,default_job_template,job_attachment_test | [] | None | {"AnnAssign": 4, "Assign": 3, "Expr": 5, "If": 4, "With": 1} | 15 | 57 | 15 | ["self._setup_create_job", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param", "pytest.mark.parametrize"] | 0 | [] | The function (test_manifest_download_job_asset_type_with_no_step_dependency) defined within the public class called TestManifestDownload.The function start at line 218 and ends at 274. It contains 44 lines of code and it has a cyclomatic complexity of 6. It takes 7 parameters, represented as [218.0] and does not return any value. It declares 15.0 functions, and It has 15.0 functions called inside which are ["self._setup_create_job", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param", "pytest.mark.parametrize"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | test_manifest_download_asset_type_with_job_step_dependency | def test_manifest_download_asset_type_with_job_step_dependency(self,temp_dir: str,asset_type: str,json_output: bool,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,default_job_template_step_step_dependency: str,job_attachment_test: JobAttachmentTest,):# Create a job, with step step dependency.job_id: str = self._setup_create_job(upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test,)# Upload a dependent task output manifest.asset_root_path: str = upload_input_files_one_asset_in_cas.attachments.manifests[0].rootPathself._sync_mock_output_file(job_attachment_test,job_id,"custom-step","custom-step-2",asset_root_path,"dependent_step_output_file","dependent_step_output/nested_output_file",)# Upload the task output manifestself._sync_mock_output_file(job_attachment_test,job_id,"custom-step-2",None,asset_root_path,"output_file","output/nested_output_file",)result = self.run_cli_with_params(asset_type=asset_type,job_attachment_test=job_attachment_test,job_id=job_id,json_output=json_output,temp_dir=temp_dir,step_id=None,)self.validate_result_exit_code_0(job_attachment_test, result)if json_output:download = self.check_json_mode_contains_files(result)# With JSON mode, we can also check the manifest file itself.with open(download["downloaded"][0]["local_manifest_path"]) as manifest_file:manifest = self.validate_manifest_is_not_None(manifest_file)# Create a list of files we know should be in the input paths.files: List[str] = [path.path for path in manifest.paths]if asset_type == AssetType.INPUT:self._assert_input_mainfests_exist(files)# No step id in request hence no step dependencies outputsassert "dependent_step_output_file" not in filesif asset_type == AssetType.OUTPUT:assert "inputs/textures", "brick.png" not in filesself._assert_output_manifests_exist(files)self._assert_dependent_step_output_exist(files)if asset_type == AssetType.ALL:self._assert_input_mainfests_exist(files)self._assert_dependent_step_output_exist(files)self._assert_output_manifests_exist(files) | 6 | 58 | 7 | 259 | 0 | 286 | 357 | 286 | self,temp_dir,asset_type,json_output,upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test | [] | None | {"AnnAssign": 3, "Assign": 3, "Expr": 9, "If": 4, "With": 1} | 18 | 72 | 18 | ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "self._assert_dependent_step_output_exist", "self._assert_input_mainfests_exist", "self._assert_dependent_step_output_exist", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param", "pytest.mark.parametrize"] | 0 | [] | The function (test_manifest_download_asset_type_with_job_step_dependency) defined within the public class called TestManifestDownload.The function start at line 286 and ends at 357. It contains 58 lines of code and it has a cyclomatic complexity of 6. It takes 7 parameters, represented as [286.0] and does not return any value. It declares 18.0 functions, and It has 18.0 functions called inside which are ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_output_manifests_exist", "self._assert_dependent_step_output_exist", "self._assert_input_mainfests_exist", "self._assert_dependent_step_output_exist", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param", "pytest.mark.parametrize"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | test_manifest_download_output_only_for_one_step | def test_manifest_download_output_only_for_one_step(self,temp_dir: str,json_output: bool,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,default_job_template_step_step_dependency: str,job_attachment_test: JobAttachmentTest,):# Create a job, with step-step dependency.job_id: str = self._setup_create_job(upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test,)# Upload a dependent task output manifest.asset_root_path: str = upload_input_files_one_asset_in_cas.attachments.manifests[0].rootPathsecond_step_id: str = self._sync_mock_output_file(job_attachment_test,job_id,"custom-step","custom-step-2",asset_root_path,"dependent_step_output_file","dependent_step_output/nested_output_file",)# Upload the task output manifest for the stepself._sync_mock_output_file(job_attachment_test,job_id,"custom-step-2",None,asset_root_path,"output_file","output/nested_output_file",)# Whenresult = self.run_cli_with_params(asset_type="output",job_attachment_test=job_attachment_test,job_id=job_id,json_output=json_output,temp_dir=temp_dir,step_id=second_step_id,)self.validate_result_exit_code_0(job_attachment_test, result)if json_output:download = self.check_json_mode_contains_files(result)# With JSON mode, we can also check the manifest file itself.with open(download["downloaded"][0]["local_manifest_path"]) as manifest_file:manifest = self.validate_manifest_is_not_None(manifest_file)# Create a list of files we know should be in the input paths.files: List[str] = [path.path for path in manifest.paths]assert "inputs/textures", "brick.png" not in filesassert "dependent_step_output_file" not in filesself._assert_output_manifests_exist(files) | 3 | 49 | 6 | 208 | 0 | 366 | 426 | 366 | self,temp_dir,json_output,upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test | [] | None | {"AnnAssign": 4, "Assign": 3, "Expr": 3, "If": 1, "With": 1} | 12 | 61 | 12 | ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param"] | 0 | [] | The function (test_manifest_download_output_only_for_one_step) defined within the public class called TestManifestDownload.The function start at line 366 and ends at 426. It contains 49 lines of code and it has a cyclomatic complexity of 3. It takes 6 parameters, represented as [366.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_output_manifests_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param"]. |
aws-deadline_deadline-cloud | TestManifestDownload | public | 0 | 0 | test_manifest_download_job_input_with_given_step_and_step_dependency | def test_manifest_download_job_input_with_given_step_and_step_dependency(self,temp_dir: str,json_output: bool,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,default_job_template_step_step_dependency: str,job_attachment_test: JobAttachmentTest,):# Given:# Create a job, with step step dependency.job_id: str = self._setup_create_job(upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test,)# Upload the task output manifest for the stepasset_root_path: str = upload_input_files_one_asset_in_cas.attachments.manifests[0].rootPathself._sync_mock_output_file(job_attachment_test,job_id,"custom-step","custom-step-2",asset_root_path,"dependent_step_output_file","dependent_step_output/nested_output_file",)# Upload the task output manifest for the stepsecond_step_id: str = self._sync_mock_output_file(job_attachment_test,job_id,"custom-step-2",None,asset_root_path,"output_file","output/nested_output_file",)# Whenresult = self.run_cli_with_params(asset_type="input",job_attachment_test=job_attachment_test,job_id=job_id,json_output=json_output,temp_dir=temp_dir,step_id=second_step_id,)self.validate_result_exit_code_0(job_attachment_test, result)if json_output:download = self.check_json_mode_contains_files(result)# With JSON mode, we can also check the manifest file itself.with open(download["downloaded"][0]["local_manifest_path"]) as manifest_file:manifest = self.validate_manifest_is_not_None(manifest_file)# Create a list of files we know should be in the input paths.files: List[str] = [path.path for path in manifest.paths]self._assert_input_mainfests_exist(files)self._assert_dependent_step_output_exist(files) | 3 | 48 | 6 | 202 | 0 | 435 | 495 | 435 | self,temp_dir,json_output,upload_input_files_one_asset_in_cas,default_job_template_step_step_dependency,job_attachment_test | [] | None | {"AnnAssign": 4, "Assign": 3, "Expr": 4, "If": 1, "With": 1} | 13 | 61 | 13 | ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_dependent_step_output_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param"] | 0 | [] | The function (test_manifest_download_job_input_with_given_step_and_step_dependency) defined within the public class called TestManifestDownload.The function start at line 435 and ends at 495. It contains 48 lines of code and it has a cyclomatic complexity of 3. It takes 6 parameters, represented as [435.0] and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["self._setup_create_job", "self._sync_mock_output_file", "self._sync_mock_output_file", "self.run_cli_with_params", "self.validate_result_exit_code_0", "self.check_json_mode_contains_files", "open", "self.validate_manifest_is_not_None", "self._assert_input_mainfests_exist", "self._assert_dependent_step_output_exist", "pytest.mark.parametrize", "pytest.param", "pytest.param"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | temp_dir | def temp_dir(self):with tempfile.TemporaryDirectory() as tmpdir_path:yield tmpdir_path | 1 | 3 | 1 | 16 | 0 | 24 | 26 | 24 | self | [] | None | {"Expr": 1, "With": 1} | 1 | 3 | 1 | ["tempfile.TemporaryDirectory"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"] | The function (temp_dir) defined within the public class called TestManifestSnapshot.The function start at line 24 and ends at 26. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["tempfile.TemporaryDirectory"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | manifest_dir | def manifest_dir(self, temp_dir):manifest_dir = os.path.join(temp_dir, "manifest_dir")os.makedirs(manifest_dir)yield manifest_dir | 1 | 4 | 2 | 27 | 0 | 29 | 33 | 29 | self,temp_dir | [] | None | {"Assign": 1, "Expr": 2} | 2 | 5 | 2 | ["os.path.join", "os.makedirs"] | 0 | [] | The function (manifest_dir) defined within the public class called TestManifestSnapshot.The function start at line 29 and ends at 33. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [29.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["os.path.join", "os.makedirs"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | _create_test_manifest | def _create_test_manifest(self, tmp_path: str, root_dir: str) -> str:"""Create some test files in the temp dir, snapshot it and return the manifest file."""TEST_MANIFEST_DIR = "manifest_dir"# Givenmanifest_dir = os.path.join(tmp_path, TEST_MANIFEST_DIR)os.makedirs(manifest_dir)subdir1 = os.path.join(root_dir, "subdir1")subdir2 = os.path.join(root_dir, "subdir2")os.makedirs(subdir1)os.makedirs(subdir2)Path(os.path.join(subdir1, "file1.txt")).touch()Path(os.path.join(subdir2, "file2.txt")).touch()# When snapshot is called.runner = CliRunner()result = runner.invoke(main,["manifest","snapshot","--root",root_dir,"--destination",manifest_dir,"--name","test",],)assert result.exit_code == 0, result.outputmanifest_files = os.listdir(manifest_dir)assert len(manifest_files) == 1, (f"Expected exactly one manifest file, but got {len(manifest_files)}")manifest = manifest_files[0]assert "test" in manifest, f"Expected test in manifest file name, got {manifest}"# Return the manifest that we found.return os.path.join(manifest_dir, manifest) | 1 | 32 | 3 | 194 | 0 | 35 | 77 | 35 | self,tmp_path,root_dir | [] | str | {"Assign": 8, "Expr": 6, "Return": 1} | 18 | 43 | 18 | ["os.path.join", "os.makedirs", "os.path.join", "os.path.join", "os.makedirs", "os.makedirs", "touch", "Path", "os.path.join", "touch", "Path", "os.path.join", "CliRunner", "runner.invoke", "os.listdir", "len", "len", "os.path.join"] | 0 | [] | The function (_create_test_manifest) defined within the public class called TestManifestSnapshot.The function start at line 35 and ends at 77. It contains 32 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [35.0] and does not return any value. It declares 18.0 functions, and It has 18.0 functions called inside which are ["os.path.join", "os.makedirs", "os.path.join", "os.path.join", "os.makedirs", "os.makedirs", "touch", "Path", "os.path.join", "touch", "Path", "os.path.join", "CliRunner", "runner.invoke", "os.listdir", "len", "len", "os.path.join"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | test_manifest_diff | def test_manifest_diff(self, tmp_path: str, json_output: bool):"""Tests if manifest diff CLI works, basic case. Variation on JSON as printout.Business logic testing will be done at the API level where we can check the outputs."""TEST_ROOT_DIR = "root_dir"# Given a created manifest file...root_dir = os.path.join(tmp_path, TEST_ROOT_DIR)manifest = self._create_test_manifest(tmp_path, root_dir)# Lets add another file.new_file = "file3.txt"Path(os.path.join(root_dir, new_file)).touch()# Whenrunner = CliRunner()args = ["manifest", "diff", "--root", root_dir, "--manifest", manifest]if json_output:args.append("--json")result = runner.invoke(main, args)# Thenassert result.exit_code == 0, result.outputif json_output:# If JSON mode was specified, make sure the output is JSON and contains the new file.diff = json.loads(result.output)assert len(diff["new"]) == 1assert new_file in diff["new"] | 3 | 16 | 3 | 138 | 0 | 86 | 114 | 86 | self,tmp_path,json_output | [] | None | {"Assign": 8, "Expr": 3, "If": 2} | 13 | 29 | 13 | ["os.path.join", "self._create_test_manifest", "touch", "Path", "os.path.join", "CliRunner", "args.append", "runner.invoke", "json.loads", "len", "pytest.mark.parametrize", "pytest.param", "pytest.param"] | 0 | [] | The function (test_manifest_diff) defined within the public class called TestManifestSnapshot.The function start at line 86 and ends at 114. It contains 16 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [86.0] and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["os.path.join", "self._create_test_manifest", "touch", "Path", "os.path.join", "CliRunner", "args.append", "runner.invoke", "json.loads", "len", "pytest.mark.parametrize", "pytest.param", "pytest.param"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | test_manifest_snapshot_and_diff_with_include | def test_manifest_snapshot_and_diff_with_include(self, tmp_path: str, manifest_dir: str):"""Tests creating a snapshot of the whole root directory and then running a diff with an include filter.This verifies that:1. The initial snapshot contains all files2. The diff correctly identifies changes only in the included subdirectory"""TEST_ROOT_DIR = "root_dir"# Given a created manifest file...root_dir = os.path.join(tmp_path, TEST_ROOT_DIR)manifest = self._create_test_manifest(tmp_path, root_dir)# Lets add another file.new_file = "subdir1/file3.txt"Path(os.path.join(root_dir, new_file)).touch()new_nested_file = "subdir1/nested/file4.txt"nested_file_path = Path(os.path.join(root_dir, new_nested_file))nested_file_path.parent.mkdir(parents=True, exist_ok=True)# Create parent directories if they don't existnested_file_path.touch()# Whenrunner = CliRunner()args = ["manifest","snapshot","--root",root_dir,"--diff",manifest,"--include","subdir1/**","--destination",manifest_dir,]result = runner.invoke(main, args)assert result.exit_code == 0, result.output# Find the manifest filemanifest_files = os.listdir(manifest_dir)assert len(manifest_files) == 1, (f"Expected exactly one manifest file, but got {len(manifest_files)}")manifest_path = os.path.join(manifest_dir, manifest_files[0])# Read the manifest file and verify its contentswith open(manifest_path, "r") as f:manifest_data = json.load(f)actual_files = set()for file_entry in manifest_data.get("paths", []):actual_files.add(file_entry.get("path"))assert actual_files == {new_file, new_nested_file}, (f"Expected manifest to contain only 'subdir1/file3.txt' and 'subdir1/nested/file4.txt', "f"but got {actual_files}") | 2 | 41 | 3 | 242 | 0 | 116 | 175 | 116 | self,tmp_path,manifest_dir | [] | None | {"Assign": 13, "Expr": 5, "For": 1, "With": 1} | 21 | 60 | 21 | ["os.path.join", "self._create_test_manifest", "touch", "Path", "os.path.join", "Path", "os.path.join", "nested_file_path.parent.mkdir", "nested_file_path.touch", "CliRunner", "runner.invoke", "os.listdir", "len", "len", "os.path.join", "open", "json.load", "set", "manifest_data.get", "actual_files.add", "file_entry.get"] | 0 | [] | The function (test_manifest_snapshot_and_diff_with_include) defined within the public class called TestManifestSnapshot.The function start at line 116 and ends at 175. It contains 41 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [116.0] and does not return any value. It declares 21.0 functions, and It has 21.0 functions called inside which are ["os.path.join", "self._create_test_manifest", "touch", "Path", "os.path.join", "Path", "os.path.join", "nested_file_path.parent.mkdir", "nested_file_path.touch", "CliRunner", "runner.invoke", "os.listdir", "len", "len", "os.path.join", "open", "json.load", "set", "manifest_data.get", "actual_files.add", "file_entry.get"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | test_manifest_snapshot_over_windows_path_limit | def test_manifest_snapshot_over_windows_path_limit(self, tmp_path: WindowsPath):"""Tests that if the snapshot root directory is almost as long as the Windows path length limit,the snapshot will still work despite reaching over the Windows path length limit.See https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation"""# Giventmp_path_len: int = len(str(tmp_path))# Make the file root directoryroot_directory: str = os.path.join(tmp_path, "root")os.makedirs(root_directory)Path(os.path.join(root_directory, "test.txt")).touch()# Create a manifest directory that is almost as long as the Windows path length limit.manifest_directory_remaining_length: int = WINDOWS_MAX_PATH_LENGTH - tmp_path_len - 30manifest_directory: str = os.path.join(tmp_path,*["path"]* math.floor(manifest_directory_remaining_length / 5),# Create a temp path that barely does not exceed the windows path limit)os.makedirs(manifest_directory)assert len(manifest_directory) <= WINDOWS_MAX_PATH_LENGTH - 30runner = CliRunner()result = runner.invoke(main,["manifest","snapshot","--root",root_directory,"--destination",manifest_directory,"--name","testLongPath",],)assert result.exit_code == 0, result.outputassert "warning" in result.stdout.lower()files: List[str] = os.listdir(manifest_directory)assert len(files) == 1, f"Expected exactly one manifest file, but got {len(files)}"manifest: str = files[0]assert "testLongPath" in manifest, (f"Expected testLongPath in manifest file name, got {manifest}")assert len(os.path.join(manifest_directory, manifest)) > 260, (f"Expected full manifest file path to be over the windows path length limit of {WINDOWS_MAX_PATH_LENGTH}, got {len(os.path.join(manifest_directory, manifest))}") | 1 | 40 | 2 | 223 | 0 | 182 | 240 | 182 | self,tmp_path | [] | None | {"AnnAssign": 6, "Assign": 2, "Expr": 4} | 23 | 59 | 23 | ["len", "str", "os.path.join", "os.makedirs", "touch", "Path", "os.path.join", "os.path.join", "math.floor", "os.makedirs", "len", "CliRunner", "runner.invoke", "result.stdout.lower", "os.listdir", "len", "len", "len", "os.path.join", "len", "os.path.join", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"] | 0 | [] | The function (test_manifest_snapshot_over_windows_path_limit) defined within the public class called TestManifestSnapshot.The function start at line 182 and ends at 240. It contains 40 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [182.0] and does not return any value. It declares 23.0 functions, and It has 23.0 functions called inside which are ["len", "str", "os.path.join", "os.makedirs", "touch", "Path", "os.path.join", "os.path.join", "math.floor", "os.makedirs", "len", "CliRunner", "runner.invoke", "result.stdout.lower", "os.listdir", "len", "len", "len", "os.path.join", "len", "os.path.join", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"]. |
aws-deadline_deadline-cloud | TestManifestSnapshot | public | 0 | 0 | test_manifest_snapshot_over_windows_path_limit_json | def test_manifest_snapshot_over_windows_path_limit_json(self, tmp_path: WindowsPath):"""Tests that if the snapshot root directory is almost as long as the Windows path length limit,the snapshot will still work despite reaching over the Windows path length limit.This test uses json output in the cli options and verifies that the output json is expected.See https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation"""# Giventmp_path_len: int = len(str(tmp_path))# Make the file root directoryroot_directory: str = os.path.join(tmp_path, "root")os.makedirs(root_directory)Path(os.path.join(root_directory, "test.txt")).touch()# Create a manifest directory that is almost as long as the Windows path length limit.manifest_directory_remaining_length: int = WINDOWS_MAX_PATH_LENGTH - tmp_path_len - 30manifest_directory: str = os.path.join(tmp_path,*["path"]* math.floor(manifest_directory_remaining_length / 5),# Create a temp path that barely does not exceed the windows path limit)os.makedirs(manifest_directory)assert len(manifest_directory) <= WINDOWS_MAX_PATH_LENGTH - 30runner = CliRunner()result = runner.invoke(main,["manifest","snapshot","--root",root_directory,"--destination",manifest_directory,"--name","testLongPath","--json",],)assert result.exit_code == 0, result.outputassert json.loads(result.stdout).get("warning") is not Nonefiles: List[str] = os.listdir(manifest_directory)assert len(files) == 1, f"Expected exactly one manifest file, but got {len(files)}"manifest: str = files[0]assert "testLongPath" in manifest, (f"Expected testLongPath in manifest file name, got {manifest}")assert len(os.path.join(manifest_directory, manifest)) > 260, (f"Expected full manifest file path to be over the windows path length limit of {WINDOWS_MAX_PATH_LENGTH}, got {len(os.path.join(manifest_directory, manifest))}") | 1 | 41 | 2 | 232 | 0 | 247 | 308 | 247 | self,tmp_path | [] | None | {"AnnAssign": 6, "Assign": 2, "Expr": 4} | 24 | 62 | 24 | ["len", "str", "os.path.join", "os.makedirs", "touch", "Path", "os.path.join", "os.path.join", "math.floor", "os.makedirs", "len", "CliRunner", "runner.invoke", "get", "json.loads", "os.listdir", "len", "len", "len", "os.path.join", "len", "os.path.join", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"] | 0 | [] | The function (test_manifest_snapshot_over_windows_path_limit_json) defined within the public class called TestManifestSnapshot.The function start at line 247 and ends at 308. It contains 41 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [247.0] and does not return any value. It declares 24.0 functions, and It has 24.0 functions called inside which are ["len", "str", "os.path.join", "os.makedirs", "touch", "Path", "os.path.join", "os.path.join", "math.floor", "os.makedirs", "len", "CliRunner", "runner.invoke", "get", "json.loads", "os.listdir", "len", "len", "len", "os.path.join", "len", "os.path.join", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"]. |
aws-deadline_deadline-cloud | TestManifestUpload | public | 0 | 0 | temp_dir | def temp_dir(self):with tempfile.TemporaryDirectory() as tmpdir_path:yield tmpdir_path | 1 | 3 | 1 | 16 | 0 | 44 | 46 | 44 | self | [] | None | {"Expr": 1, "With": 1} | 1 | 3 | 1 | ["tempfile.TemporaryDirectory"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"] | The function (temp_dir) defined within the public class called TestManifestUpload.The function start at line 44 and ends at 46. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["tempfile.TemporaryDirectory"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3924769_jaraco_zipp.tests.test_path_py.TestPath.zipfile_ondisk"]. |
aws-deadline_deadline-cloud | TestManifestUpload | public | 0 | 0 | create_manifest_file | def create_manifest_file(self, root_directory: str, destination_directory: str) -> str:"""Create a test manifest file, and return the full path for testing."""# Given a snapshot file:test_file_name = "test_file"test_file = os.path.join(root_directory, test_file_name)os.makedirs(os.path.dirname(test_file), exist_ok=True)with open(test_file, "w") as f:f.write("testing123")# Whenmanifest: Optional[ManifestSnapshot] = _manifest_snapshot(root=root_directory, destination=destination_directory, name="test")# Thenassert manifest is not Noneassert manifest.manifest is not Nonereturn manifest.manifest | 1 | 12 | 3 | 101 | 0 | 48 | 68 | 48 | self,root_directory,destination_directory | [] | str | {"AnnAssign": 1, "Assign": 2, "Expr": 3, "Return": 1, "With": 1} | 6 | 21 | 6 | ["os.path.join", "os.makedirs", "os.path.dirname", "open", "f.write", "_manifest_snapshot"] | 0 | [] | The function (create_manifest_file) defined within the public class called TestManifestUpload.The function start at line 48 and ends at 68. It contains 12 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [48.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["os.path.join", "os.makedirs", "os.path.dirname", "open", "f.write", "_manifest_snapshot"]. |
aws-deadline_deadline-cloud | TestManifestUpload | public | 0 | 0 | test_manifest_upload | def test_manifest_upload(self, temp_dir: str):"""Simple test to generate a manifest, and then call the upload CLI to upload to S3.The test verifies the manifest is uploaded by doing a S3 get call."""# Given a snapshot file:manifest_file = self.create_manifest_file(temp_dir, temp_dir)manifest_file_name = Path(manifest_file).name# Now that we have a manifest file, execute the CLI and upload it to S3# The manifest file name is unique, so it will not collide with prior test runs.s3_bucket = os.environ.get("JOB_ATTACHMENTS_BUCKET")runner = CliRunner()# Temporary, always add cli_manifest until launched.main.add_command(cli_manifest)result = runner.invoke(main,["manifest","upload","--s3-cas-uri",f"s3://{s3_bucket}/DeadlineCloud",manifest_file,],)assert result.exit_code == 0, f"Non-Zero exit code, CLI output {result.output}"# Then validate the Manifest file is uploaded to S3 by checking the file actually exists.manifest_s3_path = f"DeadlineCloud/Manifests/{manifest_file_name}"s3_client = boto3.client("s3")s3_client.head_object(Bucket=s3_bucket, Key=manifest_s3_path)# Cleanup.s3_client.delete_object(Bucket=s3_bucket, Key=manifest_s3_path) | 1 | 21 | 2 | 117 | 0 | 70 | 104 | 70 | self,temp_dir | [] | None | {"Assign": 7, "Expr": 4} | 9 | 35 | 9 | ["self.create_manifest_file", "Path", "os.environ.get", "CliRunner", "main.add_command", "runner.invoke", "boto3.client", "s3_client.head_object", "s3_client.delete_object"] | 0 | [] | The function (test_manifest_upload) defined within the public class called TestManifestUpload.The function start at line 70 and ends at 104. It contains 21 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [70.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["self.create_manifest_file", "Path", "os.environ.get", "CliRunner", "main.add_command", "runner.invoke", "boto3.client", "s3_client.head_object", "s3_client.delete_object"]. |
aws-deadline_deadline-cloud | TestManifestUpload | public | 0 | 0 | test_manifest_upload_by_farm_queue | def test_manifest_upload_by_farm_queue(self, temp_dir: str):"""Simple test to generate a manifest, and then call the upload CLI to upoad to S3.This test case uses --farm-id and --queue-idThe test verifies the manifest is uploaded by doing a S3 get call."""# Given a snapshot file:manifest_file = self.create_manifest_file(temp_dir, temp_dir)manifest_file_name = Path(manifest_file).name# Input:farm_id = os.environ.get("FARM_ID", "")queue_id = os.environ.get("QUEUE_ID", "")# Now that we have a manifest file, execute the CLI and upload it to S3# The manifest file name is unique, so it will not collide with prior test runs.s3_bucket = os.environ.get("JOB_ATTACHMENTS_BUCKET", "")runner = CliRunner()result = runner.invoke(main,["manifest","upload","--farm-id",farm_id,"--queue-id",queue_id,manifest_file,],)assert result.exit_code == 0, f"Non-Zero exit code, CLI output {result.output}"# Then validate the Manifest file is uploaded to S3 by checking the file actually exists.root_prefix = get_queue(farm_id=farm_id, queue_id=queue_id).jobAttachmentSettings.rootPrefix# type: ignore[union-attr]manifest_s3_path = f"{root_prefix}/Manifests/{manifest_file_name}"s3_client = boto3.client("s3")try:s3_client.head_object(Bucket=s3_bucket, Key=manifest_s3_path)except ClientError:pytest.fail(f"File not found at {s3_bucket}, {manifest_s3_path}")# Cleanup.s3_client.delete_object(Bucket=s3_bucket, Key=manifest_s3_path) | 2 | 28 | 2 | 168 | 0 | 106 | 149 | 106 | self,temp_dir | [] | None | {"Assign": 10, "Expr": 4, "Try": 1} | 12 | 44 | 12 | ["self.create_manifest_file", "Path", "os.environ.get", "os.environ.get", "os.environ.get", "CliRunner", "runner.invoke", "get_queue", "boto3.client", "s3_client.head_object", "pytest.fail", "s3_client.delete_object"] | 0 | [] | The function (test_manifest_upload_by_farm_queue) defined within the public class called TestManifestUpload.The function start at line 106 and ends at 149. It contains 28 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [106.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["self.create_manifest_file", "Path", "os.environ.get", "os.environ.get", "os.environ.get", "CliRunner", "runner.invoke", "get_queue", "boto3.client", "s3_client.head_object", "pytest.fail", "s3_client.delete_object"]. |
aws-deadline_deadline-cloud | TestManifestUpload | public | 0 | 0 | test_manifest_upload_over_windows_path_limit | def test_manifest_upload_over_windows_path_limit(self, tmp_path):"""Tests that when a manifest is created over the windows path limit, it is able to be uploaded and there are no issues."""tmp_path_str = str(tmp_path)# Create a manifest directory that is almost as long as the Windows path length limit.manifest_directory_remaining_length: int = WINDOWS_MAX_PATH_LENGTH - len(tmp_path_str) - 30manifest_directory: str = os.path.join(tmp_path_str,*["path"]* math.floor(manifest_directory_remaining_length / 5),# Create a temp path for the manifest directory that barely does not exceed the windows path limit)manifest_file: str = self.create_manifest_file(os.path.join(tmp_path_str, "root"), manifest_directory)assert len(manifest_file) >= WINDOWS_MAX_PATH_LENGTHmanifest_file_name: str = Path(manifest_file).name# Now that we have a manifest file, execute the CLI and upload it to S3# The manifest file name is unique, so it will not collide with prior test runs.s3_bucket = os.environ.get("JOB_ATTACHMENTS_BUCKET")runner = CliRunner()# Temporary, always add cli_manifest until launched.main.add_command(cli_manifest)result = runner.invoke(main,["manifest","upload","--s3-cas-uri",f"s3://{s3_bucket}/DeadlineCloud",manifest_file,],)assert result.exit_code == 0, f"Non-Zero exit code, CLI output {result.output}"# Then validate the Manifest file is uploaded to S3 by checking the file actually exists.manifest_s3_path = f"DeadlineCloud/Manifests/{manifest_file_name}"s3_client = boto3.client("s3")s3_client.head_object(Bucket=s3_bucket, Key=manifest_s3_path)# Cleanup.s3_client.delete_object(Bucket=s3_bucket, Key=manifest_s3_path) | 1 | 33 | 2 | 180 | 0 | 155 | 203 | 155 | self,tmp_path | [] | None | {"AnnAssign": 4, "Assign": 6, "Expr": 4} | 17 | 49 | 17 | ["str", "len", "os.path.join", "math.floor", "self.create_manifest_file", "os.path.join", "len", "Path", "os.environ.get", "CliRunner", "main.add_command", "runner.invoke", "boto3.client", "s3_client.head_object", "s3_client.delete_object", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"] | 0 | [] | The function (test_manifest_upload_over_windows_path_limit) defined within the public class called TestManifestUpload.The function start at line 155 and ends at 203. It contains 33 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [155.0] and does not return any value. It declares 17.0 functions, and It has 17.0 functions called inside which are ["str", "len", "os.path.join", "math.floor", "self.create_manifest_file", "os.path.join", "len", "Path", "os.environ.get", "CliRunner", "main.add_command", "runner.invoke", "boto3.client", "s3_client.head_object", "s3_client.delete_object", "pytest.mark.skipif", "_is_windows_long_path_registry_enabled"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_queue_get | def test_queue_get(deadline_cli_test: DeadlineCliTest) -> None:runner = CliRunner()result = runner.invoke(main,["queue","get","--queue-id",deadline_cli_test.queue_id,"--farm-id",deadline_cli_test.farm_id,],)assert result.exit_code == 0assert f"queueId: {deadline_cli_test.queue_id}" in result.output# The following vary from queue to queue, so just make sure the general layout is there.# Unit tests are able to test the output more throughly. We'll only look for the required fields.assert "displayName:" in result.outputassert f"farmId: {deadline_cli_test.farm_id}" in result.outputassert "status" in result.outputassert "defaultBudgetAction" in result.output | 1 | 19 | 1 | 80 | 2 | 12 | 35 | 12 | deadline_cli_test | ['runner', 'result'] | None | {"Assign": 2} | 2 | 24 | 2 | ["CliRunner", "runner.invoke"] | 0 | [] | The function (test_queue_get) defined within the public class called public.The function start at line 12 and ends at 35. It contains 19 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["CliRunner", "runner.invoke"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_queue_list | def test_queue_list(deadline_cli_test: DeadlineCliTest) -> None:runner = CliRunner()result = runner.invoke(main,["queue", "list", "--farm-id", deadline_cli_test.farm_id],)assert result.exit_code == 0assert f"- queueId: {deadline_cli_test.queue_id}" in result.output# The following vary from queue to queue, so just make sure the general layout is there.# Unit tests are able to test the output more throughly. We'll only look for the required fields.assert "displayName:" in result.output | 1 | 9 | 1 | 54 | 2 | 38 | 51 | 38 | deadline_cli_test | ['runner', 'result'] | None | {"Assign": 2} | 2 | 14 | 2 | ["CliRunner", "runner.invoke"] | 0 | [] | The function (test_queue_list) defined within the public class called public.The function start at line 38 and ends at 51. It contains 9 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["CliRunner", "runner.invoke"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_queue_export_credentials | def test_queue_export_credentials(deadline_cli_test: DeadlineCliTest) -> None:runner = CliRunner()result = runner.invoke(main,["queue","export-credentials","--farm-id",deadline_cli_test.farm_id,"--queue-id",deadline_cli_test.queue_id,],)try:assert result.exit_code == 0creds = json.loads(result.output)# Transform credential_process output back into what boto expects# We want to make sure the date is in an expected format, but we're not going to further validate this date_ = datetime.datetime.fromisoformat(creds["Expiration"])assert creds["Version"] == 1session = boto3.Session(aws_access_key_id=creds["AccessKeyId"],aws_secret_access_key=creds["SecretAccessKey"],aws_session_token=creds["SessionToken"],)# Use the sessionsts = session.client("sts")# If this works it means our creds are good, so we're happysts.get_caller_identity()except json.JSONDecodeError:# If JSON parsing fails, don't include the raw output in the error as this could print credentialsassert False, "Failed to parse JSON output from export-credentials"except Exception as e:# For any other exception, make sure we don't include credentials in the error, so we need to be careful with what we print.assert False, f"Test failed: {type(e).__name__}" | 3 | 29 | 1 | 140 | 6 | 54 | 95 | 54 | deadline_cli_test | ['creds', '_', 'sts', 'runner', 'session', 'result'] | None | {"Assign": 6, "Expr": 1, "Try": 1} | 8 | 42 | 8 | ["CliRunner", "runner.invoke", "json.loads", "datetime.datetime.fromisoformat", "boto3.Session", "session.client", "sts.get_caller_identity", "type"] | 0 | [] | The function (test_queue_export_credentials) defined within the public class called public.The function start at line 54 and ends at 95. It contains 29 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["CliRunner", "runner.invoke", "json.loads", "datetime.datetime.fromisoformat", "boto3.Session", "session.client", "sts.get_caller_identity", "type"]. |
aws-deadline_deadline-cloud | DeadlineCliTest | public | 0 | 0 | __init__ | def __init__(self):"""Sets up resources that the integ tests will need"""self.farm_id: str = os.environ["FARM_ID"]self.queue_id: str = os.environ["QUEUE_ID"] | 1 | 3 | 1 | 30 | 0 | 21 | 27 | 21 | self | [] | None | {"AnnAssign": 2, "Expr": 1} | 0 | 7 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineCliTest.The function start at line 21 and ends at 27. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | DeadlineCliTest | public | 0 | 0 | __init__ | def __init__(self,tmp_path_factory: TempPathFactory,manifest_version: ManifestVersion,):"""Sets up resource that these integration tests will need."""self.farm_id: str = os.environ.get("FARM_ID", "")self.queue_id: str = os.environ.get("QUEUE_ID", "")self.bucket = boto3.resource("s3").Bucket(os.environ.get("JOB_ATTACHMENTS_BUCKET", ""))self.deadline_client = DeadlineClient(boto3.client("deadline"))self.hash_cache_dir = tmp_path_factory.mktemp("hash_cache")self.s3_cache_dir = tmp_path_factory.mktemp("s3_check_cache")self.session = boto3.Session()self.deadline_endpoint = os.getenv("AWS_ENDPOINT_URL_DEADLINE",f"https://deadline.{self.session.region_name}.amazonaws.com",)self.job_attachment_settings: JobAttachmentS3Settings = get_queue(farm_id=self.farm_id,queue_id=self.queue_id,deadline_endpoint_url=self.deadline_endpoint,).jobAttachmentSettings# type: ignore[union-attr,assignment]self.manifest_version = manifest_version | 1 | 22 | 3 | 161 | 0 | 44 | 73 | 44 | self | [] | None | {"AnnAssign": 2, "Expr": 1} | 0 | 7 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineCliTest.The function start at line 44 and ends at 73. It contains 22 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [44.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | is_windows_non_admin | def is_windows_non_admin():return sys.platform == "win32" and getpass.getuser() != "Administrator" | 2 | 2 | 0 | 18 | 0 | 7 | 8 | 7 | [] | Returns | {"Return": 1} | 1 | 2 | 1 | ["getpass.getuser"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_sync_outputs_with_symlink", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.job_bundle.test_job_bundle_loader_py.test_validate_directory_symlink_containment_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_asset_sync_py.TestAssetSync.test_is_file_within_directory_with_symlink", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_OutputDownloader_set_root_path_with_symlinks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_upload_py.TestUpload.test_manage_assets_with_symlinks"] | The function (is_windows_non_admin) defined within the public class called public.The function start at line 7 and ends at 8. It contains 2 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters, and this function return a value. It declare 1.0 function, It has 1.0 function called inside which is ["getpass.getuser"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_sync_outputs_with_symlink", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.job_bundle.test_job_bundle_loader_py.test_validate_directory_symlink_containment_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_asset_sync_py.TestAssetSync.test_is_file_within_directory_with_symlink", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_OutputDownloader_set_root_path_with_symlinks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_upload_py.TestUpload.test_manage_assets_with_symlinks"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | notifier_callback | def notifier_callback(progress: float, message: str) -> None:pass | 1 | 2 | 2 | 14 | 0 | 44 | 45 | 44 | progress,message | [] | None | {} | 0 | 2 | 0 | [] | 0 | [] | The function (notifier_callback) defined within the public class called public.The function start at line 44 and ends at 45. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [44.0] and does not return any value.. |
aws-deadline_deadline-cloud | JobAttachmentTest | public | 0 | 0 | __init__ | def __init__(self,deploy_job_attachment_resources: JobAttachmentManager,tmp_path_factory: TempPathFactory,manifest_version: ManifestVersion,):"""Sets up resource that these integration tests will need."""self.job_attachment_resources = deploy_job_attachment_resourcesif self.job_attachment_resources.farm_id is None:raise TypeError("The Farm ID was not properly retrieved when initializing resources.")if (self.job_attachment_resources.queue is Noneor self.job_attachment_resources.queue_with_no_settings is None):raise TypeError("The Queues were not properly created when initializing resources.")self.farm_id = self.job_attachment_resources.farm_idself.queue_id = self.job_attachment_resources.queue.idself.queue_with_no_settings_id = self.job_attachment_resources.queue_with_no_settings.idself.bucket = boto3.resource("s3").Bucket(self.job_attachment_resources.bucket_name)self.deadline_client = self.job_attachment_resources.deadline_clientself.bucket_root_prefix = self.job_attachment_resources.bucket_root_prefixself.hash_cache_dir = tmp_path_factory.mktemp("hash_cache")self.s3_cache_dir = tmp_path_factory.mktemp("s3_check_cache")self.session = boto3.Session()self.deadline_endpoint = os.getenv("AWS_ENDPOINT_URL_DEADLINE",f"https://deadline.{self.session.region_name}.amazonaws.com",)self.manifest_version = manifest_version | 4 | 28 | 4 | 178 | 0 | 65 | 100 | 65 | self,deploy_job_attachment_resources,tmp_path_factory,manifest_version | [] | None | {"Assign": 12, "Expr": 1, "If": 2} | 8 | 36 | 8 | ["TypeError", "TypeError", "Bucket", "boto3.resource", "tmp_path_factory.mktemp", "tmp_path_factory.mktemp", "boto3.Session", "os.getenv"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called JobAttachmentTest.The function start at line 65 and ends at 100. It contains 28 lines of code and it has a cyclomatic complexity of 4. It takes 4 parameters, represented as [65.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["TypeError", "TypeError", "Bucket", "boto3.resource", "tmp_path_factory.mktemp", "tmp_path_factory.mktemp", "boto3.Session", "os.getenv"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | job_attachment_test | def job_attachment_test(deploy_job_attachment_resources: JobAttachmentManager,tmp_path_factory: TempPathFactory,request: pytest.FixtureRequest,):"""Fixture to get the session's JobAttachmentTest object."""return JobAttachmentTest(deploy_job_attachment_resources, tmp_path_factory, manifest_version=request.param) | 1 | 8 | 3 | 32 | 0 | 104 | 115 | 104 | deploy_job_attachment_resources,tmp_path_factory,request | [] | Returns | {"Expr": 1, "Return": 1} | 2 | 12 | 2 | ["JobAttachmentTest", "pytest.fixture"] | 0 | [] | The function (job_attachment_test) defined within the public class called public.The function start at line 104 and ends at 115. It contains 8 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [104.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["JobAttachmentTest", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | upload_input_files_assets_not_in_cas | def upload_input_files_assets_not_in_cas(job_attachment_test: JobAttachmentTest):"""When no assets are in the CAS, make sure all files are uploaded."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)mock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENupload_group = asset_manager.prepare_paths_for_upload(input_paths=[str(job_attachment_test.SCENE_MA_PATH)],output_paths=[str(job_attachment_test.OUTPUT_PATH)],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),)# THENscene_ma_s3_path = (f"{job_attachment_settings.full_cas_prefix()}/{job_attachment_test.SCENE_MA_HASH}.xxh128")object_summary_iterator = job_attachment_test.bucket.objects.filter(Prefix=scene_ma_s3_path,)assert list(object_summary_iterator)[0].key == scene_ma_s3_path | 2 | 40 | 1 | 217 | 7 | 119 | 172 | 119 | job_attachment_test | ['object_summary_iterator', 'job_attachment_settings', 'mock_on_uploading_files', 'scene_ma_s3_path', 'upload_group', 'mock_on_preparing_to_submit', 'asset_manager'] | None | {"Assign": 8, "Expr": 2, "If": 1} | 16 | 54 | 16 | ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.objects.filter", "list", "pytest.fixture"] | 0 | [] | The function (upload_input_files_assets_not_in_cas) defined within the public class called public.The function start at line 119 and ends at 172. It contains 40 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 16.0 functions, and It has 16.0 functions called inside which are ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.objects.filter", "list", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | upload_input_files_one_asset_in_cas | def upload_input_files_one_asset_in_cas(job_attachment_test: JobAttachmentTest, upload_input_files_assets_not_in_cas: None) -> UploadInputFilesOneAssetInCasOutputs:"""Test that when one asset is already in the CAS, that every file except for the one in the CAS is uploaded."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)input_paths = [str(job_attachment_test.SCENE_MA_PATH),str(job_attachment_test.BRICK_PNG_PATH),str(job_attachment_test.CLOTH_PNG_PATH),str(job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH),]scene_ma_s3_path = (f"{job_attachment_settings.full_cas_prefix()}/{job_attachment_test.SCENE_MA_HASH}.xxh128")# This file has already been uploadedscene_ma_upload_time = job_attachment_test.bucket.Object(scene_ma_s3_path).last_modifiedmock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENupload_group = asset_manager.prepare_paths_for_upload(input_paths=input_paths,output_paths=[str(job_attachment_test.OUTPUT_PATH)],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)(_, attachments) = asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),)# THENbrick_png_hash = hash_file(str(job_attachment_test.BRICK_PNG_PATH), HashAlgorithm.XXH128)cloth_png_hash = hash_file(str(job_attachment_test.CLOTH_PNG_PATH), HashAlgorithm.XXH128)input_in_output_dir_hash = hash_file(str(job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH), HashAlgorithm.XXH128)brick_png_s3_path = f"{job_attachment_settings.full_cas_prefix()}/{brick_png_hash}.xxh128"cloth_png_s3_path = f"{job_attachment_settings.full_cas_prefix()}/{cloth_png_hash}.xxh128"input_in_output_dir_s3_path = (f"{job_attachment_settings.full_cas_prefix()}/{input_in_output_dir_hash}.xxh128")object_summary_iterator = job_attachment_test.bucket.objects.filter(Prefix=f"{job_attachment_settings.full_cas_prefix()}/",)s3_objects = {obj.key: obj for obj in object_summary_iterator}assert {brick_png_s3_path, cloth_png_s3_path, input_in_output_dir_s3_path} <= set(map(lambda x: x.key, object_summary_iterator))assert brick_png_s3_path in s3_objectsassert cloth_png_s3_path in s3_objectsassert input_in_output_dir_s3_path in s3_objects# Make sure that the file hasn't been modified/reuploadedassert s3_objects[scene_ma_s3_path].last_modified == scene_ma_upload_timereturn UploadInputFilesOneAssetInCasOutputs(attachments) | 3 | 67 | 2 | 376 | 16 | 181 | 270 | 181 | job_attachment_test,upload_input_files_assets_not_in_cas | ['object_summary_iterator', 'brick_png_s3_path', 'job_attachment_settings', 'cloth_png_hash', 's3_objects', 'input_in_output_dir_hash', 'mock_on_uploading_files', 'scene_ma_s3_path', 'input_paths', 'brick_png_hash', 'cloth_png_s3_path', 'input_in_output_dir_s3_path', 'upload_group', 'scene_ma_upload_time', 'mock_on_preparing_to_submit', 'asset_manager'] | UploadInputFilesOneAssetInCasOutputs | {"Assign": 18, "Expr": 1, "If": 1, "Return": 1} | 32 | 90 | 32 | ["get_queue", "Exception", "upload.S3AssetManager", "str", "str", "str", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.Object", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "hash_file", "str", "hash_file", "str", "hash_file", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_settings.full_cas_prefix", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "set", "map", "UploadInputFilesOneAssetInCasOutputs", "pytest.fixture"] | 0 | [] | The function (upload_input_files_one_asset_in_cas) defined within the public class called public.The function start at line 181 and ends at 270. It contains 67 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [181.0] and does not return any value. It declares 32.0 functions, and It has 32.0 functions called inside which are ["get_queue", "Exception", "upload.S3AssetManager", "str", "str", "str", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.Object", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "hash_file", "str", "hash_file", "str", "hash_file", "str", "job_attachment_settings.full_cas_prefix", "job_attachment_settings.full_cas_prefix", "job_attachment_settings.full_cas_prefix", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "set", "map", "UploadInputFilesOneAssetInCasOutputs", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_upload_input_files_all_assets_in_cas | def test_upload_input_files_all_assets_in_cas(job_attachment_test: JobAttachmentTest,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,) -> None:"""Test that when all assets are already in the CAS, that no files are uploaded."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)input_paths = [str(job_attachment_test.SCENE_MA_PATH),str(job_attachment_test.BRICK_PNG_PATH),str(job_attachment_test.CLOTH_PNG_PATH),str(job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH),]# This file has already been uploadedasset_upload_time = {obj.key: obj.last_modifiedfor obj in job_attachment_test.bucket.objects.filter(Prefix=f"{job_attachment_settings.full_cas_prefix()}/")}mock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENupload_group = asset_manager.prepare_paths_for_upload(input_paths=input_paths,output_paths=[str(job_attachment_test.OUTPUT_PATH)],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)(_, attachments) = asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),)# THENassert attachments.manifests[0].inputManifestPath is not None# Confirm nothing was uploadedfor obj in job_attachment_test.bucket.objects.filter(Prefix=f"{job_attachment_settings.full_cas_prefix()}/"):if (f"{attachments.manifests[0].inputManifestPath}"== f"s3://{job_attachment_test.bucket.name}/{obj.key}"):# Skip checking the manifest filecontinueassert obj.last_modified == asset_upload_time[obj.key] | 5 | 58 | 2 | 299 | 7 | 274 | 350 | 274 | job_attachment_test,upload_input_files_one_asset_in_cas | ['asset_manager', 'job_attachment_settings', 'mock_on_uploading_files', 'input_paths', 'upload_group', 'mock_on_preparing_to_submit', 'asset_upload_time'] | None | {"Assign": 9, "Expr": 1, "For": 1, "If": 2} | 19 | 77 | 19 | ["get_queue", "Exception", "upload.S3AssetManager", "str", "str", "str", "str", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix"] | 0 | [] | The function (test_upload_input_files_all_assets_in_cas) defined within the public class called public.The function start at line 274 and ends at 350. It contains 58 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [274.0] and does not return any value. It declares 19.0 functions, and It has 19.0 functions called inside which are ["get_queue", "Exception", "upload.S3AssetManager", "str", "str", "str", "str", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 392 | 393 | 392 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (sync_inputs.on_downloading_files) defined within the public class called public.The function start at line 392 and ends at 393. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [392.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs | def sync_inputs(job_attachment_test: JobAttachmentTest,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,tmp_path_factory: TempPathFactory,default_job_template: str,) -> SyncInputsOutputs:"""Test that all of the input files get synced locally."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsjob_response = job_attachment_test.deadline_client.create_job(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,attachments=upload_input_files_one_asset_in_cas.attachments.to_dict(),# type: ignoretargetTaskRunStatus="SUSPENDED",template=default_job_template,templateType="JSON",priority=50,)syncer = asset_sync.AssetSync(job_attachment_test.farm_id)session_dir = tmp_path_factory.mktemp("session_dir")def on_downloading_files(*args, **kwargs):return True# WHENsyncer.sync_inputs(job_attachment_settings,upload_input_files_one_asset_in_cas.attachments,job_attachment_test.queue_id,job_response["jobId"],session_dir,on_downloading_files=on_downloading_files,)dest_dir = _get_unique_dest_dir_name(str(job_attachment_test.ASSET_ROOT))# THENassert Path(session_dir / dest_dir / job_attachment_test.SCENE_MA_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.BRICK_PNG_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.CLOTH_PNG_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH).exists()return SyncInputsOutputs(session_dir=session_dir,dest_dir=Path(dest_dir),asset_syncer=syncer,attachments=upload_input_files_one_asset_in_cas.attachments,job_id=job_response["jobId"],) | 1 | 43 | 4 | 244 | 5 | 363 | 419 | 363 | job_attachment_test,upload_input_files_one_asset_in_cas,tmp_path_factory,default_job_template | ['job_attachment_settings', 'session_dir', 'job_response', 'syncer', 'dest_dir'] | SyncInputsOutputs | {"Assign": 5, "Expr": 2, "Return": 2} | 19 | 57 | 19 | ["get_queue", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "_get_unique_dest_dir_name", "str", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "SyncInputsOutputs", "Path", "pytest.fixture"] | 0 | [] | The function (sync_inputs) defined within the public class called public.The function start at line 363 and ends at 419. It contains 43 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [363.0] and does not return any value. It declares 19.0 functions, and It has 19.0 functions called inside which are ["get_queue", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "_get_unique_dest_dir_name", "str", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "SyncInputsOutputs", "Path", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs_no_job_attachment_s3_settings.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 460 | 461 | 460 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (sync_inputs_no_job_attachment_s3_settings.on_downloading_files) defined within the public class called public.The function start at line 460 and ends at 461. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [460.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs_no_job_attachment_s3_settings | def sync_inputs_no_job_attachment_s3_settings(job_attachment_test: JobAttachmentTest,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,tmp_path_factory: TempPathFactory,default_job_template_one_task_one_step: str,caplog: LogCaptureFixture,) -> SyncInputsNoJobAttachmentS3SettingsOutput:"""Test that when there are no job attachment settings on a queue, the input sync is skipped."""# IFcaplog.set_level(logging.INFO)job_response = job_attachment_test.deadline_client.create_job(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_with_no_settings_id,attachments=upload_input_files_one_asset_in_cas.attachments.to_dict(),# type: ignoretargetTaskRunStatus="SUSPENDED",template=default_job_template_one_task_one_step,templateType="JSON",priority=50,)syncer = asset_sync.AssetSync(farm_id=job_attachment_test.farm_id,boto3_session=job_attachment_test.session,deadline_endpoint_url=job_attachment_test.deadline_endpoint,)session_dir = tmp_path_factory.mktemp("session_dir")def on_downloading_files(*args, **kwargs):return True# WHENassert syncer.sync_inputs(syncer.get_s3_settings(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_with_no_settings_id,),syncer.get_attachments(job_attachment_test.farm_id,job_attachment_test.queue_with_no_settings_id,job_response["jobId"],),job_attachment_test.queue_with_no_settings_id,job_response["jobId"],session_dir,on_downloading_files=on_downloading_files,) == (SummaryStatistics(), [])assert ("No Job Attachment settings configured for Queue "f"{job_attachment_test.queue_with_no_settings_id}, no inputs to sync." in caplog.text)return SyncInputsNoJobAttachmentS3SettingsOutput(job_id=job_response["jobId"], asset_syncer=syncer, session_dir=session_dir) | 1 | 46 | 5 | 212 | 3 | 430 | 487 | 430 | job_attachment_test,upload_input_files_one_asset_in_cas,tmp_path_factory,default_job_template_one_task_one_step,caplog | ['session_dir', 'syncer', 'job_response'] | SyncInputsNoJobAttachmentS3SettingsOutput | {"Assign": 3, "Expr": 2, "Return": 2} | 11 | 58 | 11 | ["caplog.set_level", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "syncer.get_s3_settings", "syncer.get_attachments", "SummaryStatistics", "SyncInputsNoJobAttachmentS3SettingsOutput", "pytest.fixture"] | 0 | [] | The function (sync_inputs_no_job_attachment_s3_settings) defined within the public class called public.The function start at line 430 and ends at 487. It contains 46 lines of code and it has a cyclomatic complexity of 1. It takes 5 parameters, represented as [430.0] and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["caplog.set_level", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "syncer.get_s3_settings", "syncer.get_attachments", "SummaryStatistics", "SyncInputsNoJobAttachmentS3SettingsOutput", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs_no_job_attachment_settings_in_job.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 527 | 528 | 527 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (sync_inputs_no_job_attachment_settings_in_job.on_downloading_files) defined within the public class called public.The function start at line 527 and ends at 528. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [527.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_inputs_no_job_attachment_settings_in_job | def sync_inputs_no_job_attachment_settings_in_job(job_attachment_test: JobAttachmentTest,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,tmp_path_factory: TempPathFactory,default_job_template_one_task_one_step: str,caplog: LogCaptureFixture,) -> SyncInputsNoJobAttachmentSettingsInJobOutput:"""Test that when there are no job attachment settings on a job, the input sync is skipped."""# IFcaplog.set_level(logging.INFO)job_response = job_attachment_test.deadline_client.create_job(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,targetTaskRunStatus="SUSPENDED",template=default_job_template_one_task_one_step,templateType="JSON",priority=50,)syncer = asset_sync.AssetSync(farm_id=job_attachment_test.farm_id,boto3_session=job_attachment_test.session,deadline_endpoint_url=job_attachment_test.deadline_endpoint,)session_dir = tmp_path_factory.mktemp("session_dir")def on_downloading_files(*args, **kwargs):return True# WHENassert syncer.sync_inputs(syncer.get_s3_settings(farm_id=job_attachment_test.farm_id, queue_id=job_attachment_test.queue_id),syncer.get_attachments(job_attachment_test.farm_id, job_attachment_test.queue_id, job_response["jobId"]),job_attachment_test.queue_id,job_response["jobId"],session_dir,on_downloading_files=on_downloading_files,) == (SummaryStatistics(), [])assert (f"No attachments configured for Job {job_response['jobId']}, no inputs to sync."in caplog.text)return SyncInputsNoJobAttachmentSettingsInJobOutput(job_id=job_response["jobId"], asset_syncer=syncer, session_dir=session_dir) | 1 | 42 | 5 | 199 | 3 | 498 | 551 | 498 | job_attachment_test,upload_input_files_one_asset_in_cas,tmp_path_factory,default_job_template_one_task_one_step,caplog | ['session_dir', 'syncer', 'job_response'] | SyncInputsNoJobAttachmentSettingsInJobOutput | {"Assign": 3, "Expr": 2, "Return": 2} | 10 | 54 | 10 | ["caplog.set_level", "job_attachment_test.deadline_client.create_job", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "syncer.get_s3_settings", "syncer.get_attachments", "SummaryStatistics", "SyncInputsNoJobAttachmentSettingsInJobOutput", "pytest.fixture"] | 0 | [] | The function (sync_inputs_no_job_attachment_settings_in_job) defined within the public class called public.The function start at line 498 and ends at 551. It contains 42 lines of code and it has a cyclomatic complexity of 1. It takes 5 parameters, represented as [498.0] and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["caplog.set_level", "job_attachment_test.deadline_client.create_job", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "syncer.sync_inputs", "syncer.get_s3_settings", "syncer.get_attachments", "SummaryStatistics", "SyncInputsNoJobAttachmentSettingsInJobOutput", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_outputs_no_job_attachment_settings_in_job | def test_sync_outputs_no_job_attachment_settings_in_job(job_attachment_test: JobAttachmentTest,sync_inputs_no_job_attachment_settings_in_job: SyncInputsNoJobAttachmentSettingsInJobOutput,caplog: LogCaptureFixture,) -> None:"""Test that syncing outputs is skipped when the queue has no job attachment settings."""# IFcaplog.set_level(logging.INFO)waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=sync_inputs_no_job_attachment_settings_in_job.job_id,queueId=job_attachment_test.queue_id,farmId=job_attachment_test.farm_id,)step_id = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs_no_job_attachment_settings_in_job.job_id,)["steps"][0]["stepId"]task_id = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs_no_job_attachment_settings_in_job.job_id,stepId=step_id,)["tasks"][0]["taskId"]# WHENsync_inputs_no_job_attachment_settings_in_job.asset_syncer.sync_outputs(s3_settings=sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_s3_settings(job_attachment_test.farm_id, job_attachment_test.queue_id),attachments=sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_attachments(job_attachment_test.farm_id,job_attachment_test.queue_id,sync_inputs_no_job_attachment_settings_in_job.job_id,),queue_id=job_attachment_test.queue_id,job_id=sync_inputs_no_job_attachment_settings_in_job.job_id,step_id=step_id,task_id=task_id,session_action_id="session_action_id",start_time=time.time(),session_dir=sync_inputs_no_job_attachment_settings_in_job.session_dir,)# THENassert ("No attachments configured for Job "f"{sync_inputs_no_job_attachment_settings_in_job.job_id}, no outputs to sync."in caplog.text) | 1 | 45 | 3 | 230 | 3 | 555 | 610 | 555 | job_attachment_test,sync_inputs_no_job_attachment_settings_in_job,caplog | ['waiter', 'step_id', 'task_id'] | None | {"Assign": 3, "Expr": 4} | 9 | 56 | 9 | ["caplog.set_level", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.sync_outputs", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_s3_settings", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_attachments", "time.time"] | 0 | [] | The function (test_sync_outputs_no_job_attachment_settings_in_job) defined within the public class called public.The function start at line 555 and ends at 610. It contains 45 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [555.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["caplog.set_level", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.sync_outputs", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_s3_settings", "sync_inputs_no_job_attachment_settings_in_job.asset_syncer.get_attachments", "time.time"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_outputs_no_job_attachment_s3_settings | def test_sync_outputs_no_job_attachment_s3_settings(job_attachment_test: JobAttachmentTest,sync_inputs_no_job_attachment_s3_settings: SyncInputsNoJobAttachmentS3SettingsOutput,caplog: LogCaptureFixture,) -> None:"""Test that syncing outputs is skipped when the job has no job attachment settings."""# IFcaplog.set_level(logging.INFO)waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=sync_inputs_no_job_attachment_s3_settings.job_id,queueId=job_attachment_test.queue_with_no_settings_id,farmId=job_attachment_test.farm_id,)step_id = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_with_no_settings_id,jobId=sync_inputs_no_job_attachment_s3_settings.job_id,)["steps"][0]["stepId"]task_id = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_with_no_settings_id,jobId=sync_inputs_no_job_attachment_s3_settings.job_id,stepId=step_id,)["tasks"][0]["taskId"]# WHENsync_inputs_no_job_attachment_s3_settings.asset_syncer.sync_outputs(s3_settings=sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_s3_settings(job_attachment_test.farm_id, job_attachment_test.queue_with_no_settings_id),attachments=sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_attachments(job_attachment_test.farm_id,job_attachment_test.queue_with_no_settings_id,sync_inputs_no_job_attachment_s3_settings.job_id,),queue_id=job_attachment_test.queue_with_no_settings_id,job_id=sync_inputs_no_job_attachment_s3_settings.job_id,step_id=step_id,task_id=task_id,session_action_id="session_action_id",start_time=time.time(),session_dir=sync_inputs_no_job_attachment_s3_settings.session_dir,)# THENassert ("No Job Attachment settings configured for Queue "f"{job_attachment_test.queue_with_no_settings_id}, no outputs to sync." in caplog.text) | 1 | 44 | 3 | 230 | 3 | 614 | 668 | 614 | job_attachment_test,sync_inputs_no_job_attachment_s3_settings,caplog | ['waiter', 'step_id', 'task_id'] | None | {"Assign": 3, "Expr": 4} | 9 | 55 | 9 | ["caplog.set_level", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.sync_outputs", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_s3_settings", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_attachments", "time.time"] | 0 | [] | The function (test_sync_outputs_no_job_attachment_s3_settings) defined within the public class called public.The function start at line 614 and ends at 668. It contains 44 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [614.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["caplog.set_level", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.sync_outputs", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_s3_settings", "sync_inputs_no_job_attachment_s3_settings.asset_syncer.get_attachments", "time.time"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | sync_outputs | def sync_outputs(job_attachment_test: JobAttachmentTest,sync_inputs: SyncInputsOutputs,) -> SyncOutputsOutput:"""Test that all outputs from the job get synced to the JobAttachment S3 Bucket."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=sync_inputs.job_id,queueId=job_attachment_test.queue_id,farmId=job_attachment_test.farm_id,)list_steps_response = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,)step_ids = {step["name"]: step["stepId"] for step in list_steps_response["steps"]}step0_id = step_ids["custom-step"]step1_id = step_ids["custom-step-2"]list_tasks_response = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,stepId=step0_id,)task_ids = {task["parameters"]["frame"]["int"]: task["taskId"] for task in list_tasks_response["tasks"]}step0_task0_id = task_ids["0"]step0_task1_id = task_ids["1"]step1_task0_id = list_tasks_response = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,stepId=step1_id,)["tasks"][0]["taskId"]file_to_be_synced_step0_task0_base = job_attachment_test.FIRST_RENDER_OUTPUT_PATHfile_to_be_synced_step0_task1_base = job_attachment_test.SECOND_RENDER_OUTPUT_PATHfile_to_be_synced_step1_task0_base = job_attachment_test.MOV_FILE_OUTPUT_PATHfile_to_be_synced_step0_task0 = (sync_inputs.session_dir / sync_inputs.dest_dir / file_to_be_synced_step0_task0_base)file_to_be_synced_step0_task1 = (sync_inputs.session_dir / sync_inputs.dest_dir / file_to_be_synced_step0_task1_base)file_to_be_synced_step1_task0 = (sync_inputs.session_dir / sync_inputs.dest_dir / file_to_be_synced_step1_task0_base)render_start_time = time.time()# WHENmock_on_uploading_files = MagicMock(return_value=True)# First step and task# Create files after the render start time in the output dir, these should be syncedwith open(file_to_be_synced_step0_task0, "w") as f:f.write("this is the first render")summary_stats = sync_inputs.asset_syncer.sync_outputs(s3_settings=job_attachment_settings,attachments=sync_inputs.attachments,queue_id=job_attachment_test.queue_id,job_id=sync_inputs.job_id,step_id=step0_id,task_id=step0_task0_id,session_action_id="session_action_id",start_time=render_start_time,session_dir=sync_inputs.session_dir,on_uploading_files=mock_on_uploading_files,)# There should be one synced output for this task, Step 0 - Task 0assert summary_stats.total_files == 1render_start_time = time.time()# First step and second taskwith open(file_to_be_synced_step0_task1, "w") as f:f.write("this is a second render")summary_stats = sync_inputs.asset_syncer.sync_outputs(s3_settings=job_attachment_settings,attachments=sync_inputs.attachments,queue_id=job_attachment_test.queue_id,job_id=sync_inputs.job_id,step_id=step0_id,task_id=step0_task1_id,session_action_id="session_action_id",start_time=render_start_time,session_dir=sync_inputs.session_dir,on_uploading_files=mock_on_uploading_files,)# There should be one synced output for this task, Step 0 - Task 1assert summary_stats.total_files == 1render_start_time = time.time()# Second step and first taskwith open(file_to_be_synced_step1_task0, "w") as f:f.write("this is a comp")summary_stats = sync_inputs.asset_syncer.sync_outputs(s3_settings=job_attachment_settings,attachments=sync_inputs.attachments,queue_id=job_attachment_test.queue_id,job_id=sync_inputs.job_id,step_id=step1_id,task_id=step1_task0_id,session_action_id="session_action_id",start_time=render_start_time,session_dir=sync_inputs.session_dir,on_uploading_files=mock_on_uploading_files,)# There should be one synced output for this task, Step 1 - Task 0assert summary_stats.total_files == 1# THENobject_summary_iterator = job_attachment_test.bucket.objects.filter(Prefix=f"{job_attachment_settings.full_cas_prefix()}/",)object_key_set = set(obj.key for obj in object_summary_iterator)assert (f"{job_attachment_settings.full_cas_prefix()}/{hash_file(str(file_to_be_synced_step0_task0), HashAlgorithm.XXH128)}.xxh128"in object_key_set)return SyncOutputsOutput(step0_id=step0_id,step1_id=step1_id,step0_task0_id=step0_task0_id,step0_task1_id=step0_task1_id,step1_task0_id=step1_task0_id,job_id=sync_inputs.job_id,attachments=sync_inputs.attachments,step0_task0_output_file=file_to_be_synced_step0_task0_base,step0_task1_output_file=file_to_be_synced_step0_task1_base,step1_task0_output_file=file_to_be_synced_step1_task0_base,) | 5 | 123 | 2 | 656 | 22 | 686 | 846 | 686 | job_attachment_test,sync_inputs | ['render_start_time', 'object_summary_iterator', 'list_steps_response', 'job_attachment_settings', 'step0_id', 'file_to_be_synced_step1_task0', 'mock_on_uploading_files', 'step1_task0_id', 'summary_stats', 'step_ids', 'waiter', 'file_to_be_synced_step0_task1_base', 'file_to_be_synced_step0_task0', 'file_to_be_synced_step0_task1', 'step1_id', 'task_ids', 'step0_task0_id', 'step0_task1_id', 'list_tasks_response', 'file_to_be_synced_step1_task0_base', 'file_to_be_synced_step0_task0_base', 'object_key_set'] | SyncOutputsOutput | {"Assign": 26, "Expr": 5, "If": 1, "Return": 1, "With": 3} | 28 | 161 | 28 | ["get_queue", "Exception", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "job_attachment_test.deadline_client.list_tasks", "time.time", "MagicMock", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "time.time", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "time.time", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "set", "job_attachment_settings.full_cas_prefix", "hash_file", "str", "SyncOutputsOutput", "pytest.fixture"] | 0 | [] | The function (sync_outputs) defined within the public class called public.The function start at line 686 and ends at 846. It contains 123 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [686.0] and does not return any value. It declares 28.0 functions, and It has 28.0 functions called inside which are ["get_queue", "Exception", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "job_attachment_test.deadline_client.list_tasks", "time.time", "MagicMock", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "time.time", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "time.time", "open", "f.write", "sync_inputs.asset_syncer.sync_outputs", "job_attachment_test.bucket.objects.filter", "job_attachment_settings.full_cas_prefix", "set", "job_attachment_settings.full_cas_prefix", "hash_file", "str", "SyncOutputsOutput", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_outputs_with_symlink | def test_sync_outputs_with_symlink(job_attachment_test: JobAttachmentTest,sync_inputs: SyncInputsOutputs,tmp_path,) -> None:"""Test that a symlink pointing to a file outside the session directory is not synced as output."""job_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=sync_inputs.job_id,queueId=job_attachment_test.queue_id,farmId=job_attachment_test.farm_id,)# Get the Step IDlist_steps_response = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,)step_ids = {step["name"]: step["stepId"] for step in list_steps_response["steps"]}step0_id = step_ids["custom-step"]# Get the Task IDlist_tasks_response = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,stepId=step0_id,)task_ids = {task["parameters"]["frame"]["int"]: task["taskId"] for task in list_tasks_response["tasks"]}step0_task0_id = task_ids["0"]# Create a symlink, in the output directory, pointing to a file located outside the session directory.symlink_output = Path("outputs/symlink")symlink_path = sync_inputs.session_dir / sync_inputs.dest_dir / symlink_outputtmp_dir = tmp_path / "tmp_dir"tmp_dir.mkdir()symlink_target_path = tmp_dir / "symlink_target"symlink_target_path.write_text("this is a symlink target, located outside the session directory")symlink_path.symlink_to(symlink_target_path)assert symlink_path.is_symlink()mock_on_uploading_files = MagicMock(return_value=True)summary_stats = sync_inputs.asset_syncer.sync_outputs(s3_settings=job_attachment_settings,attachments=sync_inputs.attachments,queue_id=job_attachment_test.queue_id,job_id=sync_inputs.job_id,step_id=step0_id,task_id=step0_task0_id,session_action_id="session_action_id",start_time=time.time(),session_dir=sync_inputs.session_dir,on_uploading_files=mock_on_uploading_files,)# The symlink should not be synced as output.assert summary_stats.total_files == 0 | 4 | 59 | 3 | 326 | 14 | 854 | 926 | 854 | job_attachment_test,sync_inputs,tmp_path | ['list_steps_response', 'job_attachment_settings', 'step0_id', 'symlink_output', 'mock_on_uploading_files', 'summary_stats', 'symlink_path', 'step_ids', 'list_tasks_response', 'tmp_dir', 'symlink_target_path', 'waiter', 'task_ids', 'step0_task0_id'] | None | {"Assign": 14, "Expr": 5, "If": 1} | 16 | 73 | 16 | ["get_queue", "Exception", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "Path", "tmp_dir.mkdir", "symlink_target_path.write_text", "symlink_path.symlink_to", "symlink_path.is_symlink", "MagicMock", "sync_inputs.asset_syncer.sync_outputs", "time.time", "pytest.mark.skipif", "is_windows_non_admin"] | 0 | [] | The function (test_sync_outputs_with_symlink) defined within the public class called public.The function start at line 854 and ends at 926. It contains 59 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [854.0] and does not return any value. It declares 16.0 functions, and It has 16.0 functions called inside which are ["get_queue", "Exception", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "Path", "tmp_dir.mkdir", "symlink_target_path.write_text", "symlink_path.symlink_to", "symlink_path.is_symlink", "MagicMock", "sync_inputs.asset_syncer.sync_outputs", "time.time", "pytest.mark.skipif", "is_windows_non_admin"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_with_step_dependencies.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 959 | 960 | 959 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (test_sync_inputs_with_step_dependencies.on_downloading_files) defined within the public class called public.The function start at line 959 and ends at 960. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [959.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_with_step_dependencies | def test_sync_inputs_with_step_dependencies(job_attachment_test: JobAttachmentTest,tmp_path_factory: TempPathFactory,sync_outputs: SyncOutputsOutput,):"""Test that sync_inputs() syncs the inputs specified in job settings, and the outputs from other stepsspecified in step dependencies."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingslist_steps_response = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_outputs.job_id,)step_ids = {step["name"]: step["stepId"] for step in list_steps_response["steps"]}step0_id = step_ids["custom-step"]session_dir = tmp_path_factory.mktemp("session_dir")# WHENsyncer = asset_sync.AssetSync(job_attachment_test.farm_id)def on_downloading_files(*args, **kwargs):return Truesyncer.sync_inputs(job_attachment_settings,sync_outputs.attachments,job_attachment_test.queue_id,sync_outputs.job_id,session_dir,step_dependencies=[step0_id],on_downloading_files=on_downloading_files,)dest_dir = _get_unique_dest_dir_name(str(job_attachment_test.ASSET_ROOT))# THEN# Check if the inputs specified in job settings were downloadedassert Path(session_dir / dest_dir / job_attachment_test.SCENE_MA_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.BRICK_PNG_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.CLOTH_PNG_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH).exists()# Check if the outputs from step0_id ("custom-step") were downloadedassert Path(session_dir / dest_dir / job_attachment_test.FIRST_RENDER_OUTPUT_PATH).exists()assert Path(session_dir / dest_dir / job_attachment_test.SECOND_RENDER_OUTPUT_PATH).exists()# Check if the outputs from the other step ("custom-step-1") were not downloadedassert not Path(session_dir / dest_dir / job_attachment_test.MOV_FILE_OUTPUT_PATH).exists() | 2 | 37 | 3 | 263 | 7 | 930 | 984 | 930 | job_attachment_test,tmp_path_factory,sync_outputs | ['list_steps_response', 'job_attachment_settings', 'step0_id', 'session_dir', 'step_ids', 'syncer', 'dest_dir'] | Returns | {"Assign": 7, "Expr": 2, "Return": 1} | 21 | 55 | 21 | ["get_queue", "job_attachment_test.deadline_client.list_steps", "tmp_path_factory.mktemp", "asset_sync.AssetSync", "syncer.sync_inputs", "_get_unique_dest_dir_name", "str", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path"] | 0 | [] | The function (test_sync_inputs_with_step_dependencies) defined within the public class called public.The function start at line 930 and ends at 984. It contains 37 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [930.0], and this function return a value. It declares 21.0 functions, and It has 21.0 functions called inside which are ["get_queue", "job_attachment_test.deadline_client.list_steps", "tmp_path_factory.mktemp", "asset_sync.AssetSync", "syncer.sync_inputs", "_get_unique_dest_dir_name", "str", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path", "exists", "Path"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_with_job_id_step_id_task_id_and_download_directory | def test_download_outputs_with_job_id_step_id_task_id_and_download_directory(job_attachment_test: JobAttachmentTest, tmp_path: Path, sync_outputs: SyncOutputsOutput):"""Test that outputs for a task are downloaded to the correct location locally"""# GIVENjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")# WHENtry:job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=sync_outputs.step0_id,task_id=sync_outputs.step0_task0_id,)job_output_downloader.download_job_output()# THENassert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step0_task0_output_file).exists()finally:_cleanup_outputs_dir(job_attachment_test) | 3 | 23 | 3 | 120 | 2 | 988 | 1,019 | 988 | job_attachment_test,tmp_path,sync_outputs | ['job_output_downloader', 'job_attachment_settings'] | None | {"Assign": 2, "Expr": 3, "If": 1, "Try": 1} | 7 | 32 | 7 | ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "_cleanup_outputs_dir"] | 0 | [] | The function (test_download_outputs_with_job_id_step_id_task_id_and_download_directory) defined within the public class called public.The function start at line 988 and ends at 1019. It contains 23 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [988.0] and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "_cleanup_outputs_dir"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_with_job_id_step_id_and_download_directory | def test_download_outputs_with_job_id_step_id_and_download_directory(job_attachment_test: JobAttachmentTest, tmp_path: Path, sync_outputs: SyncOutputsOutput):"""Test that outputs for a step are downloaded to the correct location locally"""# GIVENjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")# WHENtry:job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=sync_outputs.step0_id,task_id=None,)job_output_downloader.download_job_output()# THENassert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step0_task0_output_file).exists()assert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step0_task1_output_file).exists()finally:_cleanup_outputs_dir(job_attachment_test) | 3 | 24 | 3 | 133 | 2 | 1,023 | 1,055 | 1,023 | job_attachment_test,tmp_path,sync_outputs | ['job_output_downloader', 'job_attachment_settings'] | None | {"Assign": 2, "Expr": 3, "If": 1, "Try": 1} | 9 | 33 | 9 | ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "exists", "Path", "_cleanup_outputs_dir"] | 0 | [] | The function (test_download_outputs_with_job_id_step_id_and_download_directory) defined within the public class called public.The function start at line 1023 and ends at 1055. It contains 24 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [1023.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "exists", "Path", "_cleanup_outputs_dir"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_with_job_id_and_download_directory | def test_download_outputs_with_job_id_and_download_directory(job_attachment_test: JobAttachmentTest, tmp_path: Path, sync_outputs: SyncOutputsOutput):"""Test that outputs for a job are downloaded to the correct location locally"""# GIVENjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")# WHENtry:job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=None,task_id=None,)job_output_downloader.download_job_output()# THENassert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step0_task0_output_file).exists()assert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step0_task1_output_file).exists()assert Path(job_attachment_test.ASSET_ROOT / sync_outputs.step1_task0_output_file).exists()finally:_cleanup_outputs_dir(job_attachment_test) | 3 | 25 | 3 | 146 | 2 | 1,059 | 1,092 | 1,059 | job_attachment_test,tmp_path,sync_outputs | ['job_output_downloader', 'job_attachment_settings'] | None | {"Assign": 2, "Expr": 3, "If": 1, "Try": 1} | 11 | 34 | 11 | ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "exists", "Path", "exists", "Path", "_cleanup_outputs_dir"] | 0 | [] | The function (test_download_outputs_with_job_id_and_download_directory) defined within the public class called public.The function start at line 1059 and ends at 1092. It contains 25 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [1059.0] and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["get_queue", "TypeError", "download.OutputDownloader", "job_output_downloader.download_job_output", "exists", "Path", "exists", "Path", "exists", "Path", "_cleanup_outputs_dir"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _cleanup_outputs_dir | def _cleanup_outputs_dir(job_attachment_test: JobAttachmentTest) -> None:shutil.rmtree(job_attachment_test.OUTPUT_PATH)# Revive the INPUT_IN_OUTPUT_DIR_PATH file.job_attachment_test.OUTPUT_PATH.mkdir(parents=True, exist_ok=True)with open(job_attachment_test.INPUT_IN_OUTPUT_DIR_PATH, "w") as f:f.write("Although it is in the output directory, it is actually an input file. It should be"" downloaded (to the worker's session working directory) during sync_inputs, and"" should not be captured as an output file when sync_outputs.") | 1 | 9 | 1 | 51 | 0 | 1,095 | 1,104 | 1,095 | job_attachment_test | [] | None | {"Expr": 3, "With": 1} | 4 | 10 | 4 | ["shutil.rmtree", "job_attachment_test.OUTPUT_PATH.mkdir", "open", "f.write"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_and_download_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_step_id_and_download_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_step_id_task_id_and_download_directory"] | The function (_cleanup_outputs_dir) defined within the public class called public.The function start at line 1095 and ends at 1104. It contains 9 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["shutil.rmtree", "job_attachment_test.OUTPUT_PATH.mkdir", "open", "f.write"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_and_download_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_step_id_and_download_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_download_outputs_with_job_id_step_id_task_id_and_download_directory"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | upload_input_files_no_input_paths | def upload_input_files_no_input_paths(job_attachment_test: JobAttachmentTest,) -> UploadInputFilesNoInputPathsOutput:"""Test that the created job settings object doesn't have the requiredAssets field when there are no input files."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)mock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENupload_group = asset_manager.prepare_paths_for_upload(input_paths=[],output_paths=[str(job_attachment_test.OUTPUT_PATH)],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)(_, attachments) = asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),)# THENmock_host_path_format_name = PathFormat.get_host_path_format_string()assert attachments.manifests == [ManifestProperties(rootPath=str(job_attachment_test.OUTPUT_PATH),rootPathFormat=PathFormat(mock_host_path_format_name),outputRelativeDirectories=["."],)]return UploadInputFilesNoInputPathsOutput(attachments=attachments) | 2 | 44 | 1 | 233 | 6 | 1,113 | 1,169 | 1,113 | job_attachment_test | ['job_attachment_settings', 'mock_on_uploading_files', 'mock_host_path_format_name', 'upload_group', 'mock_on_preparing_to_submit', 'asset_manager'] | UploadInputFilesNoInputPathsOutput | {"Assign": 8, "Expr": 1, "If": 1, "Return": 1} | 17 | 57 | 17 | ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "PathFormat.get_host_path_format_string", "ManifestProperties", "str", "PathFormat", "UploadInputFilesNoInputPathsOutput", "pytest.fixture"] | 0 | [] | The function (upload_input_files_no_input_paths) defined within the public class called public.The function start at line 1113 and ends at 1169. It contains 44 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 17.0 functions, and It has 17.0 functions called inside which are ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "PathFormat.get_host_path_format_string", "ManifestProperties", "str", "PathFormat", "UploadInputFilesNoInputPathsOutput", "pytest.fixture"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_upload_input_files_no_download_paths | def test_upload_input_files_no_download_paths(job_attachment_test: JobAttachmentTest) -> None:"""Test that if there are no output directories, when upload_assets is called,then the resulting attachments object has no output directories in it."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise TypeError("Job attachment settings must be set for this test.")asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)mock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENupload_group = asset_manager.prepare_paths_for_upload(input_paths=[str(job_attachment_test.SCENE_MA_PATH)],output_paths=[],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)(_, attachments) = asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),)# THENif manifests[0].asset_manifest is None:raise TypeError("Asset manifest must be set for this test.")mock_host_path_format_name = PathFormat.get_host_path_format_string()asset_root_hash = hash_data(str(job_attachment_test.INPUT_PATH).encode(), HashAlgorithm.XXH128)manifest_hash = hash_data(bytes(manifests[0].asset_manifest.encode(), "utf-8"), HashAlgorithm.XXH128)assert len(attachments.manifests) == 1assert attachments.manifests[0].fileSystemLocationName is Noneassert attachments.manifests[0].rootPath == str(job_attachment_test.INPUT_PATH)assert attachments.manifests[0].rootPathFormat == PathFormat(mock_host_path_format_name)assert attachments.manifests[0].outputRelativeDirectories == []assert attachments.manifests[0].inputManifestPath is not Noneassert attachments.manifests[0].inputManifestPath.startswith(f"{job_attachment_test.farm_id}/{job_attachment_test.queue_id}/Inputs/")assert attachments.manifests[0].inputManifestPath.endswith(f"/{asset_root_hash}_input")assert attachments.manifests[0].inputManifestHash == manifest_hash | 3 | 51 | 1 | 366 | 8 | 1,173 | 1,238 | 1,173 | job_attachment_test | ['job_attachment_settings', 'mock_on_uploading_files', 'mock_host_path_format_name', 'asset_root_hash', 'upload_group', 'mock_on_preparing_to_submit', 'manifest_hash', 'asset_manager'] | None | {"Assign": 10, "Expr": 1, "If": 2} | 24 | 66 | 24 | ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "TypeError", "PathFormat.get_host_path_format_string", "hash_data", "encode", "str", "hash_data", "bytes", "asset_manifest.encode", "len", "str", "PathFormat", "inputManifestPath.startswith", "inputManifestPath.endswith"] | 0 | [] | The function (test_upload_input_files_no_download_paths) defined within the public class called public.The function start at line 1173 and ends at 1238. It contains 51 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 24.0 functions, and It has 24.0 functions called inside which are ["get_queue", "TypeError", "upload.S3AssetManager", "MagicMock", "MagicMock", "asset_manager.prepare_paths_for_upload", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str", "TypeError", "PathFormat.get_host_path_format_string", "hash_data", "encode", "str", "hash_data", "bytes", "asset_manifest.encode", "len", "str", "PathFormat", "inputManifestPath.startswith", "inputManifestPath.endswith"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_no_inputs.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 1,272 | 1,273 | 1,272 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (test_sync_inputs_no_inputs.on_downloading_files) defined within the public class called public.The function start at line 1272 and ends at 1273. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1272.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_no_inputs | def test_sync_inputs_no_inputs(job_attachment_test: JobAttachmentTest,upload_input_files_no_input_paths: UploadInputFilesNoInputPathsOutput,tmp_path: Path,default_job_template_one_task_one_step: str,) -> None:"""Test that all of the input files get synced locally."""# IFjob_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsjob_response = job_attachment_test.deadline_client.create_job(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,attachments=upload_input_files_no_input_paths.attachments.to_dict(),# type: ignoretargetTaskRunStatus="SUSPENDED",template=default_job_template_one_task_one_step,templateType="JSON",priority=50,)syncer = asset_sync.AssetSync(job_attachment_test.farm_id)session_dir = tmp_path / "session_dir"session_dir.mkdir()def on_downloading_files(*args, **kwargs):return True# WHENsyncer.sync_inputs(job_attachment_settings,upload_input_files_no_input_paths.attachments,job_attachment_test.queue_id,job_response["jobId"],session_dir,on_downloading_files=on_downloading_files,)# THENassert not any(Path(session_dir).iterdir()) | 1 | 33 | 4 | 156 | 4 | 1,242 | 1,286 | 1,242 | job_attachment_test,upload_input_files_no_input_paths,tmp_path,default_job_template_one_task_one_step | ['session_dir', 'syncer', 'job_attachment_settings', 'job_response'] | None | {"Assign": 4, "Expr": 3, "Return": 1} | 9 | 45 | 9 | ["get_queue", "job_attachment_test.deadline_client.create_job", "upload_input_files_no_input_paths.attachments.to_dict", "asset_sync.AssetSync", "session_dir.mkdir", "syncer.sync_inputs", "any", "iterdir", "Path"] | 0 | [] | The function (test_sync_inputs_no_inputs) defined within the public class called public.The function start at line 1242 and ends at 1286. It contains 33 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [1242.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["get_queue", "job_attachment_test.deadline_client.create_job", "upload_input_files_no_input_paths.attachments.to_dict", "asset_sync.AssetSync", "session_dir.mkdir", "syncer.sync_inputs", "any", "iterdir", "Path"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_upload_bucket_wrong_account | def test_upload_bucket_wrong_account(external_bucket: str, job_attachment_test: JobAttachmentTest):"""Test that if trying to upload to a bucket that isn't in the farm's AWS account, the correct error is thrown."""# IFjob_attachment_settings = JobAttachmentS3Settings(s3BucketName=external_bucket,rootPrefix=job_attachment_test.bucket_root_prefix,)asset_manager = upload.S3AssetManager(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_attachment_settings=job_attachment_settings,asset_manifest_version=job_attachment_test.manifest_version,)mock_on_preparing_to_submit = MagicMock(return_value=True)mock_on_uploading_files = MagicMock(return_value=True)# WHENwith pytest.raises(# Note: This error is raised in this case when the s3:PutObject operation is denied# due to the ExpectedBucketOwner check on our s3 operation. If the bucket is in the expected# account, then the error is a different access denied error.JobAttachmentsS3ClientError,match=".*when calling the PutObject operation: Access Denied",):# The attempt to upload the asset manifest should be blocked.upload_group = asset_manager.prepare_paths_for_upload(input_paths=[str(job_attachment_test.SCENE_MA_PATH)],output_paths=[str(job_attachment_test.OUTPUT_PATH)],referenced_paths=[],)(_, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,hash_cache_dir=str(job_attachment_test.hash_cache_dir),on_preparing_to_submit=mock_on_preparing_to_submit,)asset_manager.upload_assets(manifests,on_uploading_assets=mock_on_uploading_files,s3_check_cache_dir=str(job_attachment_test.s3_cache_dir),) | 1 | 34 | 2 | 181 | 5 | 1,291 | 1,336 | 1,291 | external_bucket,job_attachment_test | ['job_attachment_settings', 'mock_on_uploading_files', 'upload_group', 'mock_on_preparing_to_submit', 'asset_manager'] | None | {"Assign": 6, "Expr": 2, "With": 1} | 12 | 46 | 12 | ["JobAttachmentS3Settings", "upload.S3AssetManager", "MagicMock", "MagicMock", "pytest.raises", "asset_manager.prepare_paths_for_upload", "str", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str"] | 0 | [] | The function (test_upload_bucket_wrong_account) defined within the public class called public.The function start at line 1291 and ends at 1336. It contains 34 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1291.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["JobAttachmentS3Settings", "upload.S3AssetManager", "MagicMock", "MagicMock", "pytest.raises", "asset_manager.prepare_paths_for_upload", "str", "str", "asset_manager.hash_assets_and_create_manifest", "str", "asset_manager.upload_assets", "str"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_bucket_wrong_account.on_downloading_files | def on_downloading_files(*args, **kwargs):return True | 1 | 2 | 2 | 11 | 0 | 1,370 | 1,371 | 1,370 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (test_sync_inputs_bucket_wrong_account.on_downloading_files) defined within the public class called public.The function start at line 1370 and ends at 1371. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1370.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_inputs_bucket_wrong_account | def test_sync_inputs_bucket_wrong_account(external_bucket: str,job_attachment_test: JobAttachmentTest,upload_input_files_one_asset_in_cas: UploadInputFilesOneAssetInCasOutputs,default_job_template: str,tmp_path_factory: TempPathFactory,):"""Test that if trying to sync inputs to a bucket that isn't in the farm's AWS account, the correct error is thrown."""# IFjob_attachment_settings = JobAttachmentS3Settings(s3BucketName=external_bucket,rootPrefix=job_attachment_test.bucket_root_prefix,)job_response = job_attachment_test.deadline_client.create_job(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,attachments=upload_input_files_one_asset_in_cas.attachments.to_dict(),# type: ignoretargetTaskRunStatus="SUSPENDED",template=default_job_template,templateType="JSON",priority=50,)syncer = asset_sync.AssetSync(job_attachment_test.farm_id)session_dir = tmp_path_factory.mktemp("session_dir")def on_downloading_files(*args, **kwargs):return True# WHENwith pytest.raises(JobAttachmentsS3ClientError,match=f"Error downloading binary file in bucket '{external_bucket}'",):syncer.sync_inputs(job_attachment_settings,upload_input_files_one_asset_in_cas.attachments,job_attachment_test.queue_id,job_response["jobId"],session_dir,on_downloading_files=on_downloading_files,) | 1 | 35 | 5 | 147 | 4 | 1,341 | 1,385 | 1,341 | external_bucket,job_attachment_test,upload_input_files_one_asset_in_cas,default_job_template,tmp_path_factory | ['session_dir', 'syncer', 'job_attachment_settings', 'job_response'] | Returns | {"Assign": 4, "Expr": 2, "Return": 1, "With": 1} | 7 | 45 | 7 | ["JobAttachmentS3Settings", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "pytest.raises", "syncer.sync_inputs"] | 0 | [] | The function (test_sync_inputs_bucket_wrong_account) defined within the public class called public.The function start at line 1341 and ends at 1385. It contains 35 lines of code and it has a cyclomatic complexity of 1. It takes 5 parameters, represented as [1341.0], and this function return a value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["JobAttachmentS3Settings", "job_attachment_test.deadline_client.create_job", "upload_input_files_one_asset_in_cas.attachments.to_dict", "asset_sync.AssetSync", "tmp_path_factory.mktemp", "pytest.raises", "syncer.sync_inputs"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_sync_outputs_bucket_wrong_account | def test_sync_outputs_bucket_wrong_account(job_attachment_test: JobAttachmentTest,sync_inputs: SyncInputsOutputs,external_bucket: str,) -> None:"""Test that if trying to sync outputs to a bucket that isn't in the farm's AWS account, the correct error is thrown.This is ensuring that the S3 file upload is passing the ExpectedBucketOwner property and verifying that the returnederror is what we expect when using that property (rather than just plain not having access to the bucket)."""# IFjob_attachment_settings = JobAttachmentS3Settings(s3BucketName=external_bucket,rootPrefix=job_attachment_test.bucket_root_prefix,)waiter = job_attachment_test.deadline_client.get_waiter("job_create_complete")waiter.wait(jobId=sync_inputs.job_id,queueId=job_attachment_test.queue_id,farmId=job_attachment_test.farm_id,)list_steps_response = job_attachment_test.deadline_client.list_steps(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,)step_ids = {step["name"]: step["stepId"] for step in list_steps_response["steps"]}step0_id = step_ids["custom-step"]list_tasks_response = job_attachment_test.deadline_client.list_tasks(farmId=job_attachment_test.farm_id,queueId=job_attachment_test.queue_id,jobId=sync_inputs.job_id,stepId=step0_id,)task_ids = {task["parameters"]["frame"]["int"]: task["taskId"] for task in list_tasks_response["tasks"]}step0_task0_id = task_ids["0"]Path(sync_inputs.session_dir / sync_inputs.dest_dir / "outputs").mkdir(exist_ok=True)file_to_be_synced_step0_task0_base = job_attachment_test.FIRST_RENDER_OUTPUT_PATHfile_to_be_synced_step0_task0 = (sync_inputs.session_dir / sync_inputs.dest_dir / file_to_be_synced_step0_task0_base)render_start_time = time.time()# WHEN# First step and task# Create files after the render start time in the output dir, these should be syncedwith open(file_to_be_synced_step0_task0, "w") as f:f.write("this is the first render")mock_on_uploading_files = MagicMock(return_value=True)# WHENwith pytest.raises(AssetSyncError, match=f"Error checking if object exists in bucket '{external_bucket}'"):sync_inputs.asset_syncer.sync_outputs(s3_settings=job_attachment_settings,attachments=sync_inputs.attachments,queue_id=job_attachment_test.queue_id,job_id=sync_inputs.job_id,step_id=step0_id,task_id=step0_task0_id,session_action_id="session_action_id",start_time=render_start_time,session_dir=sync_inputs.session_dir,on_uploading_files=mock_on_uploading_files,) | 3 | 56 | 3 | 319 | 12 | 1,390 | 1,469 | 1,390 | job_attachment_test,sync_inputs,external_bucket | ['list_steps_response', 'job_attachment_settings', 'step0_id', 'mock_on_uploading_files', 'file_to_be_synced_step0_task0', 'step_ids', 'list_tasks_response', 'render_start_time', 'file_to_be_synced_step0_task0_base', 'waiter', 'task_ids', 'step0_task0_id'] | None | {"Assign": 12, "Expr": 5, "With": 2} | 13 | 80 | 13 | ["JobAttachmentS3Settings", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "mkdir", "Path", "time.time", "open", "f.write", "MagicMock", "pytest.raises", "sync_inputs.asset_syncer.sync_outputs"] | 0 | [] | The function (test_sync_outputs_bucket_wrong_account) defined within the public class called public.The function start at line 1390 and ends at 1469. It contains 56 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [1390.0] and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["JobAttachmentS3Settings", "job_attachment_test.deadline_client.get_waiter", "waiter.wait", "job_attachment_test.deadline_client.list_steps", "job_attachment_test.deadline_client.list_tasks", "mkdir", "Path", "time.time", "open", "f.write", "MagicMock", "pytest.raises", "sync_inputs.asset_syncer.sync_outputs"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_bucket_wrong_account | def test_download_outputs_bucket_wrong_account(job_attachment_test: JobAttachmentTest,tmp_path: Path,sync_outputs: SyncOutputsOutput,external_bucket: str,):"""Test that if trying to download outputs to a bucketthat isn't in the farm's AWS account, the correct error is thrown."""# GIVENjob_attachment_settings = JobAttachmentS3Settings(s3BucketName=external_bucket,rootPrefix=job_attachment_test.bucket_root_prefix,)# WHENwith pytest.raises(JobAttachmentsS3ClientError,match=f"Error listing bucket contents in bucket '{external_bucket}'",):job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=sync_outputs.step0_id,task_id=sync_outputs.step0_task0_id,)job_output_downloader.download_job_output() | 1 | 23 | 4 | 96 | 2 | 1,474 | 1,503 | 1,474 | job_attachment_test,tmp_path,sync_outputs,external_bucket | ['job_output_downloader', 'job_attachment_settings'] | None | {"Assign": 2, "Expr": 2, "With": 1} | 4 | 30 | 4 | ["JobAttachmentS3Settings", "pytest.raises", "download.OutputDownloader", "job_output_downloader.download_job_output"] | 0 | [] | The function (test_download_outputs_bucket_wrong_account) defined within the public class called public.The function start at line 1474 and ends at 1503. It contains 23 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [1474.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["JobAttachmentS3Settings", "pytest.raises", "download.OutputDownloader", "job_output_downloader.download_job_output"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_no_outputs_dir | def test_download_outputs_no_outputs_dir(job_attachment_test: JobAttachmentTest,sync_outputs: SyncOutputsOutput,):"""Test that if trying to download outputs but not specify a file pathDownload will be saved to the current directory."""download_path = Path(os.path.normpath(Path("").absolute()) / sync_outputs.step0_task0_output_file)job_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=sync_outputs.step0_id,task_id=sync_outputs.step0_task0_id,)job_output_downloader.set_root_path(str(job_attachment_test.ASSET_ROOT), "")# WHENtry:job_output_downloader.download_job_output()# THEN# The output file should be downloaded to the current directoryassert download_path.exists()finally:shutil.rmtree(download_path.parent) | 3 | 28 | 2 | 149 | 3 | 1,507 | 1,546 | 1,507 | job_attachment_test,sync_outputs | ['job_output_downloader', 'job_attachment_settings', 'download_path'] | None | {"Assign": 3, "Expr": 4, "If": 1, "Try": 1} | 12 | 40 | 12 | ["Path", "os.path.normpath", "absolute", "Path", "get_queue", "Exception", "download.OutputDownloader", "job_output_downloader.set_root_path", "str", "job_output_downloader.download_job_output", "download_path.exists", "shutil.rmtree"] | 0 | [] | The function (test_download_outputs_no_outputs_dir) defined within the public class called public.The function start at line 1507 and ends at 1546. It contains 28 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [1507.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["Path", "os.path.normpath", "absolute", "Path", "get_queue", "Exception", "download.OutputDownloader", "job_output_downloader.set_root_path", "str", "job_output_downloader.download_job_output", "download_path.exists", "shutil.rmtree"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_download_outputs_windows_long_file_path | def test_download_outputs_windows_long_file_path(job_attachment_test: JobAttachmentTest, sync_outputs: SyncOutputsOutput, tmp_path: WindowsPath):"""Test that when trying to download outputs to a file path thatlonger than 260 chars in Windows, the download is successful."""tmp_path_len: int = len(str(tmp_path))long_root_path_remaining_length: int = WINDOWS_MAX_PATH_LENGTH - tmp_path_len - 20long_root_path: str = os.path.join(tmp_path,*["path"]* math.floor(long_root_path_remaining_length / 5),# Create a temp path that barely does not exceed the windows path limit)os.makedirs(long_root_path)assert len(long_root_path) <= WINDOWS_MAX_PATH_LENGTH - 10job_attachment_settings = get_queue(farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,deadline_endpoint_url=job_attachment_test.deadline_endpoint,).jobAttachmentSettingsif job_attachment_settings is None:raise Exception("Job attachment settings must be set for this test.")job_output_downloader = download.OutputDownloader(s3_settings=job_attachment_settings,farm_id=job_attachment_test.farm_id,queue_id=job_attachment_test.queue_id,job_id=sync_outputs.job_id,step_id=sync_outputs.step0_id,task_id=sync_outputs.step0_task0_id,)job_output_downloader.set_root_path(str(job_attachment_test.ASSET_ROOT), str(long_root_path))# WHENtry:job_output_downloader.download_job_output()# THEN# The output file should be downloaded to the current directory# Prepend \\?\ when checking the file exists, otherwise Python will not find itoutput_file_path = Path(WINDOWS_UNC_PATH_STRING_PREFIX + long_root_path, sync_outputs.step0_task0_output_file)assert output_file_path.exists()assert len(str(output_file_path)) > 260, (f"Expected full output file path to be over the windows path length limit of {WINDOWS_MAX_PATH_LENGTH}, got {len(str(output_file_path))}")finally:shutil.rmtree(WINDOWS_UNC_PATH_STRING_PREFIX + long_root_path) | 3 | 41 | 3 | 220 | 3 | 1,554 | 1,608 | 1,554 | job_attachment_test,sync_outputs,tmp_path | ['job_output_downloader', 'output_file_path', 'job_attachment_settings'] | None | {"AnnAssign": 3, "Assign": 3, "Expr": 5, "If": 1, "Try": 1} | 21 | 55 | 21 | ["len", "str", "os.path.join", "math.floor", "os.makedirs", "len", "get_queue", "Exception", "download.OutputDownloader", "job_output_downloader.set_root_path", "str", "str", "job_output_downloader.download_job_output", "Path", "output_file_path.exists", "len", "str", "len", "str", "shutil.rmtree", "pytest.mark.skipif"] | 0 | [] | The function (test_download_outputs_windows_long_file_path) defined within the public class called public.The function start at line 1554 and ends at 1608. It contains 41 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [1554.0] and does not return any value. It declares 21.0 functions, and It has 21.0 functions called inside which are ["len", "str", "os.path.join", "math.floor", "os.makedirs", "len", "get_queue", "Exception", "download.OutputDownloader", "job_output_downloader.set_root_path", "str", "str", "job_output_downloader.download_job_output", "Path", "output_file_path.exists", "len", "str", "len", "str", "shutil.rmtree", "pytest.mark.skipif"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_boto_session | def get_boto_session():"""Fixture to establish authenticated session for MCP integration tests."""session = boto3.Session()sts_client = session.client("sts")_ = sts_client.get_caller_identity()return session | 1 | 5 | 0 | 29 | 3 | 24 | 31 | 24 | ['sts_client', '_', 'session'] | Returns | {"Assign": 3, "Expr": 1, "Return": 1} | 4 | 8 | 4 | ["boto3.Session", "session.client", "sts_client.get_caller_identity", "pytest.fixture"] | 0 | [] | The function (get_boto_session) defined within the public class called public.The function start at line 24 and ends at 31. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["boto3.Session", "session.client", "sts_client.get_caller_identity", "pytest.fixture"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_server_integration | async def test_mcp_server_integration(get_boto_session):"""Test that MCP server starts, registers tools, and handles tool calls."""with patch("deadline.client.api._session.get_boto3_session") as mock_session:mock_session.return_value = get_boto_sessionapp = server.app# 1. Check if all tools in TOOL_REGISTRY are availabletools_result = await app.list_tools()tools = tools_result if isinstance(tools_result, list) else tools_result.toolsavailable_tool_names = [tool.name for tool in tools]expected_tools = [f"deadline_{tool_name}" for tool_name in TOOL_REGISTRY.keys()]missing_tools = [tool for tool in expected_tools if tool not in available_tool_names]assert not missing_tools, f"Missing expected tools from TOOL_REGISTRY: {missing_tools}"# 2. Call deadline_check_authentication_status tool and verify responseauth_result = await app.call_tool("deadline_check_authentication_status", {})assert auth_result is not None, "Authentication tool call should return a result"if auth_result and hasattr(auth_result[0], "text"):# type: ignore[index]try:auth_data = json.loads(auth_result[0].text)# type: ignore[index]assert "error" not in auth_data, (f"Authentication check failed with error: {auth_data.get('error')}")except json.JSONDecodeError as e:pytest.fail(f"Check authentication tool returned non-JSON response: {auth_result[0].text[:200]}... (JSONDecodeError: {e})"# type: ignore[index])# 3. Call deadline_list_farms tool and verify responsefarms_result = await app.call_tool("deadline_list_farms", {})assert farms_result is not None, "List farms tool call should return a result"if farms_result and hasattr(farms_result[0], "text"):# type: ignore[index]try:farms_data = json.loads(farms_result[0].text)# type: ignore[index]assert "error" not in farms_data, (f"List farms failed with error: {farms_data.get('error')}")except json.JSONDecodeError as e:pytest.fail(f"List farms returned non-JSON response: {farms_result[0].text[:200]}... (JSONDecodeError: {e})"# type: ignore[index])# 4. Check metadata on deadline_list_queues toollist_queues_tool = Nonefor tool in tools:if tool.name == "deadline_list_queues":list_queues_tool = toolbreakassert list_queues_tool is not None, "deadline_list_queues tool not found"expected_params = get_tool_definition("list_queues")["param_names"]assert expected_params is not None, ("list_queues should have parameters defined in TOOL_REGISTRY")assert hasattr(list_queues_tool, "inputSchema"), ("deadline_list_queues tool should have an inputSchema")input_schema = list_queues_tool.inputSchemaassert input_schema is not None, "inputSchema should not be None"available_params = []if isinstance(input_schema, dict) and "properties" in input_schema:available_params = list(input_schema["properties"].keys())elif hasattr(input_schema, "properties") and input_schema.properties:available_params = list(input_schema.properties.keys())elif hasattr(input_schema, "model_fields"):available_params = list(input_schema.model_fields.keys())assert "farmId" in available_params, (f"farmId parameter should be available in list_queues tool. "f"Available params: {available_params}, Expected params: {expected_params}")missing_params = [param for param in expected_params if param not in available_params]assert not missing_params, (f"Missing expected parameters in list_queues tool: {missing_params}. "f"Available: {available_params}, Expected: {expected_params}")return True | 21 | 66 | 1 | 410 | 15 | 36 | 120 | 36 | get_boto_session | ['list_queues_tool', 'farms_data', 'farms_result', 'available_params', 'auth_result', 'auth_data', 'app', 'input_schema', 'expected_params', 'tools', 'missing_params', 'missing_tools', 'tools_result', 'available_tool_names', 'expected_tools'] | Returns | {"Assign": 20, "Expr": 3, "For": 1, "If": 6, "Return": 1, "Try": 2, "With": 1} | 25 | 85 | 25 | ["patch", "app.list_tools", "isinstance", "TOOL_REGISTRY.keys", "app.call_tool", "hasattr", "json.loads", "auth_data.get", "pytest.fail", "app.call_tool", "hasattr", "json.loads", "farms_data.get", "pytest.fail", "get_tool_definition", "hasattr", "isinstance", "list", "keys", "hasattr", "list", "input_schema.properties.keys", "hasattr", "list", "input_schema.model_fields.keys"] | 0 | [] | The function (test_mcp_server_integration) defined within the public class called public.The function start at line 36 and ends at 120. It contains 66 lines of code and it has a cyclomatic complexity of 21. The function does not take any parameters, and this function return a value. It declares 25.0 functions, and It has 25.0 functions called inside which are ["patch", "app.list_tools", "isinstance", "TOOL_REGISTRY.keys", "app.call_tool", "hasattr", "json.loads", "auth_data.get", "pytest.fail", "app.call_tool", "hasattr", "json.loads", "farms_data.get", "pytest.fail", "get_tool_definition", "hasattr", "isinstance", "list", "keys", "hasattr", "list", "input_schema.properties.keys", "hasattr", "list", "input_schema.model_fields.keys"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_tool_description_extraction | async def test_tool_description_extraction(get_boto_session):"""Test that tool descriptions are properly extracted from original function docstrings."""app = server.apptools_result = await app.list_tools()tools = tools_result if isinstance(tools_result, list) else tools_result.toolslist_farms_tool = Nonefor tool in tools:if tool.name == "deadline_list_farms":list_farms_tool = toolbreakassert list_farms_tool is not None, "deadline_list_farms tool not found"original_docstring = list_farms.__doc__assert original_docstring is not None, "Original docstring should be available"assert list_farms_tool.description == original_docstring, (f"Tool description should match original docstring.\n"f"Expected: {original_docstring}\n"f"Got: {list_farms_tool.description}")assert "Calls the deadline:ListFarms API call" in list_farms_tool.description, ("Tool description should come from the function's doc string")return True | 4 | 21 | 1 | 98 | 5 | 125 | 153 | 125 | get_boto_session | ['original_docstring', 'app', 'list_farms_tool', 'tools', 'tools_result'] | Returns | {"Assign": 6, "Expr": 1, "For": 1, "If": 1, "Return": 1} | 2 | 29 | 2 | ["app.list_tools", "isinstance"] | 0 | [] | The function (test_tool_description_extraction) defined within the public class called public.The function start at line 125 and ends at 153. It contains 21 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["app.list_tools", "isinstance"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | __init__ | def __init__(self, process):self.process = processself.request_id = 0 | 1 | 3 | 2 | 17 | 0 | 159 | 161 | 159 | self,process | [] | None | {"Assign": 2} | 0 | 3 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called MCPClient.The function start at line 159 and ends at 161. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [159.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | send_request | async def send_request(self, method, params=None):"""Send a JSON-RPC request to the MCP server."""self.request_id += 1request = {"jsonrpc": "2.0","id": self.request_id,"method": method,"params": params or {},}request_json = json.dumps(request) + "\n"self.process.stdin.write(request_json.encode())await self.process.stdin.drain()response_line = await self.process.stdout.readline()if not response_line:raise RuntimeError("No response from MCP server")return json.loads(response_line.decode().strip()) | 3 | 15 | 3 | 112 | 0 | 163 | 181 | 163 | self,method,params | [] | Returns | {"Assign": 3, "AugAssign": 1, "Expr": 3, "If": 1, "Return": 1} | 9 | 19 | 9 | ["json.dumps", "self.process.stdin.write", "request_json.encode", "self.process.stdin.drain", "self.process.stdout.readline", "RuntimeError", "json.loads", "strip", "response_line.decode"] | 0 | [] | The function (send_request) defined within the public class called MCPClient.The function start at line 163 and ends at 181. It contains 15 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [163.0], and this function return a value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["json.dumps", "self.process.stdin.write", "request_json.encode", "self.process.stdin.drain", "self.process.stdout.readline", "RuntimeError", "json.loads", "strip", "response_line.decode"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | initialize | async def initialize(self):"""Initialize the MCP session."""response = await self.send_request("initialize",{"protocolVersion": "2024-11-05","capabilities": {"tools": {}},"clientInfo": {"name": "test-client", "version": "1.0.0"},},)if "error" in response:raise RuntimeError(f"Initialize failed: {response['error']}")# Send initialized notification (no response expected)await self.send_notification("notifications/initialized")return response | 2 | 13 | 1 | 64 | 0 | 183 | 199 | 183 | self | [] | Returns | {"Assign": 1, "Expr": 2, "If": 1, "Return": 1} | 3 | 17 | 3 | ["self.send_request", "RuntimeError", "self.send_notification"] | 89 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"] | The function (initialize) defined within the public class called MCPClient.The function start at line 183 and ends at 199. It contains 13 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters, and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["self.send_request", "RuntimeError", "self.send_notification"], It has 89.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | send_notification | async def send_notification(self, method, params=None):"""Send a JSON-RPC notification (no response expected)."""request = {"jsonrpc": "2.0", "method": method, "params": params or {}}request_json = json.dumps(request) + "\n"self.process.stdin.write(request_json.encode())await self.process.stdin.drain() | 2 | 5 | 3 | 64 | 0 | 201 | 207 | 201 | self,method,params | [] | None | {"Assign": 2, "Expr": 3} | 4 | 7 | 4 | ["json.dumps", "self.process.stdin.write", "request_json.encode", "self.process.stdin.drain"] | 0 | [] | The function (send_notification) defined within the public class called MCPClient.The function start at line 201 and ends at 207. It contains 5 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [201.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["json.dumps", "self.process.stdin.write", "request_json.encode", "self.process.stdin.drain"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | list_tools | async def list_tools(self):"""List available tools."""response = await self.send_request("tools/list", {})if "error" in response:raise RuntimeError(f"List tools failed: {response['error']}")return response["result"]["tools"] | 2 | 5 | 1 | 37 | 0 | 209 | 214 | 209 | self | [] | Returns | {"Assign": 1, "Expr": 1, "If": 1, "Return": 1} | 2 | 6 | 2 | ["self.send_request", "RuntimeError"] | 0 | [] | The function (list_tools) defined within the public class called MCPClient.The function start at line 209 and ends at 214. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.send_request", "RuntimeError"]. |
aws-deadline_deadline-cloud | MCPClient | public | 0 | 0 | call_tool | async def call_tool(self, name, arguments=None):"""Call a tool."""response = await self.send_request("tools/call", {"name": name, "arguments": arguments or {}})if "error" in response:raise RuntimeError(f"Tool call failed: {response['error']}")return response["result"] | 3 | 7 | 3 | 50 | 0 | 216 | 223 | 216 | self,name,arguments | [] | Returns | {"Assign": 1, "Expr": 1, "If": 1, "Return": 1} | 2 | 8 | 2 | ["self.send_request", "RuntimeError"] | 0 | [] | The function (call_tool) defined within the public class called MCPClient.The function start at line 216 and ends at 223. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [216.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.send_request", "RuntimeError"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_server_process_integration | async def test_mcp_server_process_integration(get_boto_session):"""Test MCP server as a separate process via stdio communication."""with patch("deadline.client.api._session.get_boto3_session") as mock_session:mock_session.return_value = get_boto_sessionprocess = await asyncio.create_subprocess_exec(sys.executable,"-c","from deadline._mcp.server import main; main()",stdin=asyncio.subprocess.PIPE,stdout=asyncio.subprocess.PIPE,)try:client = MCPClient(process)init_response = await client.initialize()assert "result" in init_response, "Initialize should return a result"assert "capabilities" in init_response["result"], "Server should return capabilities"tools = await client.list_tools()assert len(tools) > 0, "Server should have tools available"tool_names = [tool["name"] for tool in tools]expected_tools = [f"deadline_{tool_name}" for tool_name in TOOL_REGISTRY.keys()]missing_tools = [tool for tool in expected_tools if tool not in tool_names]assert not missing_tools, f"Missing expected tools: {missing_tools}"farms_result = await client.call_tool("deadline_list_farms")assert "content" in farms_result, "List farms should return content"assert len(farms_result["content"]) > 0, "List farms should return non-empty content"farms_content = farms_result["content"][0]["text"]try:farms_data = json.loads(farms_content)if "error" in farms_data:print(f"List farms returned error (may be expected): {farms_data['error']}")except json.JSONDecodeError:pytest.fail(f"List farms tool returned invalid JSON: {farms_content[:200]}...")return Truefinally:if process and process.returncode is None:process.terminate()try:await asyncio.wait_for(process.wait(), timeout=5.0)except asyncio.TimeoutError:process.kill()await process.wait() | 11 | 40 | 1 | 269 | 10 | 228 | 276 | 228 | get_boto_session | ['client', 'farms_result', 'farms_data', 'init_response', 'tool_names', 'expected_tools', 'farms_content', 'tools', 'missing_tools', 'process'] | Returns | {"Assign": 11, "Expr": 7, "If": 2, "Return": 1, "Try": 3, "With": 1} | 17 | 49 | 17 | ["patch", "asyncio.create_subprocess_exec", "MCPClient", "client.initialize", "client.list_tools", "len", "TOOL_REGISTRY.keys", "client.call_tool", "len", "json.loads", "print", "pytest.fail", "process.terminate", "asyncio.wait_for", "process.wait", "process.kill", "process.wait"] | 0 | [] | The function (test_mcp_server_process_integration) defined within the public class called public.The function start at line 228 and ends at 276. It contains 40 lines of code and it has a cyclomatic complexity of 11. The function does not take any parameters, and this function return a value. It declares 17.0 functions, and It has 17.0 functions called inside which are ["patch", "asyncio.create_subprocess_exec", "MCPClient", "client.initialize", "client.list_tools", "len", "TOOL_REGISTRY.keys", "client.call_tool", "len", "json.loads", "print", "pytest.fail", "process.terminate", "asyncio.wait_for", "process.wait", "process.kill", "process.wait"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_tool_telemetry | async def test_mcp_tool_telemetry(get_boto_session):"""Test that MCP tools record telemetry events."""with patch("deadline.client.api._session.get_boto3_session") as mock_session:mock_session.return_value = get_boto_sessionmock_telemetry_client = MagicMock()with patch("deadline._mcp.utils.get_deadline_cloud_library_telemetry_client") as mock_get_client:mock_get_client.return_value = mock_telemetry_clientapp = server.appauth_result = await app.call_tool("deadline_check_authentication_status", {})assert auth_result is not None, "Authentication tool call should return a result"assert mock_telemetry_client.record_event.call_count >= 2, ("Should record at least 2 telemetry events (latency and usage)")recorded_calls = mock_telemetry_client.record_event.call_args_listevent_types = [call[1]["event_type"] for call in recorded_calls]assert "com.amazon.rum.deadline.mcp.latency" in event_types, ("Should record latency telemetry event")assert "com.amazon.rum.deadline.mcp.usage" in event_types, ("Should record usage telemetry event")latency_events = [callfor call in recorded_callsif call[1]["event_type"] == "com.amazon.rum.deadline.mcp.latency"]usage_events = [callfor call in recorded_callsif call[1]["event_type"] == "com.amazon.rum.deadline.mcp.usage"]assert len(latency_events) >= 1, "Should have at least one latency event"assert len(usage_events) >= 1, "Should have at least one usage event"# Assert latency eventlatency_event_details = latency_events[0][1]["event_details"]expected_latency_keys = {"latency", "tool_name", "usage_mode"}assert set(latency_event_details.keys()) == expected_latency_keys, (f"Latency event should contain exactly {expected_latency_keys}, "f"but got {set(latency_event_details.keys())}")assert isinstance(latency_event_details["latency"], int), "Latency should be an integer"assert latency_event_details["tool_name"] == "check_authentication_status"assert latency_event_details["usage_mode"] == "MCP"# Assert usage eventusage_event_details = usage_events[0][1]["event_details"]expected_usage_keys = {"tool_name", "is_success", "error_type", "usage_mode"}assert set(usage_event_details.keys()) == expected_usage_keys, (f"Usage event should contain exactly {expected_usage_keys}, "f"but got {set(usage_event_details.keys())}")assert usage_event_details["tool_name"] == "check_authentication_status"assert isinstance(usage_event_details["is_success"], bool), ("is_success should be a boolean")assert usage_event_details["usage_mode"] == "MCP"assert usage_event_details["error_type"] is None, ("error_type should be None for successful calls")return True | 6 | 58 | 1 | 314 | 11 | 281 | 351 | 281 | get_boto_session | ['latency_events', 'expected_usage_keys', 'usage_event_details', 'event_types', 'auth_result', 'app', 'latency_event_details', 'usage_events', 'expected_latency_keys', 'mock_telemetry_client', 'recorded_calls'] | Returns | {"Assign": 13, "Expr": 1, "Return": 1, "With": 2} | 16 | 71 | 16 | ["patch", "MagicMock", "patch", "app.call_tool", "len", "len", "set", "latency_event_details.keys", "set", "latency_event_details.keys", "isinstance", "set", "usage_event_details.keys", "set", "usage_event_details.keys", "isinstance"] | 0 | [] | The function (test_mcp_tool_telemetry) defined within the public class called public.The function start at line 281 and ends at 351. It contains 58 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters, and this function return a value. It declares 16.0 functions, and It has 16.0 functions called inside which are ["patch", "MagicMock", "patch", "app.call_tool", "len", "len", "set", "latency_event_details.keys", "set", "latency_event_details.keys", "isinstance", "set", "usage_event_details.keys", "set", "usage_event_details.keys", "isinstance"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_server_startup_telemetry | def test_mcp_server_startup_telemetry():"""Test that MCP server startup records telemetry."""mock_telemetry_client = MagicMock()with patch("deadline.client.cli._mcp_server.get_deadline_cloud_library_telemetry_client") as mock_get_client, patch("deadline._mcp.server.main") as mock_mcp_main:mock_get_client.return_value = mock_telemetry_clientrunner = CliRunner()result = runner.invoke(cli_mcp_server, [])assert result.exit_code == 0, f"Command failed with output: {result.output}"mock_telemetry_client.record_event.assert_called_once_with(event_type="com.amazon.rum.deadline.mcp.server_startup",event_details={"usage_mode": "MCP", "startup_method": "cli"},)mock_mcp_main.assert_called_once() | 1 | 14 | 0 | 83 | 3 | 355 | 375 | 355 | ['runner', 'result', 'mock_telemetry_client'] | None | {"Assign": 4, "Expr": 3, "With": 1} | 7 | 21 | 7 | ["MagicMock", "patch", "patch", "CliRunner", "runner.invoke", "mock_telemetry_client.record_event.assert_called_once_with", "mock_mcp_main.assert_called_once"] | 0 | [] | The function (test_mcp_server_startup_telemetry) defined within the public class called public.The function start at line 355 and ends at 375. It contains 14 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["MagicMock", "patch", "patch", "CliRunner", "runner.invoke", "mock_telemetry_client.record_event.assert_called_once_with", "mock_mcp_main.assert_called_once"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_tool_error_handling.parse_tool_result | def parse_tool_result(result):"""Helper to parse and validate tool result JSON."""assert result is not None, "Tool call should return a result"if result and hasattr(result[0], "text"):return json.loads(result[0].text)return {} | 3 | 5 | 1 | 41 | 0 | 383 | 388 | 383 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (test_mcp_tool_error_handling.parse_tool_result) defined within the public class called public.The function start at line 383 and ends at 388. It contains 5 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | test_mcp_tool_error_handling | async def test_mcp_tool_error_handling():"""Test that MCP tools properly handle and return errors using real AWS service validation."""def parse_tool_result(result):"""Helper to parse and validate tool result JSON."""assert result is not None, "Tool call should return a result"if result and hasattr(result[0], "text"):return json.loads(result[0].text)return {}mock_session = MagicMock()deadline_client = boto3.client("deadline", region_name="us-west-2")mock_session.client.side_effect = lambda service, **kwargs: {"deadline": deadline_client}.get(service, MagicMock())with patch("deadline.client.api._session.get_boto3_session") as mock_get_session:mock_get_session.return_value = mock_sessionapp = server.app# Test 1: Invalid farm ID formatqueues_data = parse_tool_result(await app.call_tool("deadline_list_queues", {"farmId": "invalid-farm-id-format"}))assert "error" in queues_data, f"Expected AWS error, got: {queues_data}"assert "type" in queues_data, "Error response should contain 'type' field"# Check for specific AWS exception types that indicate authentication/authorization issuesexpected_error_types = ["AccessDeniedException","ValidationException","UnauthorizedException","ExpiredTokenException",]assert any(error_type in queues_data["type"] for error_type in expected_error_types), (f"Expected one of {expected_error_types}, got: {queues_data['type']}")assert "error occurred" in queues_data["error"].lower()print(f"✅ Test 1 - Got expected error type: {queues_data['type']}")# Test 2: Invalid queue ID formatjobs_data = parse_tool_result(await app.call_tool("deadline_list_jobs",{"farmId": "farm-1234567890abcdef1234567890abcdef",# Valid format but likely non-existent"queueId": "invalid-queue-format",# Invalid format will trigger validation error},))assert "error" in jobs_data, f"Expected AWS error, got: {jobs_data}"assert "type" in jobs_data, "Error response should contain 'type' field"assert any(error_type in jobs_data["type"] for error_type in expected_error_types), (f"Expected one of {expected_error_types}, got: {jobs_data['type']}")assert "error occurred" in jobs_data["error"].lower()print(f"✅ Test 2 - Got expected error type: {jobs_data['type']}")# Test 3: Non-existent IDsfleets_data = parse_tool_result(await app.call_tool("deadline_list_fleets",{"farmId": "farm-0000000000000000000000000000000"# Properly formatted but non-existent},))assert "error" in fleets_data, f"Expected AWS error, got: {fleets_data}"assert "type" in fleets_data, "Error response should contain 'type' field"resource_error_types = expected_error_types + ["ResourceNotFoundException","ForbiddenException",]assert any(error_type in fleets_data["type"] for error_type in resource_error_types), (f"Expected one of {resource_error_types}, got: {fleets_data['type']}") | 4 | 59 | 0 | 279 | 8 | 380 | 456 | 380 | ['resource_error_types', 'deadline_client', 'app', 'jobs_data', 'queues_data', 'fleets_data', 'mock_session', 'expected_error_types'] | Returns | {"Assign": 10, "Expr": 4, "If": 1, "Return": 2, "With": 1} | 20 | 77 | 20 | ["hasattr", "json.loads", "MagicMock", "boto3.client", "get", "MagicMock", "patch", "parse_tool_result", "app.call_tool", "any", "lower", "print", "parse_tool_result", "app.call_tool", "any", "lower", "print", "parse_tool_result", "app.call_tool", "any"] | 0 | [] | The function (test_mcp_tool_error_handling) defined within the public class called public.The function start at line 380 and ends at 456. It contains 59 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters, and this function return a value. It declares 20.0 functions, and It has 20.0 functions called inside which are ["hasattr", "json.loads", "MagicMock", "boto3.client", "get", "MagicMock", "patch", "parse_tool_result", "app.call_tool", "any", "lower", "print", "parse_tool_result", "app.call_tool", "any", "lower", "print", "parse_tool_result", "app.call_tool", "any"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | launch_jobbundle_dir | def launch_jobbundle_dir():squish.startApplication("deadline bundle gui-submit --browse")test.log("Launched Choose Job Bundle Directory.") | 1 | 3 | 0 | 16 | 0 | 15 | 17 | 15 | [] | None | {"Expr": 2} | 2 | 3 | 2 | ["squish.startApplication", "test.log"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.choose_jobbundledir_helpers_py.detect_platform_and_launch_jobbundle_guisubmitter"] | The function (launch_jobbundle_dir) defined within the public class called public.The function start at line 15 and ends at 17. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["squish.startApplication", "test.log"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.choose_jobbundledir_helpers_py.detect_platform_and_launch_jobbundle_guisubmitter"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | launch_jobbundle_dir_windows_only | def launch_jobbundle_dir_windows_only():squish.startApplication(f"python {config.windows_deadline_path_envvar} bundle gui-submit --browse")test.log("Launched Choose Job Bundle Directory on Windows.") | 1 | 5 | 0 | 17 | 0 | 21 | 25 | 21 | [] | None | {"Expr": 2} | 2 | 5 | 2 | ["squish.startApplication", "test.log"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.choose_jobbundledir_helpers_py.detect_platform_and_launch_jobbundle_guisubmitter"] | The function (launch_jobbundle_dir_windows_only) defined within the public class called public.The function start at line 21 and ends at 25. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["squish.startApplication", "test.log"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.choose_jobbundledir_helpers_py.detect_platform_and_launch_jobbundle_guisubmitter"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | detect_platform_and_launch_jobbundle_guisubmitter | def detect_platform_and_launch_jobbundle_guisubmitter():if platform.system() == "Linux":test.log("Detected test running on Linux OS")test.log("Launching Choose Job Bundle GUI Submitter on Linux")launch_jobbundle_dir()elif platform.system() == "Windows":test.log("Detected test running on Windows OS")test.log("Launching Choose Job Bundle GUI Submitter on Windows")launch_jobbundle_dir_windows_only()elif platform.system() == "Darwin":test.log("Detected test running on macOS")test.log("Launching Choose Job Bundle GUI Submitter on macOS")launch_jobbundle_dir()else:test.log("Unknown operating system") | 4 | 15 | 0 | 84 | 0 | 29 | 43 | 29 | [] | None | {"Expr": 10, "If": 3} | 13 | 15 | 13 | ["platform.system", "test.log", "test.log", "launch_jobbundle_dir", "platform.system", "test.log", "test.log", "launch_jobbundle_dir_windows_only", "platform.system", "test.log", "test.log", "launch_jobbundle_dir", "test.log"] | 0 | [] | The function (detect_platform_and_launch_jobbundle_guisubmitter) defined within the public class called public.The function start at line 29 and ends at 43. It contains 15 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["platform.system", "test.log", "test.log", "launch_jobbundle_dir", "platform.system", "test.log", "test.log", "launch_jobbundle_dir_windows_only", "platform.system", "test.log", "test.log", "launch_jobbundle_dir", "test.log"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | select_jobbundle | def select_jobbundle(filepath: str):# enter job bundle directory file path in directory text inputsquish.type(squish.waitForObject(choose_jobbundledir_locators.jobbundle_filepath_input), filepath)test.log("Entered job bundle file path in Choose Job Bundle Directory.")# verify text input appearstest.compare(str(squish.waitForObjectExists(choose_jobbundledir_locators.jobbundle_filepath_input).displayText),filepath,"Expect job bundle file path to be input in dialogue.",)# hit 'choose' buttontest.log("Hitting 'Choose' button to open Submitter dialogue for selected job bundle.")squish.clickButton(squish.waitForObject(choose_jobbundledir_locators.choose_jobbundledir_button)) | 1 | 18 | 1 | 70 | 0 | 46 | 66 | 46 | filepath | [] | None | {"Expr": 5} | 9 | 21 | 9 | ["squish.type", "squish.waitForObject", "test.log", "test.compare", "str", "squish.waitForObjectExists", "test.log", "squish.clickButton", "squish.waitForObject"] | 0 | [] | The function (select_jobbundle) defined within the public class called public.The function start at line 46 and ends at 66. It contains 18 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["squish.type", "squish.waitForObject", "test.log", "test.compare", "str", "squish.waitForObjectExists", "test.log", "squish.clickButton", "squish.waitForObject"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | load_different_job_bundle | def load_different_job_bundle():# click on job specific settings tab to navigate and ensure tests are on correct tabgui_submitter_helpers.navigate_job_specific_settings()# verify load different job bundle button exists and contains correct button texttest.compare(str(squish.waitForObjectExists(gui_submitter_locators.load_different_job_bundle_button).text),"Load a different job bundle","Expect Load a different job bundle button to contain correct text.",)# verify load a different job bundle button is enabledtest.compare(squish.waitForObjectExists(gui_submitter_locators.load_different_job_bundle_button).enabled,True,"Expect Load a different job bundle button to be enabled.",)# click on load a different job bundle buttontest.log("Hitting `Load a different job bundle` button.")squish.clickButton(squish.waitForObject(gui_submitter_locators.load_different_job_bundle_button))# verify Choose Job Bundle directory is opentest.compare(str(squish.waitForObjectExists(choose_jobbundledir_locators.choose_job_bundle_dir).windowTitle),"Choose job bundle directory","Expect Choose job bundle directory window title to be present.",)test.compare(squish.waitForObjectExists(choose_jobbundledir_locators.choose_job_bundle_dir).visible,True,"Expect Choose job bundle directory to be open.",) | 1 | 32 | 0 | 114 | 0 | 69 | 105 | 69 | [] | None | {"Expr": 7} | 14 | 37 | 14 | ["gui_submitter_helpers.navigate_job_specific_settings", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists", "test.log", "squish.clickButton", "squish.waitForObject", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists"] | 0 | [] | The function (load_different_job_bundle) defined within the public class called public.The function start at line 69 and ends at 105. It contains 32 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 14.0 functions, and It has 14.0 functions called inside which are ["gui_submitter_helpers.navigate_job_specific_settings", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists", "test.log", "squish.clickButton", "squish.waitForObject", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | navigate_shared_job_settings | def navigate_shared_job_settings():# click on Shared job settings tabtest.log("Navigate to Shared job settings tab.")squish.clickTab(squish.waitForObject(gui_submitter_locators.shared_jobsettings_tab), "Shared job settings")# verify on shared job settings tabtest.compare(squish.waitForObjectExists(gui_submitter_locators.shared_jobsettings_properties_box).visible,True,"Expect user to be on Shared job settings tab.",) | 1 | 12 | 0 | 45 | 0 | 10 | 23 | 10 | [] | None | {"Expr": 3} | 5 | 14 | 5 | ["test.log", "squish.clickTab", "squish.waitForObject", "test.compare", "squish.waitForObjectExists"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.gui_submitter_helpers_py.verify_shared_job_settings"] | The function (navigate_shared_job_settings) defined within the public class called public.The function start at line 10 and ends at 23. It contains 12 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["test.log", "squish.clickTab", "squish.waitForObject", "test.compare", "squish.waitForObjectExists"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.squish.suite_deadline_gui.shared.scripts.gui_submitter_helpers_py.verify_shared_job_settings"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | navigate_job_specific_settings | def navigate_job_specific_settings():# click on Job-specific settings tabtest.log("Navigate to Job-specific settings tab.")squish.clickTab(squish.waitForObject(gui_submitter_locators.job_specificsettings_tab),"Job-specific settings",)# verify on job specific settings tabtest.compare(squish.waitForObjectExists(gui_submitter_locators.job_specificsettings_properties).visible,True,"Expect user to be on Job-specific settings tab.",) | 1 | 11 | 0 | 46 | 0 | 26 | 38 | 26 | [] | None | {"Expr": 3} | 5 | 13 | 5 | ["test.log", "squish.clickTab", "squish.waitForObject", "test.compare", "squish.waitForObjectExists"] | 0 | [] | The function (navigate_job_specific_settings) defined within the public class called public.The function start at line 26 and ends at 38. It contains 11 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["test.log", "squish.clickTab", "squish.waitForObject", "test.compare", "squish.waitForObjectExists"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | verify_shared_job_settings | def verify_shared_job_settings(job_name: str,):# click on shared job settings tab to navigate and ensure tests are on correct tabnavigate_shared_job_settings()# verify job name is set correctlytest.compare(str(squish.waitForObjectExists(gui_submitter_locators.job_properties_name_input).displayText),job_name,"Expect correct job bundle job name to be displayed by default.",) | 1 | 11 | 1 | 34 | 0 | 41 | 53 | 41 | job_name | [] | None | {"Expr": 2} | 4 | 13 | 4 | ["navigate_shared_job_settings", "test.compare", "str", "squish.waitForObjectExists"] | 0 | [] | The function (verify_shared_job_settings) defined within the public class called public.The function start at line 41 and ends at 53. It contains 11 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["navigate_shared_job_settings", "test.compare", "str", "squish.waitForObjectExists"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | deadlinecloud_farmname_locator | def deadlinecloud_farmname_locator(farm_name):return {"container": deadline_cloud_settings_widget,"text": farm_name,"type": "QLabel","unnamed": 1,"visible": 1,} | 1 | 8 | 1 | 28 | 0 | 242 | 249 | 242 | farm_name | [] | Returns | {"Return": 1} | 0 | 8 | 0 | [] | 0 | [] | The function (deadlinecloud_farmname_locator) defined within the public class called public.The function start at line 242 and ends at 249. It contains 8 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | deadlinecloud_queuename_locator | def deadlinecloud_queuename_locator(queue_name):return {"container": deadline_cloud_settings_widget,"text": queue_name,"type": "QLabel","unnamed": 1,"visible": 1,} | 1 | 8 | 1 | 28 | 0 | 252 | 259 | 252 | queue_name | [] | Returns | {"Return": 1} | 0 | 8 | 0 | [] | 0 | [] | The function (deadlinecloud_queuename_locator) defined within the public class called public.The function start at line 252 and ends at 259. It contains 8 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | open_job_hist_directory | def open_job_hist_directory():# hit '...' button to open Choose Job history directory file browsersquish.clickButton(squish.waitForObject(workstation_config_locators.open_job_hist_dir_button))test.log("Opened job history directory dialogue.")# verify job history directory dialogue is opentest.compare(str(squish.waitForObjectExists(workstation_config_locators.choosejobhistdir_filebrowser).windowTitle),"Choose Job history directory","Expect Choose Job history directory dialogue window title to be present.",)test.compare(squish.waitForObjectExists(workstation_config_locators.choosejobhistdir_filebrowser).visible,True,"Expect Choose Job history directory dialogue to be open.",)# hit 'choose' button to set job history directory and close file browsersquish.clickButton(squish.waitForObject(workstation_config_locators.choosejobhistdir_choose_button))test.log("Set job history and closed job history directory dialogue.") | 1 | 23 | 0 | 85 | 0 | 12 | 37 | 12 | [] | None | {"Expr": 6} | 11 | 26 | 11 | ["squish.clickButton", "squish.waitForObject", "test.log", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists", "squish.clickButton", "squish.waitForObject", "test.log"] | 0 | [] | The function (open_job_hist_directory) defined within the public class called public.The function start at line 12 and ends at 37. It contains 23 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["squish.clickButton", "squish.waitForObject", "test.log", "test.compare", "str", "squish.waitForObjectExists", "test.compare", "squish.waitForObjectExists", "squish.clickButton", "squish.waitForObject", "test.log"]. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.