project_name string | class_name string | class_modifiers string | class_implements int64 | class_extends int64 | function_name string | function_body string | cyclomatic_complexity int64 | NLOC int64 | num_parameter int64 | num_token int64 | num_variable int64 | start_line int64 | end_line int64 | function_index int64 | function_params string | function_variable string | function_return_type string | function_body_line_type string | function_num_functions int64 | function_num_lines int64 | outgoing_function_count int64 | outgoing_function_names string | incoming_function_count int64 | incoming_function_names string | lexical_representation string |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
aws-deadline_deadline-cloud | public | public | 0 | 0 | _save_debug_snapshot.sh_line | def sh_line(val: str = ""):sh_fh.write(val.encode("utf-8"))# type: ignoresh_fh.write(b"\n")# type: ignore | 1 | 3 | 1 | 27 | 0 | 332 | 334 | 332 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_save_debug_snapshot.sh_line) defined within the public class called public.The function start at line 332 and ends at 334. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _save_debug_snapshot.bat_line | def bat_line(val: str = ""):bat_fh.write(val.encode("utf-8"))# type: ignorebat_fh.write(b"\r\n")# type: ignore | 1 | 3 | 1 | 27 | 0 | 346 | 348 | 346 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_save_debug_snapshot.bat_line) defined within the public class called public.The function start at line 346 and ends at 348. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _save_debug_snapshot | def _save_debug_snapshot(debug_snapshot_dir: str,create_job_args: dict,asset_manager: S3AssetManager,queue: dict,storage_profile_id: str,storage_profile: Optional[StorageProfile],):# Save the full set of arguments for passing to the deadline.create_job APIwith open(os.path.join(debug_snapshot_dir, "create_job_args.json"), "w", encoding="utf-8") as fh:json.dump(create_job_args, fh, indent=1)# Add all the parameters, saving JSON values and multi-line strings in files.cli_args: list[tuple] = []for param_name, param_value in create_job_args.items():words = re.findall("(?:^|[A-Z])[a-z]+", param_name)kebab_name = "-".join(word.lower() for word in words)if isinstance(param_value, (dict, list)):param_file = f"{kebab_name}_param.json"with open(os.path.join(debug_snapshot_dir, param_file), "w", encoding="utf-8") as fh:json.dump(param_value, fh, indent=1)cli_args.append((f"--{kebab_name}", f"file://{param_file}"))elif isinstance(param_value, str) and "\n" in param_value:param_file = f"{kebab_name}_param.data"with open(os.path.join(debug_snapshot_dir, param_file), "w", encoding="utf-8") as fh:fh.write(param_value)cli_args.append((f"--{kebab_name}", f"file://{param_file}"))else:cli_args.append((f"--{kebab_name}", str(param_value)))def write_commands(write_line: Callable, continuation: str):if "attachments" in create_job_args:for subdir in ("Data", "Manifests"):write_line(f"aws s3 cp {continuation}")write_line(f"--recursive {continuation}")write_line(f"./{subdir} {continuation}")write_line(f"s3://{asset_manager.job_attachment_settings.s3BucketName}/{asset_manager.job_attachment_settings.rootPrefix}/{subdir}"# type: ignore)write_line()write_line(f"aws deadline create-job {continuation}")for param_opts in cli_args[:-1]:write_line(f"{shlex.join(param_opts)} {continuation}")write_line(f"{shlex.join(cli_args[-1])}")# Write a shell script that submits the job using AWS CLI commandswith open(os.path.join(debug_snapshot_dir, "submit_job.sh"), "wb") as sh_fh:def sh_line(val: str = ""):sh_fh.write(val.encode("utf-8"))# type: ignoresh_fh.write(b"\n")# type: ignoresh_line("#!/bin/sh")sh_line("# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.")sh_line("set -xeuo pipefail")sh_line('cd "$(dirname "$0")"')sh_line()write_commands(sh_line, "\\")# Write a batch file that submits the job using AWS CLI commandswith open(os.path.join(debug_snapshot_dir, "submit_job.bat"), "wb") as bat_fh:def bat_line(val: str = ""):bat_fh.write(val.encode("utf-8"))# type: ignorebat_fh.write(b"\r\n")# type: ignorebat_line("REM Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.")bat_line('cd /d "%~dp0"')bat_line()write_commands(bat_line, "^")# Write the queue and storage profile resourceswith open(os.path.join(debug_snapshot_dir, "queue.json"), "w") as fh:queue_with_str = {key: (value.isoformat() if isinstance(value, datetime) else value)for key, value in queue.items()if key != "ResponseMetadata"}json.dump(queue_with_str, fh, indent=1)if storage_profile_id and storage_profile is not None:with open(os.path.join(debug_snapshot_dir, "storage_profile.json"), "w") as fh:json.dump(storage_profile.to_dict(), fh, indent=1)return None | 11 | 54 | 6 | 446 | 4 | 282 | 366 | 282 | debug_snapshot_dir,create_job_args,asset_manager,queue,storage_profile_id,storage_profile | ['queue_with_str', 'kebab_name', 'words', 'param_file'] | Returns | {"AnnAssign": 1, "Assign": 5, "Expr": 30, "For": 3, "If": 4, "Return": 1, "With": 7} | 59 | 85 | 59 | ["open", "os.path.join", "json.dump", "create_job_args.items", "re.findall", "join", "word.lower", "isinstance", "open", "os.path.join", "json.dump", "cli_args.append", "isinstance", "open", "os.path.join", "fh.write", "cli_args.append", "cli_args.append", "str", "write_line", "write_line", "write_line", "write_line", "write_line", "write_line", "write_line", "shlex.join", "write_line", "shlex.join", "open", "os.path.join", "sh_fh.write", "val.encode", "sh_fh.write", "sh_line", "sh_line", "sh_line", "sh_line", "sh_line", "write_commands", "open", "os.path.join", "bat_fh.write", "val.encode", "bat_fh.write", "bat_line", "bat_line", "bat_line", "write_commands", "open", "os.path.join", "isinstance", "value.isoformat", "queue.items", "json.dump", "open", "os.path.join", "json.dump", "storage_profile.to_dict"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"] | The function (_save_debug_snapshot) defined within the public class called public.The function start at line 282 and ends at 366. It contains 54 lines of code and it has a cyclomatic complexity of 11. It takes 6 parameters, represented as [282.0], and this function return a value. It declares 59.0 functions, It has 59.0 functions called inside which are ["open", "os.path.join", "json.dump", "create_job_args.items", "re.findall", "join", "word.lower", "isinstance", "open", "os.path.join", "json.dump", "cli_args.append", "isinstance", "open", "os.path.join", "fh.write", "cli_args.append", "cli_args.append", "str", "write_line", "write_line", "write_line", "write_line", "write_line", "write_line", "write_line", "shlex.join", "write_line", "shlex.join", "open", "os.path.join", "sh_fh.write", "val.encode", "sh_fh.write", "sh_line", "sh_line", "sh_line", "sh_line", "sh_line", "write_commands", "open", "os.path.join", "bat_fh.write", "val.encode", "bat_fh.write", "bat_line", "bat_line", "bat_line", "write_commands", "open", "os.path.join", "isinstance", "value.isoformat", "queue.items", "json.dump", "open", "os.path.join", "json.dump", "storage_profile.to_dict"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | create_job_from_job_bundle._default_create_job_result_callback | def _default_create_job_result_callback() -> bool:return True | 1 | 2 | 0 | 8 | 0 | 808 | 809 | 808 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (create_job_from_job_bundle._default_create_job_result_callback) defined within the public class called public.The function start at line 808 and ends at 809. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | create_job_from_job_bundle | def create_job_from_job_bundle(job_bundle_dir: str,job_parameters: list[dict[str, Any]] = [],*,name: Optional[str] = None,queue_parameter_definitions: Optional[list[JobParameter]] = None,job_attachments_file_system: Optional[str] = None,config: Optional[ConfigParser] = None,priority: Optional[int] = None,max_failed_tasks_count: Optional[int] = None,max_retries_per_task: Optional[int] = None,max_worker_count: Optional[int] = None,target_task_run_status: Optional[str] = None,require_paths_exist: bool = False,submitter_name: Optional[str] = None,known_asset_paths: Collection[str] = [],debug_snapshot_dir: Optional[str] = None,from_gui: bool = False,print_function_callback: Callable[[str], None] = print,interactive_confirmation_callback: Optional[Callable[[str, bool], bool]] = None,hashing_progress_callback: Optional[Callable[[ProgressReportMetadata], bool]] = None,upload_progress_callback: Optional[Callable[[ProgressReportMetadata], bool]] = None,create_job_result_callback: Optional[Callable[[], bool]] = None,) -> Optional[str]:"""Creates a job in the farm/queue configured as default for the workstation from the job bundle in the provided directory.The return value is the submitted job id except when debug_snapshot_dir is provided. When creating a debug snapshot,no job is submitted.A job bundle has the following directory structure:/template.json|yaml (required): An Open Job Description job template that specifies the work to be done. Job parametersare embedded here./parameter_values.json|yaml (optional): If provided, these are parameter values for the job template and forthe render farm. AWS Deadline Cloud-specific parameters are like "deadline:priority".Looks like:{"parameterValues": [{"name": "<name>", "value": "<value>"},...]}/asset_references.json|yaml (optional): If provided, these are references to the input and output assetsof the job. Looks like:{"assetReferences": {"inputs": {"filenames": ["/mnt/path/to/file.txt",...],"directories": ["/mnt/path/to/directory",...],},"outputs": {"directories": ["/mnt/path/to/output_directory",...],}}}Args:job_bundle_dir (str): The directory containing the job bundle.job_parameters (List[Dict[str, Any]], optional): A list of job parameters in the following format:[{"name": "<name>", "value": "<value>"}, ...]name (str, optional): The name of the job to submit, replacing the name defined in the job bundle.queue_parameter_definitions (list[JobParameter], optional) A list of queue_parameters to useinstead of retrieving queue_parameters from the queue with get_queue_parameter_definitions.job_attachments_file_system (str, optional): define which file system to use;(valid values: "COPIED", "VIRTUAL") instead of using the value in the config file.config (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file.priority (int, optional): explicit value for the priority of the job.max_failed_tasks_count (int, optional): explicit value for the maximum allowed failed tasks.max_retries_per_task (int, optional): explicit value for the maximum retries per task.max_worker_count (int, optional): explicit value for the max worker count of the job.target_task_run_status (str, optional): explicit value for the target task run status of the job.Valid values are "READY" or "SUSPENDED".require_paths_exist (bool, optional): Whether to require that all input paths exist.submitter_name (str, optional): Name of the application submitting the bundle.known_asset_paths (list[str], optional): A list of paths that should not generatewarnings when outside storage profile locations. Defaults to an empty list.debug_snapshot_dir (str, optional): EXPERIMENTAL - A directory in which to save a debug snapshot of the data and commandsneeded to exactly replicate the deadline:CreateJob service API call.print_function_callback (Callable str -> None, optional): Callback to print messages produced in this function.By default calls print(), Can be replaced by click.echo or a logging function of choice.interactive_confirmation_callback (Callable [str, bool] -> bool): Callback arguments are (confirmation_message, default_response).This function should present the provided prompt, using default_response as the default value to respond with if the userdoes not make an explicit choice, and return True if the user wants to continue, False to cancel.hashing_progress_callback / upload_progress_callback / create_job_result_callback (Callable -> bool):Callbacks periodically called while hashing / uploading / waiting for job creation. If returns false,the operation will be cancelled. If return true, the operation continues. Default behavior for eachis to not cancel the operation. hashing_progress_callback and upload_progress_callback both receiveProgressReport as a parameter, which can be used for projecting remaining time, as in done in the CLI."""if not submitter_name:submitter_name = "Custom"session_context["submitter-name"] = submitter_name# Ensure the job bundle doesn't contain files that resolve outside of the bundle directoryvalidate_directory_symlink_containment(job_bundle_dir)# Read in the job templatefile_contents, file_type = read_yaml_or_json(job_bundle_dir, "template", required=True)# If requested, substitute the job name in the templateif name is not None:template_obj = parse_yaml_or_json_content(file_contents, file_type, job_bundle_dir, "template")template_obj["name"] = nameif file_type == "YAML":file_contents = deadline_yaml_dump(template_obj)else:file_contents = json.dumps(template_obj)deadline = api.get_boto3_client("deadline", config=config)queue_id = get_setting("defaults.queue_id", config=config)farm_id = get_setting("defaults.farm_id", config=config)if job_attachments_file_system is None:job_attachments_file_system = get_setting("defaults.job_attachments_file_system", config=config)queue = deadline.get_queue(farmId=farm_id,queueId=queue_id,)if not debug_snapshot_dir:print_function_callback(f"Submitting to Queue: {queue['displayName']}\n")else:print_function_callback(f"Snapshotting submission to Queue: {queue['displayName']}\n")create_job_args: Dict[str, Any] = {"farmId": farm_id,"queueId": queue_id,"template": file_contents,"templateType": file_type,"priority": 50,}storage_profile_id = get_setting("settings.storage_profile_id", config=config)storage_profile = Noneif storage_profile_id:create_job_args["storageProfileId"] = storage_profile_idstorage_profile = api.get_storage_profile_for_queue(farm_id, queue_id, storage_profile_id, deadline)# The job parametersjob_bundle_parameters = read_job_bundle_parameters(job_bundle_dir)asset_references_obj = read_yaml_or_json_object(job_bundle_dir, "asset_references", required=False)asset_references = AssetReferences.from_dict(asset_references_obj)if queue_parameter_definitions is None:queue_parameter_definitions = api.get_queue_parameter_definitions(farmId=farm_id, queueId=queue_id)parameters = merge_queue_job_parameters(queue_id=queue_id,job_parameters=job_bundle_parameters,queue_parameters=queue_parameter_definitions,)apply_job_parameters(job_parameters,job_bundle_dir,parameters,asset_references,)app_parameters_formatted, job_parameters_formatted = split_parameter_args(parameters, job_bundle_dir)# Extend known_asset_paths with all paths that are treated as known. These are# paths provided explicitly by the call to submit the job bundle:# * Paths in the known_asset_paths parameter to this function call# * Paths contained inside the job bundle.# * Paths configured in the locally configured storage profile as LOCAL (not SHARED).# * Paths configured in the local config file settings.known_asset_paths# * Paths provided within the job_parameters parameter to this function call# Paths that are treated as unknown (unless in one of the above categories). These can be# absolute paths referencing anywhere in the file system, not explicitly provided by the call,# so require that they be marked as known in the local configuration file or the associated# Storage Profile in the AWS account:# * Paths provided in the job bundle via the parameter_values.json/.yaml file# * Paths provided in the job bundle via the asset_references.json/.yaml filesknown_asset_paths = list(known_asset_paths) + [os.path.abspath(job_bundle_dir)]# Add the configured storage profile pathsif storage_profile:known_asset_paths.extend([fsl.pathfor fsl in storage_profile.fileSystemLocationsif fsl.type == FileSystemLocationType.LOCAL])# Add the configured known asset pathsconfigured_known_asset_paths = config_file.get_setting("settings.known_asset_paths", config=config).strip()if configured_known_asset_paths:known_asset_paths.extend(configured_known_asset_paths.split(os.pathsep))# Use the parameter names from job_parameters, but the values from parameters. If a value was provided# in job_parameters, it has been applied into parameters and normalized as necessary.known_parameter_names = {job_param.get("name") for job_param in job_parameters}for job_param in parameters:if job_param.get("type") == "PATH" and job_param.get("name") in known_parameter_names:job_param_value = job_param.get("value")if job_param_value:if job_param.get("objectType") == "FILE":# If the job parameter is a file, use its directory as the known path. When collecting# outputs for upload, only that directory is used, not the file path.known_asset_paths.append(os.path.dirname(job_param_value))else:known_asset_paths.append(job_param_value)# Filter known_asset_paths to remove any paths that have another one as a prefix. This can# reduce the amount of processing needed later, and produces a shorter warning message when presenting# to users.known_asset_paths = _filter_redundant_known_paths(known_asset_paths)# Hash and upload job attachments if there are anyfiles_processed = Falseif asset_references and "jobAttachmentSettings" in queue:# Extend input_filenames with all the files in the input_directoriesmissing_directories: set[str] = set()for directory in asset_references.input_directories:if not os.path.isdir(directory):if require_paths_exist:missing_directories.add(directory)else:logger.warning(f"Input path '{directory}' does not exist. Adding to referenced paths.")asset_references.referenced_paths.add(directory)continueis_dir_empty = Truefor root, _, files in os.walk(directory):if not files:continueis_dir_empty = Falseasset_references.input_filenames.update(os.path.normpath(os.path.join(root, file)) for file in files)# Empty directories just become references since the current asset manifest spec# version cannot represent them.if is_dir_empty:logger.info(f"Input directory '{directory}' is empty. Adding to referenced paths.")asset_references.referenced_paths.add(directory)asset_references.input_directories.clear()if missing_directories:all_missing_directories = "\n\t".join(sorted(list(missing_directories)))misconfigured_directories_msg = ("Job submission contains misconfigured input directories and cannot be submitted."" All input directories must exist."f"\nNon-existent directories:\n\t{all_missing_directories}")raise MisconfiguredInputsError(misconfigured_directories_msg)queue_role_session = api.get_queue_user_boto3_session(deadline=deadline,config=config,farm_id=farm_id,queue_id=queue_id,queue_display_name=queue["displayName"],)asset_manager = S3AssetManager(farm_id=farm_id,queue_id=queue_id,job_attachment_settings=JobAttachmentS3Settings(**queue["jobAttachmentSettings"]),session=queue_role_session,)upload_group = asset_manager.prepare_paths_for_upload(input_paths=sorted(asset_references.input_filenames),output_paths=sorted(asset_references.output_directories),referenced_paths=sorted(asset_references.referenced_paths),storage_profile=storage_profile,require_paths_exist=require_paths_exist,)if upload_group.asset_groups:# Generate warning message if neededasset_path_message, default_prompt_response = _generate_message_for_asset_paths(upload_group, storage_profile, known_asset_paths)if interactive_confirmation_callback is None:# In this case, no user prompt can be presented. The result of the function must# be the default that would be presented to the interactive prompt.print_function_callback(asset_path_message)if not default_prompt_response:print_function_callback("\nJob submission canceled (user input not enabled).")raise DeadlineOperationCanceled()elif config_file.str2bool(get_setting("settings.auto_accept", config=config)):if not default_prompt_response:if from_gui:# In the from_gui case, we present a prompt even though settings.auto_accept is enabled.if not interactive_confirmation_callback(asset_path_message + "Do you wish to proceed?", default_prompt_response):print_function_callback("Job submission canceled (user input).")raise UserInitiatedCancel()else:# In this case, no user prompt should be presented. The result of the function must# be the default that would be presented to the interactive prompt.print_function_callback(f"{asset_path_message}\nJob submission canceled (settings.auto_accept enabled and there were unknown paths).")raise DeadlineOperationCanceled()else:print_function_callback(asset_path_message)else:if not interactive_confirmation_callback(asset_path_message + "\nDo you wish to proceed?", default_prompt_response):print_function_callback("Job submission canceled (user input).")raise UserInitiatedCancel()_, asset_manifests = _hash_attachments(asset_manager=asset_manager,asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,print_function_callback=print_function_callback,hashing_progress_callback=hashing_progress_callback,)if not debug_snapshot_dir:attachment_settings = _upload_attachments(# type: ignoreasset_manager,asset_manifests,print_function_callback,upload_progress_callback,from_gui=from_gui,)else:attachment_settings = _snapshot_attachments(# type: ignoredebug_snapshot_dir,asset_manager,asset_manifests,print_function_callback,upload_progress_callback,from_gui=from_gui,)attachment_settings["fileSystem"] = JobAttachmentsFileSystem(job_attachments_file_system)create_job_args["attachments"] = attachment_settingsfiles_processed = Trueif not files_processed:# Call each callback once indicating nothing to do.if hashing_progress_callback is not None:hashing_progress_callback(ProgressReportMetadata(status=ProgressStatus.PREPARING_IN_PROGRESS,progress=0,transferRate=0,progressMessage="No files to hash",))if upload_progress_callback is not None:upload_progress_callback(ProgressReportMetadata(status=ProgressStatus.UPLOAD_IN_PROGRESS,progress=0,transferRate=0,progressMessage="No files to upload",))create_job_args.update(app_parameters_formatted)if job_parameters_formatted:create_job_args["parameters"] = job_parameters_formattedif priority is not None:create_job_args["priority"] = priorityif max_worker_count is not None:create_job_args["maxWorkerCount"] = max_worker_countif max_failed_tasks_count is not None:create_job_args["maxFailedTasksCount"] = max_failed_tasks_countif max_retries_per_task is not None:create_job_args["maxRetriesPerTask"] = max_retries_per_taskif target_task_run_status is not None:create_job_args["targetTaskRunStatus"] = target_task_run_statusif logging.DEBUG >= logger.getEffectiveLevel():logger.debug(json.dumps(create_job_args, indent=1))api.get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.submission",event_details={"submitter_name": submitter_name},from_gui=from_gui,)if debug_snapshot_dir:return _save_debug_snapshot(debug_snapshot_dir,create_job_args,asset_manager,queue,storage_profile_id,storage_profile,)create_job_response = deadline.create_job(**create_job_args)logger.debug("CreateJob Response %r", create_job_response)if create_job_response and "jobId" in create_job_response:job_id = create_job_response["jobId"]print_function_callback("Waiting for Job to be created...")# If using the default config, set the default job id so it holds the# most-recently submitted job.if config is None:set_setting("defaults.job_id", job_id)def _default_create_job_result_callback() -> bool:return Trueif not create_job_result_callback:create_job_result_callback = _default_create_job_result_callbacksuccess, status_message = wait_for_create_job_to_complete(farm_id,queue_id,job_id,deadline,create_job_result_callback,)api.get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.create_job",event_details={"is_success": success},from_gui=from_gui,)if not success:raise DeadlineOperationError(status_message)print_function_callback("Submitted job bundle:")print_function_callback(f" {job_bundle_dir}")print_function_callback(status_message + f"\n{job_id}")return job_idelse:raise DeadlineOperationError("CreateJob response was empty, or did not contain a Job ID.") | 53 | 305 | 26 | 1,525 | 30 | 370 | 837 | 370 | job_bundle_dir,job_parameters,name,queue_parameter_definitions,job_attachments_file_system,config,priority,max_failed_tasks_count,max_retries_per_task,max_worker_count,target_task_run_status,require_paths_exist,submitter_name,known_asset_paths,debug_snapshot_dir,from_gui,print_function_callback,interactive_confirmation_callback,hashing_progress_callback,upload_progress_callback,create_job_result_callback | ['attachment_settings', 'asset_references_obj', 'storage_profile', 'farm_id', 'job_attachments_file_system', 'storage_profile_id', 'upload_group', 'file_contents', 'is_dir_empty', 'asset_manager', 'queue', 'queue_parameter_definitions', 'known_asset_paths', 'parameters', 'job_param_value', 'deadline', 'configured_known_asset_paths', 'misconfigured_directories_msg', 'all_missing_directories', 'template_obj', 'job_id', 'create_job_result_callback', 'submitter_name', 'job_bundle_parameters', 'asset_references', 'files_processed', 'known_parameter_names', 'create_job_response', 'queue_role_session', 'queue_id'] | Optional[str] | {"AnnAssign": 2, "Assign": 52, "Expr": 34, "For": 3, "If": 42, "Return": 3} | 103 | 468 | 103 | ["validate_directory_symlink_containment", "read_yaml_or_json", "parse_yaml_or_json_content", "deadline_yaml_dump", "json.dumps", "api.get_boto3_client", "get_setting", "get_setting", "get_setting", "deadline.get_queue", "print_function_callback", "print_function_callback", "get_setting", "api.get_storage_profile_for_queue", "read_job_bundle_parameters", "read_yaml_or_json_object", "AssetReferences.from_dict", "api.get_queue_parameter_definitions", "merge_queue_job_parameters", "apply_job_parameters", "split_parameter_args", "list", "os.path.abspath", "known_asset_paths.extend", "strip", "config_file.get_setting", "known_asset_paths.extend", "configured_known_asset_paths.split", "job_param.get", "job_param.get", "job_param.get", "job_param.get", "job_param.get", "known_asset_paths.append", "os.path.dirname", "known_asset_paths.append", "_filter_redundant_known_paths", "set", "os.path.isdir", "missing_directories.add", "logger.warning", "asset_references.referenced_paths.add", "os.walk", "asset_references.input_filenames.update", "os.path.normpath", "os.path.join", "logger.info", "asset_references.referenced_paths.add", "asset_references.input_directories.clear", "join", "sorted", "list", "MisconfiguredInputsError", "api.get_queue_user_boto3_session", "S3AssetManager", "JobAttachmentS3Settings", "asset_manager.prepare_paths_for_upload", "sorted", "sorted", "sorted", "_generate_message_for_asset_paths", "print_function_callback", "print_function_callback", "DeadlineOperationCanceled", "config_file.str2bool", "get_setting", "interactive_confirmation_callback", "print_function_callback", "UserInitiatedCancel", "print_function_callback", "DeadlineOperationCanceled", "print_function_callback", "interactive_confirmation_callback", "print_function_callback", "UserInitiatedCancel", "_hash_attachments", "_upload_attachments", "_snapshot_attachments", "JobAttachmentsFileSystem", "hashing_progress_callback", "ProgressReportMetadata", "upload_progress_callback", "ProgressReportMetadata", "create_job_args.update", "logger.getEffectiveLevel", "logger.debug", "json.dumps", "record_event", "api.get_deadline_cloud_library_telemetry_client", "_save_debug_snapshot", "deadline.create_job", "logger.debug", "print_function_callback", "set_setting", "wait_for_create_job_to_complete", "record_event", "api.get_deadline_cloud_library_telemetry_client", "DeadlineOperationError", "print_function_callback", "print_function_callback", "print_function_callback", "DeadlineOperationError", "api.record_function_latency_telemetry_event"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.tools.job_py.submit_job", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.job_templates_py.submit_job_bundle"] | The function (create_job_from_job_bundle) defined within the public class called public.The function start at line 370 and ends at 837. It contains 305 lines of code and it has a cyclomatic complexity of 53. It takes 26 parameters, represented as [370.0] and does not return any value. It declares 103.0 functions, It has 103.0 functions called inside which are ["validate_directory_symlink_containment", "read_yaml_or_json", "parse_yaml_or_json_content", "deadline_yaml_dump", "json.dumps", "api.get_boto3_client", "get_setting", "get_setting", "get_setting", "deadline.get_queue", "print_function_callback", "print_function_callback", "get_setting", "api.get_storage_profile_for_queue", "read_job_bundle_parameters", "read_yaml_or_json_object", "AssetReferences.from_dict", "api.get_queue_parameter_definitions", "merge_queue_job_parameters", "apply_job_parameters", "split_parameter_args", "list", "os.path.abspath", "known_asset_paths.extend", "strip", "config_file.get_setting", "known_asset_paths.extend", "configured_known_asset_paths.split", "job_param.get", "job_param.get", "job_param.get", "job_param.get", "job_param.get", "known_asset_paths.append", "os.path.dirname", "known_asset_paths.append", "_filter_redundant_known_paths", "set", "os.path.isdir", "missing_directories.add", "logger.warning", "asset_references.referenced_paths.add", "os.walk", "asset_references.input_filenames.update", "os.path.normpath", "os.path.join", "logger.info", "asset_references.referenced_paths.add", "asset_references.input_directories.clear", "join", "sorted", "list", "MisconfiguredInputsError", "api.get_queue_user_boto3_session", "S3AssetManager", "JobAttachmentS3Settings", "asset_manager.prepare_paths_for_upload", "sorted", "sorted", "sorted", "_generate_message_for_asset_paths", "print_function_callback", "print_function_callback", "DeadlineOperationCanceled", "config_file.str2bool", "get_setting", "interactive_confirmation_callback", "print_function_callback", "UserInitiatedCancel", "print_function_callback", "DeadlineOperationCanceled", "print_function_callback", "interactive_confirmation_callback", "print_function_callback", "UserInitiatedCancel", "_hash_attachments", "_upload_attachments", "_snapshot_attachments", "JobAttachmentsFileSystem", "hashing_progress_callback", "ProgressReportMetadata", "upload_progress_callback", "ProgressReportMetadata", "create_job_args.update", "logger.getEffectiveLevel", "logger.debug", "json.dumps", "record_event", "api.get_deadline_cloud_library_telemetry_client", "_save_debug_snapshot", "deadline.create_job", "logger.debug", "print_function_callback", "set_setting", "wait_for_create_job_to_complete", "record_event", "api.get_deadline_cloud_library_telemetry_client", "DeadlineOperationError", "print_function_callback", "print_function_callback", "print_function_callback", "DeadlineOperationError", "api.record_function_latency_telemetry_event"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.tools.job_py.submit_job", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.job_templates_py.submit_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | wait_for_create_job_to_complete | def wait_for_create_job_to_complete(farm_id: str,queue_id: str,job_id: str,deadline_client: BaseClient,continue_callback: Callable,) -> Tuple[bool, str]:"""Wait until a job exits the CREATE_IN_PROGRESS state."""initial_delay = 0.3max_delay = 5.0timeout_seconds = 300creating_statuses = {"CREATE_IN_PROGRESS",}failure_statuses = {"CREATE_FAILED"}start_time = time.time()delay = initial_delay# Initial wait before first attempttime.sleep(initial_delay)while time.time() - start_time < timeout_seconds:if not continue_callback():raise CreateJobWaiterCanceled()job = deadline_client.get_job(jobId=job_id, queueId=queue_id, farmId=farm_id)current_status = job["lifecycleStatus"] if "lifecycleStatus" in job else job["state"]if current_status in creating_statuses:time.sleep(delay)delay = min(delay * 2, max_delay)elif current_status in failure_statuses:return False, job["lifecycleStatusMessage"]else:return True, job["lifecycleStatusMessage"]raise TimeoutError(f"Timed out after {timeout_seconds} seconds while waiting for Job to be created: {job_id}") | 6 | 32 | 5 | 174 | 9 | 840 | 882 | 840 | farm_id,queue_id,job_id,deadline_client,continue_callback | ['creating_statuses', 'timeout_seconds', 'initial_delay', 'delay', 'start_time', 'failure_statuses', 'max_delay', 'job', 'current_status'] | Tuple[bool, str] | {"Assign": 10, "Expr": 3, "If": 3, "Return": 2, "While": 1} | 9 | 43 | 9 | ["time.time", "time.sleep", "time.time", "continue_callback", "CreateJobWaiterCanceled", "deadline_client.get_job", "time.sleep", "min", "TimeoutError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"] | The function (wait_for_create_job_to_complete) defined within the public class called public.The function start at line 840 and ends at 882. It contains 32 lines of code and it has a cyclomatic complexity of 6. It takes 5 parameters, represented as [840.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["time.time", "time.sleep", "time.time", "continue_callback", "CreateJobWaiterCanceled", "deadline_client.get_job", "time.sleep", "min", "TimeoutError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_deadline_endpoint_url | def get_deadline_endpoint_url(config: Optional[ConfigParser] = None,) -> str:# Use boto3's built-in logic to get the correct endpoint URLclient = get_boto3_client("deadline", config=config)return client.meta.endpoint_url | 1 | 5 | 1 | 31 | 1 | 43 | 48 | 43 | config | ['client'] | str | {"Assign": 1, "Return": 1} | 1 | 6 | 1 | ["get_boto3_client"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"] | The function (get_deadline_endpoint_url) defined within the public class called public.The function start at line 43 and ends at 48. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["get_boto3_client"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | __init__ | def __init__(self,package_name: str,package_ver: str,config: Optional[ConfigParser] = None,):self._initialized: bool = Falseself.package_name = package_nameself.package_ver = ".".join(package_ver.split(".")[:3])# IDs for this sessionself.session_id: str = str(uuid.uuid4())self.telemetry_id: str = self._get_telemetry_identifier(config=config)# If a different base package is provided, include info from this library as supplementary infoif package_name != "deadline-cloud-library":self._common_details["deadline-cloud-version"] = versionself._system_metadata = self._get_system_metadata(config=config)self.set_opt_out(config=config)self.initialize(config=config) | 2 | 16 | 4 | 123 | 0 | 93 | 111 | 93 | self,package_name,package_ver,config | [] | None | {"AnnAssign": 3, "Assign": 4, "Expr": 2, "If": 1} | 8 | 19 | 8 | ["join", "package_ver.split", "str", "uuid.uuid4", "self._get_telemetry_identifier", "self._get_system_metadata", "self.set_opt_out", "self.initialize"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called TelemetryClient.The function start at line 93 and ends at 111. It contains 16 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [93.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["join", "package_ver.split", "str", "uuid.uuid4", "self._get_telemetry_identifier", "self._get_system_metadata", "self.set_opt_out", "self.initialize"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | set_opt_out | def set_opt_out(self, config: Optional[ConfigParser] = None) -> None:"""Checks whether telemetry has been opted out by checking the DEADLINE_CLOUD_TELEMETRY_OPT_OUTenvironment variable and the 'telemetry.opt_out' config file setting.Note the environment variable supersedes the config file setting."""env_var_value = os.environ.get("DEADLINE_CLOUD_TELEMETRY_OPT_OUT")if env_var_value:self.telemetry_opted_out = env_var_value in config_file._TRUE_VALUESelse:self.telemetry_opted_out = config_file.str2bool(config_file.get_setting("telemetry.opt_out", config=config))logger.info("Deadline Cloud telemetry is "+ ("not enabled." if self.telemetry_opted_out else "enabled.")) | 3 | 12 | 2 | 76 | 0 | 113 | 129 | 113 | self,config | [] | None | {"Assign": 3, "Expr": 2, "If": 1} | 4 | 17 | 4 | ["os.environ.get", "config_file.str2bool", "config_file.get_setting", "logger.info"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.dialogs.deadline_config_dialog_py.DeadlineWorkstationConfigWidget.apply"] | The function (set_opt_out) defined within the public class called TelemetryClient.The function start at line 113 and ends at 129. It contains 12 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [113.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["os.environ.get", "config_file.str2bool", "config_file.get_setting", "logger.info"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.dialogs.deadline_config_dialog_py.DeadlineWorkstationConfigWidget.apply"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | initialize | def initialize(self, config: Optional[ConfigParser] = None) -> None:"""Starts up the telemetry background thread after getting settings from the boto3 client.Note that if this is called before boto3 is successfully configured / initialized,an error can be raised. In that case we silently fail and don't mark the client asinitialized."""if self.telemetry_opted_out:returntry:self.endpoint: str = self._get_prefixed_endpoint(f"{get_deadline_endpoint_url(config=config)}/2023-10-12/telemetry",TelemetryClient.ENDPOINT_PREFIX,)# Some environments might not have SSL, so we'll use the vendored botocore SSL contextfrom botocore.httpsession import create_urllib3_context, get_cert_pathself._urllib3_context = create_urllib3_context()self._urllib3_context.load_verify_locations(cafile=get_cert_path(True))user_id, _ = get_user_and_identity_store_id(config=config)if user_id:self._system_metadata["user_id"] = user_idmonitor_id: Optional[str] = get_monitor_id(config=config)if monitor_id:self._system_metadata["monitor_id"] = monitor_idself._initialized = Trueself._start_threads()except Exception:# Silently swallow any exceptionsreturn | 5 | 21 | 2 | 130 | 0 | 131 | 165 | 131 | self,config | [] | None | {"AnnAssign": 2, "Assign": 5, "Expr": 3, "If": 3, "Return": 2, "Try": 1} | 8 | 35 | 8 | ["self._get_prefixed_endpoint", "get_deadline_endpoint_url", "create_urllib3_context", "self._urllib3_context.load_verify_locations", "get_cert_path", "get_user_and_identity_store_id", "get_monitor_id", "self._start_threads"] | 89 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"] | The function (initialize) defined within the public class called TelemetryClient.The function start at line 131 and ends at 165. It contains 21 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [131.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["self._get_prefixed_endpoint", "get_deadline_endpoint_url", "create_urllib3_context", "self._urllib3_context.load_verify_locations", "get_cert_path", "get_user_and_identity_store_id", "get_monitor_id", "self._start_threads"], It has 89.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | is_initialized | def is_initialized(self) -> bool:return self._initialized | 1 | 2 | 1 | 11 | 0 | 168 | 169 | 168 | self | [] | bool | {"Return": 1} | 0 | 2 | 0 | [] | 0 | [] | The function (is_initialized) defined within the public class called TelemetryClient.The function start at line 168 and ends at 169. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _get_prefixed_endpoint | def _get_prefixed_endpoint(self, endpoint: str, prefix: str) -> str:"""Insert the prefix right after 'https://'"""if endpoint.startswith("https://"):prefixed_endpoint = endpoint[:8] + prefix + endpoint[8:]return prefixed_endpointreturn endpoint | 2 | 5 | 3 | 43 | 0 | 171 | 176 | 171 | self,endpoint,prefix | [] | str | {"Assign": 1, "Expr": 1, "If": 1, "Return": 2} | 1 | 6 | 1 | ["endpoint.startswith"] | 0 | [] | The function (_get_prefixed_endpoint) defined within the public class called TelemetryClient.The function start at line 171 and ends at 176. It contains 5 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [171.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["endpoint.startswith"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _get_telemetry_identifier | def _get_telemetry_identifier(self, config: Optional[ConfigParser] = None):identifier = config_file.get_setting("telemetry.identifier", config=config)try:uuid.UUID(identifier, version=4)except ValueError:# Thrown if the user_id isn't in UUID4 formatidentifier = str(uuid.uuid4())config_file.set_setting("telemetry.identifier", identifier)return identifier | 2 | 8 | 2 | 61 | 0 | 178 | 185 | 178 | self,config | [] | Returns | {"Assign": 2, "Expr": 2, "Return": 1, "Try": 1} | 5 | 8 | 5 | ["config_file.get_setting", "uuid.UUID", "str", "uuid.uuid4", "config_file.set_setting"] | 0 | [] | The function (_get_telemetry_identifier) defined within the public class called TelemetryClient.The function start at line 178 and ends at 185. It contains 8 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [178.0], and this function return a value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["config_file.get_setting", "uuid.UUID", "str", "uuid.uuid4", "config_file.set_setting"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _start_threads | def _start_threads(self) -> None:"""Set up background threads for shutdown checking and request sending"""self.event_queue: Queue[Optional[TelemetryEvent]] = Queue(maxsize=TelemetryClient.MAX_QUEUE_SIZE)atexit.register(self._exit_cleanly)self.processing_thread: Thread = Thread(target=self._process_event_queue_thread, daemon=True)self.processing_thread.start() | 1 | 9 | 1 | 61 | 0 | 187 | 196 | 187 | self | [] | None | {"AnnAssign": 2, "Expr": 3} | 4 | 10 | 4 | ["Queue", "atexit.register", "Thread", "self.processing_thread.start"] | 0 | [] | The function (_start_threads) defined within the public class called TelemetryClient.The function start at line 187 and ends at 196. It contains 9 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["Queue", "atexit.register", "Thread", "self.processing_thread.start"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _get_system_metadata | def _get_system_metadata(self, config: Optional[ConfigParser]) -> Dict[str, Any]:"""Builds up a dict of non-identifiable metadata about the system environment.This will be used in the Rum event metadata, which has a limit of 10 unique values."""platform_info = platform.uname()metadata: Dict[str, Any] = {"service": self.package_name,"version": self.package_ver,"python_version": platform.python_version(),"osName": "macOS" if platform_info.system == "Darwin" else platform_info.system,"osVersion": platform_info.release,}return metadata | 2 | 10 | 2 | 80 | 0 | 198 | 213 | 198 | self,config | [] | Dict[str, Any] | {"AnnAssign": 1, "Assign": 1, "Expr": 1, "Return": 1} | 2 | 16 | 2 | ["platform.uname", "platform.python_version"] | 0 | [] | The function (_get_system_metadata) defined within the public class called TelemetryClient.The function start at line 198 and ends at 213. It contains 10 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [198.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["platform.uname", "platform.python_version"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _exit_cleanly | def _exit_cleanly(self):self.event_queue.put(None)self.processing_thread.join() | 1 | 3 | 1 | 20 | 0 | 215 | 217 | 215 | self | [] | None | {"Expr": 2} | 2 | 3 | 2 | ["self.event_queue.put", "self.processing_thread.join"] | 0 | [] | The function (_exit_cleanly) defined within the public class called TelemetryClient.The function start at line 215 and ends at 217. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.event_queue.put", "self.processing_thread.join"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _send_request | def _send_request(self, req: request.Request) -> None:attempts = 0success = Falsewhile not success:try:with request.urlopen(req, context=self._urllib3_context):logger.debug("Successfully sent telemetry.")success = Trueexcept error.HTTPError as httpe:if httpe.code == 429 or httpe.code == 500:logger.debug(f"Error received from service. Waiting to retry: {str(httpe)}")attempts += 1if attempts >= TelemetryClient.MAX_RETRY_ATTEMPTS:raise Exception("Max retries reached sending telemetry")backoff_sleep = random.uniform(0,min(TelemetryClient.MAX_BACKOFF_SECONDS,TelemetryClient.BASE_TIME * 2**attempts,),)time.sleep(backoff_sleep)else:# Reraise any exceptions we didn't expectraise | 6 | 24 | 2 | 124 | 0 | 219 | 244 | 219 | self,req | [] | None | {"Assign": 4, "AugAssign": 1, "Expr": 3, "If": 2, "Try": 1, "While": 1, "With": 1} | 8 | 26 | 8 | ["request.urlopen", "logger.debug", "logger.debug", "str", "Exception", "random.uniform", "min", "time.sleep"] | 0 | [] | The function (_send_request) defined within the public class called TelemetryClient.The function start at line 219 and ends at 244. It contains 24 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [219.0] and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["request.urlopen", "logger.debug", "logger.debug", "str", "Exception", "random.uniform", "min", "time.sleep"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _process_event_queue_thread | def _process_event_queue_thread(self):"""Background thread for processing the telemetry event data queue and sending telemetry requests."""while True:# Blocks until we get a new entry in the queueevent_data: Optional[TelemetryEvent] = self.event_queue.get()# We've received the shutdown signalif event_data is None:returnheaders = {"Accept": "application-json", "Content-Type": "application-json"}try:request_body = {"BatchId": str(uuid.uuid4()),"RumEvents": [{"details": str(json.dumps(event_data.event_details)),"id": str(uuid.uuid4()),"metadata": str(json.dumps(self._system_metadata)),"timestamp": int(datetime.now().timestamp()),"type": event_data.event_type,},],"UserDetails": {"sessionId": self.session_id, "userId": self.telemetry_id},}request_body_encoded = str(json.dumps(request_body)).encode("utf-8")except Exception as exc:logger.debug(f"Failed to serialize telemetry data. {str(exc)}")continuereq = request.Request(url=self.endpoint, data=request_body_encoded, headers=headers)try:logger.debug("Sending telemetry data: %s", request_body)self._send_request(req)except Exception as exc:# Swallow any kind of uncaught exception and stop sending telemetrylogger.debug(f"Error received from service. {str(exc)}")returnself.event_queue.task_done() | 5 | 32 | 1 | 226 | 0 | 246 | 283 | 246 | self | [] | None | {"AnnAssign": 1, "Assign": 4, "Expr": 6, "If": 1, "Return": 2, "Try": 2, "While": 1} | 23 | 38 | 23 | ["self.event_queue.get", "str", "uuid.uuid4", "str", "json.dumps", "str", "uuid.uuid4", "str", "json.dumps", "int", "timestamp", "datetime.now", "encode", "str", "json.dumps", "logger.debug", "str", "request.Request", "logger.debug", "self._send_request", "logger.debug", "str", "self.event_queue.task_done"] | 0 | [] | The function (_process_event_queue_thread) defined within the public class called TelemetryClient.The function start at line 246 and ends at 283. It contains 32 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value. It declares 23.0 functions, and It has 23.0 functions called inside which are ["self.event_queue.get", "str", "uuid.uuid4", "str", "json.dumps", "str", "uuid.uuid4", "str", "json.dumps", "int", "timestamp", "datetime.now", "encode", "str", "json.dumps", "logger.debug", "str", "request.Request", "logger.debug", "self._send_request", "logger.debug", "str", "self.event_queue.task_done"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _put_telemetry_record | def _put_telemetry_record(self, event: TelemetryEvent) -> None:if not self._initialized or self.telemetry_opted_out:returntry:self.event_queue.put_nowait(event)except Full:# Silently swallow the error if the event queue is full (due to throttling of the service)pass | 4 | 7 | 2 | 36 | 0 | 285 | 292 | 285 | self,event | [] | None | {"Expr": 1, "If": 1, "Return": 1, "Try": 1} | 1 | 8 | 1 | ["self.event_queue.put_nowait"] | 0 | [] | The function (_put_telemetry_record) defined within the public class called TelemetryClient.The function start at line 285 and ends at 292. It contains 7 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [285.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self.event_queue.put_nowait"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | _record_summary_statistics | def _record_summary_statistics(self, event_type: str, summary: SummaryStatistics, from_gui: bool):details: Dict[str, Any] = asdict(summary)self.record_event(event_type=event_type, event_details=details, from_gui=from_gui) | 1 | 5 | 4 | 46 | 0 | 294 | 298 | 294 | self,event_type,summary,from_gui | [] | None | {"AnnAssign": 1, "Expr": 1} | 2 | 5 | 2 | ["asdict", "self.record_event"] | 0 | [] | The function (_record_summary_statistics) defined within the public class called TelemetryClient.The function start at line 294 and ends at 298. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [294.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["asdict", "self.record_event"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | record_hashing_summary | def record_hashing_summary(self, summary: SummaryStatistics, *, from_gui: bool = False):self._record_summary_statistics("com.amazon.rum.deadline.job_attachments.hashing_summary", summary, from_gui) | 1 | 4 | 3 | 27 | 0 | 300 | 303 | 300 | self,summary,from_gui | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["self._record_summary_statistics"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_attachment_py._hash_attachments"] | The function (record_hashing_summary) defined within the public class called TelemetryClient.The function start at line 300 and ends at 303. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [300.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self._record_summary_statistics"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_attachment_py._hash_attachments"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | record_upload_summary | def record_upload_summary(self, summary: SummaryStatistics, *, from_gui: bool = False):self._record_summary_statistics("com.amazon.rum.deadline.job_attachments.upload_summary", summary, from_gui) | 1 | 4 | 3 | 27 | 0 | 305 | 308 | 305 | self,summary,from_gui | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["self._record_summary_statistics"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py._upload_attachments"] | The function (record_upload_summary) defined within the public class called TelemetryClient.The function start at line 305 and ends at 308. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [305.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self._record_summary_statistics"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py._upload_attachments"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | record_error | def record_error(self, event_details: Dict[str, Any], exception_type: str, from_gui: bool = False):event_details["exception_type"] = exception_type# Possibility to add stack trace hereself.record_event("com.amazon.rum.deadline.error", event_details, from_gui=from_gui) | 1 | 5 | 4 | 42 | 0 | 310 | 315 | 310 | self,event_details,exception_type,from_gui | [] | None | {"Assign": 1, "Expr": 1} | 1 | 6 | 1 | ["self.record_event"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.dialogs.submit_job_to_deadline_dialog_py.SubmitJobToDeadlineDialog.on_submit"] | The function (record_error) defined within the public class called TelemetryClient.The function start at line 310 and ends at 315. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [310.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self.record_event"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.dialogs.submit_job_to_deadline_dialog_py.SubmitJobToDeadlineDialog.on_submit"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | record_event | def record_event(self, event_type: str, event_details: Dict[str, Any], *, from_gui: bool = False):try:self.update_common_details({"accountId": self.get_account_id(get_boto3_session())})except Exception as e:# Print any errors when getting the boto3 session, then proceedlogger.debug(f"Could not add account ID to telemetry: {str(e)}")event_details.update(self._common_details)event_details["usage_mode"] = "GUI" if from_gui else "CLI"self._put_telemetry_record(TelemetryEvent(event_type=event_type,event_details=event_details,)) | 3 | 15 | 4 | 91 | 0 | 317 | 333 | 317 | self,event_type,event_details,from_gui | [] | None | {"Assign": 1, "Expr": 4, "Try": 1} | 8 | 17 | 8 | ["self.update_common_details", "self.get_account_id", "get_boto3_session", "logger.debug", "str", "event_details.update", "self._put_telemetry_record", "TelemetryEvent"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_function_latency_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_success_fail_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_export_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (record_event) defined within the public class called TelemetryClient.The function start at line 317 and ends at 333. It contains 15 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [317.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["self.update_common_details", "self.get_account_id", "get_boto3_session", "logger.debug", "str", "event_details.update", "self._put_telemetry_record", "TelemetryEvent"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_function_latency_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_success_fail_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_export_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | get_account_id | def get_account_id(self, boto3_session) -> Optional[str]:"""Retrieves the AWS account ID for the current user.If the user is not authenticated, print an error message"""try:return boto3_session.client("sts").get_caller_identity()["Account"]except (ClientError, NoCredentialsError) as e:print(f"Could not add account ID to telemetry: {str(e)}")return None | 2 | 6 | 2 | 45 | 0 | 336 | 346 | 336 | self,boto3_session | [] | Optional[str] | {"Expr": 2, "Return": 2, "Try": 1} | 4 | 11 | 4 | ["get_caller_identity", "boto3_session.client", "print", "str"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_s3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file_with_get_object", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file_with_transfer_manager", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._get_asset_root_and_manifest_from_s3_with_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.file_already_uploaded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.upload_bytes_to_s3"] | The function (get_account_id) defined within the public class called TelemetryClient.The function start at line 336 and ends at 346. It contains 6 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [336.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_caller_identity", "boto3_session.client", "print", "str"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_s3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file_with_get_object", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file_with_transfer_manager", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._get_asset_root_and_manifest_from_s3_with_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.file_already_uploaded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.upload_bytes_to_s3"]. |
aws-deadline_deadline-cloud | TelemetryClient | public | 0 | 0 | update_common_details | def update_common_details(self, details: Dict[str, Any]):"""Updates the dict of common data that is included in every telemetry request."""self._common_details.update(details) | 1 | 2 | 2 | 23 | 0 | 348 | 350 | 348 | self,details | [] | None | {"Expr": 2} | 1 | 3 | 1 | ["self._common_details.update"] | 0 | [] | The function (update_common_details) defined within the public class called TelemetryClient.The function start at line 348 and ends at 350. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [348.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self._common_details.update"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_telemetry_client | def get_telemetry_client(package_name: str, package_ver: str, config: Optional[ConfigParser] = None) -> TelemetryClient:"""Retrieves the cached telemetry client, lazy-loading the first time this is called.:param package_name: Base package name to associate data by.:param package_ver: Base package version to associate data by.:param config: Optional configuration to use for the client. Loads defaults if not given.:return: Telemetry client to make requests with."""global __cached_telemetry_clientif not __cached_telemetry_client:__cached_telemetry_client = TelemetryClient(package_name=package_name,package_ver=package_ver,config=config,)elif not __cached_telemetry_client.is_initialized:__cached_telemetry_client.initialize(config=config)return __cached_telemetry_client | 3 | 13 | 3 | 62 | 1 | 353 | 373 | 353 | package_name,package_ver,config | ['__cached_telemetry_client'] | TelemetryClient | {"Assign": 1, "Expr": 2, "If": 2, "Return": 1} | 2 | 21 | 2 | ["TelemetryClient", "__cached_telemetry_client.initialize"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.get_deadline_cloud_library_telemetry_client"] | The function (get_telemetry_client) defined within the public class called public.The function start at line 353 and ends at 373. It contains 13 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [353.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["TelemetryClient", "__cached_telemetry_client.initialize"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.get_deadline_cloud_library_telemetry_client"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_deadline_cloud_library_telemetry_client | def get_deadline_cloud_library_telemetry_client(config: Optional[ConfigParser] = None,) -> TelemetryClient:"""Retrieves the cached telemetry client, specifying the Deadline Cloud Client Library's package information.:param config: Optional configuration to use for the client. Loads defaults if not given.:return: Telemetry client to make requests with."""return get_telemetry_client("deadline-cloud-library", version, config=config) | 1 | 4 | 1 | 27 | 0 | 376 | 384 | 376 | config | [] | TelemetryClient | {"Expr": 1, "Return": 1} | 1 | 9 | 1 | ["get_telemetry_client"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py._create_wrapper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_function_latency_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_success_fail_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._mcp_server_py.cli_mcp_server", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_latency_decorator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_success"] | The function (get_deadline_cloud_library_telemetry_client) defined within the public class called public.The function start at line 376 and ends at 384. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["get_telemetry_client"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py._create_wrapper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_function_latency_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.record_success_fail_telemetry_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._mcp_server_py.cli_mcp_server", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_latency_decorator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_success"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_success_fail_telemetry_event.record_success_fail_telemetry_event.inner.wrapper | def wrapper(*args: Any, **kwargs: Any) -> Any:"""Wrapper to try-catch a function for telemetry:param * Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter:param ** Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter"""success: bool = Falsetry:result = function(*args, **kwargs)success = Truereturn resultfinally:event_name = decorator_kwargs.get("metric_name", function.__name__)event_details: dict = decorator_kwargs.get("event_details", {})event_details["is_success"] = successraised_exception = sys.exc_info()[1]if raised_exception is not None:event_details["exception_type"] = type(raised_exception).__name__get_deadline_cloud_library_telemetry_client().record_event(event_type=f"com.amazon.rum.deadline.{event_name}",event_details=event_details,) | 3 | 17 | 2 | 114 | 0 | 394 | 417 | 394 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (record_success_fail_telemetry_event.record_success_fail_telemetry_event.inner.wrapper) defined within the public class called public.The function start at line 394 and ends at 417. It contains 17 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [394.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_success_fail_telemetry_event.inner | def inner(function: F) -> F:def wrapper(*args: Any, **kwargs: Any) -> Any:"""Wrapper to try-catch a function for telemetry:param * Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter:param ** Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter"""success: bool = Falsetry:result = function(*args, **kwargs)success = Truereturn resultfinally:event_name = decorator_kwargs.get("metric_name", function.__name__)event_details: dict = decorator_kwargs.get("event_details", {})event_details["is_success"] = successraised_exception = sys.exc_info()[1]if raised_exception is not None:event_details["exception_type"] = type(raised_exception).__name__get_deadline_cloud_library_telemetry_client().record_event(event_type=f"com.amazon.rum.deadline.{event_name}",event_details=event_details,)wrapper.__doc__ = function.__doc__return cast(F, wrapper) | 1 | 4 | 1 | 25 | 0 | 393 | 420 | 393 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (record_success_fail_telemetry_event.inner) defined within the public class called public.The function start at line 393 and ends at 420. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_success_fail_telemetry_event | def record_success_fail_telemetry_event(**decorator_kwargs: Any) -> Callable[[F], F]:"""Decorator to try catch a function. Sends a success / fail telemetry event.:param ** Python variable arguments. See https://docs.python.org/3/glossary.html#term-parameter."""def inner(function: F) -> F:def wrapper(*args: Any, **kwargs: Any) -> Any:"""Wrapper to try-catch a function for telemetry:param * Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter:param ** Python variable argument. See https://docs.python.org/3/glossary.html#term-parameter"""success: bool = Falsetry:result = function(*args, **kwargs)success = Truereturn resultfinally:event_name = decorator_kwargs.get("metric_name", function.__name__)event_details: dict = decorator_kwargs.get("event_details", {})event_details["is_success"] = successraised_exception = sys.exc_info()[1]if raised_exception is not None:event_details["exception_type"] = type(raised_exception).__name__get_deadline_cloud_library_telemetry_client().record_event(event_type=f"com.amazon.rum.deadline.{event_name}",event_details=event_details,)wrapper.__doc__ = function.__doc__return cast(F, wrapper)return inner | 1 | 3 | 1 | 22 | 4 | 387 | 422 | 387 | **decorator_kwargs | ['success', 'result', 'event_name', 'raised_exception'] | Callable[[F], F] | {"AnnAssign": 2, "Assign": 7, "Expr": 3, "If": 1, "Return": 3, "Try": 1} | 8 | 36 | 8 | ["function", "decorator_kwargs.get", "decorator_kwargs.get", "sys.exc_info", "type", "record_event", "get_deadline_cloud_library_telemetry_client", "cast"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_success"] | The function (record_success_fail_telemetry_event) defined within the public class called public.The function start at line 387 and ends at 422. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["function", "decorator_kwargs.get", "decorator_kwargs.get", "sys.exc_info", "type", "record_event", "get_deadline_cloud_library_telemetry_client", "cast"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_record_decorator_success"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_function_latency_telemetry_event.record_function_latency_telemetry_event.inner.wrapper | def wrapper(*args: Any, **kwargs: Any) -> Any:start_t = time.perf_counter_ns()ret_val = function(*args, **kwargs)end_t = time.perf_counter_ns()latency = end_t - start_tevent_name = decorator_kwargs.get("metric_name", function.__name__)get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.latency",event_details={"latency": latency, "function_call": event_name},)return ret_val | 1 | 11 | 2 | 81 | 0 | 433 | 446 | 433 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (record_function_latency_telemetry_event.record_function_latency_telemetry_event.inner.wrapper) defined within the public class called public.The function start at line 433 and ends at 446. It contains 11 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [433.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_function_latency_telemetry_event.inner | def inner(function: F) -> F:@wraps(function)def wrapper(*args: Any, **kwargs: Any) -> Any:start_t = time.perf_counter_ns()ret_val = function(*args, **kwargs)end_t = time.perf_counter_ns()latency = end_t - start_tevent_name = decorator_kwargs.get("metric_name", function.__name__)get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.latency",event_details={"latency": latency, "function_call": event_name},)return ret_valreturn cast(F, wrapper) | 1 | 4 | 1 | 23 | 0 | 431 | 448 | 431 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (record_function_latency_telemetry_event.inner) defined within the public class called public.The function start at line 431 and ends at 448. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | record_function_latency_telemetry_event | def record_function_latency_telemetry_event(**decorator_kwargs: Any) -> Callable[[F], F]:"""Decorator to time a function. Sends a latency telemetry event.:param ** Python variable arguments. See https://docs.python.org/3/glossary.html#term-parameter."""def inner(function: F) -> F:@wraps(function)def wrapper(*args: Any, **kwargs: Any) -> Any:start_t = time.perf_counter_ns()ret_val = function(*args, **kwargs)end_t = time.perf_counter_ns()latency = end_t - start_tevent_name = decorator_kwargs.get("metric_name", function.__name__)get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.latency",event_details={"latency": latency, "function_call": event_name},)return ret_valreturn cast(F, wrapper)return inner | 1 | 3 | 1 | 22 | 5 | 425 | 450 | 425 | **decorator_kwargs | ['end_t', 'event_name', 'ret_val', 'latency', 'start_t'] | Callable[[F], F] | {"Assign": 5, "Expr": 2, "Return": 3} | 8 | 26 | 8 | ["time.perf_counter_ns", "function", "time.perf_counter_ns", "decorator_kwargs.get", "record_event", "get_deadline_cloud_library_telemetry_client", "wraps", "cast"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_latency_decorator"] | The function (record_function_latency_telemetry_event) defined within the public class called public.The function start at line 425 and ends at 450. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["time.perf_counter_ns", "function", "time.perf_counter_ns", "decorator_kwargs.get", "record_event", "get_deadline_cloud_library_telemetry_client", "wraps", "cast"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_telemetry_py.test_latency_decorator"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _prompt_at_completion | def _prompt_at_completion(ctx: click.Context):"""If the click context has PROMPT_WHEN_COMPLETE set to True,prints out a prompt and waits for keyboard input."""if ctx.obj[_PROMPT_WHEN_COMPLETE]:click.prompt("Press Enter To Exit", prompt_suffix="", show_default=False, hide_input=True, default="") | 2 | 5 | 1 | 40 | 0 | 39 | 47 | 39 | ctx | [] | None | {"Expr": 2, "If": 1} | 1 | 9 | 1 | ["click.prompt"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._handle_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"] | The function (_prompt_at_completion) defined within the public class called public.The function start at line 39 and ends at 47. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["click.prompt"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._handle_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _handle_error.wraps | def wraps(ctx: click.Context, *args, **kwargs):try:func(*args, **kwargs)except DeadlineOperationError as e:# The message from DeadlineOperationError is printed# out verbatim.click.echo(str(e))_prompt_at_completion(ctx)sys.exit(1)except click.ClickException:# Let click exceptions fall throughraiseexcept Exception:# Log and print out unfamiliar exceptions with additional# messaging.click.echo("The AWS Deadline Cloud CLI encountered the following exception:")click.echo(traceback.format_exc())_prompt_at_completion(ctx)sys.exit(1) | 4 | 14 | 3 | 84 | 0 | 57 | 75 | 57 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_handle_error.wraps) defined within the public class called public.The function start at line 57 and ends at 75. It contains 14 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [57.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _handle_error | def _handle_error(func: Callable) -> Callable:"""Decorator that catches any exceptions raised in the passed in function,and handles their default printout."""@click.pass_contextdef wraps(ctx: click.Context, *args, **kwargs):try:func(*args, **kwargs)except DeadlineOperationError as e:# The message from DeadlineOperationError is printed# out verbatim.click.echo(str(e))_prompt_at_completion(ctx)sys.exit(1)except click.ClickException:# Let click exceptions fall throughraiseexcept Exception:# Log and print out unfamiliar exceptions with additional# messaging.click.echo("The AWS Deadline Cloud CLI encountered the following exception:")click.echo(traceback.format_exc())_prompt_at_completion(ctx)sys.exit(1)wraps.__doc__ = func.__doc__return wraps | 1 | 5 | 1 | 25 | 0 | 50 | 78 | 50 | func | [] | Callable | {"Assign": 1, "Expr": 9, "Return": 1, "Try": 1} | 10 | 29 | 10 | ["func", "click.echo", "str", "_prompt_at_completion", "sys.exit", "click.echo", "click.echo", "traceback.format_exc", "_prompt_at_completion", "sys.exit"] | 0 | [] | The function (_handle_error) defined within the public class called public.The function start at line 50 and ends at 78. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["func", "click.echo", "str", "_prompt_at_completion", "sys.exit", "click.echo", "click.echo", "traceback.format_exc", "_prompt_at_completion", "sys.exit"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _apply_cli_options_to_config | def _apply_cli_options_to_config(*, config: Optional[ConfigParser] = None, required_options: Set[str] = set(), **args) -> Optional[ConfigParser]:"""Modifies an AWS Deadline Cloud config object to apply standard option names to it, such asthe AWS profile, AWS Deadline Cloud Farm, or AWS Deadline Cloud Queue to use.Args:config (ConfigParser, optional): an AWS Deadline Cloud config, read by config_file.read_config().If not provided, loads the config from disk."""# Only work with a custom config if there are standard options providedif any(value is not None for value in args.values()):if config is None:config = config_file.read_config()aws_profile_name = args.pop("profile", None)if aws_profile_name:config_file.set_setting("defaults.aws_profile_name", aws_profile_name, config=config)farm_id = args.pop("farm_id", None)if farm_id:config_file.set_setting("defaults.farm_id", farm_id, config=config)queue_id = args.pop("queue_id", None)if queue_id:config_file.set_setting("defaults.queue_id", queue_id, config=config)storage_profile_id = args.pop("storage_profile_id", None)if storage_profile_id:config_file.set_setting("settings.storage_profile_id", storage_profile_id, config=config)job_id = args.pop("job_id", None)if job_id:config_file.set_setting("defaults.job_id", job_id, config=config)auto_accept = args.pop("yes", None)if auto_accept:config_file.set_setting("settings.auto_accept", "true", config=config)conflict_resolution = args.pop("conflict_resolution", None)if conflict_resolution:config_file.set_setting("settings.conflict_resolution", conflict_resolution, config=config)else:# Remove the standard option names from the args listfor name in ["profile", "farm_id", "queue_id", "job_id", "storage_profile_id"]:args.pop(name, None)# Check that the required options have valuesif "farm_id" in required_options:required_options.remove("farm_id")if not config_file.get_setting("defaults.farm_id", config=config):raise click.UsageError("Missing '--farm-id' or default Farm ID configuration")if "queue_id" in required_options:required_options.remove("queue_id")if not config_file.get_setting("defaults.queue_id", config=config):raise click.UsageError("Missing '--queue-id' or default Queue ID configuration")if "job_id" in required_options:required_options.remove("job_id")if not config_file.get_setting("defaults.job_id", config=config):raise click.UsageError("Missing '--job-id' or default Job ID configuration")if required_options:raise RuntimeError(f"Unexpected required AWS Deadline Cloud CLI options: {required_options}")if args:raise RuntimeError(f"Option names {tuple(args.keys())} are not standard AWS Deadline Cloud CLI options, they need special handling")return config | 20 | 55 | 2 | 376 | 8 | 81 | 159 | 81 | config,required_options,**args | ['config', 'farm_id', 'auto_accept', 'storage_profile_id', 'conflict_resolution', 'job_id', 'aws_profile_name', 'queue_id'] | Optional[ConfigParser] | {"Assign": 8, "Expr": 12, "For": 1, "If": 17, "Return": 1} | 32 | 79 | 32 | ["set", "any", "args.values", "config_file.read_config", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "required_options.remove", "config_file.get_setting", "click.UsageError", "required_options.remove", "config_file.get_setting", "click.UsageError", "required_options.remove", "config_file.get_setting", "click.UsageError", "RuntimeError", "RuntimeError", "tuple", "args.keys"] | 25 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.attachment_group_py.attachment_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.attachment_group_py.attachment_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.auth_group_py.auth_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_download_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_trace_schedule", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_wait_for_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_export_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_paramdefs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_list"] | The function (_apply_cli_options_to_config) defined within the public class called public.The function start at line 81 and ends at 159. It contains 55 lines of code and it has a cyclomatic complexity of 20. It takes 2 parameters, represented as [81.0] and does not return any value. It declares 32.0 functions, It has 32.0 functions called inside which are ["set", "any", "args.values", "config_file.read_config", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "config_file.set_setting", "args.pop", "required_options.remove", "config_file.get_setting", "click.UsageError", "required_options.remove", "config_file.get_setting", "click.UsageError", "required_options.remove", "config_file.get_setting", "click.UsageError", "RuntimeError", "RuntimeError", "tuple", "args.keys"], It has 25.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.attachment_group_py.attachment_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.attachment_group_py.attachment_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.auth_group_py.auth_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_download_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_trace_schedule", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_wait_for_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_export_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_paramdefs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_list"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _fix_multiline_strings | def _fix_multiline_strings(obj: Any) -> Any:"""Fixes the multi-line strings in `obj` to end with "\n".Returns a new object that has been modified."""if isinstance(obj, str):if "\n" in obj and not obj.endswith("\n"):return obj + "\n"else:return objelif isinstance(obj, list):return [_fix_multiline_strings(item) for item in obj]elif isinstance(obj, tuple):return tuple(_fix_multiline_strings(item) for item in obj)elif isinstance(obj, dict):return {key: _fix_multiline_strings(value) for key, value in obj.items()}elif isinstance(obj, set):return {_fix_multiline_strings(item) for item in obj}else:return obj | 12 | 16 | 1 | 128 | 0 | 162 | 181 | 162 | obj | [] | Any | {"Expr": 1, "If": 6, "Return": 7} | 12 | 20 | 12 | ["isinstance", "obj.endswith", "isinstance", "_fix_multiline_strings", "isinstance", "tuple", "_fix_multiline_strings", "isinstance", "_fix_multiline_strings", "obj.items", "isinstance", "_fix_multiline_strings"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._cli_object_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._fix_multiline_strings"] | The function (_fix_multiline_strings) defined within the public class called public.The function start at line 162 and ends at 181. It contains 16 lines of code and it has a cyclomatic complexity of 12. The function does not take any parameters and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["isinstance", "obj.endswith", "isinstance", "_fix_multiline_strings", "isinstance", "tuple", "_fix_multiline_strings", "isinstance", "_fix_multiline_strings", "obj.items", "isinstance", "_fix_multiline_strings"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._cli_object_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._common_py._fix_multiline_strings"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _cli_object_repr | def _cli_object_repr(obj: Any) -> str:"""Transforms an API response object into a string, for printing asCLI output. This formats the output as YAML, using the "|"-stylefor multi-line strings."""# If a multi-line string does not end with an "\n", the formatting# will not use the "|"-style yaml. We fix that up be modifying such# strings to end with "\n".obj = _fix_multiline_strings(obj)return deadline_yaml_dump(obj) | 1 | 3 | 1 | 21 | 1 | 184 | 194 | 184 | obj | ['obj'] | str | {"Assign": 1, "Expr": 1, "Return": 1} | 2 | 11 | 2 | ["_fix_multiline_strings", "deadline_yaml_dump"] | 17 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_trace_schedule", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_wait_for_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_paramdefs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._categorize_jobs_in_checkpoint", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_py.test_cli_object_repr"] | The function (_cli_object_repr) defined within the public class called public.The function start at line 184 and ends at 194. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["_fix_multiline_strings", "deadline_yaml_dump"], It has 17.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.farm_group_py.farm_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.fleet_group_py.fleet_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_trace_schedule", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_wait_for_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.queue_paramdefs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.worker_group_py.worker_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._categorize_jobs_in_checkpoint", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_py.test_cli_object_repr"]. |
aws-deadline_deadline-cloud | _ProgressBarCallbackManager | protected | 0 | 0 | __init__ | def __init__(self, length: int, label: str):self._length = lengthself._label = labelself._bar_status = self.BAR_NOT_CREATEDself._exit_stack = ExitStack() | 1 | 5 | 3 | 37 | 0 | 207 | 211 | 207 | self,length,label | [] | None | {"Assign": 4} | 1 | 5 | 1 | ["ExitStack"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the protected class called _ProgressBarCallbackManager.The function start at line 207 and ends at 211. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [207.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["ExitStack"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | _ProgressBarCallbackManager | protected | 0 | 0 | callback | def callback(self, upload_metadata: ProgressReportMetadata) -> bool:if self._bar_status == self.BAR_CLOSED:# from multithreaded execution this can be called after completion somtimes.return sigint_handler.continue_operationelif self._bar_status == self.BAR_NOT_CREATED:# Note: click doesn't export the return type of progressbar(), so we suppress mypy warnings for# not annotating the type of hashing_progress.self._upload_progress = click.progressbar(length=self._length, label=self._label)# type: ignore[var-annotated]self._exit_stack.enter_context(self._upload_progress)self._bar_status = self.BAR_CREATEDtotal_progress = int(upload_metadata.progress)new_progress = total_progress - self._upload_progress.posif new_progress > 0:self._upload_progress.update(new_progress)if total_progress == self._length or not sigint_handler.continue_operation:self._bar_status = self.BAR_CLOSEDself._exit_stack.close()return sigint_handler.continue_operation | 6 | 15 | 2 | 130 | 0 | 213 | 233 | 213 | self,upload_metadata | [] | bool | {"Assign": 5, "Expr": 3, "If": 4, "Return": 2} | 5 | 21 | 5 | ["click.progressbar", "self._exit_stack.enter_context", "int", "self._upload_progress.update", "self._exit_stack.close"] | 42 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.asyncdns_py.DNSResolver._call_callback", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.asyncdns_py.DNSResolver.resolve", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.eventloop_py.EventLoop.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3695822_home_assistant_libs_aiohue.aiohue.v2.controllers.base_py.BaseResourcesController._handle_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3695822_home_assistant_libs_aiohue.aiohue.v2.controllers.events_py.EventStream.emit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_ack", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_nack", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_return", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.connection_py.Connection.connect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.method_framing_py.frame_handler", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3931660_pyserial_pyserial.examples.port_publisher_py.Forwarder.close", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957198_kvesteri_sqlalchemy_utils.sqlalchemy_utils.observer_py.PropertyObserver.invoke_callbacks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957974_asottile_pyupgrade.pyupgrade._main_py._fix_plugins", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.invert_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.VolkswagenEntity.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967974_boxed_mutmut.mutmut.__main___py.CatchOutput.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy._utils_py.parallelize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.gr._nhood_py._nhood_enrichment_helper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.control.discovery_py.DiscoveryService.start_service", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69951274_onekey_sec_unblob.python.unblob.sandbox_py.Sandbox.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.test.splitgraph.commands.test_layered_querying_py.test_disjoint_table_lq_temp_table_deletion_doesnt_lock_up", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.test.splitgraph.commands.test_layered_querying_py.test_disjoint_table_lq_two_singletons_one_overwritten_indirect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.utils.parse_conf_py.try_to_encode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Context.invoke"] | The function (callback) defined within the protected class called _ProgressBarCallbackManager.The function start at line 213 and ends at 233. It contains 15 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [213.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["click.progressbar", "self._exit_stack.enter_context", "int", "self._upload_progress.update", "self._exit_stack.close"], It has 42.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.asyncdns_py.DNSResolver._call_callback", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.asyncdns_py.DNSResolver.resolve", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.eventloop_py.EventLoop.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3695822_home_assistant_libs_aiohue.aiohue.v2.controllers.base_py.BaseResourcesController._handle_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3695822_home_assistant_libs_aiohue.aiohue.v2.controllers.events_py.EventStream.emit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_ack", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_cancel", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_nack", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.channel_py.Channel._on_basic_return", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.connection_py.Connection.connect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928034_celery_py_amqp.amqp.method_framing_py.frame_handler", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3931660_pyserial_pyserial.examples.port_publisher_py.Forwarder.close", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957198_kvesteri_sqlalchemy_utils.sqlalchemy_utils.observer_py.PropertyObserver.invoke_callbacks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957974_asottile_pyupgrade.pyupgrade._main_py._fix_plugins", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.invert_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.VolkswagenEntity.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967974_boxed_mutmut.mutmut.__main___py.CatchOutput.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy._utils_py.parallelize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.gr._nhood_py._nhood_enrichment_helper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.control.discovery_py.DiscoveryService.start_service", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69951274_onekey_sec_unblob.python.unblob.sandbox_py.Sandbox.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.test.splitgraph.commands.test_layered_querying_py.test_disjoint_table_lq_temp_table_deletion_doesnt_lock_up", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.test.splitgraph.commands.test_layered_querying_py.test_disjoint_table_lq_two_singletons_one_overwritten_indirect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.utils.parse_conf_py.try_to_encode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Context.invoke"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | parse_query_string | def parse_query_string(query_string: str, parameter_names: List[str], required_parameter_names: List[str]) -> Dict[str, str]:"""Parses the URL query string into {"parameter": "value"} form.Any "-" in the parameter names are switched to "_" in the result.Args:query_string (str): The query string from a parsed web URL."""result: dict = {}if query_string:parsed_qs = urllib.parse.parse_qs(query_string, strict_parsing=True)else:parsed_qs = {}# Ensure the required parameters are providedmissing_required_parameters = set(required_parameter_names) - set(parsed_qs.keys())if missing_required_parameters:raise DeadlineOperationError(f"The URL query did not contain the required parameter(s) {list(missing_required_parameters)}")# Process all the valid parameter namesfor name in parameter_names:values = parsed_qs.pop(name, None)if values:if len(values) > 1:raise DeadlineOperationError(f"The URL query parameter {name} was provided multiple times, it may only be provided once.")result[name.replace("-", "_")] = values[0]# If there are any left, they are not validif parsed_qs:raise DeadlineOperationError(f"The URL query contained unsupported parameter names {parsed_qs.keys()}")return result | 7 | 26 | 3 | 141 | 3 | 29 | 70 | 29 | query_string,parameter_names,required_parameter_names | ['parsed_qs', 'missing_required_parameters', 'values'] | Dict[str, str] | {"AnnAssign": 1, "Assign": 5, "Expr": 1, "For": 1, "If": 5, "Return": 1} | 12 | 42 | 12 | ["urllib.parse.parse_qs", "set", "set", "parsed_qs.keys", "DeadlineOperationError", "list", "parsed_qs.pop", "len", "DeadlineOperationError", "name.replace", "DeadlineOperationError", "parsed_qs.keys"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_duplicate_parameters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_extra_parameters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_missing_required"] | The function (parse_query_string) defined within the public class called public.The function start at line 29 and ends at 70. It contains 26 lines of code and it has a cyclomatic complexity of 7. It takes 3 parameters, represented as [29.0] and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["urllib.parse.parse_qs", "set", "set", "parsed_qs.keys", "DeadlineOperationError", "list", "parsed_qs.pop", "len", "DeadlineOperationError", "name.replace", "DeadlineOperationError", "parsed_qs.keys"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_duplicate_parameters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_extra_parameters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_parse_query_string_missing_required"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | validate_resource_ids | def validate_resource_ids(ids: Dict[str, str]) -> None:"""Validates that the resource IDs are all valid.i.e. "<name of resource>-<a hexadecimal string of length 32>"for Task ID: "task-<a hexadecimal string of length 32>-<0 or a number up to 10 digits long>"Args:ids (Dict[str, str]): The resource IDs to validate.Expected to be {"<resource type>_id": "<full id string>"} form."""for id_name, id_str in ids.items():resource_type = id_str.split("-")[0]if not id_name.startswith(resource_type) or not validate_id_format(resource_type, id_str):raise DeadlineOperationError(f'The given resource ID "{id_name}": "{id_str}" has invalid format.') | 4 | 7 | 1 | 60 | 1 | 73 | 88 | 73 | ids | ['resource_type'] | None | {"Assign": 1, "Expr": 1, "For": 1, "If": 1} | 5 | 16 | 5 | ["ids.items", "id_str.split", "id_name.startswith", "validate_id_format", "DeadlineOperationError"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_resource_ids_failed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_resource_ids_successful"] | The function (validate_resource_ids) defined within the public class called public.The function start at line 73 and ends at 88. It contains 7 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["ids.items", "id_str.split", "id_name.startswith", "validate_id_format", "DeadlineOperationError"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_resource_ids_failed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_resource_ids_successful"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | validate_id_format | def validate_id_format(resource_type: str, full_id_str: str) -> bool:"""Validates if the ID is in correct format. The ID must- start with one of the resource names, and followed by a hyphen ("-"), and- the string that follows must be a hexadecimal string of length 32.- Additional for "task": followed by another hyphen ("-"), and ends witha string of either 0 or a number up to 10 digits long.Args:full_id_str (str): The ID to validate.resource_type (str): "farm", "queue", "job", etc."""if resource_type not in VALID_RESOURCE_NAMES_IN_ID:return Falseprefix = f"{resource_type}-"if not full_id_str.startswith(prefix):return Falseid_str = full_id_str[len(prefix) :]if resource_type == "task":return bool(DEADLINE_TASK_ID_PATTERN.fullmatch(id_str))else:if len(id_str) != 32:return Falsereturn bool(DEADLINE_ID_PATTERN.fullmatch(id_str)) | 5 | 13 | 2 | 84 | 2 | 91 | 117 | 91 | resource_type,full_id_str | ['id_str', 'prefix'] | bool | {"Assign": 2, "Expr": 1, "If": 4, "Return": 5} | 7 | 27 | 7 | ["full_id_str.startswith", "len", "bool", "DEADLINE_TASK_ID_PATTERN.fullmatch", "len", "bool", "DEADLINE_ID_PATTERN.fullmatch"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._deadline_web_url_py.validate_resource_ids", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_id_format_failed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_id_format_successful"] | The function (validate_id_format) defined within the public class called public.The function start at line 91 and ends at 117. It contains 13 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [91.0] and does not return any value. It declares 7.0 functions, It has 7.0 functions called inside which are ["full_id_str.startswith", "len", "bool", "DEADLINE_TASK_ID_PATTERN.fullmatch", "len", "bool", "DEADLINE_ID_PATTERN.fullmatch"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._deadline_web_url_py.validate_resource_ids", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_id_format_failed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_validate_id_format_successful"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | install_deadline_web_url_handler | def install_deadline_web_url_handler(all_users: bool) -> None:"""Installs the called AWS Deadline Cloud CLI command as the deadline:// web URL handler."""if sys.platform == "win32":import winreg# Get the CLI program path, either an .exe or a .py with the Python interpreterdeadline_cli_program = os.path.abspath(sys.argv[0])if deadline_cli_program.endswith(".py"):deadline_cli_prefix = f'"{sys.executable}" "{deadline_cli_program}"'else:deadline_cli_program = deadline_cli_program + ".exe"deadline_cli_prefix = f'"{deadline_cli_program}"'if not os.path.isfile(deadline_cli_program):raise DeadlineOperationError(f"Error determining the AWS Deadline Cloud CLI program, {deadline_cli_program} does not exist.")logger.info(f'Installing "{deadline_cli_program}" as the handler for {DEADLINE_URL_SCHEME_NAME} URLs')try:hkey: Any = Nonehkey_command: Any = Noneif all_users:hkey = winreg.CreateKeyEx(winreg.HKEY_CLASSES_ROOT, DEADLINE_URL_SCHEME_NAME)else:hkey = winreg.CreateKeyEx(winreg.HKEY_CURRENT_USER, f"Software\\Classes\\{DEADLINE_URL_SCHEME_NAME}")winreg.SetValueEx(hkey, None, 0, winreg.REG_SZ, "URL:AWS Deadline Cloud Protocol")winreg.SetValueEx(hkey, "URL Protocol", 0, winreg.REG_SZ, "")hkey_command = winreg.CreateKeyEx(hkey, "shell\\open\\command")winreg.SetValueEx(hkey_command,None,0,winreg.REG_SZ,f'{deadline_cli_prefix} handle-web-url "%1" --prompt-when-complete',)except OSError as e:if all_users and e.winerror == 5:# Access deniedraise DeadlineOperationError(f"Administrator access is required to install the {DEADLINE_URL_SCHEME_NAME} URL handler for all users:\n{e}")else:raise DeadlineOperationError(f"Failed to install the handler for {DEADLINE_URL_SCHEME_NAME} URLs:\n{e}")finally:if hkey_command:winreg.CloseKey(hkey_command)if hkey:winreg.CloseKey(hkey)elif sys.platform == "linux":import subprocessimport shutilif shutil.which("update-desktop-database") is None:raise DeadlineOperationError(f"Failed to install the handler for {DEADLINE_URL_SCHEME_NAME} URLs: update-desktop-database is not installed.")# Get the CLI program pathdeadline_cli_program = os.path.abspath(sys.argv[0])if all_users:entry_dir = "/usr/share/applications"mimeapps_list_file_path = "/usr/share/applications/mimeapps.list"else:entry_dir = os.path.expanduser("~/.local/share/applications")mimeapps_list_file_path = os.path.expanduser("~/.config/mimeapps.list")try:os.makedirs(entry_dir, exist_ok=True)except OSError as e:raise DeadlineOperationError(f"Failed to create a directory: {e}")desktop_file_path = os.path.join(entry_dir, "deadline.desktop")desktop_file_content = f"""[Desktop Entry]Type=ApplicationName={DEADLINE_URL_SCHEME_NAME}Exec={deadline_cli_program} handle-web-url %uType=ApplicationTerminal=trueMimeType=x-scheme-handler/{DEADLINE_URL_SCHEME_NAME}"""mimeapps_file_content = f"""[Default Applications]x-scheme-handler/{DEADLINE_URL_SCHEME_NAME}={DEADLINE_URL_SCHEME_NAME}.desktop;"""with open(desktop_file_path, "w") as desktop_file:desktop_file.write(desktop_file_content)with open(mimeapps_list_file_path, "w") as mimeapps_list_file:mimeapps_list_file.write(mimeapps_file_content)try:subprocess.run(["update-desktop-database", entry_dir], check=True)except subprocess.CalledProcessError as e:raise DeadlineOperationError(f"Failed to install the handler for {DEADLINE_URL_SCHEME_NAME} URLs:\n{e}") from eelse:raise DeadlineOperationError(f"Installing the web URL handler is not supported on OS {sys.platform}") | 16 | 82 | 1 | 418 | 9 | 120 | 230 | 120 | all_users | ['deadline_cli_prefix', 'mimeapps_file_content', 'hkey_command', 'entry_dir', 'desktop_file_path', 'desktop_file_content', 'deadline_cli_program', 'mimeapps_list_file_path', 'hkey'] | None | {"AnnAssign": 2, "Assign": 15, "Expr": 11, "If": 10, "Try": 3, "With": 2} | 30 | 111 | 30 | ["os.path.abspath", "deadline_cli_program.endswith", "os.path.isfile", "DeadlineOperationError", "logger.info", "winreg.CreateKeyEx", "winreg.CreateKeyEx", "winreg.SetValueEx", "winreg.SetValueEx", "winreg.CreateKeyEx", "winreg.SetValueEx", "DeadlineOperationError", "DeadlineOperationError", "winreg.CloseKey", "winreg.CloseKey", "shutil.which", "DeadlineOperationError", "os.path.abspath", "os.path.expanduser", "os.path.expanduser", "os.makedirs", "DeadlineOperationError", "os.path.join", "open", "desktop_file.write", "open", "mimeapps_list_file.write", "subprocess.run", "DeadlineOperationError", "DeadlineOperationError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"] | The function (install_deadline_web_url_handler) defined within the public class called public.The function start at line 120 and ends at 230. It contains 82 lines of code and it has a cyclomatic complexity of 16. The function does not take any parameters and does not return any value. It declares 30.0 functions, It has 30.0 functions called inside which are ["os.path.abspath", "deadline_cli_program.endswith", "os.path.isfile", "DeadlineOperationError", "logger.info", "winreg.CreateKeyEx", "winreg.CreateKeyEx", "winreg.SetValueEx", "winreg.SetValueEx", "winreg.CreateKeyEx", "winreg.SetValueEx", "DeadlineOperationError", "DeadlineOperationError", "winreg.CloseKey", "winreg.CloseKey", "shutil.which", "DeadlineOperationError", "os.path.abspath", "os.path.expanduser", "os.path.expanduser", "os.makedirs", "DeadlineOperationError", "os.path.join", "open", "desktop_file.write", "open", "mimeapps_list_file.write", "subprocess.run", "DeadlineOperationError", "DeadlineOperationError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | uninstall_deadline_web_url_handler | def uninstall_deadline_web_url_handler(all_users: bool) -> None:"""Uninstalls the called AWS Deadline Cloud CLI command as the deadline:// web URL handler."""if sys.platform == "win32":import winreglogger.info(f"Unstalling the handler for {DEADLINE_URL_SCHEME_NAME} URLs")try:hkey: Any = Noneif all_users:hkey = winreg.HKEY_CLASSES_ROOTelse:hkey = winreg.OpenKeyEx(winreg.HKEY_CURRENT_USER, "Software\\Classes")winreg.DeleteKeyEx(hkey, f"{DEADLINE_URL_SCHEME_NAME}\\shell\\open\\command")winreg.DeleteKeyEx(hkey, f"{DEADLINE_URL_SCHEME_NAME}\\shell\\open")winreg.DeleteKeyEx(hkey, f"{DEADLINE_URL_SCHEME_NAME}\\shell")winreg.DeleteKeyEx(hkey, DEADLINE_URL_SCHEME_NAME)except OSError as e:if e.winerror == 2:# Cannot find the specified keyclick.echo(f"Nothing to uninstall, no handler for {DEADLINE_URL_SCHEME_NAME} URLs was installed")else:raise DeadlineOperationError(f"Failed to uninstall handler for {DEADLINE_URL_SCHEME_NAME} URLs:\n{e}")finally:if hkey and hkey != winreg.HKEY_CLASSES_ROOT:winreg.CloseKey(hkey)elif sys.platform == "linux":import subprocessimport shutilif shutil.which("update-desktop-database") is None:raise DeadlineOperationError(f"Failed to uninstall the handler for {DEADLINE_URL_SCHEME_NAME} URLs: update-desktop-database is not installed.")logger.info(f"Unstalling the handler for {DEADLINE_URL_SCHEME_NAME} URLs")if all_users:entry_dir = "/usr/share/applications"else:entry_dir = os.path.expanduser("~/.local/share/applications")desktop_file_path = f"{entry_dir}/{DEADLINE_URL_SCHEME_NAME}.desktop"try:os.remove(desktop_file_path)print(f"Removed {desktop_file_path}")except FileNotFoundError:print(f"{desktop_file_path} not found, nothing to remove")try:subprocess.run(["update-desktop-database", entry_dir], check=True)except subprocess.CalledProcessError as e:raise DeadlineOperationError(f"Failed to uninstall the handler for {DEADLINE_URL_SCHEME_NAME} URLs:\n{e}") from eelse:raise DeadlineOperationError(f"Uninstalling the web URL handler is not supported on OS {sys.platform}") | 13 | 54 | 1 | 250 | 3 | 233 | 298 | 233 | all_users | ['entry_dir', 'desktop_file_path', 'hkey'] | None | {"AnnAssign": 1, "Assign": 5, "Expr": 13, "If": 7, "Try": 3} | 19 | 66 | 19 | ["logger.info", "winreg.OpenKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "click.echo", "DeadlineOperationError", "winreg.CloseKey", "shutil.which", "DeadlineOperationError", "logger.info", "os.path.expanduser", "os.remove", "print", "print", "subprocess.run", "DeadlineOperationError", "DeadlineOperationError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"] | The function (uninstall_deadline_web_url_handler) defined within the public class called public.The function start at line 233 and ends at 298. It contains 54 lines of code and it has a cyclomatic complexity of 13. The function does not take any parameters and does not return any value. It declares 19.0 functions, It has 19.0 functions called inside which are ["logger.info", "winreg.OpenKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "winreg.DeleteKeyEx", "click.echo", "DeadlineOperationError", "winreg.CloseKey", "shutil.which", "DeadlineOperationError", "logger.info", "os.path.expanduser", "os.remove", "print", "print", "subprocess.run", "DeadlineOperationError", "DeadlineOperationError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.handle_web_url_command_py.cli_handle_web_url"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_download_candidate_jobs | def _get_download_candidate_jobs(boto3_session: boto3.Session,farm_id: str,queue_id: str,starting_timestamp: datetime,print_function_callback: Callable[[str], None] = lambda msg: None,) -> dict[str, dict[str, Any]]:"""Uses deadline:SearchJobs queries to get a dict {job_id: job} of download candidates for the queue.This is a superset of all the jobs that have produced any output for download sincethe provided starting_timestamp.Args:boto3_session: The boto3.Session for accessing AWS.farm_id: The farm id for the operation.queue_id: The queue id for the operation.starting_timestamp: The point in time from which to look for new download outputs.print_function_callback: Callback for printing output to the terminal or log.Returns:A dictionary mapping job id to the job as returned by the deadline.search_jobs API."""print_function_callback("Retrieving updated data from Deadline Cloud...")start_time = datetime.now(tz=timezone.utc)# Construct the full set of jobs that may have new available downloads.# - Any active job (job with taskRunStatus in READY, ASSIGNED,# STARTING, SCHEDULED, or RUNNING), that has at least one SUCCEEDED task.download_candidate_jobs = {job["jobId"]: jobfor job in _list_jobs_by_filter_expression(boto3_session,farm_id,queue_id,filter_expression={"filters": [{"stringFilter": {"name": "TASK_RUN_STATUS","operator": "EQUAL","value": status_value,},}# Maximum of 3 filters are permitted, so the 5 statuses are splitfor status_value in ["READY", "ASSIGNED", "STARTING"]],"operator": "OR",},)}download_candidate_jobs.update({job["jobId"]: jobfor job in _list_jobs_by_filter_expression(boto3_session,farm_id,queue_id,filter_expression={"filters": [{"stringFilter": {"name": "TASK_RUN_STATUS","operator": "EQUAL","value": status_value,},}for status_value in ["SCHEDULED", "RUNNING"]],"operator": "OR",},)})print(f"DEBUG: Got {len(download_candidate_jobs)} active jobs")download_candidate_jobs = {job_id: _datetimes_to_str(job)for job_id, job in download_candidate_jobs.items()if job["taskRunStatusCounts"]["SUCCEEDED"] > 0}print(f"DEBUG: Filtered down to {len(download_candidate_jobs)} active jobs based on SUCCEEDED task filter")# - Any recently ended job (job went from active to terminal with a taskRunStatus# in SUSPENDED, CANCELED, FAILED, SUCCEEDED, NOT_COMPATIBLE), that has at least# one SUCCEEDED task. The endedAt timestamp field gets updated when that occurs.recently_ended_jobs = _list_jobs_by_filter_expression(boto3_session,farm_id,queue_id,filter_expression={"filters": [{"dateTimeFilter": {"name": "ENDED_AT","dateTime": starting_timestamp,"operator": "GREATER_THAN_EQUAL_TO",}}],"operator": "AND",},)print(f"DEBUG: Got {len(recently_ended_jobs)} jobs with job[endedAt] >= {starting_timestamp.astimezone().isoformat()}")# Filter to jobs where the count of SUCCEEDED tasks is positive.recently_ended_jobs = [job for job in recently_ended_jobs if job["taskRunStatusCounts"]["SUCCEEDED"] > 0]print(f"DEBUG: Filtered down to {len(recently_ended_jobs)} jobs based on SUCCEEDED task filter")download_candidate_jobs.update({job["jobId"]: _datetimes_to_str(job) for job in recently_ended_jobs})duration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...retrieval completed in {duration}")return download_candidate_jobs | 10 | 92 | 6 | 351 | 4 | 70 | 188 | 70 | boto3_session,farm_id,queue_id,starting_timestamp,print_function_callback | ['start_time', 'duration', 'download_candidate_jobs', 'recently_ended_jobs'] | dict[str, dict[str, Any]] | {"Assign": 6, "Expr": 9, "Return": 1} | 22 | 119 | 22 | ["print_function_callback", "datetime.now", "_list_jobs_by_filter_expression", "download_candidate_jobs.update", "_list_jobs_by_filter_expression", "print", "len", "_datetimes_to_str", "download_candidate_jobs.items", "print", "len", "_list_jobs_by_filter_expression", "print", "len", "isoformat", "starting_timestamp.astimezone", "print", "len", "download_candidate_jobs.update", "_datetimes_to_str", "datetime.now", "print_function_callback"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_get_download_candidate_jobs) defined within the public class called public.The function start at line 70 and ends at 188. It contains 92 lines of code and it has a cyclomatic complexity of 10. It takes 6 parameters, represented as [70.0] and does not return any value. It declares 22.0 functions, It has 22.0 functions called inside which are ["print_function_callback", "datetime.now", "_list_jobs_by_filter_expression", "download_candidate_jobs.update", "_list_jobs_by_filter_expression", "print", "len", "_datetimes_to_str", "download_candidate_jobs.items", "print", "len", "_list_jobs_by_filter_expression", "print", "len", "isoformat", "starting_timestamp.astimezone", "print", "len", "download_candidate_jobs.update", "_datetimes_to_str", "datetime.now", "print_function_callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _categorize_jobs_in_checkpoint | def _categorize_jobs_in_checkpoint(boto3_session: boto3.Session,farm_id: str,queue_id: str,checkpoint: IncrementalDownloadState,download_candidate_jobs: dict[str, dict[str, Any]],new_completed_timestamp: datetime,print_function_callback: Callable[[str], None] = lambda msg: None,) -> CategorizedJobIds:"""Categorizes the provided download candidate jobs by id into a CategorizedJobIds object,updating the jobs within download_candidate_jobs where necessary.* Calls boto3 deadline.get_job() to get job attachments manifest information and storage profile id if it is not stored yet.Args:boto3_session: The boto3.Session for accessing AWS.farm_id: The farm id for the operation.queue_id: The queue id for the operation.checkpoint: The checkpoint for the incremental download.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call.new_completed_timestamp: This is the timestamp value that will be placed incheckpoint.downloads_completed_timestamp when saving the checkpoint.print_function_callback: Callback for printing output to the terminal or log."""deadline = get_session_client(boto3_session, "deadline")checkpoint_jobs = {job.job_id: job.job for job in checkpoint.jobs}checkpoint_job_ids = set(checkpoint_jobs.keys())download_candidate_job_ids = set(download_candidate_jobs.keys())print_function_callback(f"Categorizing {len(checkpoint_jobs)} checkpoint jobs against {len(download_candidate_jobs)} download candidate jobs...")start_time = datetime.now(tz=timezone.utc)finished_tracking_job_ids = checkpoint_job_ids.difference(download_candidate_job_ids)updated_job_ids = checkpoint_job_ids.intersection(download_candidate_job_ids)new_job_ids = download_candidate_job_ids.difference(checkpoint_job_ids)# The following sets get populated while analyzing the jobsunchanged_job_ids = set()attachments_free_job_ids = set()missing_storage_profile = set()completed_job_ids = set()# Copy the job attachments manifest data and storage profile id from the checkpoint to the new job objects. This data# is not returned by deadline:SearchJobs, so we need to call deadline:GetJob on every job to retrieve it. This data# on a job don't change, so after the call to deadline:GetJob we can cache it indefinitely.for job_id in updated_job_ids:ip_job = checkpoint_jobs[job_id]dc_job = download_candidate_jobs[job_id]if set(ip_job.keys()) == {"jobId"}:# If the job has a minimal placeholder, move the job id to the new job idsnew_job_ids.add(job_id)elif ip_job["attachments"] is None:# Carry over the minimal placeholder identifying the job as not using job attachmentsdownload_candidate_jobs[job_id] = ip_jobattachments_free_job_ids.add(job_id)elif ip_job["storageProfileId"] is None and checkpoint.local_storage_profile_id is not None:# Carry over the minimal placeholder identifying the job as missing a storage profiledownload_candidate_jobs[job_id] = ip_jobmissing_storage_profile.add(job_id)else:# Copy the attachments manifest metadata as it is not returned by deadline:SearchJobsdc_job["attachments"] = ip_job["attachments"]dc_job["storageProfileId"] = ip_job["storageProfileId"]updated_job_ids.difference_update(attachments_free_job_ids)updated_job_ids.difference_update(new_job_ids)updated_job_ids.difference_update(missing_storage_profile)# Prune jobs that we are (almost) certain have no changes by looking at its task status counts. We treat a job as unchanged if its# value job["taskRunStatusCounts"]["SUCCEEDED"] stayed the same and its timestamp job["endedAt"] stayed the same.## The case this misses (and causes a delay in task output download) is the following sequence: 1/ User requeues one or more steps/tasks.# 2/ Tasks succeed in the correct number to equal the previous value 3/ The incremental output download command sees an equal count# and miscategorizes it as unchanged. If that count is all the tasks, the job["endedAt"] timestamp will catch it, and if the count# is less, the next time a task completes the succeeded count will be different.## Because of this potential delay, the checkpoint needs to keep tracking all of the sessions it has seen, and cannot assume# that a session ending before the downloads completed timestamp was already processed.for job_id in updated_job_ids:ip_job = checkpoint_jobs[job_id]dc_job = download_candidate_jobs[job_id]if ip_job["taskRunStatusCounts"]["SUCCEEDED"] == dc_job["taskRunStatusCounts"]["SUCCEEDED"] and ip_job.get("endedAt") == dc_job.get("endedAt"):print_function_callback(f"UNCHANGED Job: {dc_job['name']} ({job_id})")unchanged_job_ids.add(job_id)updated_job_ids.difference_update(unchanged_job_ids)# First make note of any jobs that were dropped from tracking, for example if they were canceled or they failedfor job_id in finished_tracking_job_ids:ip_job = checkpoint_jobs[job_id]if "taskRunStatusCounts" in ip_job:ip_succeeded_task_count = ip_job["taskRunStatusCounts"]["SUCCEEDED"]ip_total_task_count = sum(value for _, value in ip_job["taskRunStatusCounts"].items())else:ip_succeeded_task_count = 0ip_total_task_count = -1# Print something only if the job is more than a minimal "jobId" trackerif set(ip_job.keys()) != {"jobId"}:print_function_callback(f"FINISHED TRACKING Job: {ip_job['name']} ({job_id})")if ip_job["attachments"] is None:print_function_callback("Job without job attachments is no longer active")elif ip_succeeded_task_count == ip_total_task_count:print_function_callback(" Job succeeded")else:print_function_callback(" Job is not a download candidate anymore (likely suspended, canceled or failed)")# Process all the jobs that have updatesfor job_id in updated_job_ids:ip_job = checkpoint_jobs[job_id]dc_job = download_candidate_jobs[job_id]ip_succeeded_task_count = ip_job["taskRunStatusCounts"]["SUCCEEDED"]ip_total_task_count = sum(value for _, value in ip_job["taskRunStatusCounts"].items())dc_succeeded_task_count = dc_job["taskRunStatusCounts"]["SUCCEEDED"]dc_total_task_count = sum(value for _, value in dc_job["taskRunStatusCounts"].items())print_function_callback(f"EXISTING Job: {ip_job['name']} ({job_id})")print_function_callback(f"Succeeded tasks (before): {ip_succeeded_task_count} / {ip_total_task_count}")print_function_callback(f"Succeeded tasks (now) : {dc_succeeded_task_count} / {dc_total_task_count}")# Use the CLI output format to produce a diff of the changesip_job_repr: list[str] = _cli_object_repr(ip_job).splitlines()dc_job_repr: list[str] = _cli_object_repr(dc_job).splitlines()for line in difflib.unified_diff(ip_job_repr,dc_job_repr,fromfile="Previous update",tofile="Current update",lineterm="",):print_function_callback(f"{line}")if (dc_succeeded_task_count == dc_total_task_countand "endedAt" in dc_joband datetime.fromisoformat(dc_job["endedAt"]) < new_completed_timestamp):completed_job_ids.add(job_id)updated_job_ids.difference_update(completed_job_ids)# Process all the jobs that are newfor job_id in new_job_ids:dc_job = download_candidate_jobs[job_id]# Call deadline:GetJob to retrieve attachments manifest informationjob = deadline.get_job(jobId=job_id, queueId=queue_id, farmId=farm_id)dc_job["attachments"] = job.get("attachments")dc_job["storageProfileId"] = job.get("storageProfileId")dc_succeeded_task_count = dc_job["taskRunStatusCounts"]["SUCCEEDED"]dc_total_task_count = sum(value for _, value in dc_job["taskRunStatusCounts"].items())print_function_callback(f"NEW Job: {dc_job['name']} ({job_id})")if (dc_job["attachments"] is not Noneand dc_job["storageProfileId"] is Noneand checkpoint.local_storage_profile_id is not None):print_function_callback("WARNING: THE JOB OUTPUT WILL NOT BE DOWNLOADED, IT HAS NO STORAGE PROFILE.")missing_storage_profile.add(job_id)continueprint_function_callback(f"Succeeded tasks: {dc_succeeded_task_count} / {dc_total_task_count}")if dc_job["attachments"] is None:# If the job does not use job attachments, save a minimal placeholder to avoid# repeatedly calling deadline:GetJob.download_candidate_jobs[job_id] = dc_job = {"jobId": job_id,"name": dc_job["name"],"attachments": None,}attachments_free_job_ids.add(job_id)print_function_callback("Job does not use job attachments.")else:print_function_callback("Manifest file system paths:")for manifest in dc_job["attachments"]["manifests"]:print_function_callback(f"- {manifest['rootPath']} ({manifest['rootPathFormat']})")if (dc_succeeded_task_count == dc_total_task_countand "endedAt" in dc_joband datetime.fromisoformat(dc_job["endedAt"]) < new_completed_timestamp):completed_job_ids.add(job_id)new_job_ids.difference_update(attachments_free_job_ids)new_job_ids.difference_update(completed_job_ids)new_job_ids.difference_update(missing_storage_profile)result = CategorizedJobIds()result.attachments_free = attachments_free_job_idsresult.missing_storage_profile = missing_storage_profileresult.completed = completed_job_idsresult.inactive = finished_tracking_job_idsresult.added = new_job_idsresult.unchanged = unchanged_job_idsresult.updated = updated_job_idsduration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...categorization completed in {duration}")return result | 33 | 154 | 8 | 931 | 21 | 219 | 439 | 219 | boto3_session,farm_id,queue_id,checkpoint,download_candidate_jobs,new_completed_timestamp,print_function_callback | ['dc_job', 'new_job_ids', 'checkpoint_jobs', 'ip_job', 'ip_succeeded_task_count', 'checkpoint_job_ids', 'start_time', 'dc_total_task_count', 'attachments_free_job_ids', 'completed_job_ids', 'updated_job_ids', 'duration', 'deadline', 'unchanged_job_ids', 'result', 'download_candidate_job_ids', 'missing_storage_profile', 'finished_tracking_job_ids', 'ip_total_task_count', 'dc_succeeded_task_count', 'job'] | CategorizedJobIds | {"AnnAssign": 2, "Assign": 47, "Expr": 34, "For": 7, "If": 12, "Return": 1} | 74 | 221 | 74 | ["get_session_client", "set", "checkpoint_jobs.keys", "set", "download_candidate_jobs.keys", "print_function_callback", "len", "len", "datetime.now", "checkpoint_job_ids.difference", "checkpoint_job_ids.intersection", "download_candidate_job_ids.difference", "set", "set", "set", "set", "set", "ip_job.keys", "new_job_ids.add", "attachments_free_job_ids.add", "missing_storage_profile.add", "updated_job_ids.difference_update", "updated_job_ids.difference_update", "updated_job_ids.difference_update", "ip_job.get", "dc_job.get", "print_function_callback", "unchanged_job_ids.add", "updated_job_ids.difference_update", "sum", "items", "set", "ip_job.keys", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "sum", "items", "sum", "items", "print_function_callback", "print_function_callback", "print_function_callback", "splitlines", "_cli_object_repr", "splitlines", "_cli_object_repr", "difflib.unified_diff", "print_function_callback", "datetime.fromisoformat", "completed_job_ids.add", "updated_job_ids.difference_update", "deadline.get_job", "job.get", "job.get", "sum", "items", "print_function_callback", "print_function_callback", "missing_storage_profile.add", "print_function_callback", "attachments_free_job_ids.add", "print_function_callback", "print_function_callback", "print_function_callback", "datetime.fromisoformat", "completed_job_ids.add", "new_job_ids.difference_update", "new_job_ids.difference_update", "new_job_ids.difference_update", "CategorizedJobIds", "datetime.now", "print_function_callback"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_categorize_jobs_in_checkpoint) defined within the public class called public.The function start at line 219 and ends at 439. It contains 154 lines of code and it has a cyclomatic complexity of 33. It takes 8 parameters, represented as [219.0] and does not return any value. It declares 74.0 functions, It has 74.0 functions called inside which are ["get_session_client", "set", "checkpoint_jobs.keys", "set", "download_candidate_jobs.keys", "print_function_callback", "len", "len", "datetime.now", "checkpoint_job_ids.difference", "checkpoint_job_ids.intersection", "download_candidate_job_ids.difference", "set", "set", "set", "set", "set", "ip_job.keys", "new_job_ids.add", "attachments_free_job_ids.add", "missing_storage_profile.add", "updated_job_ids.difference_update", "updated_job_ids.difference_update", "updated_job_ids.difference_update", "ip_job.get", "dc_job.get", "print_function_callback", "unchanged_job_ids.add", "updated_job_ids.difference_update", "sum", "items", "set", "ip_job.keys", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "sum", "items", "sum", "items", "print_function_callback", "print_function_callback", "print_function_callback", "splitlines", "_cli_object_repr", "splitlines", "_cli_object_repr", "difflib.unified_diff", "print_function_callback", "datetime.fromisoformat", "completed_job_ids.add", "updated_job_ids.difference_update", "deadline.get_job", "job.get", "job.get", "sum", "items", "print_function_callback", "print_function_callback", "missing_storage_profile.add", "print_function_callback", "attachments_free_job_ids.add", "print_function_callback", "print_function_callback", "print_function_callback", "datetime.fromisoformat", "completed_job_ids.add", "new_job_ids.difference_update", "new_job_ids.difference_update", "new_job_ids.difference_update", "CategorizedJobIds", "datetime.now", "print_function_callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _retrieve_sessions_for_job | def _retrieve_sessions_for_job(deadline_client: BaseClient,farm_id: str,queue_id: str,job_id: str,session_ended_threshold: datetime,output_job_sessions: dict[str, list],):"""Uses deadline.list_sessions to get all sessions of the specified job that are still running orthat ended after session_ended_threshold.Places the output into output_job_sessions[job_id]Args:deadline_client: A boto3 client for accessing Deadline.farm_id: The farm id for the operation.queue_id: The queue id for the operation.job_id: The job id to process.session_ended_threshold: The timestamp threshold to filter out older sessions based on the endedAt field.output_job_sessions: A dictionary {job_id: session_list} to populate for the provided job id."""sessions_paginator = deadline_client.get_paginator("list_sessions")session_list: list[dict[str, Any]] = []for sessions_page in sessions_paginator.paginate(farmId=farm_id, queueId=queue_id, jobId=job_id):for session in sessions_page.get("sessions", []):if "endedAt" not in session or session["endedAt"] >= session_ended_threshold:session_list.append(session)if session_list:output_job_sessions[job_id] = session_list | 6 | 18 | 6 | 117 | 1 | 442 | 474 | 442 | deadline_client,farm_id,queue_id,job_id,session_ended_threshold,output_job_sessions | ['sessions_paginator'] | None | {"AnnAssign": 1, "Assign": 2, "Expr": 2, "For": 2, "If": 2} | 4 | 33 | 4 | ["deadline_client.get_paginator", "sessions_paginator.paginate", "sessions_page.get", "session_list.append"] | 0 | [] | The function (_retrieve_sessions_for_job) defined within the public class called public.The function start at line 442 and ends at 474. It contains 18 lines of code and it has a cyclomatic complexity of 6. It takes 6 parameters, represented as [442.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["deadline_client.get_paginator", "sessions_paginator.paginate", "sessions_page.get", "session_list.append"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _retrieve_session_actions_for_session | def _retrieve_session_actions_for_session(deadline_client: BaseClient,checkpoint_job_session_completed_indexes: dict[str, dict[str, int]],farm_id: str,queue_id: str,job_id: str,output_session: dict[str, Any],):"""Args:deadline_client: A boto3 client for accessing Deadline.checkpoint_job_session_completed_indexes: All the jobs' session action indexes loaded from the checkpoint.The value checkpoint_job_session_completed_indexes[job_id][session_id] is the session action index ofthe latest session action that is completed download.farm_id: The farm id for the operation.queue_id: The queue id for the operation.job_id: The job id to process.output_session: The session to populate with a sessionActions field."""session_actions_paginator = deadline_client.get_paginator("list_session_actions")session_action_list: list[dict[str, Any]] = []for session_actions_page in session_actions_paginator.paginate(farmId=farm_id,queueId=queue_id,jobId=job_id,sessionId=output_session["sessionId"],):# Include only succeeded taskRun actions.for session_action in session_actions_page.get("sessionActions", []):succeeded = session_action.get("status") == "SUCCEEDED"is_task_run = "taskRun" in session_action.get("definition", {})if succeeded and is_task_run:session_action_list.append(session_action)if session_action_list:# Extract the session action indexes from the idsfor session_action in session_action_list:# Session action IDs look like "sessionaction-abc123-12" for index 12session_action_index = int(session_action["sessionActionId"].rsplit("-", 1)[-1])session_action["sessionActionIndex"] = session_action_index# Include only session action indexes newer than latest downloaded ones from the checkpointsession_completed_index: Optional[int] = checkpoint_job_session_completed_indexes.get(job_id, {}).get(output_session["sessionId"])if session_completed_index is not None:# Filter out older session actions that were already downloadedsession_action_list = [session_actionfor session_action in session_action_listif session_action["sessionActionIndex"] > session_completed_index]if session_action_list:output_session["sessionActions"] = session_action_list | 11 | 36 | 6 | 230 | 5 | 477 | 530 | 477 | deadline_client,checkpoint_job_session_completed_indexes,farm_id,queue_id,job_id,output_session | ['succeeded', 'session_actions_paginator', 'session_action_list', 'session_action_index', 'is_task_run'] | None | {"AnnAssign": 2, "Assign": 7, "Expr": 2, "For": 3, "If": 4} | 10 | 54 | 10 | ["deadline_client.get_paginator", "session_actions_paginator.paginate", "session_actions_page.get", "session_action.get", "session_action.get", "session_action_list.append", "int", "rsplit", "get", "checkpoint_job_session_completed_indexes.get"] | 0 | [] | The function (_retrieve_session_actions_for_session) defined within the public class called public.The function start at line 477 and ends at 530. It contains 36 lines of code and it has a cyclomatic complexity of 11. It takes 6 parameters, represented as [477.0] and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["deadline_client.get_paginator", "session_actions_paginator.paginate", "session_actions_page.get", "session_action.get", "session_action.get", "session_action_list.append", "int", "rsplit", "get", "checkpoint_job_session_completed_indexes.get"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_job_sessions | def _get_job_sessions(boto3_session: boto3.Session,boto3_session_for_s3: boto3.Session,farm_id: str,queue: dict[str, Any],checkpoint_job_session_completed_indexes: dict[str, dict[str, int]],categorized_job_ids: CategorizedJobIds,checkpoint: IncrementalDownloadState,download_candidate_jobs: dict[str, dict[str, Any]],print_function_callback: Callable[[str], None] = lambda msg: None,) -> dict[str, list]:"""This function gets all the job sessions and session actions from the completed, added, and updated jobs.It uses the checkpoint's session_completed_indexes to filter out older session actions that are already downloaded.Args:boto3_session: The boto3.Session for accessing AWS.boto3_session_for_s3: The boto3.Session to use for accessing S3.farm_id: The farm id for the operation.queue: The queue as returned by boto3 deadline.get_queue().checkpoint_job_session_completed_indexes: All the jobs' session action indexes loaded from the checkpoint.The value checkpoint_job_session_completed_indexes[job_id][session_id] is the session action index ofthe latest session action that is completed download.categorized_job_ids: The categorized job ids as returned by _categorize_jobs_in_checkpoint().checkpoint: The checkpoint for the incremental download.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call.print_function_callback: Callback for printing output to the terminal or log.Returns:Access a session action in the returned job_sessions withjob_sessions[job_id][session_index]["sessionActions"][session_action_index]The returned structure looks like this:{"<job_id>": [{"sessionId": "<session_id>",...,"sessionActions": [{"sessionActionId": "<session_action_id>",...},...]},...],...}"""job_ids = categorized_job_ids.completed.union(categorized_job_ids.added).union(categorized_job_ids.updated)print_function_callback(f"Retrieving sessions for {len(job_ids)} jobs...")start_time = datetime.now(tz=timezone.utc)# The max timestamp of a downloaded session's endedAt provides a lower bound to filter sessions by.# This is tracked in the checkpoint.job_session_ended_timestamp: dict[str, datetime] = {job.job_id: job.session_ended_timestampfor job in checkpoint.jobsif job.session_ended_timestamp is not None}deadline = get_session_client(boto3_session, "deadline")job_sessions: dict[str, list] = {}# Retrieve all the sessions with some parallelismmax_workers = SESSIONS_API_MAX_CONCURRENCYprint_function_callback(f"Using {max_workers} threads")with concurrent.futures.ThreadPoolExecutor(max_workers=max_workers) as executor:futures = []for job_id in job_ids:# Use the greater of the bootstrap command timestamp and the session ended timestamps# recorded in the checkpoint.session_ended_threshold = job_session_ended_timestamp.get(job_id)if session_ended_threshold is None:session_ended_threshold = checkpoint.downloads_started_timestamp# For all jobs that are not NEW (including re-queued jobs) - i.e. completed and updated jobs# Use an eventual consistency window to accept a little extraif job_id not in categorized_job_ids.added:session_ended_threshold = session_ended_threshold - timedelta(seconds=checkpoint.eventual_consistency_max_seconds)futures.append(executor.submit(_retrieve_sessions_for_job,deadline,farm_id,queue["queueId"],job_id,session_ended_threshold,job_sessions,))# surfaces any exceptions in the threadfor future in concurrent.futures.as_completed(futures):future.result()duration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...retrieval completed in {duration}")print_function_callback("")print_function_callback(f"Retrieving session actions for {sum(len(session_list) for session_list in job_sessions.values())} sessions...")start_time = datetime.now(tz=timezone.utc)# Retrieve all the session actions with some parallelismmax_workers = SESSIONS_API_MAX_CONCURRENCYprint_function_callback(f"Using {max_workers} threads")with concurrent.futures.ThreadPoolExecutor(max_workers=max_workers) as executor:futures = []for job_id, session_list in job_sessions.items():for session in session_list:futures.append(executor.submit(_retrieve_session_actions_for_session,deadline,checkpoint_job_session_completed_indexes,farm_id,queue["queueId"],job_id,session,))# surfaces any exceptions in the threadfor future in concurrent.futures.as_completed(futures):future.result()duration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...retrieval completed in {duration}")print_function_callback("")print_function_callback("Populating missing manifest S3 keys...")start_time = datetime.now(tz=timezone.utc)_add_missing_output_manifests_to_job_sessions(boto3_session_for_s3, farm_id, queue, job_sessions, download_candidate_jobs)_filter_session_actions_without_manifests_from_job_sessions(job_sessions,download_candidate_jobs,print_function_callback,)duration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...populated in {duration}")return job_sessions | 10 | 90 | 10 | 495 | 7 | 533 | 687 | 533 | boto3_session,boto3_session_for_s3,farm_id,queue,checkpoint_job_session_completed_indexes,categorized_job_ids,checkpoint,download_candidate_jobs,print_function_callback | ['futures', 'duration', 'deadline', 'session_ended_threshold', 'job_ids', 'max_workers', 'start_time'] | dict[str, list] | {"AnnAssign": 2, "Assign": 15, "Expr": 17, "For": 5, "If": 2, "Return": 1, "With": 2} | 38 | 155 | 38 | ["union", "categorized_job_ids.completed.union", "print_function_callback", "len", "datetime.now", "get_session_client", "print_function_callback", "concurrent.futures.ThreadPoolExecutor", "job_session_ended_timestamp.get", "timedelta", "futures.append", "executor.submit", "concurrent.futures.as_completed", "future.result", "datetime.now", "print_function_callback", "print_function_callback", "print_function_callback", "sum", "len", "job_sessions.values", "datetime.now", "print_function_callback", "concurrent.futures.ThreadPoolExecutor", "job_sessions.items", "futures.append", "executor.submit", "concurrent.futures.as_completed", "future.result", "datetime.now", "print_function_callback", "print_function_callback", "print_function_callback", "datetime.now", "_add_missing_output_manifests_to_job_sessions", "_filter_session_actions_without_manifests_from_job_sessions", "datetime.now", "print_function_callback"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_get_job_sessions) defined within the public class called public.The function start at line 533 and ends at 687. It contains 90 lines of code and it has a cyclomatic complexity of 10. It takes 10 parameters, represented as [533.0] and does not return any value. It declares 38.0 functions, It has 38.0 functions called inside which are ["union", "categorized_job_ids.completed.union", "print_function_callback", "len", "datetime.now", "get_session_client", "print_function_callback", "concurrent.futures.ThreadPoolExecutor", "job_session_ended_timestamp.get", "timedelta", "futures.append", "executor.submit", "concurrent.futures.as_completed", "future.result", "datetime.now", "print_function_callback", "print_function_callback", "print_function_callback", "sum", "len", "job_sessions.values", "datetime.now", "print_function_callback", "concurrent.futures.ThreadPoolExecutor", "job_sessions.items", "futures.append", "executor.submit", "concurrent.futures.as_completed", "future.result", "datetime.now", "print_function_callback", "print_function_callback", "print_function_callback", "datetime.now", "_add_missing_output_manifests_to_job_sessions", "_filter_session_actions_without_manifests_from_job_sessions", "datetime.now", "print_function_callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_storage_profiles | def _get_storage_profiles(deadline: BaseClient,farm_id: str,queue: dict[str, Any],job_sessions: dict[str, list],checkpoint: IncrementalDownloadState,download_candidate_jobs: dict[str, dict[str, Any]],) -> dict[str, dict[str, Any]]:"""Retrieves all the needed storage profiles. To call this function,checkpoint.local_storage_profile_id must be set.Args:deadline: An AWS client for calling deadline APIs.farm_id: The farm id for the operation.queue: The queue as returned by boto3 deadline.get_queue().checkpoint: The checkpoint for the incremental download.job_sessions: Contains each job's sessions and session actions, structured as job_sessions[job_id][session_index]["sessionActions"][session_action_index].See the function _get_job_sessions for more details.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call."""if checkpoint.local_storage_profile_id is None:raise ValueError("The checkpoint local storage profile id must be set.")# Collect all the storage profile idsstorage_profile_ids: set[str] = {checkpoint.local_storage_profile_id}for job_id in job_sessions.keys():storage_profile_ids.add(download_candidate_jobs[job_id]["storageProfileId"])# Load all the storage profiles from Deadline Cloudstorage_profiles = {storage_profile_id: deadline.get_storage_profile_for_queue(farmId=farm_id,queueId=queue["queueId"],storageProfileId=storage_profile_id,)for storage_profile_id in storage_profile_ids}return storage_profiles | 4 | 22 | 6 | 138 | 1 | 690 | 730 | 690 | deadline,farm_id,queue,job_sessions,checkpoint,download_candidate_jobs | ['storage_profiles'] | dict[str, dict[str, Any]] | {"AnnAssign": 1, "Assign": 1, "Expr": 2, "For": 1, "If": 1, "Return": 1} | 4 | 41 | 4 | ["ValueError", "job_sessions.keys", "storage_profile_ids.add", "deadline.get_storage_profile_for_queue"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_get_storage_profiles) defined within the public class called public.The function start at line 690 and ends at 730. It contains 22 lines of code and it has a cyclomatic complexity of 4. It takes 6 parameters, represented as [690.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["ValueError", "job_sessions.keys", "storage_profile_ids.add", "deadline.get_storage_profile_for_queue"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _create_path_mapping_rule_appliers | def _create_path_mapping_rule_appliers(storage_profiles: dict[str, dict[str, Any]],checkpoint: IncrementalDownloadState,download_candidate_jobs: dict[str, dict[str, Any]],print_function_callback: Callable[[str], None] = lambda msg: None,) -> dict[str, Optional[_PathMappingRuleApplier]]:"""Retrieves all the needed storage profiles and constructs path mapping rule applies for them.To call this function, checkpoint.local_storage_profile_id must be set.Args:storage_profiles: A mapping from storage profile id to the storage profile as returned by boto3 deadline.get_storage_profile_for_queue.checkpoint: The checkpoint for the incremental download.job_sessions: Contains each job's sessions and session actions, structured as job_sessions[job_id][session_index]["sessionActions"][session_action_index].See the function _get_job_sessions for more details.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call.print_function_callback: Callback for printing output to the terminal or log."""if checkpoint.local_storage_profile_id is None:raise ValueError("The checkpoint local storage profile id must be set.")path_mapping_rule_appliers: dict[str, Optional[_PathMappingRuleApplier]] = {}# Create a path mapping rule applier for each storage profilelocal_storage_profile = storage_profiles[checkpoint.local_storage_profile_id]local_storage_profile_name = local_storage_profile["displayName"]print_function_callback("")print_function_callback(f"Local storage profile is {local_storage_profile_name} ({checkpoint.local_storage_profile_id})")print_function_callback(f"{len([job for job in download_candidate_jobs.values() if job.get('storageProfileId') == checkpoint.local_storage_profile_id])} download candidate jobs have the same storage profile and will be downloaded to their original specified paths")for storage_profile_id, storage_profile in storage_profiles.items():storage_profile_name = storage_profile["displayName"]if storage_profile_id == checkpoint.local_storage_profile_id:path_mapping_rule_appliers[storage_profile_id] = Noneelse:rules = _generate_path_mapping_rules(storage_profile, local_storage_profile)path_mapping_rule_appliers[storage_profile_id] = _PathMappingRuleApplier(rules)# Print the path mapping rules for each source storage profileprint_function_callback("")job_count = len([jobfor job in download_candidate_jobs.values()if job.get("storageProfileId") == storage_profile_id])print_function_callback(f"Path mapping rules for {job_count} download candidate jobs with storage profile {storage_profile_name} ({storage_profile_id})")print_function_callback(f"job storage profile: {storage_profile_name} ({storage_profile['osFamily']})")print_function_callback(f"local storage profile: {local_storage_profile_name} ({local_storage_profile['osFamily']})")if rules:for rule in rules:print_function_callback(f"- from: {rule.source_path}")print_function_callback(f"to: {rule.destination_path}")else:print_function_callback(f" No rules generated. Storage profiles {local_storage_profile_name} and {storage_profile_name} share no file system location names.")return path_mapping_rule_appliers | 8 | 51 | 5 | 237 | 5 | 733 | 801 | 733 | storage_profiles,checkpoint,download_candidate_jobs,print_function_callback | ['local_storage_profile_name', 'local_storage_profile', 'storage_profile_name', 'job_count', 'rules'] | dict[str, Optional[_PathMappingRuleApplier]] | {"AnnAssign": 1, "Assign": 7, "Expr": 11, "For": 2, "If": 3, "Return": 1} | 20 | 69 | 20 | ["ValueError", "print_function_callback", "print_function_callback", "print_function_callback", "len", "download_candidate_jobs.values", "job.get", "storage_profiles.items", "_generate_path_mapping_rules", "_PathMappingRuleApplier", "print_function_callback", "len", "download_candidate_jobs.values", "job.get", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_create_path_mapping_rule_appliers) defined within the public class called public.The function start at line 733 and ends at 801. It contains 51 lines of code and it has a cyclomatic complexity of 8. It takes 5 parameters, represented as [733.0] and does not return any value. It declares 20.0 functions, It has 20.0 functions called inside which are ["ValueError", "print_function_callback", "print_function_callback", "print_function_callback", "len", "download_candidate_jobs.values", "job.get", "storage_profiles.items", "_generate_path_mapping_rules", "_PathMappingRuleApplier", "print_function_callback", "len", "download_candidate_jobs.values", "job.get", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _add_missing_output_manifests_to_job_sessions | def _add_missing_output_manifests_to_job_sessions(boto3_session_for_s3: boto3.Session,farm_id: str,queue: dict[str, Any],job_sessions: dict[str, list],download_candidate_jobs: dict[str, dict[str, Any]],):"""Args:boto3_session_for_s3: The boto3.Session to use for accessing S3.farm_id: The farm id for the operation.queue: The queue as returned by boto3 deadline.get_queue().job_sessions: Contains each job's sessions and session actions, structured as job_sessions[job_id][session_index]["sessionActions"][session_action_index].See the function _get_job_sessions for more details.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call."""for job_id, session_list in job_sessions.items():job = download_candidate_jobs[job_id]session_action_list = [session_actionfor session in session_listfor session_action in session.get("sessionActions", [])]_add_output_manifests_from_s3(farm_id, queue, job, boto3_session_for_s3, session_action_list) | 4 | 17 | 5 | 97 | 2 | 804 | 830 | 804 | boto3_session_for_s3,farm_id,queue,job_sessions,download_candidate_jobs | ['session_action_list', 'job'] | None | {"Assign": 2, "Expr": 2, "For": 1} | 3 | 27 | 3 | ["job_sessions.items", "session.get", "_add_output_manifests_from_s3"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions"] | The function (_add_missing_output_manifests_to_job_sessions) defined within the public class called public.The function start at line 804 and ends at 830. It contains 17 lines of code and it has a cyclomatic complexity of 4. It takes 5 parameters, represented as [804.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["job_sessions.items", "session.get", "_add_output_manifests_from_s3"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _filter_session_actions_without_manifests_from_job_sessions | def _filter_session_actions_without_manifests_from_job_sessions(job_sessions: dict[str, list],download_candidate_jobs: dict[str, dict[str, Any]],print_function_callback: Callable[[str], None] = lambda msg: None,):"""Modify job_sessions in place to filter out any session actions that lack any output manifests.Print a warning message for any job that had a session action like this.Args:job_sessions: Contains each job's sessions and session actions, structured as job_sessions[job_id][session_index]["sessionActions"][session_action_index].See the function _get_job_sessions for more details.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call.print_function_callback: Callback for printing output to the terminal or log."""for job_id, session_list in job_sessions.items():job = download_candidate_jobs[job_id]total_count = 0filtered_count = 0for session in session_list:total_count += len(session.get("sessionActions", []))# Filter out session actions with no manifest filesfiltered_session_action_list = [session_actionfor session_action in session.get("sessionActions", [])if any(item != {} for item in session_action["manifests"])]filtered_count += len(filtered_session_action_list)if total_count != filtered_count:session["sessionActions"] = filtered_session_action_listif total_count != filtered_count:print_function_callback(f"WARNING: Job {job['name']} ({job_id}) ran {total_count - filtered_count} / {total_count} session actions with no output.")print_function_callback(" This may indicate steps in the job that strictly perform validation or save results elsewhere like a shared file system or S3.") | 8 | 26 | 4 | 149 | 4 | 833 | 870 | 833 | job_sessions,download_candidate_jobs,print_function_callback | ['total_count', 'filtered_session_action_list', 'job', 'filtered_count'] | None | {"Assign": 5, "AugAssign": 2, "Expr": 3, "For": 2, "If": 2} | 8 | 38 | 8 | ["job_sessions.items", "len", "session.get", "session.get", "any", "len", "print_function_callback", "print_function_callback"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions"] | The function (_filter_session_actions_without_manifests_from_job_sessions) defined within the public class called public.The function start at line 833 and ends at 870. It contains 26 lines of code and it has a cyclomatic complexity of 8. It takes 4 parameters, represented as [833.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["job_sessions.items", "len", "session.get", "session.get", "any", "len", "print_function_callback", "print_function_callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _update_checkpoint_jobs_list | def _update_checkpoint_jobs_list(checkpoint: IncrementalDownloadState,download_candidate_jobs: dict[str, dict[str, Any]],categorized_job_ids: CategorizedJobIds,job_sessions: dict[str, list],):"""Update the jobs list in the checkpoint object.Args:checkpoint: The checkpoint for the incremental download.download_candidate_jobs: The result of a _get_download_candidate_jobs call, {job_id: job} wherejob is a result from a deadline.search_jobs() or deadline.get_job() call.categorized_job_ids: The categorized job ids as returned by _categorize_jobs_in_checkpoint().job_sessions: Contains each job's sessions and session actions, structured as job_sessions[job_id][session_index]["sessionActions"][session_action_index].See the function _get_job_sessions for more details."""updated_jobs: list[IncrementalDownloadJob] = []# Produce the session_ended_timestamp for all the job ids. Start# with the values from the previous checkpoint, and then overwrite# them from job_sessionsjob_session_ended_timestamps: dict[str, Optional[datetime]] = {job.job_id: job.session_ended_timestampfor job in checkpoint.jobsif job.session_ended_timestamp is not None}for job_id, session_list in job_sessions.items():max_session_ended_timestamp = Nonefor session in session_list:if "endedAt" in session:if max_session_ended_timestamp is None:max_session_ended_timestamp = session["endedAt"]else:max_session_ended_timestamp = max(max_session_ended_timestamp, session["endedAt"])job_session_ended_timestamps[job_id] = max_session_ended_timestamp# Produce the session_completed_indexes for all the job ids. Start# with the values from the previous checkpoint, then overwrite# them from job_sessions.job_session_completed_indexes: dict[str, dict[str, int]] = {job.job_id: job.session_completed_indexes for job in checkpoint.jobs}for job_id, session_list in job_sessions.items():for session in session_list:session_actions = session.get("sessionActions", [])if session_actions:job_session_completed_indexes.setdefault(job_id, {})[session["sessionId"]] = max(session_action["sessionActionIndex"] for session_action in session_actions)job_session_ended_timestamps[job_id] = max_session_ended_timestamp# These categories keep the download_candidate_jobs job as is.for job_id in (categorized_job_ids.added | categorized_job_ids.updated | categorized_job_ids.unchanged):updated_jobs.append(IncrementalDownloadJob(download_candidate_jobs[job_id],job_session_ended_timestamps.get(job_id),job_session_completed_indexes.get(job_id, {}),))# This category keeps a signal that it has no job attachments to process by having an attachments field with None in itfor job_id in categorized_job_ids.attachments_free:updated_jobs.append(IncrementalDownloadJob({"jobId": job_id,"name": download_candidate_jobs[job_id]["name"],"attachments": None,},None,{},))# This category keeps a signal that it is missing a storage profile by populating the attachments but# having a storageProfileId field with None in it. By keeping the attachments in the checkpoint, someone# inspecting the checkpoint to understand what's happening can get an idea about the paths for the job# and diagnose problems quicker.for job_id in categorized_job_ids.missing_storage_profile:updated_jobs.append(IncrementalDownloadJob({"jobId": job_id,"name": download_candidate_jobs[job_id]["name"],"attachments": download_candidate_jobs[job_id]["attachments"],"storageProfileId": None,},None,{},))# Keep completed jobs around until they become inactivefor job_id in categorized_job_ids.completed:updated_jobs.append(IncrementalDownloadJob(download_candidate_jobs[job_id],job_session_ended_timestamps.get(job_id),job_session_completed_indexes.get(job_id, {}),))# When a job becomes inactive, keep it around in minimal form when it has a session_ended_timestamp.# This is necessary for the case where a completed job gets requeued later. We can't tell# that it was requeued from the deadline.search_jobs query, so we hold this metadata in the checkpoint.for job_id in categorized_job_ids.inactive:session_ended_timestamp = job_session_ended_timestamps.get(job_id)if session_ended_timestamp is not None:updated_jobs.append(IncrementalDownloadJob({"jobId": job_id}, session_ended_timestamp, {}))checkpoint.jobs = updated_jobs | 18 | 84 | 4 | 445 | 3 | 873 | 987 | 873 | checkpoint,download_candidate_jobs,categorized_job_ids,job_sessions | ['session_ended_timestamp', 'session_actions', 'max_session_ended_timestamp'] | None | {"AnnAssign": 3, "Assign": 9, "Expr": 6, "For": 9, "If": 4} | 21 | 115 | 21 | ["job_sessions.items", "max", "job_sessions.items", "session.get", "job_session_completed_indexes.setdefault", "max", "updated_jobs.append", "IncrementalDownloadJob", "job_session_ended_timestamps.get", "job_session_completed_indexes.get", "updated_jobs.append", "IncrementalDownloadJob", "updated_jobs.append", "IncrementalDownloadJob", "updated_jobs.append", "IncrementalDownloadJob", "job_session_ended_timestamps.get", "job_session_completed_indexes.get", "job_session_ended_timestamps.get", "updated_jobs.append", "IncrementalDownloadJob"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"] | The function (_update_checkpoint_jobs_list) defined within the public class called public.The function start at line 873 and ends at 987. It contains 84 lines of code and it has a cyclomatic complexity of 18. It takes 4 parameters, represented as [873.0] and does not return any value. It declares 21.0 functions, It has 21.0 functions called inside which are ["job_sessions.items", "max", "job_sessions.items", "session.get", "job_session_completed_indexes.setdefault", "max", "updated_jobs.append", "IncrementalDownloadJob", "job_session_ended_timestamps.get", "job_session_completed_indexes.get", "updated_jobs.append", "IncrementalDownloadJob", "updated_jobs.append", "IncrementalDownloadJob", "updated_jobs.append", "IncrementalDownloadJob", "job_session_ended_timestamps.get", "job_session_completed_indexes.get", "job_session_ended_timestamps.get", "updated_jobs.append", "IncrementalDownloadJob"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _incremental_output_download._update_download_progress | def _update_download_progress(download_metadata: ProgressReportMetadata,) -> bool:nonlocal last_call_time, printed_100_percentif not printed_100_percent and download_metadata.progress == 100:print_function_callback(f"{download_metadata.progressMessage}")last_call_time = time.time()printed_100_percent = Trueelif (not printed_100_percentand time.time() - last_call_time > MIN_DELAY_BETWEEN_PRINTOUTS):print_function_callback(f"{download_metadata.progressMessage}")last_call_time = time.time()return sigint_handler.continue_operation | 5 | 15 | 1 | 71 | 0 | 1,202 | 1,218 | 1,202 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_incremental_output_download._update_download_progress) defined within the public class called public.The function start at line 1202 and ends at 1218. It contains 15 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _incremental_output_download | def _incremental_output_download(farm_id: str,queue: dict[str, Any],boto3_session: boto3.Session,checkpoint: IncrementalDownloadState,file_conflict_resolution: FileConflictResolution,config: Optional[ConfigParser] = None,print_function_callback: Callable[[str], None] = lambda msg: None,*,dry_run: bool = False,) -> IncrementalDownloadState:"""This function downloads all the task run outputs from the specified queue, that have becomeavailable since the last time the function was called. The checkpoint objectkeeps track of all state needed to keep track of what needs to be downloaded.Pre-condition: The input checkpoint holds all information needed to understand the state of downloadscompleted up to the timestamp checkpoint.downloads_completed_timestamp. See the documentationin the IncrementalDownloadState to understand the invariants of the checkpoint.Post-condition: The output checkpoint has an updated checkpoint.downloads_completed_timestamp,all downloads were performed up to at least this timestamp, and the checkpoint datais updated to satisfy the next call's pre-condition.Args:farm_id: The farm id for the operation.queue: The queue as returned by boto3 deadline.get_queue().boto3_session: The boto3.Session for accessing AWS.checkpoint: The checkpoint for the incremental download.config: Optional, a Deadline Cloud configuration as loaded from config_file.read_config().print_function_callback: Callback for printing output to the terminal or log.dry_run: If True, the operation will print out information but not perform any data downloads.Returns:An updated checkpoint object."""if sys.version_info < (3, 9):raise DeadlineOperationError("The sync-output command requires Python version 3.9 or later")durations = IncrementalOutputDownloadLatencies()deadline = get_session_client(boto3_session, "deadline")# When this function is done, we will be confident that downloads are complete up to# new_completed_timestamp. We subtract a duration from now() that gives a generous amount of# time for the deadline:SearchJobs API's eventual consistency to converge.current_timestamp = datetime.now(timezone.utc)new_completed_timestamp = max(checkpoint.downloads_started_timestamp,current_timestamp - timedelta(seconds=checkpoint.eventual_consistency_max_seconds),)# The queue role is used for accessing S3boto3_session_for_s3 = api.get_queue_user_boto3_session(deadline=deadline,config=config,farm_id=farm_id,queue_id=queue["queueId"],queue_display_name=queue["displayName"],)print_function_callback("Updating download state across time interval:")print_function_callback(f"From: {checkpoint.downloads_completed_timestamp.astimezone().isoformat()}")print_function_callback(f"To: {current_timestamp.astimezone().isoformat()}")update_length = current_timestamp - checkpoint.downloads_completed_timestampeventual_consistency_delta = timedelta(seconds=checkpoint.eventual_consistency_max_seconds)if update_length > eventual_consistency_delta:print_function_callback(f"Length: {update_length - eventual_consistency_delta} + {eventual_consistency_delta} (eventual consistency allowance)")else:# Immediately after bootstrapping, this length will be shorter than the eventual consistency windowprint_function_callback(f"Length: {update_length}")print_function_callback("")# Save all the jobs' session action indexes from the checkpoint, before we update the checkpoint's jobs listcheckpoint_job_session_completed_indexes: dict[str, dict[str, int]] = {job.job_id: job.session_completed_indexes for job in checkpoint.jobs}# Call deadline:SearchJobs to get a set of jobs that includes every job with downloads available.start_t = time.perf_counter_ns()download_candidate_jobs: dict[str, dict[str, Any]] = _get_download_candidate_jobs(boto3_session,farm_id,queue["queueId"],checkpoint.downloads_completed_timestamp,print_function_callback,)durations._get_download_candidate_jobs = time.perf_counter_ns() - start_tprint_function_callback("")# Compare the download candidates with the previously saved checkpoint state to categorize the jobsstart_t = time.perf_counter_ns()categorized_job_ids: CategorizedJobIds = _categorize_jobs_in_checkpoint(boto3_session,farm_id,queue["queueId"],checkpoint,download_candidate_jobs,new_completed_timestamp,print_function_callback,)durations._categorize_jobs_in_checkpoint = time.perf_counter_ns() - start_tprint_function_callback("")# All the completed, added, and updated jobs might have downloads available. Retrieve the sessions for these jobs.start_t = time.perf_counter_ns()job_sessions: dict[str, list] = _get_job_sessions(boto3_session,boto3_session_for_s3,farm_id,queue,checkpoint_job_session_completed_indexes,categorized_job_ids,checkpoint,download_candidate_jobs,print_function_callback,)durations._get_job_sessions = time.perf_counter_ns() - start_t# If storage profiles are being used, get them and construct all the path mapping rulespath_mapping_rule_appliers: dict[str, Optional[_PathMappingRuleApplier]] = {}if checkpoint.local_storage_profile_id:start_t = time.perf_counter_ns()storage_profiles = _get_storage_profiles(deadline, farm_id, queue, job_sessions, checkpoint, download_candidate_jobs)path_mapping_rule_appliers = _create_path_mapping_rule_appliers(storage_profiles,checkpoint,download_candidate_jobs,print_function_callback,)durations.path_mapping = time.perf_counter_ns() - start_t# Use the information collected so far to update the jobs list in checkpointstart_t = time.perf_counter_ns()_update_checkpoint_jobs_list(checkpoint, download_candidate_jobs, categorized_job_ids, job_sessions)durations._update_checkpoint_jobs_list = time.perf_counter_ns() - start_tstart_t = time.perf_counter_ns()unmapped_paths: dict[str, list[str]] = {}downloaded_manifests: list[tuple[datetime, BaseAssetManifest]] = (_download_all_manifests_with_absolute_paths(queue,download_candidate_jobs,job_sessions,path_mapping_rule_appliers,unmapped_paths,boto3_session_for_s3,print_function_callback,))durations._download_all_manifests_with_absolute_paths = time.perf_counter_ns() - start_t# Print warning messages about all the output paths that will not be downloaded due to lack of path mapping.if unmapped_paths:print_function_callback("")print_function_callback("WARNING: THE FOLLOWING FILES WILL NOT BE DOWNLOADED")for job_id, unmapped_path_list in unmapped_paths.items():print_function_callback(f"Job {download_candidate_jobs[job_id]['name']} ({job_id}) has outputs with unmapped paths that will not be downloaded")storage_profile = storage_profiles[download_candidate_jobs[job_id]["storageProfileId"]]print_function_callback(f"Job storage profile is {storage_profile['displayName']} ({storage_profile['storageProfileId']})")print_function_callback("Summary of unmapped paths:")path_format = (PathFormat.WINDOWSif storage_profile["osFamily"] == StorageProfileOperatingSystemFamily.WINDOWS.valueelse PathFormat.POSIX)paths_summary = summarize_path_list(unmapped_path_list, max_entries=30, path_format=path_format)print_function_callback(textwrap.indent(paths_summary, ""))# Merge the manifests ordered by the last modified timestampmanifest_paths_to_download: list[BaseManifestPath] = _merge_absolute_path_manifest_list(downloaded_manifests)# Print a summary of all the paths before starting the downloadlocal_path_list = [manifest_path.path for manifest_path in manifest_paths_to_download]file_size_by_path = {manifest_path.path: manifest_path.size for manifest_path in manifest_paths_to_download}print_function_callback("")print_function_callback("Summary of paths to download:")print_function_callback(summarize_path_list(local_path_list, total_size_by_path=file_size_by_path, max_entries=30))print_function_callback("")if not dry_run:print_function_callback(f"Downloading {len(manifest_paths_to_download)} files from S3...")start_t = time.perf_counter_ns()start_time = datetime.now(tz=timezone.utc)# Incremental download is mostly a background thing, so don't print status too often while downloadingMIN_DELAY_BETWEEN_PRINTOUTS = 20last_call_time = time.time() - MIN_DELAY_BETWEEN_PRINTOUTSprinted_100_percent = Falsedef _update_download_progress(download_metadata: ProgressReportMetadata,) -> bool:nonlocal last_call_time, printed_100_percentif not printed_100_percent and download_metadata.progress == 100:print_function_callback(f"{download_metadata.progressMessage}")last_call_time = time.time()printed_100_percent = Trueelif (not printed_100_percentand time.time() - last_call_time > MIN_DELAY_BETWEEN_PRINTOUTS):print_function_callback(f"{download_metadata.progressMessage}")last_call_time = time.time()return sigint_handler.continue_operation_download_manifest_paths(manifest_paths_to_download,HashAlgorithm.XXH128,queue,boto3_session_for_s3,file_conflict_resolution,on_downloading_files=_update_download_progress,print_function_callback=print_function_callback,)durations.download = time.perf_counter_ns() - start_tduration = datetime.now(tz=timezone.utc) - start_timeprint_function_callback(f"...downloaded in {duration}")else:print_function_callback("Skipping downloads due to DRY RUN")# Update the timestamp in the state object to reflect the downloads that were completedcheckpoint.downloads_completed_timestamp = new_completed_timestampstats: dict[str, Any] = {"downloaded_session_actions": sum(len(session.get("sessionActions", []))for session_list in job_sessions.values()for session in session_list),"downloaded_files": len(manifest_paths_to_download),"downloaded_bytes": sum(path.size for path in manifest_paths_to_download),"jobs_with_downloads": {"completed": len(categorized_job_ids.completed),"added": len(categorized_job_ids.added),"updated": len(categorized_job_ids.updated),},"jobs_without_downloads": {"not_using_job_attachments": len(categorized_job_ids.attachments_free),"missing_storage_profile": len(categorized_job_ids.missing_storage_profile),"unchanged": len(categorized_job_ids.unchanged),"inactive": len(categorized_job_ids.inactive),},"unmapped_paths": len(unmapped_paths),}api.get_deadline_cloud_library_telemetry_client().record_event(event_type="com.amazon.rum.deadline.queue_sync_output_stats",event_details={"latencies": asdict(durations),"dry_run": dry_run,**stats,},)print_function_callback("")if dry_run:print_function_callback("Summary of DRY RUN for incremental output download (no files were downloaded to the file system):")else:print_function_callback("Summary of incremental output download:")print_function_callback(f"Downloaded session actions: {stats['downloaded_session_actions']}")print_function_callback(f"Downloaded files: {stats['downloaded_files']}")print_function_callback(f"Downloaded bytes: {human_readable_file_size(stats['downloaded_bytes'])}")print_function_callback("Jobs with downloads:")print_function_callback(f"completed: {stats['jobs_with_downloads']['completed']}")print_function_callback(f"added: {stats['jobs_with_downloads']['added']}")print_function_callback(f"updated: {stats['jobs_with_downloads']['updated']}")print_function_callback("Jobs without downloads:")print_function_callback(f"not using job attachments: {stats['jobs_without_downloads']['not_using_job_attachments']}")print_function_callback(f"missing storage profile: {stats['jobs_without_downloads']['missing_storage_profile']}")print_function_callback(f"unchanged: {stats['jobs_without_downloads']['unchanged']}")print_function_callback(f"inactive: {stats['jobs_without_downloads']['inactive']}")return checkpoint | 15 | 223 | 9 | 1,063 | 20 | 991 | 1,295 | 991 | farm_id,queue,boto3_session,checkpoint,file_conflict_resolution,config,print_function_callback,dry_run | ['paths_summary', 'path_mapping_rule_appliers', 'boto3_session_for_s3', 'update_length', 'printed_100_percent', 'path_format', 'last_call_time', 'storage_profile', 'current_timestamp', 'eventual_consistency_delta', 'start_time', 'MIN_DELAY_BETWEEN_PRINTOUTS', 'local_path_list', 'duration', 'deadline', 'new_completed_timestamp', 'file_size_by_path', 'start_t', 'durations', 'storage_profiles'] | IncrementalDownloadState | {"AnnAssign": 9, "Assign": 37, "Expr": 42, "For": 1, "If": 8, "Return": 2} | 103 | 305 | 103 | ["DeadlineOperationError", "IncrementalOutputDownloadLatencies", "get_session_client", "datetime.now", "max", "timedelta", "api.get_queue_user_boto3_session", "print_function_callback", "print_function_callback", "isoformat", "checkpoint.downloads_completed_timestamp.astimezone", "print_function_callback", "isoformat", "current_timestamp.astimezone", "timedelta", "print_function_callback", "print_function_callback", "print_function_callback", "time.perf_counter_ns", "_get_download_candidate_jobs", "time.perf_counter_ns", "print_function_callback", "time.perf_counter_ns", "_categorize_jobs_in_checkpoint", "time.perf_counter_ns", "print_function_callback", "time.perf_counter_ns", "_get_job_sessions", "time.perf_counter_ns", "time.perf_counter_ns", "_get_storage_profiles", "_create_path_mapping_rule_appliers", "time.perf_counter_ns", "time.perf_counter_ns", "_update_checkpoint_jobs_list", "time.perf_counter_ns", "time.perf_counter_ns", "_download_all_manifests_with_absolute_paths", "time.perf_counter_ns", "print_function_callback", "print_function_callback", "unmapped_paths.items", "print_function_callback", "print_function_callback", "print_function_callback", "summarize_path_list", "print_function_callback", "textwrap.indent", "_merge_absolute_path_manifest_list", "print_function_callback", "print_function_callback", "print_function_callback", "summarize_path_list", "print_function_callback", "print_function_callback", "len", "time.perf_counter_ns", "datetime.now", "time.time", "print_function_callback", "time.time", "time.time", "print_function_callback", "time.time", "_download_manifest_paths", "time.perf_counter_ns", "datetime.now", "print_function_callback", "print_function_callback", "sum", "len", "session.get", "job_sessions.values", "len", "sum", "len", "len", "len", "len", "len", "len", "len", "len", "record_event", "api.get_deadline_cloud_library_telemetry_client", "asdict", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "human_readable_file_size", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "api.record_function_latency_telemetry_event"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output"] | The function (_incremental_output_download) defined within the public class called public.The function start at line 991 and ends at 1295. It contains 223 lines of code and it has a cyclomatic complexity of 15. It takes 9 parameters, represented as [991.0] and does not return any value. It declares 103.0 functions, It has 103.0 functions called inside which are ["DeadlineOperationError", "IncrementalOutputDownloadLatencies", "get_session_client", "datetime.now", "max", "timedelta", "api.get_queue_user_boto3_session", "print_function_callback", "print_function_callback", "isoformat", "checkpoint.downloads_completed_timestamp.astimezone", "print_function_callback", "isoformat", "current_timestamp.astimezone", "timedelta", "print_function_callback", "print_function_callback", "print_function_callback", "time.perf_counter_ns", "_get_download_candidate_jobs", "time.perf_counter_ns", "print_function_callback", "time.perf_counter_ns", "_categorize_jobs_in_checkpoint", "time.perf_counter_ns", "print_function_callback", "time.perf_counter_ns", "_get_job_sessions", "time.perf_counter_ns", "time.perf_counter_ns", "_get_storage_profiles", "_create_path_mapping_rule_appliers", "time.perf_counter_ns", "time.perf_counter_ns", "_update_checkpoint_jobs_list", "time.perf_counter_ns", "time.perf_counter_ns", "_download_all_manifests_with_absolute_paths", "time.perf_counter_ns", "print_function_callback", "print_function_callback", "unmapped_paths.items", "print_function_callback", "print_function_callback", "print_function_callback", "summarize_path_list", "print_function_callback", "textwrap.indent", "_merge_absolute_path_manifest_list", "print_function_callback", "print_function_callback", "print_function_callback", "summarize_path_list", "print_function_callback", "print_function_callback", "len", "time.perf_counter_ns", "datetime.now", "time.time", "print_function_callback", "time.time", "time.time", "print_function_callback", "time.time", "_download_manifest_paths", "time.perf_counter_ns", "datetime.now", "print_function_callback", "print_function_callback", "sum", "len", "session.get", "job_sessions.values", "len", "sum", "len", "len", "len", "len", "len", "len", "len", "len", "record_event", "api.get_deadline_cloud_library_telemetry_client", "asdict", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "human_readable_file_size", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "print_function_callback", "api.record_function_latency_telemetry_event"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_default_log_level | def _get_default_log_level() -> str:"""Get the default log level from the config file."""# Set the default log level based on the setting, must do here so we can pass into the click option_SETTING_LOG_LEVEL = get_setting("settings.log_level").upper()_DEFAULT_LOG_LEVEL = get_setting_default("settings.log_level")_CLI_DEFAULT_LOG_LEVEL = _DEFAULT_LOG_LEVELif _SETTING_LOG_LEVEL not in _DEADLINE_LOG_LEVELS:logger.warning(f"Log Level '{_SETTING_LOG_LEVEL}' not in {_DEADLINE_LOG_LEVELS}. Defaulting to {_DEFAULT_LOG_LEVEL}")else:_CLI_DEFAULT_LOG_LEVEL = _SETTING_LOG_LEVELreturn _CLI_DEFAULT_LOG_LEVEL | 2 | 11 | 0 | 46 | 3 | 30 | 44 | 30 | ['_SETTING_LOG_LEVEL', '_DEFAULT_LOG_LEVEL', '_CLI_DEFAULT_LOG_LEVEL'] | str | {"Assign": 4, "Expr": 2, "If": 1, "Return": 1} | 4 | 15 | 4 | ["upper", "get_setting", "get_setting_default", "logger.warning"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._main_py.main"] | The function (_get_default_log_level) defined within the public class called public.The function start at line 30 and ends at 44. It contains 11 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["upper", "get_setting", "get_setting_default", "logger.warning"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._main_py.main"]. | |
aws-deadline_deadline-cloud | ContextTrackingCommand | public | 0 | 1 | invoke | def invoke(self, ctx: click.Context):# This is a global variable used to modify User Agent header in the default boto configsession_context["cli-command-name"] = ctx.command_path.replace(" ", ".")return super().invoke(ctx) | 1 | 3 | 2 | 35 | 0 | 52 | 55 | 52 | self,ctx | [] | Returns | {"Assign": 1, "Return": 1} | 3 | 4 | 3 | ["ctx.command_path.replace", "invoke", "super"] | 26 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_help_py.test_help_in_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_help_py.test_help_in_subcommand", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_lowlevel_dumps_in_debug_mode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_no_lowlevel_dumps_in_nondebug", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_verbosity", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_options_py.test_options_passed_to_preload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_options_py.test_options_passed_to_realrun", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_mixed_sources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_nothing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_one_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_one_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_two_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_two_modules", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_explicit_args_passed_properly", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_result_returned", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_special_kwargs_added", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_stacktrace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.lifecycles.test_real_invocation_py.test_protocol_invocation", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.settings.test_executor_py.test_synchronous_calls_are_threaded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.settings.test_executor_py.test_synchronous_calls_use_replaced_executor", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967974_boxed_mutmut.tests.e2e.test_cli_version_py.test_cli_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.commandline.__init___py.WithExceptionHandler.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70138775_pallets_quart.src.quart.testing.__init___py.QuartCliRunner.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._main_py.ContextTrackingCommand.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_config_py.test_log_level_updated"] | The function (invoke) defined within the public class called ContextTrackingCommand, that inherit another class.The function start at line 52 and ends at 55. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [52.0], and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["ctx.command_path.replace", "invoke", "super"], It has 26.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_help_py.test_help_in_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_help_py.test_help_in_subcommand", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_lowlevel_dumps_in_debug_mode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_no_lowlevel_dumps_in_nondebug", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_logging_py.test_verbosity", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_options_py.test_options_passed_to_preload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_options_py.test_options_passed_to_realrun", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_mixed_sources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_nothing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_one_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_one_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_two_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.cli.test_preloading_py.test_two_modules", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_explicit_args_passed_properly", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_result_returned", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_special_kwargs_added", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.invocations.test_callbacks_py.test_stacktrace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.lifecycles.test_real_invocation_py.test_protocol_invocation", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.settings.test_executor_py.test_synchronous_calls_are_threaded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.settings.test_executor_py.test_synchronous_calls_use_replaced_executor", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967974_boxed_mutmut.tests.e2e.test_cli_version_py.test_cli_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.commandline.__init___py.WithExceptionHandler.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70138775_pallets_quart.src.quart.testing.__init___py.QuartCliRunner.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._main_py.ContextTrackingCommand.invoke", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_config_py.test_log_level_updated"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main(ctx: click.Context, log_level: Optional[str], redirect_output: str, redirect_mode: str):"""The AWS Deadline Cloud CLI provides functionality to interact with the AWS Deadline Cloudservice."""if redirect_output:# Set both stdout and stderr to write to the specified file, writing in line buffering modeif redirect_mode == "append":open_mode = "a"else:open_mode = "w"sys.stdout = sys.stderr = open(redirect_output, open_mode, encoding="utf-8", buffering=1)if log_level is None:log_level = _get_default_log_level()logging.basicConfig(level=log_level)if log_level == "DEBUG":logger.debug("Debug logging is on")ctx.ensure_object(dict)# By default don't prompt when the operation is completectx.obj[_PROMPT_WHEN_COMPLETE] = False | 5 | 14 | 4 | 106 | 2 | 92 | 113 | 92 | ctx,log_level,redirect_output,redirect_mode | ['log_level', 'open_mode'] | None | {"Assign": 5, "Expr": 4, "If": 4} | 12 | 22 | 12 | ["open", "_get_default_log_level", "logging.basicConfig", "logger.debug", "ctx.ensure_object", "click.group", "click.version_option", "click.option", "click.Choice", "click.option", "click.option", "click.Choice"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 92 and ends at 113. It contains 14 lines of code and it has a cyclomatic complexity of 5. It takes 4 parameters, represented as [92.0] and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["open", "_get_default_log_level", "logging.basicConfig", "logger.debug", "ctx.ensure_object", "click.group", "click.version_option", "click.option", "click.Choice", "click.option", "click.option", "click.Choice"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_mcp_server | def cli_mcp_server():"""Start the AWS Deadline Cloud MCP (Model Context Protocol) server.The MCP server provides LLM tools with access to AWS Deadline Cloud operationsthrough the Model Context Protocol. This allows AI assistants to interact withDeadline Cloud services on your behalf.The server will run until interrupted with Ctrl+C or Ctrl+D.Note: This command requires MCP dependencies. Install them with:pip install 'deadline[mcp]'"""try:from ..._mcp.server import main as mcp_mainexcept ImportError:click.echo("Error: MCP dependencies not installed.\n""Please install them with: pip install 'deadline[mcp]'",err=True,)sys.exit(1)# Record server startup telemetrytry:telemetry_client = get_deadline_cloud_library_telemetry_client()telemetry_client.record_event(event_type="com.amazon.rum.deadline.mcp.server_startup",event_details={"usage_mode": "MCP", "startup_method": "cli"},)except Exception:# Don't let telemetry errors affect server startuppassmcp_main() | 3 | 19 | 0 | 72 | 1 | 17 | 51 | 17 | ['telemetry_client'] | None | {"Assign": 1, "Expr": 5, "Try": 2} | 6 | 35 | 6 | ["click.echo", "sys.exit", "get_deadline_cloud_library_telemetry_client", "telemetry_client.record_event", "mcp_main", "main.command"] | 0 | [] | The function (cli_mcp_server) defined within the public class called public.The function start at line 17 and ends at 51. It contains 19 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["click.echo", "sys.exit", "get_deadline_cloud_library_telemetry_client", "telemetry_client.record_event", "mcp_main", "main.command"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _pid_lock_temp_file_path | def _pid_lock_temp_file_path(pid_file_path: str) -> str:"""Construct the temporary file path used to populate the pid lock file before moving it to the lock file path."""return pid_file_path + f"{os.getpid()}~tmp" | 1 | 2 | 1 | 15 | 0 | 19 | 21 | 19 | pid_file_path | [] | str | {"Expr": 1, "Return": 1} | 1 | 3 | 1 | ["os.getpid"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._release_pid_lock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._try_acquire_pid_lock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_cleans_up_stale_temp_file"] | The function (_pid_lock_temp_file_path) defined within the public class called public.The function start at line 19 and ends at 21. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["os.getpid"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._release_pid_lock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._try_acquire_pid_lock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_cleans_up_stale_temp_file"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _claim_pid_lock_with_rename | def _claim_pid_lock_with_rename(tmp_file_name: str, pid_file_path: str) -> bool:# Atomic rename the temporary location to the pid lock file. This operation needs to be# atomic: it either succeeds or another process is holding the lock.## If the lock is not claimed, the temporary file will remain.try:if sys.platform.startswith("win32") or sys.platform.startswith("cygwin"):# On Windows systems, rename will raise if the destination file existsos.rename(tmp_file_name, pid_file_path)else:# On POSIX systems, link will raise if the destination file existsos.link(tmp_file_name, pid_file_path)os.remove(tmp_file_name)# Successfully claimed the lockreturn Trueexcept FileExistsError:# Another process was holding the lock, as the file exists.return False | 4 | 10 | 2 | 65 | 0 | 24 | 42 | 24 | tmp_file_name,pid_file_path | [] | bool | {"Expr": 3, "If": 1, "Return": 2, "Try": 1} | 5 | 19 | 5 | ["sys.platform.startswith", "sys.platform.startswith", "os.rename", "os.link", "os.remove"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._try_acquire_pid_lock"] | The function (_claim_pid_lock_with_rename) defined within the public class called public.The function start at line 24 and ends at 42. It contains 10 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [24.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["sys.platform.startswith", "sys.platform.startswith", "os.rename", "os.link", "os.remove"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py._try_acquire_pid_lock"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _try_acquire_pid_lock | def _try_acquire_pid_lock(pid_file_path: str,operation_name: str = "the operation",):"""Checks if the specified pid lock file exists and executes as per the following:If the pid lock file does not exist:It creates a new pid file and acquires lock for this pid.It handles concurrent processes trying to obtain lock such that only one process gets the lock and others failIf the pid lock file exists:If the process with its pid is still running, it raises an exception that a download is in progress alreadyIf the process with its pid is not running it deletes the pid lock file after taking a primitive file lock on itto handle concurrent processes making the same check. This will not work if primitive file locks are disabled.:param pid_file_full_path: full path of the pid lock file:return: boolean, True if pid lock was obtained successfully, throws an exception otherwise"""current_process_id: int = os.getpid()# Generate a tmp file for writing the pid file as a whole and prevent corrupt datatmp_file_path = _pid_lock_temp_file_path(pid_file_path)try:# Write the pid lock file to the temporary locationwith open(tmp_file_path, "w+") as f:f.write(str(current_process_id))# Claim the lock if possible. We do this first so that the normal path does the fewest operations.if _claim_pid_lock_with_rename(tmp_file_path, pid_file_path):return# If we could not claim the lock, inspect it to see whether it's stale and should be deleted,# for example if a previous run of the command was terminated from task manager and could not clean up.try:# Read the lock file. If it does not exist, this raises FileNotFoundError.with open(pid_file_path, "r") as f:try:lock_holder_pid = int(f.read())except ValueError:lock_holder_pid = -1# If the lock holder PID is not running, that means it exited without a proper shutdown,# and we should clear the lock. Under normal operation this will not occur. There is# a race if another process is running and discovers this at the same time. To minimize# the time window of the race, reading the lock, checking the pid, and removing the file are# done in sequence.if lock_holder_pid == -1:os.remove(pid_file_path)logger.warning("Pid lock file contains incorrect data. Deleted pid lock file.")elif not psutil.pid_exists(lock_holder_pid):os.remove(pid_file_path)logger.warning(f"Process with pid {lock_holder_pid} is not running. Deleted pid lock file.")else:raise PidLockAlreadyHeld(f"Unable to perform {operation_name} as process with pid {lock_holder_pid} already holds the lock {pid_file_path}")except FileNotFoundError:# In this case, the pid lock is free to acquirepass# After possibly cleaning up a stale lock, try claiming it againif _claim_pid_lock_with_rename(tmp_file_path, pid_file_path):returnelse:raise PidLockAlreadyHeld(f"Unable to perform {operation_name} as process with pid {lock_holder_pid} already holds the lock {pid_file_path}")finally:# Clean up the pid lock temporary file if necessaryif os.path.exists(tmp_file_path):try:os.remove(tmp_file_path)except OSError as e:logger.warning(f"Failed to clean up pid lock temporary file: {e}") | 10 | 43 | 2 | 192 | 2 | 45 | 121 | 45 | pid_file_path,operation_name | ['tmp_file_path', 'lock_holder_pid'] | None | {"AnnAssign": 1, "Assign": 3, "Expr": 8, "If": 5, "Return": 2, "Try": 4, "With": 2} | 20 | 77 | 20 | ["os.getpid", "_pid_lock_temp_file_path", "open", "f.write", "str", "_claim_pid_lock_with_rename", "open", "int", "f.read", "os.remove", "logger.warning", "psutil.pid_exists", "os.remove", "logger.warning", "PidLockAlreadyHeld", "_claim_pid_lock_with_rename", "PidLockAlreadyHeld", "os.path.exists", "os.remove", "logger.warning"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py.PidFileLock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_when_process_not_running", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_when_process_running", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_with_corrupt_pidfile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_acquire_and_release"] | The function (_try_acquire_pid_lock) defined within the public class called public.The function start at line 45 and ends at 121. It contains 43 lines of code and it has a cyclomatic complexity of 10. It takes 2 parameters, represented as [45.0] and does not return any value. It declares 20.0 functions, It has 20.0 functions called inside which are ["os.getpid", "_pid_lock_temp_file_path", "open", "f.write", "str", "_claim_pid_lock_with_rename", "open", "int", "f.read", "os.remove", "logger.warning", "psutil.pid_exists", "os.remove", "logger.warning", "PidLockAlreadyHeld", "_claim_pid_lock_with_rename", "PidLockAlreadyHeld", "os.path.exists", "os.remove", "logger.warning"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py.PidFileLock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_when_process_not_running", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_when_process_running", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_check_pid_lock_with_corrupt_pidfile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_acquire_and_release"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _release_pid_lock | def _release_pid_lock(pid_file_path: str):"""Releases the pid lock by deleting the pid file.:param pid_file_full_path: full path of the pid lock file:return: boolean, True if pid lock released successfully"""# Get the current process's id to obtain lockcurrent_process_id: int = os.getpid()# If an issue occurred with the temporary file during acquisition, delete it here.tmp_file_name = _pid_lock_temp_file_path(pid_file_path)if os.path.exists(tmp_file_name):try:os.remove(tmp_file_name)logger.warning(f"Cleaned up stale pid lock temporary file: {tmp_file_name}")except OSError as e:logger.warning(f"Failed to clean up pid lock temporary file: {e}")# Check if pid lock file does not exist. Returns if file doesn't exist as there is no lock to be released.if not os.path.exists(pid_file_path):logger.warning(f"Expected pid lock file does not exist at {pid_file_path}")return# Try to open pid file at download progress location in read modewith open(pid_file_path, "r") as f:# Read pid file and obtain the process id from file contentslock_holder_pid = f.read()if lock_holder_pid == str(current_process_id):# Process pid from file is same as current process pid - release pid lockos.remove(pid_file_path)else:# Process pid from file is different from current process pid.logger.warning(f"Another process with pid {lock_holder_pid} claimed the pid lock {pid_file_path} while {current_process_id} was holding it. Skipping pid file deletion.") | 5 | 20 | 1 | 119 | 2 | 124 | 160 | 124 | pid_file_path | ['tmp_file_name', 'lock_holder_pid'] | None | {"AnnAssign": 1, "Assign": 2, "Expr": 7, "If": 3, "Return": 1, "Try": 1, "With": 1} | 13 | 37 | 13 | ["os.getpid", "_pid_lock_temp_file_path", "os.path.exists", "os.remove", "logger.warning", "logger.warning", "os.path.exists", "logger.warning", "open", "f.read", "str", "os.remove", "logger.warning"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py.PidFileLock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_acquire_and_release", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_cleans_up_stale_temp_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_when_lock_not_held", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_when_pid_does_not_match"] | The function (_release_pid_lock) defined within the public class called public.The function start at line 124 and ends at 160. It contains 20 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value. It declares 13.0 functions, It has 13.0 functions called inside which are ["os.getpid", "_pid_lock_temp_file_path", "os.path.exists", "os.remove", "logger.warning", "logger.warning", "os.path.exists", "logger.warning", "open", "f.read", "str", "os.remove", "logger.warning"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._pid_file_lock_py.PidFileLock", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_acquire_and_release", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_cleans_up_stale_temp_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_when_lock_not_held", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_release_pid_lock_when_pid_does_not_match"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | PidFileLock | def PidFileLock(lock_file_path: str,operation_name: str = "the operation",):"""A context manager for holding a pid (process id) lock file during the scope of a 'with' statement.A pid lock file lets you prevent concurrent execution of the same CLI command. For example,a command to repeatedly download the new output available from a Deadline Cloud queue coulduse this to ensure only one running command is calculating and downloading what to output at a time.Example:with PidLockFile("/path/to/lock/file", operation_name="incremental output download"):# Code to load the checkpoint, do the download, save the new checkpoint...Args:lock_file_path (str): The file system path of the PID lock file.operation_name (Optional[str]): The name of the operation being performed in the lock, used for error messages."""_try_acquire_pid_lock(lock_file_path, operation_name)try:yield Nonefinally:_release_pid_lock(lock_file_path) | 2 | 9 | 2 | 31 | 0 | 164 | 186 | 164 | lock_file_path,operation_name | [] | None | {"Expr": 4, "Try": 1} | 2 | 23 | 2 | ["_try_acquire_pid_lock", "_release_pid_lock"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_contextmanager_acquire_and_release"] | The function (PidFileLock) defined within the public class called public.The function start at line 164 and ends at 186. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [164.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["_try_acquire_pid_lock", "_release_pid_lock"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_pid_file_lock_py.test_pidlock_contextmanager_acquire_and_release"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main() -> None:from deadline.client.cli._deadline_cli import mainmain(# Override the program name to always be "deadline"prog_name="deadline",) | 1 | 5 | 0 | 23 | 0 | 9 | 15 | 9 | [] | None | {"Expr": 1} | 1 | 7 | 1 | ["main"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 9 and ends at 15. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["main"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main() -> None:import argparseimport loggingfrom deadline.client.ui.dev_application import appparser = argparse.ArgumentParser()parser.add_argument("--debug", help="Enable debug logging", action="store_true")args = parser.parse_args()if args.debug:log_level = "DEBUG"else:log_level = get_setting("settings.log_level")logging.basicConfig(level=log_level)app() | 2 | 13 | 0 | 75 | 3 | 12 | 28 | 12 | ['parser', 'args', 'log_level'] | None | {"Assign": 4, "Expr": 3, "If": 1} | 6 | 17 | 6 | ["argparse.ArgumentParser", "parser.add_argument", "parser.parse_args", "get_setting", "logging.basicConfig", "app"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 12 and ends at 28. It contains 13 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "parser.parse_args", "get_setting", "logging.basicConfig", "app"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. | |
aws-deadline_deadline-cloud | SigIntHandler | public | 0 | 0 | __new__ | def __new__(cls, *args, **kwargs):if not cls._instance:cls._instance = super().__new__(cls)cls._instance.continue_operation = Truesignal.signal(signal.SIGINT, cls._instance._handle_sigint)return cls._instance | 2 | 6 | 3 | 54 | 0 | 15 | 20 | 15 | cls,*args,**kwargs | [] | Returns | {"Assign": 2, "Expr": 1, "If": 1, "Return": 1} | 3 | 6 | 3 | ["__new__", "super", "signal.signal"] | 30 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.pyoko.modelmeta_py.ModelMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.documentation_py.DefaultDoc.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.output.dockertime_py.DockerTime.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.norm_py.Normalized.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.TranslatableModelFormMetaclass.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3657361_openstack_archive_syntribos.syntribos.tests.base_py.TestType.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718498_nylas_nylas_python.nylas.models.response_py.ListResponse.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718498_nylas_nylas_python.nylas.models.response_py.Response.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3721936_jaraco_configparser.backports.configparser.__init___py._Line.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2.uiautomator2.xpath_py.XPath.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953346_rsheftel_pandas_market_calendars.pandas_market_calendars.calendars.mirror_py.TradingCalendar.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953346_rsheftel_pandas_market_calendars.pandas_market_calendars.class_registry_py.RegisteryMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.key_py.CPubKey.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScript.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScriptOp.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985971_pyusb_pyusb.usb._objfinalizer_py.AutoFinalizedObject.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985971_pyusb_pyusb.usb._objfinalizer_py._AutoFinalizedObjectBase.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.57092491_pinax_pinax_stripe_light.pinax.stripe.webhooks.base_py.Registerable.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69501567_pandora_analysis_pandora.pandora.storage_client_py.Storage.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy._constants._utils_py.ABCEnumMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.tests.conftest_py.PlotTesterMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.box.box_py.Box.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.79438799_jazzband_django_oauth_toolkit.oauth2_provider.views.mixins_py.ReadWriteScopedResourceMixin.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94530662_scikit_build_scikit_build_core.src.scikit_build_core.settings.skbuild_model_py.CMakeSettingsDefine.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.tvm.recommendation.pytorch.deepctr_torch.inputs_py.DenseFeat.__new__"] | The function (__new__) defined within the public class called SigIntHandler.The function start at line 15 and ends at 20. It contains 6 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [15.0], and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["__new__", "super", "signal.signal"], It has 30.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.pyoko.modelmeta_py.ModelMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.documentation_py.DefaultDoc.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.output.dockertime_py.DockerTime.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.norm_py.Normalized.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.TranslatableModelFormMetaclass.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3657361_openstack_archive_syntribos.syntribos.tests.base_py.TestType.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718498_nylas_nylas_python.nylas.models.response_py.ListResponse.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718498_nylas_nylas_python.nylas.models.response_py.Response.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3721936_jaraco_configparser.backports.configparser.__init___py._Line.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2.uiautomator2.xpath_py.XPath.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953346_rsheftel_pandas_market_calendars.pandas_market_calendars.calendars.mirror_py.TradingCalendar.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953346_rsheftel_pandas_market_calendars.pandas_market_calendars.class_registry_py.RegisteryMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.key_py.CPubKey.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScript.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScriptOp.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985971_pyusb_pyusb.usb._objfinalizer_py.AutoFinalizedObject.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985971_pyusb_pyusb.usb._objfinalizer_py._AutoFinalizedObjectBase.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.57092491_pinax_pinax_stripe_light.pinax.stripe.webhooks.base_py.Registerable.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69501567_pandora_analysis_pandora.pandora.storage_client_py.Storage.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy._constants._utils_py.ABCEnumMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.tests.conftest_py.PlotTesterMeta.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.box.box_py.Box.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.79438799_jazzband_django_oauth_toolkit.oauth2_provider.views.mixins_py.ReadWriteScopedResourceMixin.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94530662_scikit_build_scikit_build_core.src.scikit_build_core.settings.skbuild_model_py.CMakeSettingsDefine.__new__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.tvm.recommendation.pytorch.deepctr_torch.inputs_py.DenseFeat.__new__"]. |
aws-deadline_deadline-cloud | SigIntHandler | public | 0 | 0 | _handle_sigint | def _handle_sigint(self, signum, frame):self.continue_operation = False | 1 | 2 | 3 | 14 | 0 | 22 | 23 | 22 | self,signum,frame | [] | None | {"Assign": 1} | 0 | 2 | 0 | [] | 0 | [] | The function (_handle_sigint) defined within the public class called SigIntHandler.The function start at line 22 and ends at 23. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [22.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_attachment | def cli_attachment():"""Commands to work with Deadline Cloud Job Attachments.""" | 1 | 1 | 0 | 5 | 0 | 33 | 36 | 33 | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["main.group"] | 0 | [] | The function (cli_attachment) defined within the public class called public.The function start at line 33 and ends at 36. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["main.group"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | attachment_download | def attachment_download(manifests: list[str],s3_root_uri: str,path_mapping_rules: str,json: bool,**args,):"""Download data files of manifest root(s) to a machine for given manifest(s) from S3."""logger: ClickLogger = ClickLogger(is_json=json)# Setup configconfig = _apply_cli_options_to_config(**args)# Assuming when passing with config, session constructs from the profile id for S3 calls# TODO - add type for profile, if queue type, get queue session directlyboto3_session: boto3.session = api.get_boto3_session(config=config)# If profile is not provided via args, default to use local config fileif not args.pop("profile", None):queue_id: str = config_file.get_setting("defaults.queue_id", config=config)farm_id: str = config_file.get_setting("defaults.farm_id", config=config)s3_settings: Optional[JobAttachmentS3Settings] = get_queue(farm_id=farm_id,queue_id=queue_id,session=boto3_session,).jobAttachmentSettingsif not s3_settings:raise MissingJobAttachmentSettingsError(f"Queue {queue_id} has no attachment settings")s3_root_uri = s3_settings.to_s3_root_uri()deadline_client = boto3_session.client("deadline")boto3_session = api.get_queue_user_boto3_session(deadline=deadline_client, config=config)if not s3_root_uri:raise MissingJobAttachmentSettingsError("No valid s3 root path available")# Apply conflict resolution setting from Config.conflict_resolution = FileConflictResolution.CREATE_COPYconflict_resolution_setting = config_file.get_setting("settings.conflict_resolution", config=config)if (conflict_resolution_settingand conflict_resolution_setting != FileConflictResolution.NOT_SELECTED.name):conflict_resolution = FileConflictResolution[conflict_resolution_setting]_attachment_download(manifests=manifests,s3_root_uri=s3_root_uri,boto3_session=boto3_session,path_mapping_rules=path_mapping_rules,logger=logger,conflict_resolution=conflict_resolution,) | 6 | 42 | 5 | 232 | 6 | 76 | 134 | 76 | manifests,s3_root_uri,path_mapping_rules,json,**args | ['config', 'deadline_client', 'conflict_resolution_setting', 'conflict_resolution', 'boto3_session', 's3_root_uri'] | None | {"AnnAssign": 5, "Assign": 7, "Expr": 2, "If": 4} | 24 | 59 | 24 | ["ClickLogger", "_apply_cli_options_to_config", "api.get_boto3_session", "args.pop", "config_file.get_setting", "config_file.get_setting", "get_queue", "MissingJobAttachmentSettingsError", "s3_settings.to_s3_root_uri", "boto3_session.client", "api.get_queue_user_boto3_session", "MissingJobAttachmentSettingsError", "config_file.get_setting", "_attachment_download", "cli_attachment.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option"] | 0 | [] | The function (attachment_download) defined within the public class called public.The function start at line 76 and ends at 134. It contains 42 lines of code and it has a cyclomatic complexity of 6. It takes 5 parameters, represented as [76.0] and does not return any value. It declares 24.0 functions, and It has 24.0 functions called inside which are ["ClickLogger", "_apply_cli_options_to_config", "api.get_boto3_session", "args.pop", "config_file.get_setting", "config_file.get_setting", "get_queue", "MissingJobAttachmentSettingsError", "s3_settings.to_s3_root_uri", "boto3_session.client", "api.get_queue_user_boto3_session", "MissingJobAttachmentSettingsError", "config_file.get_setting", "_attachment_download", "cli_attachment.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | attachment_upload | def attachment_upload(manifests: list[str],root_dirs: list[str],path_mapping_rules: str,s3_root_uri: str,upload_manifest_path: str,json: bool,**args,):"""Upload output files to s3. The files always include data files, optionally upload manifests prefixed by given path."""logger: ClickLogger = ClickLogger(is_json=json)# Setup configconfig = _apply_cli_options_to_config(**args)# Assuming when passing with config, session constructs from the profile id for S3 calls# TODO - add type for profile, if queue type, get queue session directlyboto3_session: boto3.session = api.get_boto3_session(config=config)# If profile is not provided via args, default to use local config fileif not args.pop("profile", None):queue_id: str = config_file.get_setting("defaults.queue_id", config=config)farm_id: str = config_file.get_setting("defaults.farm_id", config=config)s3_settings: Optional[JobAttachmentS3Settings] = get_queue(farm_id=farm_id,queue_id=queue_id,session=boto3_session,).jobAttachmentSettingsif not s3_settings:raise MissingJobAttachmentSettingsError(f"Queue {queue_id} has no attachment settings")s3_root_uri = s3_settings.to_s3_root_uri()deadline_client = boto3_session.client("deadline")boto3_session = api.get_queue_user_boto3_session(deadline=deadline_client, config=config)if not s3_root_uri:raise MissingJobAttachmentSettingsError("No valid s3 root path available")_attachment_upload(root_dirs=root_dirs,manifests=manifests,s3_root_uri=s3_root_uri,boto3_session=boto3_session,path_mapping_rules=path_mapping_rules,upload_manifest_path=upload_manifest_path,logger=logger,) | 4 | 36 | 7 | 211 | 4 | 168 | 218 | 168 | manifests,root_dirs,path_mapping_rules,s3_root_uri,upload_manifest_path,json,**args | ['boto3_session', 'deadline_client', 'config', 's3_root_uri'] | None | {"AnnAssign": 5, "Assign": 4, "Expr": 2, "If": 3} | 23 | 51 | 23 | ["ClickLogger", "_apply_cli_options_to_config", "api.get_boto3_session", "args.pop", "config_file.get_setting", "config_file.get_setting", "get_queue", "MissingJobAttachmentSettingsError", "s3_settings.to_s3_root_uri", "boto3_session.client", "api.get_queue_user_boto3_session", "MissingJobAttachmentSettingsError", "_attachment_upload", "cli_attachment.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option"] | 0 | [] | The function (attachment_upload) defined within the public class called public.The function start at line 168 and ends at 218. It contains 36 lines of code and it has a cyclomatic complexity of 4. It takes 7 parameters, represented as [168.0] and does not return any value. It declares 23.0 functions, and It has 23.0 functions called inside which are ["ClickLogger", "_apply_cli_options_to_config", "api.get_boto3_session", "args.pop", "config_file.get_setting", "config_file.get_setting", "get_queue", "MissingJobAttachmentSettingsError", "s3_settings.to_s3_root_uri", "boto3_session.client", "api.get_queue_user_boto3_session", "MissingJobAttachmentSettingsError", "_attachment_upload", "cli_attachment.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _cli_on_pending_authorization | def _cli_on_pending_authorization(**kwargs):"""Callback for `login`, to tell the user that Deadline Cloud monitor is opening"""if kwargs["credentials_source"] == AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGIN:click.echo("Opening Deadline Cloud monitor. Please log in and then return here.") | 2 | 3 | 1 | 23 | 0 | 26 | 32 | 26 | **kwargs | [] | None | {"Expr": 2, "If": 1} | 1 | 7 | 1 | ["click.echo"] | 0 | [] | The function (_cli_on_pending_authorization) defined within the public class called public.The function start at line 26 and ends at 32. It contains 3 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["click.echo"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_auth | def cli_auth():"""Commands to handle authentication.""" | 1 | 1 | 0 | 5 | 0 | 37 | 40 | 37 | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["main.group"] | 0 | [] | The function (cli_auth) defined within the public class called public.The function start at line 37 and ends at 40. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["main.group"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | auth_login | def auth_login():"""Logs in to the Deadline-configured AWS profile.This is for any profile type that Deadline knows how to login toCurrently only supports Deadline Cloud monitor"""click.echo(f"Logging into AWS Profile {config_file.get_setting('defaults.aws_profile_name')!r} for AWS Deadline Cloud")message = api.login(on_pending_authorization=_cli_on_pending_authorization, on_cancellation_check=None)click.echo(f"\nSuccessfully logged in: {message}\n") | 1 | 8 | 0 | 33 | 1 | 45 | 60 | 45 | ['message'] | None | {"Assign": 1, "Expr": 3} | 5 | 16 | 5 | ["click.echo", "config_file.get_setting", "api.login", "click.echo", "cli_auth.command"] | 0 | [] | The function (auth_login) defined within the public class called public.The function start at line 45 and ends at 60. It contains 8 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["click.echo", "config_file.get_setting", "api.login", "click.echo", "cli_auth.command"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | auth_logout | def auth_logout():"""Logs out of the Deadline Cloud monitor configured AWS profile."""api.logout()click.echo("Successfully logged out of all Deadline Cloud monitor AWS profiles") | 1 | 3 | 0 | 16 | 0 | 65 | 71 | 65 | [] | None | {"Expr": 3} | 3 | 7 | 3 | ["api.logout", "click.echo", "cli_auth.command"] | 0 | [] | The function (auth_logout) defined within the public class called public.The function start at line 65 and ends at 71. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["api.logout", "click.echo", "cli_auth.command"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | auth_status | def auth_status(output, **args):"""Gets the authentication status for the given AWS profile"""# Get a temporary config object with the standard options handledconfig = _apply_cli_options_to_config(**args)profile_name = get_setting("defaults.aws_profile_name", config=config)is_json_format = True if output == "json" else Falsewith _modified_logging_level(logging.getLogger("deadline.client.api"), logging.CRITICAL):# always returns enum in AwsCredentialsSourcecreds_source = api.get_credentials_source(config=config)creds_source_result = creds_source.name# always returns enum in AwsAuthenticationStatusauth_status = api.check_authentication_status(config=config)auth_status_results = auth_status.name# always returns True/Falseapi_availability_result = api.check_deadline_api_available(config=config)if not is_json_format:width = 17click.echo(f"{'Profile Name:': >{width}} {profile_name}")click.echo(f"{'Source:': >{width}} {creds_source_result}")click.echo(f"{'Status:': >{width}} {auth_status_results}")click.echo(f"{'API Availability:': >{width}} {api_availability_result}")else:json_output = {JSON_FIELD_PROFILE_NAME: profile_name,JSON_FIELD_CREDS_SOURCE: creds_source_result,JSON_FIELD_AUTH_STATUS: auth_status_results,JSON_FIELD_AUTH_API_AVAILABLE: api_availability_result,}click.echo(json.dumps(json_output, ensure_ascii=True)) | 3 | 24 | 2 | 162 | 10 | 89 | 121 | 89 | output,**args | ['profile_name', 'config', 'creds_source_result', 'auth_status_results', 'json_output', 'creds_source', 'api_availability_result', 'auth_status', 'width', 'is_json_format'] | None | {"Assign": 10, "Expr": 6, "If": 1, "With": 1} | 17 | 33 | 17 | ["_apply_cli_options_to_config", "get_setting", "_modified_logging_level", "logging.getLogger", "api.get_credentials_source", "api.check_authentication_status", "api.check_deadline_api_available", "click.echo", "click.echo", "click.echo", "click.echo", "click.echo", "json.dumps", "cli_auth.command", "click.option", "click.option", "click.Choice"] | 0 | [] | The function (auth_status) defined within the public class called public.The function start at line 89 and ends at 121. It contains 24 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [89.0] and does not return any value. It declares 17.0 functions, and It has 17.0 functions called inside which are ["_apply_cli_options_to_config", "get_setting", "_modified_logging_level", "logging.getLogger", "api.get_credentials_source", "api.check_authentication_status", "api.check_deadline_api_available", "click.echo", "click.echo", "click.echo", "click.echo", "click.echo", "json.dumps", "cli_auth.command", "click.option", "click.option", "click.Choice"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_bundle | def cli_bundle():"""Commands to work with Open Job Description job bundles.""" | 1 | 1 | 0 | 5 | 0 | 47 | 50 | 47 | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["main.group"] | 0 | [] | The function (cli_bundle) defined within the public class called public.The function start at line 47 and ends at 50. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["main.group"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | validate_parameters | def validate_parameters(ctx, param, value):"""Validate provided --parameter values, ensuring that they are in the format "ParamName=Value", and convert them to a dict with thefollowing format:[{"name": "<name>", "value": "<value>"}, ...]"""parameters_split = []for parameter in value:regex_match = re.match("([^=]+)=(.*)", parameter)if not regex_match:raise click.BadParameter(f'Parameters must be provided in the format "ParamName=Value". Invalid parameter: {parameter}')if not re.match(_openjd_identifier_regex, regex_match[1]):raise click.BadParameter(f"Parameter names must be alphanumeric Open Job Description identifiers. Invalid parameter name: {regex_match[1]}")parameters_split.append({"name": regex_match[1], "value": regex_match[2]})return parameters_split | 4 | 14 | 3 | 85 | 2 | 57 | 78 | 57 | ctx,param,value | ['regex_match', 'parameters_split'] | Returns | {"Assign": 2, "Expr": 2, "For": 1, "If": 2, "Return": 1} | 5 | 22 | 5 | ["re.match", "click.BadParameter", "re.match", "click.BadParameter", "parameters_split.append"] | 0 | [] | The function (validate_parameters) defined within the public class called public.The function start at line 57 and ends at 78. It contains 14 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [57.0], and this function return a value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["re.match", "click.BadParameter", "re.match", "click.BadParameter", "parameters_split.append"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _interactive_confirmation_prompt | def _interactive_confirmation_prompt(message: str, default_response: bool) -> bool:"""Callback to decide if submission should continue or be canceled. Returns True to continue, False to cancel.Args:warning_message (str): The warning message to display.default_response (bool): The default to present as the response (True to continue, False to cancel)."""return click.confirm(message,default=default_response,) | 1 | 5 | 2 | 26 | 0 | 81 | 92 | 81 | message,default_response | [] | bool | {"Expr": 1, "Return": 1} | 1 | 12 | 1 | ["click.confirm"] | 0 | [] | The function (_interactive_confirmation_prompt) defined within the public class called public.The function start at line 81 and ends at 92. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [81.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["click.confirm"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | bundle_submit._check_create_job_wait_canceled | def _check_create_job_wait_canceled() -> bool:return sigint_handler.continue_operation | 1 | 2 | 0 | 10 | 0 | 201 | 202 | 201 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (bundle_submit._check_create_job_wait_canceled) defined within the public class called public.The function start at line 201 and ends at 202. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | bundle_submit | def bundle_submit(job_bundle_dir,job_attachments_file_system,parameter,known_asset_path,name,priority,max_failed_tasks_count,max_retries_per_task,max_worker_count,target_task_run_status,require_paths_exist,submitter_name,save_debug_snapshot,**args,):"""Submits an Open Job Description job bundle."""# Apply the CLI args to the configconfig = _apply_cli_options_to_config(required_options={"farm_id", "queue_id"}, **args)hash_callback_manager = _ProgressBarCallbackManager(length=100, label="Hashing Attachments")upload_callback_manager = _ProgressBarCallbackManager(length=100, label="Uploading Attachments")def _check_create_job_wait_canceled() -> bool:return sigint_handler.continue_operationtry:snapshot_tmpdir = Noneif save_debug_snapshot:save_debug_snapshot = os.path.abspath(save_debug_snapshot)# If the debug snapshot is to a zip file, first put it in a temporary directoryif save_debug_snapshot.endswith(".zip"):snapshot_tmpdir = tempfile.TemporaryDirectory()job_id = api.create_job_from_job_bundle(job_bundle_dir=job_bundle_dir,job_parameters=parameter,name=name,job_attachments_file_system=job_attachments_file_system,config=config,priority=priority,max_failed_tasks_count=max_failed_tasks_count,max_retries_per_task=max_retries_per_task,max_worker_count=max_worker_count,target_task_run_status=target_task_run_status,hashing_progress_callback=hash_callback_manager.callback,upload_progress_callback=upload_callback_manager.callback,create_job_result_callback=_check_create_job_wait_canceled,print_function_callback=click.echo,interactive_confirmation_callback=_interactive_confirmation_prompt,require_paths_exist=require_paths_exist,submitter_name=submitter_name or "CLI",known_asset_paths=known_asset_path,debug_snapshot_dir=snapshot_tmpdir.name if snapshot_tmpdir else save_debug_snapshot,)if snapshot_tmpdir:# Put the snapshot in a zip fileos.makedirs(os.path.dirname(save_debug_snapshot), exist_ok=True)shutil.make_archive(save_debug_snapshot, "zip", snapshot_tmpdir.name)if save_debug_snapshot:click.echo("Saved job debug snapshot:")click.echo(f"{save_debug_snapshot}")# Check Whether the CLI options are modifying any of the default settings that affect# the job id. If not, we'll save the job id submitted as the default job id.# If a job snapshot directory was provided, the job_id will be None.if (args.get("profile") is Noneand args.get("farm_id") is Noneand args.get("queue_id") is Noneand args.get("storage_profile_id") is Noneand job_id):config_file.set_setting("defaults.job_id", job_id)except AssetSyncCancelledError as exc:if sigint_handler.continue_operation:raise DeadlineOperationError(f"Job submission unexpectedly canceled:\n{exc}") from excelse:click.echo("Job submission canceled.")sys.exit(1)except AssetSyncError as exc:raise DeadlineOperationError(f"Failed to upload job attachments:\n{exc}") from excexcept CreateJobWaiterCanceled as exc:if sigint_handler.continue_operation:raise DeadlineOperationError(f"Unexpectedly canceled during wait for final status of CreateJob:\n{exc}") from excelse:click.echo("Canceled waiting for final status of CreateJob.")sys.exit(1)except ClientError as exc:raise DeadlineOperationError(f"Failed to submit the job bundle to AWS Deadline Cloud:\n{exc}") from excexcept MisconfiguredInputsError as exc:click.echo(str(exc))click.echo("Job submission canceled.")sys.exit(1)except Exception as exc:api.get_deadline_cloud_library_telemetry_client().record_error(event_details={"exception_scope": "on_submit"},exception_type=str(type(exc)),)raisefinally:if snapshot_tmpdir:snapshot_tmpdir.cleanup() | 22 | 94 | 14 | 461 | 6 | 176 | 288 | 176 | job_bundle_dir,job_attachments_file_system,parameter,known_asset_path,name,priority,max_failed_tasks_count,max_retries_per_task,max_worker_count,target_task_run_status,require_paths_exist,submitter_name,save_debug_snapshot,**args | ['config', 'snapshot_tmpdir', 'upload_callback_manager', 'hash_callback_manager', 'save_debug_snapshot', 'job_id'] | Returns | {"Assign": 7, "Expr": 15, "If": 8, "Return": 1, "Try": 1} | 55 | 113 | 55 | ["_apply_cli_options_to_config", "_ProgressBarCallbackManager", "_ProgressBarCallbackManager", "os.path.abspath", "save_debug_snapshot.endswith", "tempfile.TemporaryDirectory", "api.create_job_from_job_bundle", "os.makedirs", "os.path.dirname", "shutil.make_archive", "click.echo", "click.echo", "args.get", "args.get", "args.get", "args.get", "config_file.set_setting", "DeadlineOperationError", "click.echo", "sys.exit", "DeadlineOperationError", "DeadlineOperationError", "click.echo", "sys.exit", "DeadlineOperationError", "click.echo", "str", "click.echo", "sys.exit", "record_error", "api.get_deadline_cloud_library_telemetry_client", "str", "type", "snapshot_tmpdir.cleanup", "cli_bundle.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option", "click.Choice", "click.option", "click.option", "click.option", "click.option", "click.option", "click.argument"] | 0 | [] | The function (bundle_submit) defined within the public class called public.The function start at line 176 and ends at 288. It contains 94 lines of code and it has a cyclomatic complexity of 22. It takes 14 parameters, represented as [176.0], and this function return a value. It declares 55.0 functions, and It has 55.0 functions called inside which are ["_apply_cli_options_to_config", "_ProgressBarCallbackManager", "_ProgressBarCallbackManager", "os.path.abspath", "save_debug_snapshot.endswith", "tempfile.TemporaryDirectory", "api.create_job_from_job_bundle", "os.makedirs", "os.path.dirname", "shutil.make_archive", "click.echo", "click.echo", "args.get", "args.get", "args.get", "args.get", "config_file.set_setting", "DeadlineOperationError", "click.echo", "sys.exit", "DeadlineOperationError", "DeadlineOperationError", "click.echo", "sys.exit", "DeadlineOperationError", "click.echo", "str", "click.echo", "sys.exit", "record_error", "api.get_deadline_cloud_library_telemetry_client", "str", "type", "snapshot_tmpdir.cleanup", "cli_bundle.command", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option", "click.Choice", "click.option", "click.option", "click.option", "click.option", "click.option", "click.argument"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | bundle_gui_submit | def bundle_gui_submit(parameter, job_bundle_dir, browse, output, install_gui, submitter_name, known_asset_path, **args):"""Opens a GUI to submit an Open Job Description job bundle."""from ...ui import gui_context_for_cliwith gui_context_for_cli(automatically_install_dependencies=install_gui) as app:from ...ui.job_bundle_submitter import show_job_bundle_submitterif not job_bundle_dir and not browse:raise DeadlineOperationError("Specify a job bundle directory or run the bundle command with the --browse flag")output = output.lower()submitter = show_job_bundle_submitter(input_job_bundle_dir=job_bundle_dir,browse=browse,submitter_name=submitter_name,known_asset_paths=known_asset_path,job_parameters=parameter,)if not submitter:returnsubmitter.show()app.exec()_print_response(output=output,job_bundle_dir=job_bundle_dir,job_history_bundle_dir=submitter.job_history_bundle_dir,job_id=submitter.job_id,) | 4 | 28 | 8 | 125 | 2 | 338 | 375 | 338 | parameter,job_bundle_dir,browse,output,install_gui,submitter_name,known_asset_path,**args | ['output', 'submitter'] | None | {"Assign": 2, "Expr": 4, "If": 2, "Return": 1, "With": 1} | 16 | 38 | 16 | ["gui_context_for_cli", "DeadlineOperationError", "output.lower", "show_job_bundle_submitter", "submitter.show", "app.exec", "_print_response", "cli_bundle.command", "click.option", "click.argument", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option"] | 0 | [] | The function (bundle_gui_submit) defined within the public class called public.The function start at line 338 and ends at 375. It contains 28 lines of code and it has a cyclomatic complexity of 4. It takes 8 parameters, represented as [338.0] and does not return any value. It declares 16.0 functions, and It has 16.0 functions called inside which are ["gui_context_for_cli", "DeadlineOperationError", "output.lower", "show_job_bundle_submitter", "submitter.show", "app.exec", "_print_response", "cli_bundle.command", "click.option", "click.argument", "click.option", "click.option", "click.option", "click.option", "click.Choice", "click.option"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _print_response | def _print_response(output: str,job_bundle_dir: str,job_history_bundle_dir: Optional[str],job_id: Optional[str],):if output == "json":if job_id:response: dict[str, Any] = {"status": "SUBMITTED","jobId": job_id,"jobHistoryBundleDirectory": job_history_bundle_dir,}click.echo(json.dumps(response))else:click.echo(json.dumps({"status": "CANCELED"}))else:if job_id:click.echo("Submitted job bundle:")click.echo(f" {job_bundle_dir}")click.echo(f"Job ID: {job_id}")else:click.echo("Job submission canceled.") | 4 | 23 | 4 | 118 | 0 | 378 | 400 | 378 | output,job_bundle_dir,job_history_bundle_dir,job_id | [] | None | {"AnnAssign": 1, "Expr": 6, "If": 3} | 8 | 23 | 8 | ["click.echo", "json.dumps", "click.echo", "json.dumps", "click.echo", "click.echo", "click.echo", "click.echo"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_gui_submit"] | The function (_print_response) defined within the public class called public.The function start at line 378 and ends at 400. It contains 23 lines of code and it has a cyclomatic complexity of 4. It takes 4 parameters, represented as [378.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["click.echo", "json.dumps", "click.echo", "json.dumps", "click.echo", "click.echo", "click.echo", "click.echo"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.bundle_group_py.bundle_gui_submit"]. |
aws-deadline_deadline-cloud | ClickLogger | public | 0 | 0 | __init__ | def __init__(self, is_json: bool):self._is_json = is_json | 1 | 2 | 2 | 14 | 0 | 15 | 16 | 15 | self,is_json | [] | None | {"Assign": 1} | 0 | 2 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called ClickLogger.The function start at line 15 and ends at 16. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [15.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | ClickLogger | public | 0 | 0 | is_json | def is_json(self) -> bool:"""Is logging in JSON mode."""return self._is_json | 1 | 2 | 1 | 12 | 0 | 18 | 22 | 18 | self | [] | bool | {"Expr": 1, "Return": 1} | 0 | 5 | 0 | [] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.ee.db_connections.type_infer_py.get_parser", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_empty_content_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_non_json_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_valid_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_wrong_extension"] | The function (is_json) defined within the public class called ClickLogger.The function start at line 18 and ends at 22. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.ee.db_connections.type_infer_py.get_parser", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_empty_content_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_non_json_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_valid_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.tests.test_mime_py.TestIsJsonFunction.test_is_json_with_wrong_extension"]. |
aws-deadline_deadline-cloud | ClickLogger | public | 0 | 0 | echo | def echo(self,message: t.Optional[t.Any] = None,file: t.Optional[t.IO[t.Any]] = None,nl: bool = True,err: bool = False,color: t.Optional[bool] = None,):if not self._is_json:click.echo(message, file, nl, err, color) | 2 | 10 | 6 | 80 | 0 | 24 | 33 | 24 | self,message,file,nl,err,color | [] | None | {"Expr": 1, "If": 1} | 1 | 10 | 1 | ["click.echo"] | 18 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.cloud.tunnel_client_py.launch_rathole_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.bashcomplete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.do_complete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.do_complete_fish", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._termui_impl_py.ProgressBar.render_progress", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.BaseCommand.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Command.get_help_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Command.parse_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.MultiCommand.parse_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py._maybe_show_deprecated_notice", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.decorators_py.help_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.decorators_py.version_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.exceptions_py.ClickException.show", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.exceptions_py.UsageError.show", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.confirm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.pause", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.prompt", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.secho"] | The function (echo) defined within the public class called ClickLogger.The function start at line 24 and ends at 33. It contains 10 lines of code and it has a cyclomatic complexity of 2. It takes 6 parameters, represented as [24.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["click.echo"], It has 18.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.cloud.tunnel_client_py.launch_rathole_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.bashcomplete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.do_complete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._bashcomplete_py.do_complete_fish", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click._termui_impl_py.ProgressBar.render_progress", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.BaseCommand.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Command.get_help_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.Command.parse_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py.MultiCommand.parse_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.core_py._maybe_show_deprecated_notice", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.decorators_py.help_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.decorators_py.version_option", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.exceptions_py.ClickException.show", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.exceptions_py.UsageError.show", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.confirm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.pause", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.prompt", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.vendor.click.termui_py.secho"]. |
aws-deadline_deadline-cloud | ClickLogger | public | 0 | 0 | json | def json(self,message: t.Optional[dict] = None,file: t.Optional[t.IO[t.Any]] = None,nl: bool = True,err: bool = False,color: t.Optional[bool] = None,indent=None,):if self._is_json:click.echo(json.dumps(obj=message, indent=indent), file, nl, err, color) | 2 | 11 | 7 | 92 | 0 | 35 | 45 | 35 | self,message,file,nl,err,color,indent | [] | None | {"Expr": 1, "If": 1} | 2 | 11 | 2 | ["click.echo", "json.dumps"] | 60 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.binance_api_utils_py.get_current_price", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.report_py.get_report", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.19906768_dragon_userbot_dragon_userbot.modules.loader_py.load_all_mods", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3548418_mozilla_mozilla_ci_tools.mozci.taskcluster.tc_py.get_latest_full_task", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3548418_mozilla_mozilla_ci_tools.mozci.taskcluster.tc_py.validate_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3708613_moogar0880_pytrakt.trakt.core_py.get_device_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.device", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.magisk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.twrp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.misc_py.ud", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.misc_py.wall", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.runtest", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.test_self_driver", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2.examples.batteryweb.main_py.battery_level", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.android_py.magisk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.gen_direct_links_py.yandex_disk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.qrcode_py.parseqr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961680_n4s4_synology_api.synology_api.auth_py.Authentication.get_api_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961752_openfun_openedx_docker.gitlint.gitlint_emoji_py.GitmojiTitle.validate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_user", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.utils_py.get_manifest", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.condor_crab_unique_users_py.get_crab_unique_users", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.condor_hs06coreHrPlot_py.get_hs06CpuTImeHr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.dbs_hdfs_crab_py.get_crab_popularity_ds"] | The function (json) defined within the public class called ClickLogger.The function start at line 35 and ends at 45. It contains 11 lines of code and it has a cyclomatic complexity of 2. It takes 7 parameters, represented as [35.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["click.echo", "json.dumps"], It has 60.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.binance_api_utils_py.get_current_price", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.report_py.get_report", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.19906768_dragon_userbot_dragon_userbot.modules.loader_py.load_all_mods", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3548418_mozilla_mozilla_ci_tools.mozci.taskcluster.tc_py.get_latest_full_task", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3548418_mozilla_mozilla_ci_tools.mozci.taskcluster.tc_py.validate_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3708613_moogar0880_pytrakt.trakt.core_py.get_device_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.device", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.magisk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.android_py.twrp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.misc_py.ud", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913189_sensipeeps_skyleebot.skylee.modules.misc_py.wall", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.runtest", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2._archived.webview_py.test_self_driver", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928943_openatx_uiautomator2.examples.batteryweb.main_py.battery_level", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.android_py.magisk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.gen_direct_links_py.yandex_disk", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953407_raphielgang_telegram_paperplane.userbot.modules.qrcode_py.parseqr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961680_n4s4_synology_api.synology_api.auth_py.Authentication.get_api_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961752_openfun_openedx_docker.gitlint.gitlint_emoji_py.GitmojiTitle.validate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_user", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.utils_py.get_manifest", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.condor_crab_unique_users_py.get_crab_unique_users", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.condor_hs06coreHrPlot_py.get_hs06CpuTImeHr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.dbs_hdfs_crab_py.get_crab_popularity_ds"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_config | def cli_config():"""Manage Deadline's workstation configuration.""" | 1 | 1 | 0 | 5 | 0 | 18 | 21 | 18 | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["main.group"] | 0 | [] | The function (cli_config) defined within the public class called public.The function start at line 18 and ends at 21. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["main.group"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | config_show | def config_show(output):"""Show all workstation configuration settings and current values."""settings_json = {}if output == "verbose":click.echo(f"AWS Deadline Cloud configuration file:\n {config_file.get_config_file_path()}")click.echo()for setting_name in config_file.SETTINGS.keys():setting_value = config_file.get_setting(setting_name)setting_default = config_file.get_setting_default(setting_name)# Wrap and indent the descriptions to 80 characters because they may be multiline.setting_description: str = config_file.SETTINGS[setting_name].get("description", "")setting_description = "\n".join(f" {line}" for line in textwrap.wrap(setting_description, width=77))click.echo(f"{setting_name}: {setting_value} {'(default)' if setting_value == setting_default else ''}")click.echo(setting_description)click.echo()else:settings_json["settings.config_file_path"] = str(config_file.get_config_file_path())for setting_name in config_file.SETTINGS.keys():setting_value = config_file.get_setting(setting_name)settings_json[setting_name] = setting_valueclick.echo(json.dumps(settings_json)) | 5 | 25 | 1 | 162 | 4 | 32 | 62 | 32 | output | ['settings_json', 'setting_default', 'setting_description', 'setting_value'] | None | {"AnnAssign": 1, "Assign": 7, "Expr": 7, "For": 2, "If": 1} | 21 | 31 | 21 | ["click.echo", "config_file.get_config_file_path", "click.echo", "config_file.SETTINGS.keys", "config_file.get_setting", "config_file.get_setting_default", "get", "join", "textwrap.wrap", "click.echo", "click.echo", "click.echo", "str", "config_file.get_config_file_path", "config_file.SETTINGS.keys", "config_file.get_setting", "click.echo", "json.dumps", "cli_config.command", "click.option", "click.Choice"] | 0 | [] | The function (config_show) defined within the public class called public.The function start at line 32 and ends at 62. It contains 25 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value. It declares 21.0 functions, and It has 21.0 functions called inside which are ["click.echo", "config_file.get_config_file_path", "click.echo", "config_file.SETTINGS.keys", "config_file.get_setting", "config_file.get_setting_default", "get", "join", "textwrap.wrap", "click.echo", "click.echo", "click.echo", "str", "config_file.get_config_file_path", "config_file.SETTINGS.keys", "config_file.get_setting", "click.echo", "json.dumps", "cli_config.command", "click.option", "click.Choice"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | config_gui | def config_gui(install_gui: bool):"""Open the workstation configuration settings GUI."""from ...ui import gui_context_for_cliwith gui_context_for_cli(automatically_install_dependencies=install_gui):from ...ui.dialogs.deadline_config_dialog import DeadlineConfigDialogDeadlineConfigDialog.configure_settings() | 1 | 5 | 1 | 35 | 0 | 72 | 81 | 72 | install_gui | [] | None | {"Expr": 2, "With": 1} | 4 | 10 | 4 | ["gui_context_for_cli", "DeadlineConfigDialog.configure_settings", "cli_config.command", "click.option"] | 0 | [] | The function (config_gui) defined within the public class called public.The function start at line 72 and ends at 81. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["gui_context_for_cli", "DeadlineConfigDialog.configure_settings", "cli_config.command", "click.option"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | config_set | def config_set(setting_name, value):"""Sets a workstation configuration setting.For example `deadline config set defaults.farm_id <farm-id>`.Run `deadline config --help` to show available settings."""config_file.set_setting(setting_name, value) | 1 | 2 | 2 | 16 | 0 | 88 | 95 | 88 | setting_name,value | [] | None | {"Expr": 2} | 4 | 8 | 4 | ["config_file.set_setting", "cli_config.command", "click.argument", "click.argument"] | 0 | [] | The function (config_set) defined within the public class called public.The function start at line 88 and ends at 95. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [88.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["config_file.set_setting", "cli_config.command", "click.argument", "click.argument"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | config_clear | def config_clear(setting_name):"""Sets a workstation configuration setting back to the default value.For example `deadline config clear defaults.farm_id`.Run `deadline config --help` to show available settings."""config_file.clear_setting(setting_name) | 1 | 2 | 1 | 12 | 0 | 101 | 108 | 101 | setting_name | [] | None | {"Expr": 2} | 3 | 8 | 3 | ["config_file.clear_setting", "cli_config.command", "click.argument"] | 0 | [] | The function (config_clear) defined within the public class called public.The function start at line 101 and ends at 108. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["config_file.clear_setting", "cli_config.command", "click.argument"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | config_get | def config_get(setting_name):"""Gets a workstation configuration setting.For example `deadline config get defaults.farm_id`.Run `deadline config --help` to show available settings."""click.echo(config_file.get_setting(setting_name)) | 1 | 2 | 1 | 17 | 0 | 114 | 121 | 114 | setting_name | [] | None | {"Expr": 2} | 4 | 8 | 4 | ["click.echo", "config_file.get_setting", "cli_config.command", "click.argument"] | 0 | [] | The function (config_get) defined within the public class called public.The function start at line 114 and ends at 121. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["click.echo", "config_file.get_setting", "cli_config.command", "click.argument"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli_farm | def cli_farm():"""Commands to work with farms.""" | 1 | 1 | 0 | 5 | 0 | 19 | 22 | 19 | [] | None | {"Expr": 1} | 1 | 4 | 1 | ["main.group"] | 0 | [] | The function (cli_farm) defined within the public class called public.The function start at line 19 and ends at 22. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["main.group"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | farm_list | def farm_list(**args):"""Lists the available farms."""# Get a temporary config object with the standard options handledconfig = _apply_cli_options_to_config(**args)try:response = api.list_farms(config=config)except ClientError as exc:raise DeadlineOperationError(f"Failed to get Farms from Deadline:\n{exc}") from exc# Select which fields to print and in which orderstructured_farm_list = [{field: farm[field] for field in ["farmId", "displayName"]} for farm in response["farms"]]click.echo(_cli_object_repr(structured_farm_list)) | 4 | 10 | 1 | 75 | 3 | 28 | 45 | 28 | **args | ['config', 'structured_farm_list', 'response'] | None | {"Assign": 3, "Expr": 2, "Try": 1} | 7 | 18 | 7 | ["_apply_cli_options_to_config", "api.list_farms", "DeadlineOperationError", "click.echo", "_cli_object_repr", "cli_farm.command", "click.option"] | 0 | [] | The function (farm_list) defined within the public class called public.The function start at line 28 and ends at 45. It contains 10 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["_apply_cli_options_to_config", "api.list_farms", "DeadlineOperationError", "click.echo", "_cli_object_repr", "cli_farm.command", "click.option"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | farm_get | def farm_get(**args):"""Get the details of a farm.If farm ID is not provided, returns the configured default farm."""# Get a temporary config object with the standard options handledconfig = _apply_cli_options_to_config(required_options={"farm_id"}, **args)farm_id = config_file.get_setting("defaults.farm_id", config=config)deadline = api.get_boto3_client("deadline", config=config)response = deadline.get_farm(farmId=farm_id)response.pop("ResponseMetadata", None)click.echo(_cli_object_repr(response)) | 1 | 7 | 1 | 71 | 4 | 52 | 67 | 52 | **args | ['farm_id', 'config', 'deadline', 'response'] | None | {"Assign": 4, "Expr": 3} | 10 | 16 | 10 | ["_apply_cli_options_to_config", "config_file.get_setting", "api.get_boto3_client", "deadline.get_farm", "response.pop", "click.echo", "_cli_object_repr", "cli_farm.command", "click.option", "click.option"] | 0 | [] | The function (farm_get) defined within the public class called public.The function start at line 52 and ends at 67. It contains 7 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["_apply_cli_options_to_config", "config_file.get_setting", "api.get_boto3_client", "deadline.get_farm", "response.pop", "click.echo", "_cli_object_repr", "cli_farm.command", "click.option", "click.option"]. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.