project_name string | class_name string | class_modifiers string | class_implements int64 | class_extends int64 | function_name string | function_body string | cyclomatic_complexity int64 | NLOC int64 | num_parameter int64 | num_token int64 | num_variable int64 | start_line int64 | end_line int64 | function_index int64 | function_params string | function_variable string | function_return_type string | function_body_line_type string | function_num_functions int64 | function_num_lines int64 | outgoing_function_count int64 | outgoing_function_names string | incoming_function_count int64 | incoming_function_names string | lexical_representation string |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
aws-deadline_deadline-cloud | InstallBuilderSelection | public | 0 | 0 | from_s3 | def from_s3(bucket_name: str, dest_path: Path, key: Optional[str] = None):if key is None:if platform.system() in _INSTALL_BUILDER_ARCHIVE_FILENAME:resolved_key = (f"install_builder/{_INSTALL_BUILDER_ARCHIVE_FILENAME[platform.system()]}")else:raise UnsupportedOSError(f"Unsupported OS: {platform.system()}")else:resolved_key = keyreturn InstallBuilderSelection(_InstallBuilderS3Selection(bucket_name, resolved_key, dest_path)) | 3 | 13 | 3 | 65 | 0 | 69 | 81 | 69 | bucket_name,dest_path,key | [] | Returns | {"Assign": 2, "If": 2, "Return": 1} | 6 | 13 | 6 | ["platform.system", "platform.system", "UnsupportedOSError", "platform.system", "InstallBuilderSelection", "_InstallBuilderS3Selection"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"] | The function (from_s3) defined within the public class called InstallBuilderSelection.The function start at line 69 and ends at 81. It contains 13 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [69.0], and this function return a value. It declares 6.0 functions, It has 6.0 functions called inside which are ["platform.system", "platform.system", "UnsupportedOSError", "platform.system", "InstallBuilderSelection", "_InstallBuilderS3Selection"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"]. |
aws-deadline_deadline-cloud | InstallBuilderSelection | public | 0 | 0 | from_path | def from_path(path: Path):return InstallBuilderSelection(_InstallBuilderPathSelection(path)) | 1 | 2 | 1 | 15 | 0 | 84 | 85 | 84 | path | [] | Returns | {"Return": 1} | 2 | 2 | 2 | ["InstallBuilderSelection", "_InstallBuilderPathSelection"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"] | The function (from_path) defined within the public class called InstallBuilderSelection.The function start at line 84 and ends at 85. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, It has 2.0 functions called inside which are ["InstallBuilderSelection", "_InstallBuilderPathSelection"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"]. |
aws-deadline_deadline-cloud | InstallBuilderSelection | public | 0 | 0 | from_search | def from_search():return InstallBuilderSelection(None) | 1 | 2 | 0 | 9 | 0 | 88 | 89 | 88 | [] | Returns | {"Return": 1} | 1 | 2 | 1 | ["InstallBuilderSelection"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"] | The function (from_search) defined within the public class called InstallBuilderSelection.The function start at line 88 and ends at 89. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declare 1.0 function, It has 1.0 function called inside which is ["InstallBuilderSelection"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.setup_install_builder"]. | |
aws-deadline_deadline-cloud | _InstallBuilderSearchConfig | protected | 0 | 0 | get_install_directory_regex | def get_install_directory_regex(self) -> re.Pattern:return re.compile(self.install_directory_name_regex) | 1 | 2 | 1 | 18 | 0 | 97 | 98 | 97 | self | [] | re.Pattern | {"Return": 1} | 1 | 2 | 1 | ["re.compile"] | 0 | [] | The function (get_install_directory_regex) defined within the protected class called _InstallBuilderSearchConfig.The function start at line 97 and ends at 98. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["re.compile"]. |
aws-deadline_deadline-cloud | _InstallBuilderEdition | protected | 0 | 1 | from_str | def from_str(edition: str):if edition == "Professional":return _InstallBuilderEdition.PROFESSIONALelif edition == "Enterprise":return _InstallBuilderEdition.ENTERPRISEelif edition == "For Windows":return _InstallBuilderEdition.WINDOWSelif edition == "For OS X":return _InstallBuilderEdition.OSXelif edition == "":return _InstallBuilderEdition.LINUXelse:raise ValueError(f"Unknown edition: {edition}") | 6 | 13 | 1 | 60 | 0 | 122 | 134 | 122 | edition | [] | Returns | {"If": 5, "Return": 5} | 1 | 13 | 1 | ["ValueError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.find_installbuilder_py._get_default_installbuilder_location"] | The function (from_str) defined within the protected class called _InstallBuilderEdition, that inherit another class.The function start at line 122 and ends at 134. It contains 13 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters, and this function return a value. It declare 1.0 function, It has 1.0 function called inside which is ["ValueError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.find_installbuilder_py._get_default_installbuilder_location"]. |
aws-deadline_deadline-cloud | _InstallBuilderInstallation | protected | 0 | 0 | _sort_key | def _sort_key(install: "_InstallBuilderInstallation") -> tuple[int, int, int, int]:# Prioritize Enterprise, then Professional, then any other editionedition_priority = {_InstallBuilderEdition.ENTERPRISE: 0,_InstallBuilderEdition.PROFESSIONAL: 1,}edition_value = edition_priority.get(install.edition, 2)return (-install.version.major,-install.version.minor,-install.version.patch,edition_value,) | 1 | 12 | 1 | 72 | 0 | 151 | 163 | 151 | install | [] | tuple[int, int, int, int] | {"Assign": 2, "Return": 1} | 1 | 13 | 1 | ["edition_priority.get"] | 0 | [] | The function (_sort_key) defined within the protected class called _InstallBuilderInstallation.The function start at line 151 and ends at 163. It contains 12 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["edition_priority.get"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_default_installbuilder_location | def _get_default_installbuilder_location() -> Path:"""Returns the default location where InstallBuilder Professional will be installed depending on the platform."""if platform.system() not in _INSTALL_BUILDER_SEARCH_CONFIGS:raise UnsupportedOSError(f"Unsupported OS for building installer: {platform.system()}")config = _INSTALL_BUILDER_SEARCH_CONFIGS[platform.system()]candidates = []for install_dir in config.parent_path.iterdir():if install_dir.is_dir():match = config.get_install_directory_regex().match(install_dir.name)if match and (install_dir / "bin" / "builder").is_file():if platform.system() == "Linux":version_offset = 0else:version_offset = 1installation = _InstallBuilderInstallation(install_dir,_Semver(int(match.group(1 + version_offset)),int(match.group(2 + version_offset)),int(match.group(3 + version_offset)),),_InstallBuilderEdition.from_str(match.group(1) if platform.system() != "Linux" else ""),)candidates.append(installation)candidates.sort(key=_InstallBuilderInstallation._sort_key)if not candidates:raise FileNotFoundError("Could not find a default InstallBuilder path.")return candidates[0].path | 9 | 29 | 0 | 202 | 5 | 166 | 197 | 166 | ['config', 'version_offset', 'installation', 'candidates', 'match'] | Path | {"Assign": 6, "Expr": 3, "For": 1, "If": 5, "Return": 1} | 24 | 32 | 24 | ["platform.system", "UnsupportedOSError", "platform.system", "platform.system", "config.parent_path.iterdir", "install_dir.is_dir", "match", "config.get_install_directory_regex", "is_file", "platform.system", "_InstallBuilderInstallation", "_Semver", "int", "match.group", "int", "match.group", "int", "match.group", "_InstallBuilderEdition.from_str", "platform.system", "match.group", "candidates.append", "candidates.sort", "FileNotFoundError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.find_installbuilder_py.InstallBuilderSelection.resolve_install_builder_installation"] | The function (_get_default_installbuilder_location) defined within the public class called public.The function start at line 166 and ends at 197. It contains 29 lines of code and it has a cyclomatic complexity of 9. The function does not take any parameters and does not return any value. It declares 24.0 functions, It has 24.0 functions called inside which are ["platform.system", "UnsupportedOSError", "platform.system", "platform.system", "config.parent_path.iterdir", "install_dir.is_dir", "match", "config.get_install_directory_regex", "is_file", "platform.system", "_InstallBuilderInstallation", "_Semver", "int", "match.group", "int", "match.group", "int", "match.group", "_InstallBuilderEdition.from_str", "platform.system", "match.group", "candidates.append", "candidates.sort", "FileNotFoundError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.find_installbuilder_py.InstallBuilderSelection.resolve_install_builder_installation"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main(archive_path: Path, output_path: Path) -> None:"""Extracts the archive file to the output directory."""shutil.unpack_archive(archive_path, output_path) | 1 | 2 | 2 | 22 | 0 | 22 | 24 | 22 | archive_path,output_path | [] | None | {"Expr": 2} | 4 | 3 | 4 | ["shutil.unpack_archive", "click.command", "click.option", "click.option"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 22 and ends at 24. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [22.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["shutil.unpack_archive", "click.command", "click.option", "click.option"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_desired_python_version | def _get_desired_python_version() -> str:if platform.system() == "Darwin":return "3.12"elif platform.system() == "Windows":return "3.10"elif platform.system() == "Linux":return "3.9"raise RuntimeError("Platform not supported") | 4 | 8 | 0 | 44 | 0 | 151 | 158 | 151 | [] | str | {"If": 3, "Return": 3} | 4 | 8 | 4 | ["platform.system", "platform.system", "platform.system", "RuntimeError"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo._get_dist_info_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.check_against_attributions_allow_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.generate_attributions_document"] | The function (_get_desired_python_version) defined within the public class called public.The function start at line 151 and ends at 158. It contains 8 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["platform.system", "platform.system", "platform.system", "RuntimeError"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo._get_dist_info_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.check_against_attributions_allow_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.generate_attributions_document"]. | |
aws-deadline_deadline-cloud | PythonInstall | public | 0 | 0 | __init__ | def __init__(self, arg: str, version: str, dev: bool):"""Create a python installation based in the passed in --python argumentIf the argument was "mise", query mise for an installed python interpreter of the desired version (only allowed in dev mode)If the argument was "uv", let uv install the desired python version (only allowed in dev mode)If the argument is anything else, check to see if it is a path to a file, if it is, assume this is the path to a Python interpreter"""if arg == "mise":interpreter_path = PythonInstall._get_mise_interpreter_path(version, dev)elif arg == "uv":if not dev:raise RuntimeError("Cannot use uv for Python interpreter outside of dev mode")interpreter_path = Noneelse:interpreter_path = Path(arg)if interpreter_path is not None:if not interpreter_path.is_file():raise RuntimeError("Specified python interpreter path either doesn't exist or is not a file")python_version_output = subprocess.check_output([interpreter_path, "--version"], text=True)version_match = _PYTHON_VERSION_REGEX.match(python_version_output)if version_match is None:raise RuntimeError(f"Python interpreter candidate at {interpreter_path} is not a Python interpreter")interpreter_version = f"{version_match.group(1)}.{version_match.group(2)}"if interpreter_version != version:raise RuntimeError(f"Python interpreter candidate at {interpreter_path} has version {interpreter_version} which does not match specified version {version}")self._interpreter_path = interpreter_pathself._version = versionself._dev = dev | 8 | 30 | 4 | 142 | 0 | 166 | 203 | 166 | self,arg,version,dev | [] | None | {"Assign": 9, "Expr": 1, "If": 7} | 11 | 38 | 11 | ["PythonInstall._get_mise_interpreter_path", "RuntimeError", "Path", "interpreter_path.is_file", "RuntimeError", "subprocess.check_output", "_PYTHON_VERSION_REGEX.match", "RuntimeError", "version_match.group", "version_match.group", "RuntimeError"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called PythonInstall.The function start at line 166 and ends at 203. It contains 30 lines of code and it has a cyclomatic complexity of 8. It takes 4 parameters, represented as [166.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["PythonInstall._get_mise_interpreter_path", "RuntimeError", "Path", "interpreter_path.is_file", "RuntimeError", "subprocess.check_output", "_PYTHON_VERSION_REGEX.match", "RuntimeError", "version_match.group", "version_match.group", "RuntimeError"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | PythonInstall | public | 0 | 0 | _get_mise_interpreter_path | def _get_mise_interpreter_path(version: str, dev: bool) -> Path:if not dev:raise RuntimeError("Cannot use mise for Python interpreter outside of dev mode")python_install_path = Path(subprocess.check_output(["mise", "where", f"python@{version}"], text=True).strip())if not python_install_path.is_dir():raise RuntimeError(f"mise where python@{version} returned {python_install_path} which is not a directory.")if platform.system() == "Windows":python_exe_name = "python.exe"else:python_exe_name = "python"return python_install_path / "bin" / python_exe_name | 4 | 15 | 2 | 85 | 0 | 206 | 221 | 206 | version,dev | [] | Path | {"Assign": 3, "If": 3, "Return": 1} | 7 | 16 | 7 | ["RuntimeError", "Path", "strip", "subprocess.check_output", "python_install_path.is_dir", "RuntimeError", "platform.system"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.PythonInstall.__init__"] | The function (_get_mise_interpreter_path) defined within the public class called PythonInstall.The function start at line 206 and ends at 221. It contains 15 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [206.0] and does not return any value. It declares 7.0 functions, It has 7.0 functions called inside which are ["RuntimeError", "Path", "strip", "subprocess.check_output", "python_install_path.is_dir", "RuntimeError", "platform.system"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.PythonInstall.__init__"]. |
aws-deadline_deadline-cloud | PythonInstall | public | 0 | 0 | get_uv_venv_python_args | def get_uv_venv_python_args(self) -> list[str]:if self._dev:if self._interpreter_path is None:return ["--python", self._version]else:return ["--python",str(self._interpreter_path),"--python-preference","only-system","--no-python-downloads",]else:return ["--python",str(self._interpreter_path),"--python-preference","only-system","--no-python-downloads",] | 3 | 20 | 1 | 70 | 0 | 223 | 242 | 223 | self | [] | list[str] | {"If": 2, "Return": 3} | 2 | 20 | 2 | ["str", "str"] | 0 | [] | The function (get_uv_venv_python_args) defined within the public class called PythonInstall.The function start at line 223 and ends at 242. It contains 20 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["str", "str"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | uv_pip | def uv_pip(args: list[str], venv: Path) -> None:"""Convenience function that calls `uv pip [args]` against the virtual envrionment at a given Path"""subprocess.check_call(["uv", "pip", *args], env={**os.environ, "VIRTUAL_ENV": str(venv)}) | 1 | 2 | 2 | 46 | 0 | 245 | 249 | 245 | args,venv | [] | None | {"Expr": 2} | 2 | 5 | 2 | ["subprocess.check_call", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._get_license_info"] | The function (uv_pip) defined within the public class called public.The function start at line 245 and ends at 249. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [245.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["subprocess.check_call", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._get_license_info"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_sha256 | def _get_sha256(text: str) -> str:return hashlib.sha256(text.encode("utf8")).hexdigest() | 1 | 2 | 1 | 25 | 0 | 252 | 253 | 252 | text | [] | str | {"Return": 1} | 3 | 2 | 3 | ["hexdigest", "hashlib.sha256", "text.encode"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.get_license_sha256", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.get_notice_sha256"] | The function (_get_sha256) defined within the public class called public.The function start at line 252 and ends at 253. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["hexdigest", "hashlib.sha256", "text.encode"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.get_license_sha256", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py._PackageLicenseInfo.get_notice_sha256"]. |
aws-deadline_deadline-cloud | PythonInstall | public | 0 | 0 | __init__ | def __init__(self, venv: Path, pip_license_info: dict[str, str]):name = pip_license_info["Name"]version = pip_license_info["Version"]if name == "UNKNOWN":raise RuntimeError("Package missing name")self.name = nameif version == "UNKNOWN":raise RuntimeError(f"Package {name} missing version")self.version = versiondiscovered_license_text = pip_license_info["LicenseText"]discovered_notice_text = pip_license_info["NoticeText"]license_text_override = self._get_license_text_override(venv)notice_text_override = self._get_notice_text_override(venv)if license_text_override is not None:self.license_text = license_text_overrideelif discovered_license_text == "UNKNOWN":raise RuntimeError(f"Package {name} missing license text")else:self.license_text = discovered_license_textif notice_text_override is not None:self.notice_text = notice_text_overrideelif discovered_notice_text == "UNKNOWN":self.notice_text = Noneelse:self.notice_text = discovered_notice_text | 7 | 25 | 3 | 146 | 0 | 263 | 293 | 263 | self,arg,version,dev | [] | None | {"Assign": 9, "Expr": 1, "If": 7} | 11 | 38 | 11 | ["PythonInstall._get_mise_interpreter_path", "RuntimeError", "Path", "interpreter_path.is_file", "RuntimeError", "subprocess.check_output", "_PYTHON_VERSION_REGEX.match", "RuntimeError", "version_match.group", "version_match.group", "RuntimeError"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called PythonInstall.The function start at line 263 and ends at 293. It contains 25 lines of code and it has a cyclomatic complexity of 7. It takes 3 parameters, represented as [263.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["PythonInstall._get_mise_interpreter_path", "RuntimeError", "Path", "interpreter_path.is_file", "RuntimeError", "subprocess.check_output", "_PYTHON_VERSION_REGEX.match", "RuntimeError", "version_match.group", "version_match.group", "RuntimeError"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | _get_license_text_override | def _get_license_text_override(self, venv: Path) -> Optional[str]:if self.name not in _LICENSE_PATH_OVERRIDES:return Nonedist_info_path = self._get_dist_info_path(venv)if not dist_info_path.is_dir():raise RuntimeError(f".dist-info path for {self.name} does not exist")with open(dist_info_path / _LICENSE_PATH_OVERRIDES[self.name], "r", encoding="utf8") as f:return f.read() | 3 | 8 | 2 | 73 | 0 | 295 | 302 | 295 | self,venv | [] | Optional[str] | {"Assign": 1, "If": 2, "Return": 2, "With": 1} | 5 | 8 | 5 | ["self._get_dist_info_path", "dist_info_path.is_dir", "RuntimeError", "open", "f.read"] | 0 | [] | The function (_get_license_text_override) defined within the protected class called _PackageLicenseInfo.The function start at line 295 and ends at 302. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [295.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["self._get_dist_info_path", "dist_info_path.is_dir", "RuntimeError", "open", "f.read"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | _get_notice_text_override | def _get_notice_text_override(self, venv: Path) -> Optional[str]:if self.name not in _NOTICE_PATH_OVERRIDES:return Nonedist_info_path = self._get_dist_info_path(venv)if not dist_info_path.is_dir():raise RuntimeError(f".dist-info path for {self.name} does not exist")with open(dist_info_path / _NOTICE_PATH_OVERRIDES[self.name], "r", encoding="utf8") as f:return f.read() | 3 | 8 | 2 | 73 | 0 | 304 | 311 | 304 | self,venv | [] | Optional[str] | {"Assign": 1, "If": 2, "Return": 2, "With": 1} | 5 | 8 | 5 | ["self._get_dist_info_path", "dist_info_path.is_dir", "RuntimeError", "open", "f.read"] | 0 | [] | The function (_get_notice_text_override) defined within the protected class called _PackageLicenseInfo.The function start at line 304 and ends at 311. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [304.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["self._get_dist_info_path", "dist_info_path.is_dir", "RuntimeError", "open", "f.read"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | check_against_attributions_allow_list | def check_against_attributions_allow_list(self) -> None:if self.name not in _ATTRIBUTIONS_ALLOW_LIST:raise RuntimeError(f"Package {self.name} is not in the allow list for the attributions document")license_sha256 = self.get_license_sha256()license_sha256_version_key = f"license_sha256_{_get_desired_python_version()}"if license_sha256_version_key in _ATTRIBUTIONS_ALLOW_LIST[self.name]:expected_sha256 = _ATTRIBUTIONS_ALLOW_LIST[self.name][license_sha256_version_key]else:expected_sha256 = _ATTRIBUTIONS_ALLOW_LIST[self.name]["license_sha256"]if license_sha256 != expected_sha256:raise RuntimeError(f"Package {self.name} has had a change to its license text since added to the allow list. Computed sha256 is {license_sha256}")notice_sha256 = self.get_notice_sha256()if notice_sha256 is None and "notice_sha256" not in _ATTRIBUTIONS_ALLOW_LIST[self.name]:returnif notice_sha256 is None and "notice_sha256" in _ATTRIBUTIONS_ALLOW_LIST[self.name]:raise RuntimeError(f"No notice file found for package {self.name} but the allow list has a sha256 for a notice file for {self.name}")if notice_sha256 is not None and "notice_sha256" not in _ATTRIBUTIONS_ALLOW_LIST[self.name]:raise RuntimeError(f"Found notice file with sha256 {notice_sha256} for package {self.name}, but {self.name} does not have a notice file sha256 in teh allow list.")if notice_sha256 != _ATTRIBUTIONS_ALLOW_LIST[self.name]["notice_sha256"]:raise RuntimeError(f"Package {self.name} has had a change to its notice text since added to the allow list") | 11 | 30 | 1 | 161 | 0 | 313 | 344 | 313 | self | [] | None | {"Assign": 5, "If": 7, "Return": 1} | 8 | 32 | 8 | ["RuntimeError", "self.get_license_sha256", "_get_desired_python_version", "RuntimeError", "self.get_notice_sha256", "RuntimeError", "RuntimeError", "RuntimeError"] | 0 | [] | The function (check_against_attributions_allow_list) defined within the protected class called _PackageLicenseInfo.The function start at line 313 and ends at 344. It contains 30 lines of code and it has a cyclomatic complexity of 11. The function does not take any parameters and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["RuntimeError", "self.get_license_sha256", "_get_desired_python_version", "RuntimeError", "self.get_notice_sha256", "RuntimeError", "RuntimeError", "RuntimeError"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | get_attribution_text | def get_attribution_text(self) -> str:if self.notice_text is None:notice_text = "\n"else:notice_text = f"\n{self.notice_text}\n"return f"{self.name}\n\n{self.license_text}{notice_text}" | 2 | 6 | 1 | 26 | 0 | 346 | 352 | 346 | self | [] | str | {"Assign": 2, "If": 1, "Return": 1} | 0 | 7 | 0 | [] | 0 | [] | The function (get_attribution_text) defined within the protected class called _PackageLicenseInfo.The function start at line 346 and ends at 352. It contains 6 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | get_license_sha256 | def get_license_sha256(self) -> str:return _get_sha256(self.license_text) | 1 | 2 | 1 | 14 | 0 | 354 | 355 | 354 | self | [] | str | {"Return": 1} | 1 | 2 | 1 | ["_get_sha256"] | 0 | [] | The function (get_license_sha256) defined within the protected class called _PackageLicenseInfo.The function start at line 354 and ends at 355. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["_get_sha256"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | get_notice_sha256 | def get_notice_sha256(self) -> Optional[str]:if self.notice_text is not None:return _get_sha256(self.notice_text)return None | 2 | 4 | 1 | 27 | 0 | 357 | 360 | 357 | self | [] | Optional[str] | {"If": 1, "Return": 2} | 1 | 4 | 1 | ["_get_sha256"] | 0 | [] | The function (get_notice_sha256) defined within the protected class called _PackageLicenseInfo.The function start at line 357 and ends at 360. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["_get_sha256"]. |
aws-deadline_deadline-cloud | _PackageLicenseInfo | protected | 0 | 0 | _get_dist_info_path | def _get_dist_info_path(self, venv: Path) -> Path:if platform.system() == "Windows":return venv / "Lib" / "site-packages" / f"{self.name}-{self.version}.dist-info"else:return (venv/ "lib"/ f"python{_get_desired_python_version()}"/ "site-packages"/ f"{self.name}-{self.version}.dist-info") | 2 | 11 | 2 | 45 | 0 | 362 | 372 | 362 | self,venv | [] | Path | {"If": 1, "Return": 2} | 2 | 11 | 2 | ["platform.system", "_get_desired_python_version"] | 0 | [] | The function (_get_dist_info_path) defined within the protected class called _PackageLicenseInfo.The function start at line 362 and ends at 372. It contains 11 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [362.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["platform.system", "_get_desired_python_version"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_license_info | def _get_license_info(python_interpreter: PythonInstall) -> list[_PackageLicenseInfo]:repository_root = Path(__file__).parent.parent.parentwith tempfile.TemporaryDirectory() as td:temp = Path(td)venv = temp / ".venv"python_args = python_interpreter.get_uv_venv_python_args()uv_venv_args = ["uv", "venv", venv, *python_args]subprocess.check_call(uv_venv_args)uv_pip(["install", repository_root], venv)if platform.system() == "Windows":python_path = venv / "Scripts" / "python.exe"else:python_path = venv / "bin" / "python"pip_licenses_output = subprocess.check_output(["pip-licenses","--from=meta","--with-url","--with-license-file","--with-notice-file","--format=json",f"--python={python_path}",])pip_licenses_parsed = json.loads(pip_licenses_output)for package in pip_licenses_parsed:name = package["Name"]license_text = package["LicenseText"]notice_text = package["NoticeText"]if name in _EXPECTED_MISSING_LICENSE and license_text != "UNKNOWN":raise RuntimeError(f"Expected pip-licenses to not find a license for {name} but one was found.")if license_text == "UNKNOWN" and notice_text != "UNKNOWN":raise RuntimeError(f"pip-licenses found a notices file for {name} but no license file. This case is not handled.")if license_text == "UNKNOWN" and name not in _EXPECTED_MISSING_LICENSE:raise RuntimeError(f"pip-licenses did not find a license file for {name} but it was expected to.")pip_licenses_parsed = [packagefor package in pip_licenses_parsedif package["Name"] not in _EXPECTED_MISSING_LICENSE]return [_PackageLicenseInfo(venv,pip_license_info,)for pip_license_info in pip_licenses_parsedif pip_license_info["Name"] != "deadline"] | 13 | 54 | 1 | 243 | 11 | 375 | 431 | 375 | python_interpreter | ['temp', 'uv_venv_args', 'license_text', 'repository_root', 'venv', 'pip_licenses_parsed', 'name', 'python_args', 'notice_text', 'pip_licenses_output', 'python_path'] | list[_PackageLicenseInfo] | {"Assign": 13, "Expr": 2, "For": 1, "If": 4, "Return": 1, "With": 1} | 13 | 57 | 13 | ["Path", "tempfile.TemporaryDirectory", "Path", "python_interpreter.get_uv_venv_python_args", "subprocess.check_call", "uv_pip", "platform.system", "subprocess.check_output", "json.loads", "RuntimeError", "RuntimeError", "RuntimeError", "_PackageLicenseInfo"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.generate_attributions_document"] | The function (_get_license_info) defined within the public class called public.The function start at line 375 and ends at 431. It contains 54 lines of code and it has a cyclomatic complexity of 13. The function does not take any parameters and does not return any value. It declares 13.0 functions, It has 13.0 functions called inside which are ["Path", "tempfile.TemporaryDirectory", "Path", "python_interpreter.get_uv_venv_python_args", "subprocess.check_call", "uv_pip", "platform.system", "subprocess.check_output", "json.loads", "RuntimeError", "RuntimeError", "RuntimeError", "_PackageLicenseInfo"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.attributions.cli_py.generate_attributions_document"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | generate_attributions_document | def generate_attributions_document(out_file: Path, python_arg: Optional[str], dev: bool) -> None:"""Generate an attributions document for this package and write it to `out_file`"""desired_python_version = _get_desired_python_version()python_install = PythonInstall(python_arg, desired_python_version, dev)license_info = _get_license_info(python_install)attributions = []for package in license_info:package.check_against_attributions_allow_list()attributions.append(package.get_attribution_text())additional_attributions_path = Path(__file__).parent / "additional"for attribution in _ADDITIONAL_ATTRIBUTIONS:with open(additional_attributions_path / attribution["attribution_path"], "r", encoding="utf8") as f:attributions.append(f"{attribution['name']}\n\n{f.read()}\n")attributions = "".join(attributions)with open(out_file, "w", encoding="utf8") as f:f.write(attributions) | 3 | 17 | 3 | 135 | 5 | 434 | 456 | 434 | out_file,python_arg,dev | ['python_install', 'attributions', 'license_info', 'additional_attributions_path', 'desired_python_version'] | None | {"Assign": 6, "Expr": 5, "For": 2, "With": 2} | 13 | 23 | 13 | ["_get_desired_python_version", "PythonInstall", "_get_license_info", "package.check_against_attributions_allow_list", "attributions.append", "package.get_attribution_text", "Path", "open", "attributions.append", "f.read", "join", "open", "f.write"] | 0 | [] | The function (generate_attributions_document) defined within the public class called public.The function start at line 434 and ends at 456. It contains 17 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [434.0] and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["_get_desired_python_version", "PythonInstall", "_get_license_info", "package.check_against_attributions_allow_list", "attributions.append", "package.get_attribution_text", "Path", "open", "attributions.append", "f.read", "join", "open", "f.write"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | make_exe | def make_exe(exe_zipfile: Path, cleanup=True) -> None:clean_pyinstaller_build_dirs()# Create Deadline CLI distpyinstaller(str(DEADLINE_CLI_SPEC_PATH))# Create Deadline CLI wrapper distos.environ["PYINSTALLER_DEADLINE_CLI_DIST_PATH"] = str(DEADLINE_CLI_DIST_PATH)pyinstaller(str(DEADLINE_SPEC_PATH))# Sometimes the files output by pyinstaller have a last modified# date of the unix epoch. This causes make_archive to fail.# Touch each file in the directory we are archiving to make# sure they all have non-epoch modified dates.for dirpath, _, filenames in os.walk(DEADLINE_DIST_PATH):for filename in filenames:(Path(dirpath) / filename).touch(exist_ok=True)# Zip up the Deadline CLI wrapper to the final output pathshutil.make_archive(exe_zipfile.with_suffix(""), "zip", DEADLINE_DIST_PATH)if cleanup:clean_pyinstaller_build_dirs()print(f"Exe build is available at: {str(exe_zipfile)}") | 4 | 12 | 2 | 101 | 0 | 40 | 64 | 40 | exe_zipfile,cleanup | [] | None | {"Assign": 1, "Expr": 7, "For": 2, "If": 1} | 14 | 25 | 14 | ["clean_pyinstaller_build_dirs", "pyinstaller", "str", "str", "pyinstaller", "str", "os.walk", "touch", "Path", "shutil.make_archive", "exe_zipfile.with_suffix", "clean_pyinstaller_build_dirs", "print", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.main"] | The function (make_exe) defined within the public class called public.The function start at line 40 and ends at 64. It contains 12 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [40.0] and does not return any value. It declares 14.0 functions, It has 14.0 functions called inside which are ["clean_pyinstaller_build_dirs", "pyinstaller", "str", "str", "pyinstaller", "str", "os.walk", "touch", "Path", "shutil.make_archive", "exe_zipfile.with_suffix", "clean_pyinstaller_build_dirs", "print", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.main"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | pyinstaller | def pyinstaller(*args: tuple):if "--onefile" in args or "-F" in args:raise Exception("Cannot use --onefile/-F option for PyInstaller due to libreadline being licensed under GPL")################################# WARNING ################################### Do not change this to use one-file mode (do not add `--onefile` / `-F` ## to the command-line arguments).#### Doing so causes pyinstaller to bundle libreadline which is licensed## under GPL. ################################## WARNING ##################################subprocess.run(["pyinstaller", *args], cwd=PYINSTALLER_DIR, check=True) | 3 | 6 | 1 | 41 | 0 | 67 | 80 | 67 | *args | [] | None | {"Expr": 1, "If": 1} | 2 | 14 | 2 | ["Exception", "subprocess.run"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.make_exe"] | The function (pyinstaller) defined within the public class called public.The function start at line 67 and ends at 80. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["Exception", "subprocess.run"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.make_exe"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | clean_pyinstaller_build_dirs | def clean_pyinstaller_build_dirs():for location in [PYINSTALLER_BUILD_DIR,PYINSTALLER_DIST_DIR,]:shutil.rmtree(location, ignore_errors=True)print(f"Deleted build directory: {str(location)}") | 2 | 7 | 0 | 29 | 0 | 83 | 89 | 83 | [] | None | {"Expr": 2, "For": 1} | 3 | 7 | 3 | ["shutil.rmtree", "print", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.make_exe"] | The function (clean_pyinstaller_build_dirs) defined within the public class called public.The function start at line 83 and ends at 89. It contains 7 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["shutil.rmtree", "print", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.make_exe_py.make_exe"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main() -> None:parser = argparse.ArgumentParser(description=__doc__)parser.add_argument("--output",default=str(ROOT / "dist" / DEFAULT_OUTPUT_ZIP),help=("The name of the file to save the exe zip. By default, "f"this will be saved in 'dist/{DEFAULT_OUTPUT_ZIP}' directory in the root of the ""DeadlineClient."),)parser.add_argument("--no-cleanup",dest="cleanup",action="store_false",help=("Leave the build folder produced by pyinstaller. This can be useful for debugging."),)args = parser.parse_args()output = Path(args.output).absolute()make_exe(output, cleanup=args.cleanup) | 1 | 20 | 0 | 93 | 3 | 92 | 113 | 92 | ['parser', 'args', 'output'] | None | {"Assign": 3, "Expr": 3} | 8 | 22 | 8 | ["argparse.ArgumentParser", "parser.add_argument", "str", "parser.add_argument", "parser.parse_args", "absolute", "Path", "make_exe"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 92 and ends at 113. It contains 20 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "str", "parser.add_argument", "parser.parse_args", "absolute", "Path", "make_exe"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | temp_zip_contents | def temp_zip_contents(zip_path: Path) -> Generator[Path, None, None]:"""Context manager that temporarily extracts the contents of a zipfile to a temporary directory, yielding the path to the temporarydirectory"""with tempfile.TemporaryDirectory() as td:temp = Path(td)with zipfile.ZipFile(zip_path, "r") as zip:zip.extractall(temp)yield temp | 1 | 6 | 1 | 52 | 1 | 26 | 36 | 26 | zip_path | ['temp'] | Generator[Path, None, None] | {"Assign": 1, "Expr": 3, "With": 2} | 4 | 11 | 4 | ["tempfile.TemporaryDirectory", "Path", "zipfile.ZipFile", "zip.extractall"] | 0 | [] | The function (temp_zip_contents) defined within the public class called public.The function start at line 26 and ends at 36. It contains 6 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["tempfile.TemporaryDirectory", "Path", "zipfile.ZipFile", "zip.extractall"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_pyinstaller_contents_info | def _get_pyinstaller_contents_info(archive_path: Path) -> list[str]:"""Uses pyi-archive_viewer to list the contents of a pyi-archive_viewer compatiblearchive such as a pyinstaller generated executable or .pyz file"""archive_list = subprocess.check_output(["pyi-archive_viewer", "--list", "--brief", str(archive_path)], text=True)# Output will look something like:## Options in 'deadline' (PKG/CArchive):#pyi-contents-directory _internal# Contents of 'deadline' (PKG/CArchive):#struct#pyimod01_archive#pyimod02_importers#pyimod03_ctypes#pyiboot01_bootstrap#deadline#PYZ.pyzstate = _PYIParseState.Startitems = []for line in archive_list.splitlines(keepends=False):if state == _PYIParseState.Start:if _PYI_CONTENTS_HEADER_REGEX.match(line) is not None:state = _PYIParseState.ContentsSectionelif state == _PYIParseState.ContentsSection:if item := line.strip():items.append(item)else:state = _PYIParseState.Endelif state == _PYIParseState.End:if line.strip():raise RuntimeError((f"Further data found after contents list for {archive_path}. ""This may mean the format has changed between pyinstaller versions. "f"Use `pyi-archive_viewer -l -b {archive_path}` to check the output to check the output."))else:raise RuntimeError(f"Non-valid parse state {str(state)} reached while parsing contents of {archive_path}")return items | 8 | 29 | 1 | 144 | 3 | 45 | 91 | 45 | archive_path | ['state', 'archive_list', 'items'] | list[str] | {"Assign": 5, "Expr": 2, "For": 1, "If": 6, "Return": 1} | 10 | 47 | 10 | ["subprocess.check_output", "str", "archive_list.splitlines", "_PYI_CONTENTS_HEADER_REGEX.match", "line.strip", "items.append", "line.strip", "RuntimeError", "RuntimeError", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.temp_pyinstaller_archive_contents"] | The function (_get_pyinstaller_contents_info) defined within the public class called public.The function start at line 45 and ends at 91. It contains 29 lines of code and it has a cyclomatic complexity of 8. The function does not take any parameters and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["subprocess.check_output", "str", "archive_list.splitlines", "_PYI_CONTENTS_HEADER_REGEX.match", "line.strip", "items.append", "line.strip", "RuntimeError", "RuntimeError", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.temp_pyinstaller_archive_contents"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | temp_pyinstaller_archive_contents | def temp_pyinstaller_archive_contents(archive_path: Path) -> Generator[Path, None, None]:"""Context manager to temporarily extract the contents of a pyi-archive_viewer compatiblearchive (such as a pyinstaller generated executable or .pyz file) to a temporary directory"""archive_contents = _get_pyinstaller_contents_info(archive_path)with tempfile.TemporaryDirectory() as td:temp = Path(td)# We have to batch the extractions into multiple# invocations of pyi-archive_viewer because if we try# to do too many at once on Windows it freezes.for batch in itertools.batched(archive_contents, 16):# pyi-archive_viewer doesn't seem to have arguments# for extracting from the archive, so we need to# start it in interactive mode and pipe commands# to stdinprocess = subprocess.Popen(["pyi-archive_viewer", archive_path],stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE,text=True,)for item in batch:# X - Extractprocess.stdin.write(f"X {item}\n")process.stdin.flush()# Input the path to extract toprocess.stdin.write(f"{str(temp / item)}\n")process.stdin.flush()# Quitprocess.stdin.write("Q\n")process.stdin.flush()stdout, stderr = process.communicate()if process.returncode != 0:print(stdout)print(stderr)raise RuntimeError(f"Failed to extract {archive_path}")# Verify that all files were extractedfor item in batch:if not (temp / item).is_file():raise RuntimeError(f"Failed to extract {item.name} from {archive_path}")yield temp | 6 | 28 | 1 | 192 | 3 | 95 | 137 | 95 | archive_path | ['temp', 'archive_contents', 'process'] | Generator[Path, None, None] | {"Assign": 4, "Expr": 10, "For": 3, "If": 2, "With": 1} | 18 | 43 | 18 | ["_get_pyinstaller_contents_info", "tempfile.TemporaryDirectory", "Path", "itertools.batched", "subprocess.Popen", "process.stdin.write", "process.stdin.flush", "process.stdin.write", "str", "process.stdin.flush", "process.stdin.write", "process.stdin.flush", "process.communicate", "print", "print", "RuntimeError", "is_file", "RuntimeError"] | 0 | [] | The function (temp_pyinstaller_archive_contents) defined within the public class called public.The function start at line 95 and ends at 137. It contains 28 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 18.0 functions, and It has 18.0 functions called inside which are ["_get_pyinstaller_contents_info", "tempfile.TemporaryDirectory", "Path", "itertools.batched", "subprocess.Popen", "process.stdin.write", "process.stdin.flush", "process.stdin.write", "str", "process.stdin.flush", "process.stdin.write", "process.stdin.flush", "process.communicate", "print", "print", "RuntimeError", "is_file", "RuntimeError"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | calculate_sha256 | def calculate_sha256(filepath: Path) -> str:"""Calculated the sha256 of a file at a given path"""sha256 = hashlib.sha256()with open(filepath, "rb") as f:while True:data = f.read(_BLOCK_SIZE)if not data:breaksha256.update(data)return sha256.hexdigest() | 3 | 9 | 1 | 55 | 2 | 140 | 151 | 140 | filepath | ['sha256', 'data'] | str | {"Assign": 2, "Expr": 2, "If": 1, "Return": 1, "While": 1, "With": 1} | 5 | 12 | 5 | ["hashlib.sha256", "open", "f.read", "sha256.update", "sha256.hexdigest"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"] | The function (calculate_sha256) defined within the public class called public.The function start at line 140 and ends at 151. It contains 9 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["hashlib.sha256", "open", "f.read", "sha256.update", "sha256.hexdigest"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | rglob_files_only | def rglob_files_only(basepath: Path, glob: str) -> list[Path]:"""call rglob on a given path, but filter out anything that isn't a file"""return [filepath for filepath in basepath.rglob(glob) if filepath.is_file()] | 3 | 2 | 2 | 36 | 0 | 154 | 158 | 154 | basepath,glob | [] | list[Path] | {"Expr": 1, "Return": 1} | 2 | 5 | 2 | ["basepath.rglob", "filepath.is_file"] | 0 | [] | The function (rglob_files_only) defined within the public class called public.The function start at line 154 and ends at 158. It contains 2 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [154.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["basepath.rglob", "filepath.is_file"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _re_raise | def _re_raise(e: Exception) -> None:raise e | 1 | 2 | 1 | 11 | 0 | 161 | 162 | 161 | e | [] | None | {} | 0 | 2 | 0 | [] | 0 | [] | The function (_re_raise) defined within the public class called public.The function start at line 161 and ends at 162. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | enumerate_directory | def enumerate_directory(dirpath: Path) -> list[Path]:result = []for root, dirs, files in dirpath.walk(follow_symlinks=False, on_error=_re_raise):for name in files:result.append((root / name).relative_to(dirpath))return result | 3 | 6 | 1 | 58 | 1 | 165 | 170 | 165 | dirpath | ['result'] | list[Path] | {"Assign": 1, "Expr": 1, "For": 2, "Return": 1} | 3 | 6 | 3 | ["dirpath.walk", "result.append", "relative_to"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.get_allowed_files"] | The function (enumerate_directory) defined within the public class called public.The function start at line 165 and ends at 170. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["dirpath.walk", "result.append", "relative_to"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.get_allowed_files"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | rglob_file_list | def rglob_file_list(files: list[Path], glob: str) -> list[Path]:result = []for filepath in files:if fnmatch.fnmatch(str(filepath), glob):result.append(filepath)return result | 3 | 6 | 2 | 49 | 1 | 173 | 178 | 173 | files,glob | ['result'] | list[Path] | {"Assign": 1, "Expr": 1, "For": 1, "If": 1, "Return": 1} | 3 | 6 | 3 | ["fnmatch.fnmatch", "str", "result.append"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.get_allowed_files"] | The function (rglob_file_list) defined within the public class called public.The function start at line 173 and ends at 178. It contains 6 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [173.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["fnmatch.fnmatch", "str", "result.append"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.get_allowed_files"]. |
aws-deadline_deadline-cloud | Allowlist | public | 0 | 0 | from_dict | def from_dict(raw: dict[str, Any]) -> "Allowlist":files = []globs = []for dep in [*DEPENDENCIES, *STANDARD_LIBRARY_MODULES]:files.extend([Path(dep),Path(f"{dep}.pyc"),])globs.extend([f"{dep}.*",f"{dep}/*",f"**/{dep}.pyd",f"**/_{dep}.pyd",f"_internal/cli/_internal/{dep}-*.dist-info/**/*",f"_internal/cli/_internal/{dep}-*.dist-info/*",f"_internal/cli/_internal/{dep}/**/*",f"_internal/cli/_internal/{dep}/*",])files.extend([Path(file) for file in raw.get("files", [])])globs.extend(raw.get("globs", []))conditions = ({Path(filepath): AllowlistCondition(sha256=condition.get("sha256", None),archive_contents=(Allowlist.from_dict(condition["archive_contents"])if "archive_contents" in conditionelse None),)for filepath, condition in raw["conditions"].items()}if "conditions" in rawelse {})return Allowlist(files,globs,conditions,) | 6 | 44 | 1 | 194 | 0 | 218 | 262 | 218 | raw | [] | 'Allowlist' | {"Assign": 3, "Expr": 4, "For": 1, "Return": 1} | 15 | 45 | 15 | ["files.extend", "Path", "Path", "globs.extend", "files.extend", "Path", "raw.get", "globs.extend", "raw.get", "Path", "AllowlistCondition", "condition.get", "Allowlist.from_dict", "items", "Allowlist"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.from_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95188692_yurujaja_pangaea_bench.pangaea.datasets.pastis_py.prepare_dates"] | The function (from_dict) defined within the public class called Allowlist.The function start at line 218 and ends at 262. It contains 44 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 15.0 functions, It has 15.0 functions called inside which are ["files.extend", "Path", "Path", "globs.extend", "files.extend", "Path", "raw.get", "globs.extend", "raw.get", "Path", "AllowlistCondition", "condition.get", "Allowlist.from_dict", "items", "Allowlist"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.Allowlist.from_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95188692_yurujaja_pangaea_bench.pangaea.datasets.pastis_py.prepare_dates"]. |
aws-deadline_deadline-cloud | Allowlist | public | 0 | 0 | get_allowed_files | def get_allowed_files(self, root: Path) -> AllowlistReport:all_files = enumerate_directory(root)allowed_files = []remaining_files = set(all_files)for file in all_files:if file in self.files:allowed_files.append(file)remaining_files.remove(file)for glob in self.globs:globbed = rglob_file_list(list(remaining_files), glob)for file in globbed:remaining_files.remove(file)allowed_files.extend(globbed)if len(remaining_files) == 0:breakreturn AllowlistReport(allowed_files,list(remaining_files),) | 6 | 19 | 2 | 106 | 0 | 264 | 285 | 264 | self,root | [] | AllowlistReport | {"Assign": 4, "Expr": 4, "For": 3, "If": 2, "Return": 1} | 11 | 22 | 11 | ["enumerate_directory", "set", "allowed_files.append", "remaining_files.remove", "rglob_file_list", "list", "remaining_files.remove", "allowed_files.extend", "len", "AllowlistReport", "list"] | 0 | [] | The function (get_allowed_files) defined within the public class called Allowlist.The function start at line 264 and ends at 285. It contains 19 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [264.0] and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["enumerate_directory", "set", "allowed_files.append", "remaining_files.remove", "rglob_file_list", "list", "remaining_files.remove", "allowed_files.extend", "len", "AllowlistReport", "list"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _prepend_if_not_none | def _prepend_if_not_none(context: Optional[Path], suffix: Path) -> Path:if context is None:return suffixreturn context / suffix | 2 | 4 | 2 | 27 | 0 | 288 | 291 | 288 | context,suffix | [] | Path | {"If": 1, "Return": 2} | 0 | 4 | 0 | [] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py._get_extraction_manager", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"] | The function (_prepend_if_not_none) defined within the public class called public.The function start at line 288 and ends at 291. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [288.0] and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py._get_extraction_manager", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_extraction_manager | def _get_extraction_manager(archive_path: Path, context: Optional[Path]) -> Callable[[Path], Generator[Path, None, None]]:extension = archive_path.suffixif extension == ".zip":return temp_zip_contentselif extension == ".pyz" or extension == ".exe" or not extension:return temp_pyinstaller_archive_contentselse:raise RuntimeError(f"Cannot extract from archive of unknown type: {_prepend_if_not_none(context, archive_path.name)}") | 5 | 12 | 2 | 64 | 1 | 294 | 305 | 294 | archive_path,context | ['extension'] | Callable[[Path], Generator[Path, None, None]] | {"Assign": 1, "If": 2, "Return": 2} | 2 | 12 | 2 | ["RuntimeError", "_prepend_if_not_none"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"] | The function (_get_extraction_manager) defined within the public class called public.The function start at line 294 and ends at 305. It contains 12 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [294.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["RuntimeError", "_prepend_if_not_none"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | apply_allowlist | def apply_allowlist(archive_path: Path, allowlist: Allowlist, context: Optional[Path]) -> list[str]:print(f"Extracting: {archive_path}")with _get_extraction_manager(archive_path, context)(archive_path) as temp:allowlist_report = allowlist.get_allowed_files(temp)failures = [f"Found file {_prepend_if_not_none(context, dissallowed)} which was not included in the allowlist"for dissallowed in allowlist_report.dissallowed_files]for relative in allowlist_report.allowed_files:if relative in allowlist.conditions:condition = allowlist.conditions[relative]if condition.sha256 is not None:calculated = calculate_sha256(temp / relative)if calculated != condition.sha256:failures.append(f"File {_prepend_if_not_none(context, relative)} has sha256 {calculated} which does not match the expected {condition.sha256}")if condition.archive_contents is not None:failures.extend(apply_allowlist(temp / relative,condition.archive_contents,_prepend_if_not_none(context, relative),))return failures | 7 | 26 | 3 | 146 | 4 | 308 | 334 | 308 | archive_path,allowlist,context | ['calculated', 'failures', 'allowlist_report', 'condition'] | list[str] | {"Assign": 4, "Expr": 3, "For": 1, "If": 4, "Return": 1, "With": 1} | 10 | 27 | 10 | ["print", "_get_extraction_manager", "allowlist.get_allowed_files", "_prepend_if_not_none", "calculate_sha256", "failures.append", "_prepend_if_not_none", "failures.extend", "apply_allowlist", "_prepend_if_not_none"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.cli"] | The function (apply_allowlist) defined within the public class called public.The function start at line 308 and ends at 334. It contains 26 lines of code and it has a cyclomatic complexity of 7. It takes 3 parameters, represented as [308.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["print", "_get_extraction_manager", "allowlist.get_allowed_files", "_prepend_if_not_none", "calculate_sha256", "failures.append", "_prepend_if_not_none", "failures.extend", "apply_allowlist", "_prepend_if_not_none"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.apply_allowlist", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.pyinstaller.validate_py.cli"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | cli | def cli() -> None:parser = argparse.ArgumentParser()parser.add_argument("--zip-path",type=Path,required=False,help="The path to where the pyinstaller output zip can be found.",)args = parser.parse_args()repository_root = Path(__file__).parent.parent.parentif args.zip_path is None:zip_path = repository_root / "dist" / "deadline-client-exe.zip"else:zip_path = args.zip_pathallowlist = Allowlist.from_dict(ALLOWLIST)failures = apply_allowlist(zip_path, allowlist, None)for failure in failures:print(failure)if len(failures) != 0:exit(1)print("Validation passed") | 4 | 21 | 0 | 115 | 6 | 337 | 362 | 337 | ['parser', 'args', 'zip_path', 'repository_root', 'allowlist', 'failures'] | None | {"Assign": 7, "Expr": 4, "For": 1, "If": 2} | 10 | 26 | 10 | ["argparse.ArgumentParser", "parser.add_argument", "parser.parse_args", "Path", "Allowlist.from_dict", "apply_allowlist", "print", "len", "exit", "print"] | 154 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"] | The function (cli) defined within the public class called public.The function start at line 337 and ends at 362. It contains 21 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "parser.parse_args", "Path", "Allowlist.from_dict", "apply_allowlist", "print", "len", "exit", "print"], It has 154.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_tool_definition | def get_tool_definition(tool_name: str) -> ToolDefinition:"""Get the definition for a specific tool."""if tool_name not in TOOL_REGISTRY:raise ValueError(f"Tool '{tool_name}' not found in registry")return TOOL_REGISTRY[tool_name] | 2 | 4 | 1 | 27 | 0 | 18 | 22 | 18 | tool_name | [] | ToolDefinition | {"Expr": 1, "If": 1, "Return": 1} | 1 | 5 | 1 | ["ValueError"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_mcp.test_mcp_server_integration_py.test_mcp_server_integration"] | The function (get_tool_definition) defined within the public class called public.The function start at line 18 and ends at 22. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["ValueError"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_mcp.test_mcp_server_integration_py.test_mcp_server_integration"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_all_tool_names | def get_all_tool_names() -> List[str]:"""Get all registered tool names."""return list(TOOL_REGISTRY.keys()) | 1 | 2 | 0 | 19 | 0 | 25 | 27 | 25 | [] | List[str] | {"Expr": 1, "Return": 1} | 2 | 3 | 2 | ["list", "TOOL_REGISTRY.keys"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools"] | The function (get_all_tool_names) defined within the public class called public.The function start at line 25 and ends at 27. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["list", "TOOL_REGISTRY.keys"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | main | def main():app.run() | 1 | 2 | 0 | 9 | 0 | 12 | 13 | 12 | [] | None | {"Expr": 1} | 1 | 2 | 1 | ["app.run"] | 134 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"] | The function (main) defined within the public class called public.The function start at line 12 and ends at 13. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["app.run"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _default_serializer | def _default_serializer(obj: Any) -> Any:"""Default serializer for API responses."""if hasattr(obj, "__dict__"):return obj.__dict__elif hasattr(obj, "to_dict"):return obj.to_dict()return str(obj) | 3 | 6 | 1 | 41 | 0 | 19 | 25 | 19 | obj | [] | Any | {"Expr": 1, "If": 2, "Return": 3} | 4 | 7 | 4 | ["hasattr", "hasattr", "obj.to_dict", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_default_serializer"] | The function (_default_serializer) defined within the public class called public.The function start at line 19 and ends at 25. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["hasattr", "hasattr", "obj.to_dict", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_default_serializer"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _default_error_handler | def _default_error_handler(e: Exception) -> Dict:"""Default error handler for API calls."""error_info = {"error": str(e), "type": type(e).__name__}if hasattr(e, "response") and hasattr(e.response, "status_code"):error_info["status_code"] = e.response.status_codelogger.error(f"API tool error: {error_info}", exc_info=True)return error_info | 3 | 6 | 1 | 69 | 1 | 28 | 34 | 28 | e | ['error_info'] | Dict | {"Assign": 2, "Expr": 2, "If": 1, "Return": 1} | 5 | 7 | 5 | ["str", "type", "hasattr", "hasattr", "logger.error"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_default_error_handler"] | The function (_default_error_handler) defined within the public class called public.The function start at line 28 and ends at 34. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["str", "type", "hasattr", "hasattr", "logger.error"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_default_error_handler"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _create_wrapper.wrapper | def wrapper(**kwargs) -> dict:start_t = time.perf_counter_ns()success = Trueerror_type = Nonetry:# Filter out empty/null values that MCP clients might sendfiltered_kwargs = {k: v for k, v in kwargs.items() if v not in (None, "", "null")}# Type conversion now handled by preserving original function annotations# Add config parameter if function accepts itif has_config_param:filtered_kwargs["config"] = Noneresult = json.loads(json.dumps(func(**filtered_kwargs), default=serializer))except Exception as e:success = Falseerror_type = type(e).__name__result = error_handler(e)# Record telemetry datatry:telemetry_client = get_deadline_cloud_library_telemetry_client()latency = time.perf_counter_ns() - start_ttelemetry_client.record_event(event_type="com.amazon.rum.deadline.mcp.latency",event_details={"latency": latency,"tool_name": wrapper.__name__,"usage_mode": "MCP",},)telemetry_client.record_event(event_type="com.amazon.rum.deadline.mcp.usage",event_details={"tool_name": wrapper.__name__,"is_success": success,"error_type": error_type,"usage_mode": "MCP",},)except Exception as telemetry_error:logger.debug(f"Failed to record telemetry for MCP tool {func.__name__}: {telemetry_error}")return result | 6 | 38 | 1 | 193 | 0 | 68 | 117 | 68 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_create_wrapper.wrapper) defined within the public class called public.The function start at line 68 and ends at 117. It contains 38 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _create_wrapper | def _create_wrapper(config: ToolDefinition, serializer: Callable, error_handler: Callable) -> Callable:"""Create a wrapper function based on the tool configuration."""func = config["func"]param_names = config["param_names"]# Inspect signature once and reusefunc_sig = inspect.signature(func)if param_names is None:param_names = [p for p in func_sig.parameters.keys() if p != "config"]# Check if function accepts 'config' parameterhas_config_param = "config" in func_sig.parameterssignature = inspect.Signature([inspect.Parameter(name,inspect.Parameter.KEYWORD_ONLY,default=None,annotation=func_sig.parameters[name].annotationif name in func_sig.parameterselse str,)for name in param_names],return_annotation=dict,)def wrapper(**kwargs) -> dict:start_t = time.perf_counter_ns()success = Trueerror_type = Nonetry:# Filter out empty/null values that MCP clients might sendfiltered_kwargs = {k: v for k, v in kwargs.items() if v not in (None, "", "null")}# Type conversion now handled by preserving original function annotations# Add config parameter if function accepts itif has_config_param:filtered_kwargs["config"] = Noneresult = json.loads(json.dumps(func(**filtered_kwargs), default=serializer))except Exception as e:success = Falseerror_type = type(e).__name__result = error_handler(e)# Record telemetry datatry:telemetry_client = get_deadline_cloud_library_telemetry_client()latency = time.perf_counter_ns() - start_ttelemetry_client.record_event(event_type="com.amazon.rum.deadline.mcp.latency",event_details={"latency": latency,"tool_name": wrapper.__name__,"usage_mode": "MCP",},)telemetry_client.record_event(event_type="com.amazon.rum.deadline.mcp.usage",event_details={"tool_name": wrapper.__name__,"is_success": success,"error_type": error_type,"usage_mode": "MCP",},)except Exception as telemetry_error:logger.debug(f"Failed to record telemetry for MCP tool {func.__name__}: {telemetry_error}")return resultwrapper.__signature__ = signature# type: ignore[attr-defined]return wrapper | 6 | 26 | 3 | 132 | 12 | 37 | 120 | 37 | config,serializer,error_handler | ['success', 'has_config_param', 'result', 'telemetry_client', 'signature', 'func_sig', 'param_names', 'func', 'filtered_kwargs', 'latency', 'error_type', 'start_t'] | Callable | {"Assign": 18, "Expr": 4, "If": 2, "Return": 2, "Try": 2} | 16 | 84 | 16 | ["inspect.signature", "func_sig.parameters.keys", "inspect.Signature", "inspect.Parameter", "time.perf_counter_ns", "kwargs.items", "json.loads", "json.dumps", "func", "type", "error_handler", "get_deadline_cloud_library_telemetry_client", "time.perf_counter_ns", "telemetry_client.record_event", "telemetry_client.record_event", "logger.debug"] | 12 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestParameterTypeConversion.test_signature_preserves_original_annotations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_create_wrapper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_aws_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_network_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_parameter_filtering", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_client_initialization_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_event_structure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_queue_integration", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_success"] | The function (_create_wrapper) defined within the public class called public.The function start at line 37 and ends at 120. It contains 26 lines of code and it has a cyclomatic complexity of 6. It takes 3 parameters, represented as [37.0] and does not return any value. It declares 16.0 functions, It has 16.0 functions called inside which are ["inspect.signature", "func_sig.parameters.keys", "inspect.Signature", "inspect.Parameter", "time.perf_counter_ns", "kwargs.items", "json.loads", "json.dumps", "func", "type", "error_handler", "get_deadline_cloud_library_telemetry_client", "time.perf_counter_ns", "telemetry_client.record_event", "telemetry_client.record_event", "logger.debug"], It has 12.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline._mcp.utils_py.register_api_tools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestParameterTypeConversion.test_signature_preserves_original_annotations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestUtilityFunctions.test_create_wrapper", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_aws_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_network_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_parameter_filtering", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_client_initialization_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_error_handling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_event_structure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_queue_integration", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_telemetry_py.test_mcp_tool_telemetry_success"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | register_api_tools | def register_api_tools(app: FastMCP,tools: Optional[List[Callable]] = None,prefix: str = "",error_handler: Optional[Callable] = None,serializer: Optional[Callable] = None,) -> None:"""Register API tools with the MCP server.Args:app: FastMCP application instancetools: Optional list of specific functions to register. If None, registers all configured tools.prefix: Prefix to add to tool nameserror_handler: Optional custom error handlerserializer: Optional custom serializer"""error_handler = error_handler or _default_error_handlerserializer = serializer or _default_serializerif tools is not None:# Register only the specified toolsfor func in tools:if not callable(func):raise ValueError(f"Tool {func} is not callable")if hasattr(func, "_mcp_tool_registered"):continue# Already registered, skip# Find the tool name for this function in the registrytool_name = Nonefor name in get_all_tool_names():config = get_tool_definition(name)if config["func"] == func:tool_name = namebreakif tool_name is None:raise ValueError(f"Function {func.__name__} not found in tool registry")config = get_tool_definition(tool_name)wrapper = _create_wrapper(config, serializer, error_handler)wrapper.__name__ = tool_namedescription = func.__doc__wrapper.__doc__ = descriptionapp.tool(name=f"{prefix}{tool_name}", description=description)(wrapper)func._mcp_tool_registered = True# type: ignore[attr-defined]else:# Register all configured toolsfor tool_name in get_all_tool_names():config = get_tool_definition(tool_name)func = config["func"]if not callable(func) or hasattr(func, "_mcp_tool_registered"):continuewrapper = _create_wrapper(config, serializer, error_handler)wrapper.__name__ = tool_namedescription = func.__doc__wrapper.__doc__ = descriptionapp.tool(name=f"{prefix}{tool_name}", description=description)(wrapper)func._mcp_tool_registered = True# type: ignore[attr-defined] | 13 | 42 | 5 | 263 | 7 | 123 | 187 | 123 | app,tools,prefix,error_handler,serializer | ['config', 'wrapper', 'description', 'tool_name', 'func', 'serializer', 'error_handler'] | None | {"Assign": 18, "Expr": 3, "For": 3, "If": 6} | 15 | 65 | 15 | ["callable", "ValueError", "hasattr", "get_all_tool_names", "get_tool_definition", "ValueError", "get_tool_definition", "_create_wrapper", "app.tool", "get_all_tool_names", "get_tool_definition", "callable", "hasattr", "_create_wrapper", "app.tool"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_basic_registration", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_invalid_tools_raise_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_unregistered_function_raises_exception"] | The function (register_api_tools) defined within the public class called public.The function start at line 123 and ends at 187. It contains 42 lines of code and it has a cyclomatic complexity of 13. It takes 5 parameters, represented as [123.0] and does not return any value. It declares 15.0 functions, It has 15.0 functions called inside which are ["callable", "ValueError", "hasattr", "get_all_tool_names", "get_tool_definition", "ValueError", "get_tool_definition", "_create_wrapper", "app.tool", "get_all_tool_names", "get_tool_definition", "callable", "hasattr", "_create_wrapper", "app.tool"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_basic_registration", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_invalid_tools_raise_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_mcp.test_mcp_py.TestRegisterAPITools.test_unregistered_function_raises_exception"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | submit_job | def submit_job(job_bundle_dir: str,job_parameters: Optional[str] = None,name: Optional[str] = None,farm_id: Optional[str] = None,queue_id: Optional[str] = None,storage_profile_id: Optional[str] = None,priority: Optional[int] = 50,max_failed_tasks_count: Optional[int] = None,max_retries_per_task: Optional[int] = None,max_worker_count: Optional[int] = None,job_attachments_file_system: Optional[str] = None,require_paths_exist: bool = False,submitter_name: Optional[str] = None,known_asset_paths: Optional[str] = None,) -> Dict[str, Any]:"""Submit an Open Job Description job bundle to AWS Deadline Cloud.Args:job_bundle_dir: Path to the job bundle directory containing template.json/yamljob_parameters: JSON string of job parameters in format [{"name": "param_name", "value": "param_value"}]name: Job name to override the one in the bundlefarm_id: Farm ID to submit to (uses default if not provided)queue_id: Queue ID to submit to (uses default if not provided)storage_profile_id: Storage profile ID to usepriority: Job priority (1-100, default 50)max_failed_tasks_count: Maximum failed tasks before job failsmax_retries_per_task: Maximum retries per taskmax_worker_count: Maximum worker count for the jobjob_attachments_file_system: File system type (COPIED or VIRTUAL)require_paths_exist: Return error if input files are missingsubmitter_name: Name of the submitting applicationknown_asset_paths: JSON array of paths that shouldn't generate warningsReturns:Dictionary containing job_id and submission status"""start_time = time.time()if not os.path.exists(job_bundle_dir):raise ValueError(f"Job bundle directory does not exist: {job_bundle_dir}")if not os.path.isdir(job_bundle_dir):raise ValueError(f"Path is not a directory: {job_bundle_dir}")# Parse job parametersparsed_job_parameters = []if job_parameters:parsed_job_parameters = json.loads(job_parameters)if not isinstance(parsed_job_parameters, list):raise ValueError("job_parameters must be a JSON array of objects with 'name' and 'value' keys")# Parse known asset pathsparsed_known_asset_paths = []if known_asset_paths:parsed_known_asset_paths = json.loads(known_asset_paths)if not isinstance(parsed_known_asset_paths, list):raise ValueError("known_asset_paths must be a JSON array of strings")# Read config and use parameter values if provided, otherwise fall back to config defaultsconfig = config_file.read_config()farm_id = farm_id or config.get("defaults", "farm_id", fallback=None)queue_id = queue_id or config.get("defaults", "queue_id", fallback=None)storage_profile_id = storage_profile_id or config.get("defaults", "storage_profile_id", fallback=None)if not farm_id:raise ValueError("farm_id is required")if not queue_id:raise ValueError("queue_id is required")config.set("defaults", "farm_id", farm_id)config.set("defaults", "queue_id", queue_id)if storage_profile_id:config.set("defaults", "storage_profile_id", storage_profile_id)# Submit the jobjob_id = create_job_from_job_bundle(job_bundle_dir=job_bundle_dir,job_parameters=parsed_job_parameters,name=name,config=config,priority=priority,max_failed_tasks_count=max_failed_tasks_count,max_retries_per_task=max_retries_per_task,max_worker_count=max_worker_count,job_attachments_file_system=job_attachments_file_system,require_paths_exist=require_paths_exist,submitter_name=submitter_name or "MCP",known_asset_paths=parsed_known_asset_paths,)total_time = time.time() - start_timereturn {"status": "success","job_id": job_id,"message": f"Successfully submitted job bundle from {job_bundle_dir}","total_time_seconds": round(total_time, 1),} | 14 | 68 | 14 | 424 | 9 | 21 | 125 | 21 | job_bundle_dir,job_parameters,name,farm_id,queue_id,storage_profile_id,priority,max_failed_tasks_count,max_retries_per_task,max_worker_count,job_attachments_file_system,require_paths_exist,submitter_name,known_asset_paths | ['config', 'farm_id', 'storage_profile_id', 'total_time', 'job_id', 'parsed_job_parameters', 'start_time', 'queue_id', 'parsed_known_asset_paths'] | Dict[str, Any] | {"Assign": 11, "Expr": 4, "If": 9, "Return": 1} | 23 | 105 | 23 | ["time.time", "os.path.exists", "ValueError", "os.path.isdir", "ValueError", "json.loads", "isinstance", "ValueError", "json.loads", "isinstance", "ValueError", "config_file.read_config", "config.get", "config.get", "config.get", "ValueError", "ValueError", "config.set", "config.set", "config.set", "create_job_from_job_bundle", "time.time", "round"] | 0 | [] | The function (submit_job) defined within the public class called public.The function start at line 21 and ends at 125. It contains 68 lines of code and it has a cyclomatic complexity of 14. It takes 14 parameters, represented as [21.0] and does not return any value. It declares 23.0 functions, and It has 23.0 functions called inside which are ["time.time", "os.path.exists", "ValueError", "os.path.isdir", "ValueError", "json.loads", "isinstance", "ValueError", "json.loads", "isinstance", "ValueError", "config_file.read_config", "config.get", "config.get", "config.get", "ValueError", "ValueError", "config.set", "config.set", "config.set", "create_job_from_job_bundle", "time.time", "round"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_job_output | def download_job_output(farm_id: Optional[str] = None,queue_id: Optional[str] = None,job_id: Optional[str] = None,step_id: Optional[str] = None,task_id: Optional[str] = None,conflict_resolution: Optional[str] = None,) -> Dict[str, Any]:"""Download job output files from AWS Deadline Cloud.Args:farm_id: Farm ID (uses default if not provided)queue_id: Queue ID (uses default if not provided)job_id: Job ID to download output fromstep_id: Optional step ID to download output from specific steptask_id: Optional task ID to download output from specific task (requires step_id)conflict_resolution: How to handle file conflicts - SKIP, OVERWRITE, or CREATE_COPY (default)Returns:Dictionary containing download status and summary"""start_time = time.time()if task_id and not step_id:raise ValueError("step_id is required when task_id is provided")if not job_id:raise ValueError("job_id is required")if conflict_resolution and conflict_resolution.upper() not in ["SKIP","OVERWRITE","CREATE_COPY",]:raise ValueError(f"Invalid conflict_resolution: {conflict_resolution}. Must be SKIP, OVERWRITE, or CREATE_COPY")config = config_file.read_config()config.set("defaults", "farm_id", farm_id or config.get("defaults", "farm_id", fallback=""))config.set("defaults", "queue_id", queue_id or config.get("defaults", "queue_id", fallback=""))config.set("defaults", "job_id", job_id)if not config.has_section("settings"):config.add_section("settings")config.set("settings", "auto_accept", "true")if conflict_resolution:config.set("settings", "conflict_resolution", conflict_resolution.upper())captured_output = io.StringIO()with redirect_stdout(captured_output):_download_job_output(config,config.get("defaults", "farm_id"),config.get("defaults", "queue_id"),job_id,step_id,task_id,is_json_format=False,)output_text = captured_output.getvalue()total_time = time.time() - start_timereturn {"status": "success","job_id": job_id,"step_id": step_id,"task_id": task_id,"total_time_seconds": round(total_time, 1),"output": output_text.strip(),} | 10 | 51 | 6 | 321 | 5 | 128 | 198 | 128 | farm_id,queue_id,job_id,step_id,task_id,conflict_resolution | ['config', 'captured_output', 'output_text', 'total_time', 'start_time'] | Dict[str, Any] | {"Assign": 5, "Expr": 8, "If": 5, "Return": 1, "With": 1} | 25 | 71 | 25 | ["time.time", "ValueError", "ValueError", "conflict_resolution.upper", "ValueError", "config_file.read_config", "config.set", "config.get", "config.set", "config.get", "config.set", "config.has_section", "config.add_section", "config.set", "config.set", "conflict_resolution.upper", "io.StringIO", "redirect_stdout", "_download_job_output", "config.get", "config.get", "captured_output.getvalue", "time.time", "round", "output_text.strip"] | 0 | [] | The function (download_job_output) defined within the public class called public.The function start at line 128 and ends at 198. It contains 51 lines of code and it has a cyclomatic complexity of 10. It takes 6 parameters, represented as [128.0] and does not return any value. It declares 25.0 functions, and It has 25.0 functions called inside which are ["time.time", "ValueError", "ValueError", "conflict_resolution.upper", "ValueError", "config_file.read_config", "config.set", "config.get", "config.set", "config.get", "config.set", "config.has_section", "config.add_section", "config.set", "config.set", "conflict_resolution.upper", "io.StringIO", "redirect_stdout", "_download_job_output", "config.get", "config.get", "captured_output.getvalue", "time.time", "round", "output_text.strip"]. |
aws-deadline_deadline-cloud | DeadlineOperationCanceled | public | 0 | 1 | __init__ | def __init__(self, message: str = "Operation canceled"):super().__init__(message) | 1 | 2 | 2 | 19 | 0 | 15 | 16 | 15 | self,message | [] | None | {"Expr": 1} | 2 | 2 | 2 | ["__init__", "super"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineOperationCanceled, that inherit another class.The function start at line 15 and ends at 16. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [15.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["__init__", "super"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | DeadlineOperationCanceled | public | 0 | 1 | __init__ | def __init__(self, message: str = "Operation canceled"):super().__init__(message) | 1 | 2 | 2 | 19 | 0 | 22 | 23 | 15 | self,message | [] | None | {"Expr": 1} | 2 | 2 | 2 | ["__init__", "super"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineOperationCanceled, that inherit another class.The function start at line 22 and ends at 23. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [15.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["__init__", "super"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | DeadlineOperationCanceled | public | 0 | 1 | __init__ | def __init__(self, message: str = "Operation canceled"):super().__init__(message) | 1 | 2 | 2 | 19 | 0 | 29 | 30 | 15 | self,message | [] | None | {"Expr": 1} | 2 | 2 | 2 | ["__init__", "super"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineOperationCanceled, that inherit another class.The function start at line 29 and ends at 30. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [15.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["__init__", "super"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | DeadlineOperationCanceled | public | 0 | 1 | __init__ | def __init__(self, message: str = "Operation canceled"):super().__init__(message) | 1 | 2 | 2 | 19 | 0 | 36 | 37 | 15 | self,message | [] | None | {"Expr": 1} | 2 | 2 | 2 | ["__init__", "super"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called DeadlineOperationCanceled, that inherit another class.The function start at line 36 and ends at 37. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [15.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["__init__", "super"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | check_deadline_api_available | def check_deadline_api_available(config: Optional[ConfigParser] = None) -> bool:"""Returns True if AWS Deadline Cloud APIs are authorized in the session,False otherwise. This only checks the deadline:ListFarms API by performingone call with just one result.Args:config (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file."""import loggingfrom ._session import _modified_logging_levelwith _modified_logging_level(logging.getLogger("botocore.credentials"), logging.ERROR):try:list_farm_params: Dict[str, Any] = {"maxResults": 1}user_id, _ = get_user_and_identity_store_id(config=config)if user_id:list_farm_params["principalId"] = str(user_id)deadline = get_boto3_client("deadline", config=config)deadline.list_farms(**list_farm_params)return Trueexcept Exception:logger.exception("Error invoking ListFarms")return False | 3 | 15 | 1 | 105 | 1 | 98 | 124 | 98 | config | ['deadline'] | bool | {"AnnAssign": 1, "Assign": 3, "Expr": 3, "If": 1, "Return": 2, "Try": 1, "With": 1} | 7 | 27 | 7 | ["_modified_logging_level", "logging.getLogger", "get_user_and_identity_store_id", "str", "get_boto3_client", "deadline.list_farms", "logger.exception"] | 0 | [] | The function (check_deadline_api_available) defined within the public class called public.The function start at line 98 and ends at 124. It contains 15 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["_modified_logging_level", "logging.getLogger", "get_user_and_identity_store_id", "str", "get_boto3_client", "deadline.list_farms", "logger.exception"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_storage_profile_for_queue | def get_storage_profile_for_queue(farm_id: str,queue_id: str,storage_profile_id: str,deadline: Optional[BaseClient] = None,config: Optional[ConfigParser] = None,) -> StorageProfile:if deadline is None:deadline = get_boto3_client("deadline", config=config)storage_profile_response = deadline.get_storage_profile_for_queue(farmId=farm_id, queueId=queue_id, storageProfileId=storage_profile_id)return StorageProfile(storageProfileId=storage_profile_response["storageProfileId"],displayName=storage_profile_response["displayName"],osFamily=StorageProfileOperatingSystemFamily(storage_profile_response["osFamily"]),fileSystemLocations=[FileSystemLocation(name=file_system_location["name"],path=file_system_location["path"],type=FileSystemLocationType(file_system_location["type"]),)for file_system_location in storage_profile_response.get("fileSystemLocations", [])],) | 3 | 25 | 5 | 141 | 2 | 21 | 46 | 21 | farm_id,queue_id,storage_profile_id,deadline,config | ['storage_profile_response', 'deadline'] | StorageProfile | {"Assign": 2, "If": 1, "Return": 1} | 8 | 26 | 8 | ["get_boto3_client", "deadline.get_storage_profile_for_queue", "StorageProfile", "StorageProfileOperatingSystemFamily", "FileSystemLocation", "FileSystemLocationType", "storage_profile_response.get", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (get_storage_profile_for_queue) defined within the public class called public.The function start at line 21 and ends at 46. It contains 25 lines of code and it has a cyclomatic complexity of 3. It takes 5 parameters, represented as [21.0] and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["get_boto3_client", "deadline.get_storage_profile_for_queue", "StorageProfile", "StorageProfileOperatingSystemFamily", "FileSystemLocation", "FileSystemLocationType", "storage_profile_response.get", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _hash_attachments._default_update_hash_progress | def _default_update_hash_progress(hashing_metadata: ProgressReportMetadata) -> bool:return True | 1 | 2 | 1 | 11 | 0 | 29 | 30 | 29 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_hash_attachments._default_update_hash_progress) defined within the public class called public.The function start at line 29 and ends at 30. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _hash_attachments | def _hash_attachments(asset_manager: S3AssetManager,asset_groups: List[AssetRootGroup],total_input_files: int,total_input_bytes: int,print_function_callback: Callable = lambda msg: None,hashing_progress_callback: Optional[Callable] = None,config: Optional[ConfigParser] = None,) -> Tuple[SummaryStatistics, List[AssetRootManifest]]:"""Starts the job attachments hashing and handles the progress reportingcallback. Returns a list of the asset manifests of the hashed files."""def _default_update_hash_progress(hashing_metadata: ProgressReportMetadata) -> bool:return Trueif not hashing_progress_callback:hashing_progress_callback = _default_update_hash_progresshashing_summary, manifests = asset_manager.hash_assets_and_create_manifest(asset_groups=asset_groups,total_input_files=total_input_files,total_input_bytes=total_input_bytes,hash_cache_dir=config_file.get_cache_directory(),on_preparing_to_submit=hashing_progress_callback,)api.get_deadline_cloud_library_telemetry_client(config=config).record_hashing_summary(hashing_summary)if hashing_summary.total_files > 0:print_function_callback("Hashing Summary:")print_function_callback(textwrap.indent(str(hashing_summary), ""))else:# Ensure to call the callback once if no files were processedhashing_progress_callback(ProgressReportMetadata(status=ProgressStatus.PREPARING_IN_PROGRESS,progress=100,transferRate=0,progressMessage="No files to hash",))return hashing_summary, manifests | 3 | 35 | 7 | 171 | 1 | 15 | 59 | 15 | asset_manager,asset_groups,total_input_files,total_input_bytes,print_function_callback,hashing_progress_callback,config | ['hashing_progress_callback'] | Tuple[SummaryStatistics, List[AssetRootManifest]] | {"Assign": 2, "Expr": 5, "If": 2, "Return": 2} | 10 | 45 | 10 | ["asset_manager.hash_assets_and_create_manifest", "config_file.get_cache_directory", "record_hashing_summary", "api.get_deadline_cloud_library_telemetry_client", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "hashing_progress_callback", "ProgressReportMetadata"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_manifests._create_manifest_py._create_manifest_for_single_root"] | The function (_hash_attachments) defined within the public class called public.The function start at line 15 and ends at 59. It contains 35 lines of code and it has a cyclomatic complexity of 3. It takes 7 parameters, represented as [15.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["asset_manager.hash_assets_and_create_manifest", "config_file.get_cache_directory", "record_hashing_summary", "api.get_deadline_cloud_library_telemetry_client", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "hashing_progress_callback", "ProgressReportMetadata"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_manifests._create_manifest_py._create_manifest_for_single_root"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | wait_for_job_completion | def wait_for_job_completion(farm_id: str,queue_id: str,job_id: str,max_poll_interval: int = 120,timeout: int = 0,config: Optional[ConfigParser] = None,status_callback: Optional[Callable] = None,) -> JobCompletionResult:"""Wait for a job to complete and return information about its status and any failed tasks.This function blocks until the job's taskRunStatus reaches a terminal state(SUCCEEDED, FAILED, CANCELED, SUSPENDED, or NOT_COMPATIBLE), then returns a JobCompletionResultobject containing the final status and any failed tasks.The function uses exponential backoff for polling, starting at 0.5 seconds and doublingthe interval after each check until it reaches the maximum polling interval.Args:farm_id: The ID of the farm containing the job.queue_id: The ID of the queue containing the job.job_id: The ID of the job to wait for.max_poll_interval: Maximum time in seconds between status checks (default: 120).timeout: Maximum time in seconds to wait (0 for no timeout).config: Optional configuration object.status_callback: Optional callback function that receives the current status during polling.Returns:A JobCompletionResult object containing the job's final status and any failed tasks.Raises:DeadlineOperationError: If the timeout is reached or there's an error retrieving job information."""deadline = get_boto3_client("deadline", config=config)start_time = datetime.datetime.now()terminal_states = ["SUCCEEDED", "FAILED", "CANCELED", "SUSPENDED", "NOT_COMPATIBLE"]status = ""# Initial polling interval of 0.5 secondscurrent_interval = 0.5while True:# Check timeoutif timeout > 0:elapsed = (datetime.datetime.now() - start_time).total_seconds()if elapsed > timeout:raise DeadlineOperationTimedOut(f"Timeout waiting for job {job_id} to complete after {elapsed:.1f} seconds")# Get job statustry:job = deadline.get_job(farmId=farm_id, queueId=queue_id, jobId=job_id)status = job.get("taskRunStatus", "")# Call the status callback if provided with elapsed time and timeout infoif status_callback:elapsed = (datetime.datetime.now() - start_time).total_seconds()status_callback(status, elapsed, timeout)except ClientError as exc:raise DeadlineOperationError(f"Failed to get job status: {exc}") from excif status in terminal_states:break# Sleep using current intervaltime.sleep(current_interval)# Exponential backoff with a maximum intervalcurrent_interval = min(current_interval * 2, max_poll_interval)elapsed_time = (datetime.datetime.now() - start_time).total_seconds()# If job failed, collect failed tasksfailed_tasks = []if status != "SUCCEEDED":try:# Get all steps with paginationpaginator = deadline.get_paginator("list_steps")for page in paginator.paginate(farmId=farm_id, queueId=queue_id, jobId=job_id):# For each step, get tasks and filter for failed ones client-sidefor step in page["steps"]:step_id = step["stepId"]step_name = step.get("name", "")# Only query for tasks if the step has any failed tasksif step.get("taskRunStatusCounts", {}).get("FAILED", 0) > 0:# Get all tasks with paginationtask_paginator = deadline.get_paginator("list_tasks")for tasks_page in task_paginator.paginate(farmId=farm_id, queueId=queue_id, jobId=job_id, stepId=step_id):# Filter failed tasks client-sidefor task in tasks_page["tasks"]:if task.get("runStatus") == "FAILED":# Extract session ID from latestSessionActionIdsession_id = Nonelatest_session_action_id = task.get("latestSessionActionId")if latest_session_action_id:# Format is typically "sessionaction-{session_id}-{action_number}"# Extract the session ID partparts = latest_session_action_id.split("-")if len(parts) >= 3 and parts[0] == "sessionaction":session_id = f"session-{parts[1]}"failed_tasks.append(FailedTask(step_id=step_id,task_id=task["taskId"],step_name=step_name,parameters=task.get("parameters", {}),session_id=session_id,))except ClientError as exc:raise DeadlineOperationError(f"Failed to retrieve failed tasks: {exc}") from excreturn JobCompletionResult(status=status, failed_tasks=failed_tasks, elapsed_time=elapsed_time) | 18 | 67 | 7 | 476 | 16 | 73 | 193 | 73 | farm_id,queue_id,job_id,max_poll_interval,timeout,config,status_callback | ['job', 'paginator', 'terminal_states', 'elapsed_time', 'deadline', 'failed_tasks', 'current_interval', 'status', 'session_id', 'step_name', 'task_paginator', 'start_time', 'parts', 'step_id', 'elapsed', 'latest_session_action_id'] | JobCompletionResult | {"Assign": 20, "Expr": 4, "For": 4, "If": 9, "Return": 1, "Try": 2, "While": 1} | 31 | 121 | 31 | ["get_boto3_client", "datetime.datetime.now", "total_seconds", "datetime.datetime.now", "DeadlineOperationTimedOut", "deadline.get_job", "job.get", "total_seconds", "datetime.datetime.now", "status_callback", "DeadlineOperationError", "time.sleep", "min", "total_seconds", "datetime.datetime.now", "deadline.get_paginator", "paginator.paginate", "step.get", "get", "step.get", "deadline.get_paginator", "task_paginator.paginate", "task.get", "task.get", "latest_session_action_id.split", "len", "failed_tasks.append", "FailedTask", "task.get", "DeadlineOperationError", "JobCompletionResult"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.IncrementalDownloadTest.wait_for_job_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_exponential_backoff", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_max_interval_cap", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_timeout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_with_pagination"] | The function (wait_for_job_completion) defined within the public class called public.The function start at line 73 and ends at 193. It contains 67 lines of code and it has a cyclomatic complexity of 18. It takes 7 parameters, represented as [73.0] and does not return any value. It declares 31.0 functions, It has 31.0 functions called inside which are ["get_boto3_client", "datetime.datetime.now", "total_seconds", "datetime.datetime.now", "DeadlineOperationTimedOut", "deadline.get_job", "job.get", "total_seconds", "datetime.datetime.now", "status_callback", "DeadlineOperationError", "time.sleep", "min", "total_seconds", "datetime.datetime.now", "deadline.get_paginator", "paginator.paginate", "step.get", "get", "step.get", "deadline.get_paginator", "task_paginator.paginate", "task.get", "task.get", "latest_session_action_id.split", "len", "failed_tasks.append", "FailedTask", "task.get", "DeadlineOperationError", "JobCompletionResult"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.test_cli_incremental_download_py.IncrementalDownloadTest.wait_for_job_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_exponential_backoff", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_max_interval_cap", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_timeout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_wait_for_job_completion_with_pagination"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_session_logs | def get_session_logs(farm_id: str,queue_id: str,session_id: str,limit: int = 100,start_time: Optional[datetime.datetime] = None,end_time: Optional[datetime.datetime] = None,next_token: Optional[str] = None,config: Optional[ConfigParser] = None,) -> SessionLogResult:"""Get CloudWatch logs for a specific session.This function retrieves logs from CloudWatch for the specified session ID.By default, it returns the most recent 100 log lines, but this can beadjusted using the limit parameter.Args:farm_id: The ID of the farm containing the session.queue_id: The ID of the queue containing the session.session_id: The ID of the session to get logs for.limit: Maximum number of log lines to return.start_time: Optional start time for logs as a datetime object.end_time: Optional end time for logs as a datetime object.next_token: Optional token for pagination of results.config: Optional configuration object.Returns:A SessionLogResult object containing the log events and metadata.Raises:DeadlineOperationError: If there's an error retrieving the logs."""# Get the Deadline client to use for getting queue credentialsdeadline = get_boto3_client("deadline", config=config)# Check if we have user and identity store ID (from Deadline Cloud monitor)user_id, identity_store_id = get_user_and_identity_store_id(config=config)# Create logs client - either with queue credentials or directlyif user_id and identity_store_id:# Get a session with queue user credentialstry:queue_session = get_queue_user_boto3_session(deadline=deadline, config=config, farm_id=farm_id, queue_id=queue_id)logs_client = queue_session.client("logs")except Exception as e:raise DeadlineOperationError(f"Failed to get queue credentials: {e}")else:# Use the same boto session as for deadlinelogs_client = get_boto3_client("logs", config=config)# Construct the log group namelog_group_name = f"/aws/deadline/{farm_id}/{queue_id}"# Prepare parameters for GetLogEventsparams = {"logGroupName": log_group_name,"logStreamName": session_id,"limit": limit,"startFromHead": False,# Get the most recent logs first}# Add next_token if providedif next_token:params["nextToken"] = next_token# Add optional time parameters if providedif start_time:try:# Convert datetime to milliseconds since epochstart_timestamp = int(start_time.timestamp() * 1000)params["startTime"] = start_timestampexcept (ValueError, AttributeError) as e:raise DeadlineOperationError(f"Invalid start time: {e}")if end_time:try:# Convert datetime to milliseconds since epochend_timestamp = int(end_time.timestamp() * 1000)params["endTime"] = end_timestampexcept (ValueError, AttributeError) as e:raise DeadlineOperationError(f"Invalid end time: {e}")try:response = logs_client.get_log_events(**params)# Convert to strongly typed objectsevents = []for event in response.get("events", []):events.append(LogEvent(timestamp=datetime.datetime.fromtimestamp(event["timestamp"] / 1000, tz=datetime.timezone.utc),message=event["message"].rstrip(),ingestion_time=(datetime.datetime.fromtimestamp(event["ingestionTime"] / 1000)if "ingestionTime" in eventelse None),event_id=event.get("eventId"),))return SessionLogResult(events=events,next_token=response.get("nextForwardToken"),log_group=log_group_name,log_stream=session_id,count=len(events),)except logs_client.exceptions.ResourceNotFoundException:# Return an empty result if the log group or stream doesn't existreturn SessionLogResult(events=[],next_token=None,log_group=log_group_name,log_stream=session_id,count=0,)except Exception as e:raise DeadlineOperationError(f"Failed to retrieve logs: {e}") | 13 | 78 | 8 | 431 | 9 | 196 | 320 | 196 | farm_id,queue_id,session_id,limit,start_time,end_time,next_token,config | ['events', 'log_group_name', 'deadline', 'start_timestamp', 'logs_client', 'end_timestamp', 'response', 'params', 'queue_session'] | SessionLogResult | {"Assign": 14, "Expr": 2, "For": 1, "If": 4, "Return": 2, "Try": 4} | 25 | 125 | 25 | ["get_boto3_client", "get_user_and_identity_store_id", "get_queue_user_boto3_session", "queue_session.client", "DeadlineOperationError", "get_boto3_client", "int", "start_time.timestamp", "DeadlineOperationError", "int", "end_time.timestamp", "DeadlineOperationError", "logs_client.get_log_events", "response.get", "events.append", "LogEvent", "datetime.datetime.fromtimestamp", "rstrip", "datetime.datetime.fromtimestamp", "event.get", "SessionLogResult", "response.get", "len", "SessionLogResult", "DeadlineOperationError"] | 6 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_basic", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_invalid_datetime", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_resource_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_datetime_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_monitor_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_next_token"] | The function (get_session_logs) defined within the public class called public.The function start at line 196 and ends at 320. It contains 78 lines of code and it has a cyclomatic complexity of 13. It takes 8 parameters, represented as [196.0] and does not return any value. It declares 25.0 functions, It has 25.0 functions called inside which are ["get_boto3_client", "get_user_and_identity_store_id", "get_queue_user_boto3_session", "queue_session.client", "DeadlineOperationError", "get_boto3_client", "int", "start_time.timestamp", "DeadlineOperationError", "int", "end_time.timestamp", "DeadlineOperationError", "logs_client.get_log_events", "response.get", "events.append", "LogEvent", "datetime.datetime.fromtimestamp", "rstrip", "datetime.datetime.fromtimestamp", "event.get", "SessionLogResult", "response.get", "len", "SessionLogResult", "DeadlineOperationError"], It has 6.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_basic", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_invalid_datetime", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_resource_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_datetime_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_monitor_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_job_monitoring_py.test_get_session_logs_with_next_token"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _call_paginated_deadline_list_api | def _call_paginated_deadline_list_api(list_api, list_property_name, **kwargs):"""Calls a deadline:List* API repeatedly to concatenate all pages.Example:deadline = get_boto3_client("deadline")return _call_paginated_deadline_list_api(deadline.list_farms, "farms", **kwargs)Args:list_api (callable): The List* API function to call, from the boto3 client.list_property_name (str): The name of the property in the response that containsthe list."""response = list_api(**kwargs)result = {list_property_name: response[list_property_name]}while "nextToken" in response:response = list_api(nextToken=response["nextToken"], **kwargs)result[list_property_name].extend(response[list_property_name])return result | 2 | 7 | 3 | 61 | 2 | 10 | 30 | 10 | list_api,list_property_name,**kwargs | ['result', 'response'] | Returns | {"Assign": 3, "Expr": 2, "Return": 1, "While": 1} | 3 | 21 | 3 | ["list_api", "list_api", "extend"] | 6 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_storage_profiles_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._queue_parameters_py.get_queue_parameter_definitions"] | The function (_call_paginated_deadline_list_api) defined within the public class called public.The function start at line 10 and ends at 30. It contains 7 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [10.0], and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["list_api", "list_api", "extend"], It has 6.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_storage_profiles_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._queue_parameters_py.get_queue_parameter_definitions"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | list_farms | def list_farms(config=None, **kwargs):"""Calls the deadline:ListFarms API call, applying the filter for user membershipdepending on the configuration. If the response is paginated, it repeatedcalls the API to get all the farms."""if "principalId" not in kwargs:user_id, _ = get_user_and_identity_store_id(config=config)if user_id:kwargs["principalId"] = user_iddeadline = get_boto3_client("deadline", config=config)return _call_paginated_deadline_list_api(deadline.list_farms, "farms", **kwargs) | 3 | 7 | 2 | 58 | 1 | 34 | 46 | 34 | config,**kwargs | ['deadline'] | Returns | {"Assign": 3, "Expr": 1, "If": 2, "Return": 1} | 4 | 13 | 4 | ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (list_farms) defined within the public class called public.The function start at line 34 and ends at 46. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [34.0], and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | list_queues | def list_queues(config=None, **kwargs):"""Calls the deadline:ListQueues API call, applying the filter for user membershipdepending on the configuration. If the response is paginated, it repeatedcalls the API to get all the queues."""if "principalId" not in kwargs:user_id, _ = get_user_and_identity_store_id(config=config)if user_id:kwargs["principalId"] = user_iddeadline = get_boto3_client("deadline", config=config)return _call_paginated_deadline_list_api(deadline.list_queues, "queues", **kwargs) | 3 | 7 | 2 | 58 | 1 | 50 | 62 | 50 | config,**kwargs | ['deadline'] | Returns | {"Assign": 3, "Expr": 1, "If": 2, "Return": 1} | 4 | 13 | 4 | ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (list_queues) defined within the public class called public.The function start at line 50 and ends at 62. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [50.0], and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | list_jobs | def list_jobs(config=None, **kwargs):"""Calls the deadline:ListJobs API call, applying the filter for user membershipdepending on the configuration. If the response is paginated, it repeatedcalls the API to get all the jobs."""if "principalId" not in kwargs:user_id, _ = get_user_and_identity_store_id(config=config)if user_id:kwargs["principalId"] = user_iddeadline = get_boto3_client("deadline", config=config)return _call_paginated_deadline_list_api(deadline.list_jobs, "jobs", **kwargs) | 3 | 7 | 2 | 58 | 1 | 66 | 78 | 66 | config,**kwargs | ['deadline'] | Returns | {"Assign": 3, "Expr": 1, "If": 2, "Return": 1} | 4 | 13 | 4 | ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (list_jobs) defined within the public class called public.The function start at line 66 and ends at 78. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [66.0], and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | list_fleets | def list_fleets(config=None, **kwargs):"""Calls the deadline:ListFleets API call, applying the filter for user membershipdepending on the configuration. If the response is paginated, it repeatedcalls the API to get all the fleets."""if "principalId" not in kwargs:user_id, _ = get_user_and_identity_store_id(config=config)if user_id:kwargs["principalId"] = user_iddeadline = get_boto3_client("deadline", config=config)return _call_paginated_deadline_list_api(deadline.list_fleets, "fleets", **kwargs) | 3 | 7 | 2 | 58 | 1 | 82 | 94 | 82 | config,**kwargs | ['deadline'] | Returns | {"Assign": 3, "Expr": 1, "If": 2, "Return": 1} | 4 | 13 | 4 | ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (list_fleets) defined within the public class called public.The function start at line 82 and ends at 94. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [82.0], and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["get_user_and_identity_store_id", "get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | list_storage_profiles_for_queue | def list_storage_profiles_for_queue(config=None, **kwargs):"""Calls the deadline:ListStorageProfilesForQueue API call, applying the filter for user membershipdepending on the configuration. If the response is paginated, it repeatedcalls the API to get all the storage profiles."""deadline = get_boto3_client("deadline", config=config)return _call_paginated_deadline_list_api(deadline.list_storage_profiles_for_queue, "storageProfiles", **kwargs) | 1 | 5 | 2 | 33 | 1 | 98 | 108 | 98 | config,**kwargs | ['deadline'] | Returns | {"Assign": 1, "Expr": 1, "Return": 1} | 3 | 11 | 3 | ["get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"] | 0 | [] | The function (list_storage_profiles_for_queue) defined within the public class called public.The function start at line 98 and ends at 108. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [98.0], and this function return a value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["get_boto3_client", "_call_paginated_deadline_list_api", "api.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _list_jobs_by_filter_expression | def _list_jobs_by_filter_expression(boto3_session: boto3.Session,farm_id: str,queue_id: str,filter_expression: dict[str, Any],) -> list[dict[str, Any]]:"""This function retrieves all jobs in the queue that satisfy a provided filter expression, except potentiallysome jobs updated recently due to eventual consistency. The value is the same as the boto3 filterExpression tohttps://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/deadline/client/search_jobs.html,except it may not contain a nested groupFilter.CAUTION:Eventual consistency in the deadline:SearchJobs API means that the result set can be missing jobswith a timestamp close to the current time.TODO: This is an experimental function under development, and is exposed under the internal-only namedeadline.client.api._list_jobs_by_filter_expression._list_jobs_by_filter_expression.If it proves useful, deadline.client.api.list_jobs_active is the likely public function.NOTE:There's an edge case where 100 jobs with identical createdAt timestamps will cause the function to raisea JobFetchFailure exception. Because this would require all 100 jobs be created/updated withidentical timestamp recorded at millisecond precision, we do not expect this to occur in practice.Example:boto3_session = boto3.Session()farm_id = ...queue_id = ...saved_timestamp = ...# Get the jobs created in the last 5 minutesall_active_jobs = _list_jobs_by_filter_expression(boto3_session,farm_id,queue_id,)Args:boto3_session (boto3.Session): The boto3 Session for AWS API access.farm_id (str): The Farm ID.queue_id (str): The Queue ID.filter_expressions (dict[str, Any]): The filter expression to apply to jobs. This is nested one level in afilter expression provided to deadline:SearchJobs, so cannot include a groupFilter.Returns:The list of all jobs in the queue that satisfy the provided filter expression. Each job is as returned by the deadline:SearchJobs API."""# This function uses deadline:SearchJobs as a primitive to union subsets of up to 100 jobs at a time,# designed to guarantee we end up with the full correct set. Ordering and thresholding by the createdAt# timestamp ensures no jobs get missed as might occur if we incremented itemOffset instead.## Let J = {j ∈ queue | filter_expression(j)} be the set we want# J' = union {J_i | i ∈ range(search_count)} be the set the algorithm produces,#where J_0 = {j ∈ queue | filter_expression(j)} [limit 100, ordered by createdAt asc]#J_i = {j ∈ queue | filter_expression(j) && j["createdAt"] >= timestamp_i} [when i > 0, limit 100, ordered by createdAt asc]#timestamp_i = max {j["timestamp_field_name"]: j in J_{i-1}} [when i >= 1]## To prove these sets are equal, we show they are each a subset of the other.## J' is a subset of J, because each J_i is from a deadline:SearchJobs that filters to a subset of J.## Consider a job j ∈ J. The createdAt timestamp of a job is always available, and never changes. By construction,# the set J_0 is equivalent to {j ∈ queue | filter_expression(j) && j["createdAt"] <= timestamp_1}, and each# subsequent J_i is equivalent to {j ∈ queue | filter_expression(j) && timestamp_i <= j["createdAt"] <= timestamp_[i+1]},# except for when there are more than 100 jobs with identical createdAt timestamp, in which case the algorithm# raises an exception. All of the createdAt timestamps in J are within the range of timestamps covered by these sets,# so therefore J is also a subset of J'.## The value of filter_expression(j) may change between subsequent calls to deadline:SearchJob. If it does, the job# is within the eventual consistency window.# Do some basic parameter validation on filter_expression. The rest is left up to the deadline:SearchJobs API handler.if not isinstance(filter_expression, dict):raise ValueError("The provided filter expression must be a dict")if sorted(filter_expression.keys()) != ["filters", "operator"]:raise ValueError(f"The provided filter expression must contain 'filters' and 'operator', got {sorted(filter_expression.keys())}")# This holds {job_id: job_from_search_jobs_call, ...}result_jobs = {}deadline = get_session_client(boto3_session, "deadline")# Sort jobs in ascending order of the timestamp fieldsort_expressions = [{"fieldSort": {"name": "CREATED_AT", "sortOrder": "ASCENDING"}}]# Filter for any of the active statuses READY, ASSIGNED, STARTING, SCHEDULED, or RUNNINGprovided_filter = {"groupFilter": filter_expression,}# The first time we call deadline.search_jobs, there is no timestamp filter so it# will return the earliest jobs satisfying the filter, ordered by createdAt.query_filter_expressions = {"filters": [provided_filter],"operator": "AND",}# Continue until we've processed all jobswhile True:try:# The pageSize defaults to its maximum, 100, so we leave it out of the call.response = deadline.search_jobs(farmId=farm_id,queueIds=[queue_id],itemOffset=0,filterExpressions=query_filter_expressions,sortExpressions=sort_expressions,)except ClientError as exc:raise DeadlineOperationError(f"Failed to get Jobs from Deadline:\n{exc}") from exc# This is up to the first 100 of jobs that satisfy the queryjobs = response.get("jobs", [])# This is the total number of jobs that satisfied the querytotal_results = response.get("totalResults", 0)result_jobs.update({job["jobId"]: job for job in jobs})if len(jobs) == total_results:# If the jobs we got are the total results, result_jobs is now the full setbreakelif jobs[0]["createdAt"] == jobs[-1]["createdAt"]:# Rare edge case where the timestamp field is the same for all 100 jobs in the page, that# we expect to never see in practice. The timestamp value is stored with# millisecond precision, and jobs are scheduled independently from each other,# with updates of running jobs generally being multiple seconds apart.raise JobFetchFailure("Failure fetching jobs based on the createdAt field as more then 100 jobs have the exact same timestamp value.")else:# Continue processing from the largest timestamp value we saw so farthreshold_timestamp = jobs[-1]["createdAt"]# Update jobs to have job["createdAt"] >= threshold_timestampquery_filter_expressions = {"filters": [provided_filter,{"dateTimeFilter": {"name": "CREATED_AT","dateTime": threshold_timestamp,"operator": "GREATER_THAN_EQUAL_TO",}},],"operator": "AND",}return list(result_jobs.values()) | 8 | 58 | 4 | 297 | 9 | 21 | 173 | 21 | boto3_session,farm_id,queue_id,filter_expression | ['threshold_timestamp', 'provided_filter', 'deadline', 'sort_expressions', 'query_filter_expressions', 'total_results', 'response', 'result_jobs', 'jobs'] | list[dict[str, Any]] | {"Assign": 10, "Expr": 2, "If": 4, "Return": 1, "Try": 1, "While": 1} | 17 | 153 | 17 | ["isinstance", "ValueError", "sorted", "filter_expression.keys", "ValueError", "sorted", "filter_expression.keys", "get_session_client", "deadline.search_jobs", "DeadlineOperationError", "response.get", "response.get", "result_jobs.update", "len", "JobFetchFailure", "list", "result_jobs.values"] | 6 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_download_candidate_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_by_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_by_filter_with_incorrect_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_edge_case_many_equal_timestamps", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_jobs_timestamp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_jobs_timestamp_partial_jobs_return"] | The function (_list_jobs_by_filter_expression) defined within the public class called public.The function start at line 21 and ends at 173. It contains 58 lines of code and it has a cyclomatic complexity of 8. It takes 4 parameters, represented as [21.0] and does not return any value. It declares 17.0 functions, It has 17.0 functions called inside which are ["isinstance", "ValueError", "sorted", "filter_expression.keys", "ValueError", "sorted", "filter_expression.keys", "get_session_client", "deadline.search_jobs", "DeadlineOperationError", "response.get", "response.get", "result_jobs.update", "len", "JobFetchFailure", "list", "result_jobs.values"], It has 6.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_download_candidate_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_by_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_by_filter_with_incorrect_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_edge_case_many_equal_timestamps", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_jobs_timestamp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_list_jobs_by_filter_expression_py.test_list_jobs_recent_jobs_timestamp_partial_jobs_return"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _login_deadline_cloud_monitor | def _login_deadline_cloud_monitor(on_pending_authorization: Optional[Callable],on_cancellation_check: Optional[Callable],config: Optional[ConfigParser] = None,):# Deadline Cloud monitor writes the absolute path to itself to the config filedeadline_cloud_monitor_path = get_setting("deadline-cloud-monitor.path", config=config)profile_name = get_setting("defaults.aws_profile_name", config=config)args = [deadline_cloud_monitor_path, "login", "--profile", profile_name]# Open Deadline Cloud monitor, non-blocking the user will keep Deadline Cloud monitor running in the background.try:if sys.platform.startswith("win"):# We don't hookup to stdin but do this to avoid issues on windows# See https://docs.python.org/3/library/subprocess.html#subprocess.STARTUPINFO.lpAttributeListp = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL, stdin=subprocess.PIPE)else:p = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)except FileNotFoundError:raise DeadlineOperationError(f"Could not find Deadline Cloud monitor at {deadline_cloud_monitor_path}. "f"Please ensure Deadline Cloud monitor is installed correctly and set up the {profile_name} profile again.")if on_pending_authorization:on_pending_authorization(credentials_source=AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGIN)# And wait for the user to complete loginwhile True:# Deadline Cloud monitor is a GUI app that will keep on running# So we sit here and test that profile for validity until it worksif check_authentication_status(config) == AwsAuthenticationStatus.AUTHENTICATED:return f"Deadline Cloud monitor profile: {profile_name}"if on_cancellation_check:# Check if the UI has signaled a cancelif on_cancellation_check():p.kill()raise Exception()if p.poll():# Deadline Cloud monitor has stopped, we assume it returned us an error on one line on stderr# but let's be specific about Deadline Cloud monitor failing incase the error is non-obvious# and let's tack on stdout incase it helpserr_prefix = (f"Deadline Cloud monitor was not able to log into the {profile_name} profile:")out = p.stdout.read().decode("utf-8") if p.stdout else ""raise DeadlineOperationError(f"{err_prefix}\n{out}")time.sleep(0.5) | 10 | 40 | 3 | 226 | 6 | 33 | 85 | 33 | on_pending_authorization,on_cancellation_check,config | ['out', 'profile_name', 'args', 'deadline_cloud_monitor_path', 'p', 'err_prefix'] | Returns | {"Assign": 7, "Expr": 3, "If": 6, "Return": 1, "Try": 1, "While": 1} | 16 | 53 | 16 | ["get_setting", "get_setting", "sys.platform.startswith", "subprocess.Popen", "subprocess.Popen", "DeadlineOperationError", "on_pending_authorization", "check_authentication_status", "on_cancellation_check", "p.kill", "Exception", "p.poll", "decode", "p.stdout.read", "DeadlineOperationError", "time.sleep"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.login"] | The function (_login_deadline_cloud_monitor) defined within the public class called public.The function start at line 33 and ends at 85. It contains 40 lines of code and it has a cyclomatic complexity of 10. It takes 3 parameters, represented as [33.0], and this function return a value. It declares 16.0 functions, It has 16.0 functions called inside which are ["get_setting", "get_setting", "sys.platform.startswith", "subprocess.Popen", "subprocess.Popen", "DeadlineOperationError", "on_pending_authorization", "check_authentication_status", "on_cancellation_check", "p.kill", "Exception", "p.poll", "decode", "p.stdout.read", "DeadlineOperationError", "time.sleep"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.login"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | login | def login(on_pending_authorization: Optional[Callable],on_cancellation_check: Optional[Callable],config: Optional[ConfigParser] = None,) -> str:"""For AWS profiles created by Deadline Cloud monitor, logs in to provide access to Deadline Cloud.Args:on_pending_authorization (Callable): A callback that receives method-specific information to continue login.All methods: 'credentials_source' parameter of type AwsCredentialsSourceFor Deadline Cloud monitor: No additional parameterson_cancellation_check (Callable): A callback that allows the operation to cancel before login completesconfig (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file."""credentials_source = get_credentials_source(config)if credentials_source == AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGIN:return _login_deadline_cloud_monitor(on_pending_authorization, on_cancellation_check, config)raise UnsupportedProfileTypeForLoginLogout("Logging in is only supported for AWS Profiles created by Deadline Cloud monitor.") | 2 | 13 | 3 | 57 | 1 | 89 | 112 | 89 | on_pending_authorization,on_cancellation_check,config | ['credentials_source'] | str | {"Assign": 1, "Expr": 1, "If": 1, "Return": 1} | 4 | 24 | 4 | ["get_credentials_source", "_login_deadline_cloud_monitor", "UnsupportedProfileTypeForLoginLogout", "api.record_function_latency_telemetry_event"] | 11 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_is_independent_of_pykube_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_is_independent_of_pykube_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_login_works_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_login_works_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_monkeypatched_get_pykube_cfg_overrides_pykube", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_is_independent_of_client_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_is_independent_of_client_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_login_works_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_login_works_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930358_mozilla_mozilla_django_oidc.tests.test_middleware_py.ClientWithUser.login", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95251290_Correia_jpv_fucking_awesome_aws.awesome.lib.github_py.GitHub._login"] | The function (login) defined within the public class called public.The function start at line 89 and ends at 112. It contains 13 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [89.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_credentials_source", "_login_deadline_cloud_monitor", "UnsupportedProfileTypeForLoginLogout", "api.record_function_latency_telemetry_event"], It has 11.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_is_independent_of_pykube_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_is_independent_of_pykube_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_login_works_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_client_login_works_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_monkeypatched_get_pykube_cfg_overrides_pykube", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_is_independent_of_client_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_is_independent_of_client_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_login_works_incluster", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3688162_zalando_incubator_kopf.tests.authentication.test_login_py.test_pykube_login_works_viaconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930358_mozilla_mozilla_django_oidc.tests.test_middleware_py.ClientWithUser.login", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95251290_Correia_jpv_fucking_awesome_aws.awesome.lib.github_py.GitHub._login"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | logout | def logout(config: Optional[ConfigParser] = None) -> str:"""For AWS profiles created by Deadline Cloud monitor, logs out of Deadline Cloud. Args:config (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file."""credentials_source = get_credentials_source(config)if credentials_source == AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGIN:# Deadline Cloud monitor writes the absolute path to itself to the config filedeadline_cloud_monitor_path = get_setting("deadline-cloud-monitor.path", config=config)profile_name = get_setting("defaults.aws_profile_name", config=config)args = [deadline_cloud_monitor_path, "logout", "--profile", profile_name]# Open Deadline Cloud monitor, blocking# Unlike login, that opens the regular Deadline Cloud monitor GUI, logout is a CLI command that clears the profile# This makes it easier as we can execute and look at the return cdoetry:output = subprocess.check_output(args)except FileNotFoundError:raise DeadlineOperationError(f"Could not find Deadline Cloud monitor at {deadline_cloud_monitor_path}. "f"Please ensure Deadline Cloud monitor is installed correctly and set up the {profile_name} profile again.")except subprocess.CalledProcessError as e:raise DeadlineOperationError(f"Deadline Cloud monitor was unable to log out the profile {profile_name}."f"Return code {e.returncode}: {e.output}")# Force a refresh of the cached boto3 Session_session.invalidate_boto3_session_cache()return output.decode("utf8")raise UnsupportedProfileTypeForLoginLogout("Logging out is only supported for AWS Profiles created by Deadline Cloud monitor.") | 4 | 23 | 1 | 112 | 5 | 116 | 152 | 116 | config | ['profile_name', 'output', 'args', 'credentials_source', 'deadline_cloud_monitor_path'] | str | {"Assign": 5, "Expr": 2, "If": 1, "Return": 1, "Try": 1} | 10 | 37 | 10 | ["get_credentials_source", "get_setting", "get_setting", "subprocess.check_output", "DeadlineOperationError", "DeadlineOperationError", "_session.invalidate_boto3_session_cache", "output.decode", "UnsupportedProfileTypeForLoginLogout", "api.record_function_latency_telemetry_event"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.79438799_jazzband_django_oauth_toolkit.oauth2_provider.views.oidc_py.RPInitiatedLogoutView.do_logout"] | The function (logout) defined within the public class called public.The function start at line 116 and ends at 152. It contains 23 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["get_credentials_source", "get_setting", "get_setting", "subprocess.check_output", "DeadlineOperationError", "DeadlineOperationError", "_session.invalidate_boto3_session_cache", "output.decode", "UnsupportedProfileTypeForLoginLogout", "api.record_function_latency_telemetry_event"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.79438799_jazzband_django_oauth_toolkit.oauth2_provider.views.oidc_py.RPInitiatedLogoutView.do_logout"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | assume_queue_role_for_user | def assume_queue_role_for_user(farmId: str, queueId: str, *, config: Optional[ConfigParser] = None) -> Dict[str, Any]:"""Assumes the user role for a queue and returns temporary credentials.These credentials can be used to perform user-level operations on the queue,such as submitting jobs and monitoring job status.Args:farmId: The ID of the farm containing the queue.queueId: The ID of the queue to assume the role for.config: Optional configuration to use. If not provided, the default configuration is used.Returns:A dictionary containing the temporary credentials in the following format:{"credentials": {"accessKeyId": str,"secretAccessKey": str,"sessionToken": str,"expiration": datetime}}Raises:ClientError: If there is an error assuming the role."""client = _session.get_boto3_client("deadline", config=config)response = client.assume_queue_role_for_user(farmId=farmId, queueId=queueId)return response | 1 | 6 | 3 | 58 | 2 | 14 | 44 | 14 | farmId,queueId,config | ['client', 'response'] | Dict[str, Any] | {"Assign": 2, "Expr": 1, "Return": 1} | 3 | 31 | 3 | ["_session.get_boto3_client", "client.assume_queue_role_for_user", "_telemetry.record_function_latency_telemetry_event"] | 0 | [] | The function (assume_queue_role_for_user) defined within the public class called public.The function start at line 14 and ends at 44. It contains 6 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [14.0] and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["_session.get_boto3_client", "client.assume_queue_role_for_user", "_telemetry.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | assume_queue_role_for_read | def assume_queue_role_for_read(farmId: str, queueId: str, *, config: Optional[ConfigParser] = None) -> Dict[str, Any]:"""Assumes the read role for a queue and returns temporary credentials.These credentials can be used to perform read-only operations on the queue,such as viewing job status and queue information.Args:farmId: The ID of the farm containing the queue.queueId: The ID of the queue to assume the role for.config: Optional configuration to use. If not provided, the default configuration is used.Returns:A dictionary containing the temporary credentials in the following format:{"credentials": {"accessKeyId": str,"secretAccessKey": str,"sessionToken": str,"expiration": datetime}}Raises:ClientError: If there is an error assuming the role."""client = _session.get_boto3_client("deadline", config=config)response = client.assume_queue_role_for_read(farmId=farmId, queueId=queueId)return response | 1 | 6 | 3 | 58 | 2 | 48 | 78 | 48 | farmId,queueId,config | ['client', 'response'] | Dict[str, Any] | {"Assign": 2, "Expr": 1, "Return": 1} | 3 | 31 | 3 | ["_session.get_boto3_client", "client.assume_queue_role_for_read", "_telemetry.record_function_latency_telemetry_event"] | 0 | [] | The function (assume_queue_role_for_read) defined within the public class called public.The function start at line 48 and ends at 78. It contains 6 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [48.0] and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["_session.get_boto3_client", "client.assume_queue_role_for_read", "_telemetry.record_function_latency_telemetry_event"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_queue_parameter_definitions | def get_queue_parameter_definitions(*, farmId: str, queueId: str, config=None) -> list[JobParameter]:"""This gets all the queue parameters definitions from the specified Queue. It does soby getting all the full templates for queue environments, and then combiningthem equivalently to the Deadline Cloud service logic."""deadline = get_boto3_client("deadline", config=config)response = _call_paginated_deadline_list_api(deadline.list_queue_environments,"environments",farmId=farmId,queueId=queueId,)queue_environments = sorted((deadline.get_queue_environment(farmId=farmId,queueId=queueId,queueEnvironmentId=queue_env["queueEnvironmentId"],)for queue_env in response["environments"]),key=lambda queue_env: queue_env["priority"],)queue_environment_templates = [yaml.safe_load(queue_env["template"]) for queue_env in queue_environments]queue_parameters_definitions: dict[str, JobParameter] = {}for template in queue_environment_templates:for parameter in template.get("parameterDefinitions", []):parameter = validate_job_parameter(parameter, type_required=True, default_required=True)# If there is no group label, set it to the name of the Queue Environmentif not parameter.get("userInterface", {}).get("groupLabel"):if "userInterface" not in parameter:parameter["userInterface"] = {"control": get_ui_control_for_parameter_definition(parameter)}parameter["userInterface"]["groupLabel"] = (f"Queue Environment: {template['environment']['name']}")existing_parameter = queue_parameters_definitions.get(parameter["name"])if existing_parameter:differences = parameter_definition_difference(existing_parameter, parameter)if differences:raise DeadlineOperationError(f"Job template parameter {parameter['name']} is duplicated across queue environments with mismatched fields:\n"+ " ".join(differences))else:queue_parameters_definitions[parameter["name"]] = parameterreturn list(queue_parameters_definitions.values()) | 9 | 47 | 3 | 263 | 7 | 21 | 76 | 21 | farmId,queueId,config | ['queue_environment_templates', 'parameter', 'existing_parameter', 'deadline', 'differences', 'response', 'queue_environments'] | list[JobParameter] | {"AnnAssign": 1, "Assign": 10, "Expr": 1, "For": 2, "If": 4, "Return": 1} | 17 | 56 | 17 | ["get_boto3_client", "_call_paginated_deadline_list_api", "sorted", "deadline.get_queue_environment", "yaml.safe_load", "template.get", "validate_job_parameter", "get", "parameter.get", "get_ui_control_for_parameter_definition", "queue_parameters_definitions.get", "parameter_definition_difference", "DeadlineOperationError", "join", "list", "queue_parameters_definitions.values", "api.record_function_latency_telemetry_event"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.widgets.shared_job_settings_tab_py.SharedJobSettingsWidget._load_queue_parameters_thread_function"] | The function (get_queue_parameter_definitions) defined within the public class called public.The function start at line 21 and ends at 76. It contains 47 lines of code and it has a cyclomatic complexity of 9. It takes 3 parameters, represented as [21.0] and does not return any value. It declares 17.0 functions, It has 17.0 functions called inside which are ["get_boto3_client", "_call_paginated_deadline_list_api", "sorted", "deadline.get_queue_environment", "yaml.safe_load", "template.get", "validate_job_parameter", "get", "parameter.get", "get_ui_control_for_parameter_definition", "queue_parameters_definitions.get", "parameter_definition_difference", "DeadlineOperationError", "join", "list", "queue_parameters_definitions.values", "api.record_function_latency_telemetry_event"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.ui.widgets.shared_job_settings_tab_py.SharedJobSettingsWidget._load_queue_parameters_thread_function"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_boto3_session | def get_boto3_session(force_refresh: bool = False, config: Optional[ConfigParser] = None) -> boto3.Session:"""Gets a boto3 session for the configured AWS Deadline Cloud aws profile. This mayeither use a named profile or the default credentials provider chain.This implementation caches the session object for use across the CLI code,so that we can use the following code pattern without repeated calls toan external credentials provider process, for example.Args:force_refresh (bool, optional): If set to True, forces a cache refresh.config (ConfigParser, optional): If provided, the AWS Deadline Cloud config to use."""profile_name: Optional[str] = get_setting("defaults.aws_profile_name", config)# If the default AWS profile name is either not set, or set to "default",# use the default credentials provider chain instead of a named profile.if profile_name in ("(default)", "default", ""):profile_name = Noneif force_refresh:invalidate_boto3_session_cache()return _get_boto3_session_for_profile(profile_name) | 3 | 9 | 2 | 61 | 1 | 53 | 79 | 53 | force_refresh,config | ['profile_name'] | boto3.Session | {"AnnAssign": 1, "Assign": 1, "Expr": 2, "If": 2, "Return": 1} | 3 | 27 | 3 | ["get_setting", "invalidate_boto3_session_cache", "_get_boto3_session_for_profile"] | 12 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_credentials_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_monitor_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_queue_user_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_user_and_identity_store_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.record_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_deadline_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_s3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_sts_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.__init__"] | The function (get_boto3_session) defined within the public class called public.The function start at line 53 and ends at 79. It contains 9 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [53.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["get_setting", "invalidate_boto3_session_cache", "_get_boto3_session_for_profile"], It has 12.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_credentials_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_monitor_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_queue_user_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_user_and_identity_store_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.record_event", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_deadline_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_s3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._aws.aws_clients_py.get_sts_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.__init__"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_boto3_session_for_profile | def _get_boto3_session_for_profile(profile_name: str):session = boto3.Session(profile_name=profile_name)# By default, DCM returns creds that expire after 15 minutes, and boto3's RefreshableCredentials# class refreshes creds that are within 15 minutes of expiring, so credentials would never be reused.# Also DCM credentials currently take several seconds to refresh. Lower the refresh timeouts so creds# are reused between API calls to save time.# See https://github.com/boto/botocore/blob/develop/botocore/credentials.py#L342-L362try:credentials = session.get_credentials()if (isinstance(credentials, RefreshableCredentials)and credentials.method == "custom-process"):credentials._advisory_refresh_timeout = 5 * 60# 5 minutescredentials._mandatory_refresh_timeout = 2.5 * 60# 2.5 minutesexcept:# noqa: E722# Attempt to patch the timeouts but ignore any errors. These patched proeprties are internal and could change# without notice. Creds are functional without patching timeouts.passreturn session | 4 | 13 | 1 | 63 | 2 | 83 | 105 | 83 | profile_name | ['credentials', 'session'] | Returns | {"Assign": 4, "If": 1, "Return": 1, "Try": 1} | 3 | 23 | 3 | ["boto3.Session", "session.get_credentials", "isinstance"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_session"] | The function (_get_boto3_session_for_profile) defined within the public class called public.The function start at line 83 and ends at 105. It contains 13 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters, and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["boto3.Session", "session.get_credentials", "isinstance"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_session"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | invalidate_boto3_session_cache | def invalidate_boto3_session_cache() -> None:_get_boto3_session_for_profile.cache_clear()_get_queue_user_boto3_session.cache_clear() | 1 | 3 | 0 | 16 | 0 | 108 | 110 | 108 | [] | None | {"Expr": 2} | 2 | 3 | 2 | ["_get_boto3_session_for_profile.cache_clear", "_get_queue_user_boto3_session.cache_clear"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.conftest_py.fresh_deadline_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.conftest_py.fresh_deadline_config"] | The function (invalidate_boto3_session_cache) defined within the public class called public.The function start at line 108 and ends at 110. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["_get_boto3_session_for_profile.cache_clear", "_get_queue_user_boto3_session.cache_clear"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.cli.conftest_py.fresh_deadline_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.conftest_py.fresh_deadline_config"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_default_client_config | def get_default_client_config(**kwargs) -> botocore.config.Config:"""Gets the default botocore Config object to use with `boto3 clients`.This method adds user agent version and submitter context into botocore calls.Additional arguments are forwarded to the Config constructor."""user_agent_extra = f"app/deadline-client#{version}"if session_context.get("submitter-name"):user_agent_extra += f" submitter/{session_context['submitter-name']}"if session_context.get("cli-command-name"):user_agent_extra += f" cli-command/{session_context['cli-command-name']}"client_config = botocore.config.Config(user_agent_extra=user_agent_extra, **kwargs)return client_config | 3 | 8 | 1 | 58 | 2 | 113 | 125 | 113 | **kwargs | ['client_config', 'user_agent_extra'] | botocore.config.Config | {"Assign": 2, "AugAssign": 2, "Expr": 1, "If": 2, "Return": 1} | 3 | 13 | 3 | ["session_context.get", "session_context.get", "botocore.config.Config"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_session_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_py.test_context_tracking_command_sets_boto_user_agent_extra"] | The function (get_default_client_config) defined within the public class called public.The function start at line 113 and ends at 125. It contains 8 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["session_context.get", "session_context.get", "botocore.config.Config"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_session_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py.job_requeue_tasks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_py.test_context_tracking_command_sets_boto_user_agent_extra"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_session_client | def get_session_client(session: boto3.Session, service_name: str):"""Create and cache a boto3 client for the given session and service name.This function is decorated with @lru_cache to ensure that repeated callswith the same session and service name return the cached client to avoidrepeating initialization where possible.Args:session: The boto3 Session to use for creating the clientservice_name: The name of the AWS service (e.g., 's3', 'sts', 'ec2')Returns:A boto3 client for the specified service"""return session.client(service_name, config=get_default_client_config()) | 1 | 2 | 2 | 27 | 0 | 129 | 144 | 129 | session,service_name | [] | Returns | {"Expr": 1, "Return": 1} | 2 | 16 | 2 | ["session.client", "get_default_client_config"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_jobs_by_filter_expression_py._list_jobs_by_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._categorize_jobs_in_checkpoint", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_get_session_client_caching"] | The function (get_session_client) defined within the public class called public.The function start at line 129 and ends at 144. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [129.0], and this function return a value. It declares 2.0 functions, It has 2.0 functions called inside which are ["session.client", "get_default_client_config"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_jobs_by_filter_expression_py._list_jobs_by_filter_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.queue_group_py.sync_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._categorize_jobs_in_checkpoint", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._get_job_sessions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._incremental_download_py._incremental_output_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_get_session_client_caching"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_boto3_client | def get_boto3_client(service_name: str, config: Optional[ConfigParser] = None) -> BaseClient:"""Gets a client from the boto3 session returned by `get_boto3_session`.If the client requested is `deadline`, it uses the AWS_ENDPOINT_URL_DEADLINEdeadline endpoint url.Args:service_name (str): The AWS service to get the client for, e.g. "deadline".config (ConfigParser, optional): If provided, the AWS Deadline Cloud config to use."""session = get_boto3_session(config=config)return get_session_client(session=session, service_name=service_name) | 1 | 3 | 2 | 38 | 1 | 147 | 159 | 147 | service_name,config | ['session'] | BaseClient | {"Assign": 1, "Expr": 1, "Return": 1} | 2 | 13 | 2 | ["get_boto3_session", "get_session_client"] | 12 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._get_storage_profile_for_queue_py.get_storage_profile_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.wait_for_job_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_storage_profiles_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._queue_parameters_py.get_queue_parameter_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.precache_clients", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.get_deadline_endpoint_url"] | The function (get_boto3_client) defined within the public class called public.The function start at line 147 and ends at 159. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [147.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["get_boto3_session", "get_session_client"], It has 12.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._get_storage_profile_for_queue_py.get_storage_profile_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.wait_for_job_completion", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_storage_profiles_for_queue", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._queue_parameters_py.get_queue_parameter_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.precache_clients", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.get_deadline_endpoint_url"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_credentials_source | def get_credentials_source(config: Optional[ConfigParser] = None) -> AwsCredentialsSource:"""Returns DEADLINE_CLOUD_MONITOR_LOGIN if Deadline Cloud monitor wrote the credentials, HOST_PROVIDED otherwise.Args:config (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file."""try:session = get_boto3_session(config=config)profile_config = session._session.get_scoped_config()except ProfileNotFound:return AwsCredentialsSource.NOT_VALIDif "monitor_id" in profile_config:# Deadline Cloud monitor Desktop adds the "monitor_id" keyreturn AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGINelse:return AwsCredentialsSource.HOST_PROVIDED | 3 | 10 | 1 | 56 | 2 | 162 | 179 | 162 | config | ['profile_config', 'session'] | AwsCredentialsSource | {"Assign": 2, "Expr": 1, "If": 1, "Return": 3, "Try": 1} | 2 | 18 | 2 | ["get_boto3_session", "session._session.get_scoped_config"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.login", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.logout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status"] | The function (get_credentials_source) defined within the public class called public.The function start at line 162 and ends at 179. It contains 10 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["get_boto3_session", "session._session.get_scoped_config"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.login", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py.logout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_user_and_identity_store_id | def get_user_and_identity_store_id(config: Optional[ConfigParser] = None,) -> tuple[Optional[str], Optional[str]]:"""If logged in with Deadline Cloud monitor Desktop, returns a tuple(user_id, identity_store_id), otherwise returns None."""session = get_boto3_session(config=config)profile_config = session._session.get_scoped_config()if "monitor_id" in profile_config:return (profile_config["user_id"], profile_config["identity_store_id"])else:return None, None | 2 | 9 | 1 | 67 | 2 | 182 | 195 | 182 | config | ['profile_config', 'session'] | tuple[Optional[str], Optional[str]] | {"Assign": 2, "Expr": 1, "If": 1, "Return": 2} | 2 | 14 | 2 | ["get_boto3_session", "session._session.get_scoped_config"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"] | The function (get_user_and_identity_store_id) defined within the public class called public.The function start at line 182 and ends at 195. It contains 9 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["get_boto3_session", "session._session.get_scoped_config"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_farms", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_fleets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._list_apis_py.list_queues", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_monitor_id | def get_monitor_id(config: Optional[ConfigParser] = None,) -> Optional[str]:"""If logged in with Deadline Cloud Monitor to a Deadline Monitor, returns Monitor Id, otherwise returns None."""session = get_boto3_session(config=config)profile_config = session._session.get_scoped_config()return profile_config.get("monitor_id", None) | 1 | 6 | 1 | 45 | 2 | 198 | 207 | 198 | config | ['profile_config', 'session'] | Optional[str] | {"Assign": 2, "Expr": 1, "Return": 1} | 3 | 10 | 3 | ["get_boto3_session", "session._session.get_scoped_config", "profile_config.get"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"] | The function (get_monitor_id) defined within the public class called public.The function start at line 198 and ends at 207. It contains 6 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["get_boto3_session", "session._session.get_scoped_config", "profile_config.get"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._telemetry_py.TelemetryClient.initialize"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_queue_user_boto3_session | def get_queue_user_boto3_session(deadline: BaseClient,config: Optional[ConfigParser] = None,farm_id: Optional[str] = None,queue_id: Optional[str] = None,queue_display_name: Optional[str] = None,force_refresh: bool = False,) -> boto3.Session:"""Calls the AssumeQueueRoleForUser API to obtain the role configured in a Queue,and then creates and returns a boto3 session with those credentials.Args:deadline (BaseClient): A Deadline client.config (ConfigParser, optional): If provided, the AWS Deadline Cloud config to use.farm_id (str, optional): The ID of the farm to use.queue_id (str, optional): The ID of the queue to use.queue_display_name (str, optional): The display name of the queue.force_refresh (bool, optional): If True, forces a cache refresh."""base_session = get_boto3_session(config=config, force_refresh=force_refresh)if farm_id is None:farm_id = get_setting("defaults.farm_id")if queue_id is None:queue_id = get_setting("defaults.queue_id")return _get_queue_user_boto3_session(deadline, base_session, farm_id, queue_id, queue_display_name) | 3 | 16 | 6 | 102 | 3 | 210 | 240 | 210 | deadline,config,farm_id,queue_id,queue_display_name,force_refresh | ['farm_id', 'base_session', 'queue_id'] | boto3.Session | {"Assign": 3, "Expr": 1, "If": 2, "Return": 1} | 4 | 31 | 4 | ["get_boto3_session", "get_setting", "get_setting", "_get_queue_user_boto3_session"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.precache_clients"] | The function (get_queue_user_boto3_session) defined within the public class called public.The function start at line 210 and ends at 240. It contains 16 lines of code and it has a cyclomatic complexity of 3. It takes 6 parameters, represented as [210.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_boto3_session", "get_setting", "get_setting", "_get_queue_user_boto3_session"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._job_monitoring_py.get_session_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.precache_clients"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_queue_user_boto3_session | def _get_queue_user_boto3_session(deadline: BaseClient,base_session: boto3.Session,farm_id: str,queue_id: str,queue_display_name: Optional[str] = None,):queue_credential_provider = QueueUserCredentialProvider(deadline,farm_id,queue_id,queue_display_name,)botocore_session = get_botocore_session()credential_provider = botocore_session.get_component("credential_provider")credential_provider.insert_before("env", queue_credential_provider)aws_profile_name: Optional[str] = Noneif base_session.profile_name != "default":aws_profile_name = base_session.profile_namereturn boto3.Session(botocore_session=botocore_session,profile_name=aws_profile_name,region_name=base_session.region_name,) | 2 | 24 | 5 | 105 | 4 | 244 | 269 | 244 | deadline,base_session,farm_id,queue_id,queue_display_name | ['credential_provider', 'botocore_session', 'queue_credential_provider', 'aws_profile_name'] | Returns | {"AnnAssign": 1, "Assign": 4, "Expr": 1, "If": 1, "Return": 1} | 5 | 26 | 5 | ["QueueUserCredentialProvider", "get_botocore_session", "botocore_session.get_component", "credential_provider.insert_before", "boto3.Session"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_queue_user_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download"] | The function (_get_queue_user_boto3_session) defined within the public class called public.The function start at line 244 and ends at 269. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 5 parameters, represented as [244.0], and this function return a value. It declares 5.0 functions, It has 5.0 functions called inside which are ["QueueUserCredentialProvider", "get_botocore_session", "botocore_session.get_component", "credential_provider.insert_before", "boto3.Session"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.get_queue_user_boto3_session", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _modified_logging_level | def _modified_logging_level(logger, level):old_level = logger.getEffectiveLevel()logger.setLevel(level)try:yieldfinally:logger.setLevel(old_level) | 2 | 7 | 2 | 31 | 1 | 273 | 279 | 273 | logger,level | ['old_level'] | None | {"Assign": 1, "Expr": 3, "Try": 1} | 3 | 7 | 3 | ["logger.getEffectiveLevel", "logger.setLevel", "logger.setLevel"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.auth_group_py.auth_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py._download_job_output"] | The function (_modified_logging_level) defined within the public class called public.The function start at line 273 and ends at 279. It contains 7 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [273.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["logger.getEffectiveLevel", "logger.setLevel", "logger.setLevel"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api.__init___py.check_deadline_api_available", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._session_py.check_authentication_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.auth_group_py.auth_status", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py._download_job_output"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | check_authentication_status | def check_authentication_status(config: Optional[ConfigParser] = None) -> AwsAuthenticationStatus:"""Checks the status of the provided session, bycalling the sts::GetCallerIdentity API.Args:config (ConfigParser, optional): The AWS Deadline Cloud configurationobject to use instead of the config file.Returns AwsAuthenticationStatus enum value:- CONFIGURATION_ERROR if there is an unexpected error accessing credentials- AUTHENTICATED if they are fine- NEEDS_LOGIN if a Deadline Cloud monitor login is required."""with _modified_logging_level(logging.getLogger("botocore.credentials"), logging.ERROR):try:get_boto3_session(config=config).client("sts").get_caller_identity()return AwsAuthenticationStatus.AUTHENTICATEDexcept Exception:# We assume that the presence of a Deadline Cloud monitor profile# means we will know everything necessary to start it and login.if get_credentials_source(config) == AwsCredentialsSource.DEADLINE_CLOUD_MONITOR_LOGIN:return AwsAuthenticationStatus.NEEDS_LOGINreturn AwsAuthenticationStatus.CONFIGURATION_ERROR | 3 | 9 | 1 | 72 | 0 | 282 | 307 | 282 | config | [] | AwsAuthenticationStatus | {"Expr": 2, "If": 1, "Return": 3, "Try": 1, "With": 1} | 6 | 26 | 6 | ["_modified_logging_level", "logging.getLogger", "get_caller_identity", "client", "get_boto3_session", "get_credentials_source"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py._login_deadline_cloud_monitor"] | The function (check_authentication_status) defined within the public class called public.The function start at line 282 and ends at 307. It contains 9 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["_modified_logging_level", "logging.getLogger", "get_caller_identity", "client", "get_boto3_session", "get_credentials_source"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._loginout_py._login_deadline_cloud_monitor"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | precache_clients | def precache_clients(deadline: BaseClient = None,config: Optional[ConfigParser] = None,farm_id: Optional[str] = None,queue_id: Optional[str] = None,queue_display_name: Optional[str] = None,) -> Tuple[BaseClient, BaseClient]:"""Initialize an S3 client (and optionally a Deadline client) with queue user credentialsto pre-warm the client cache.This function creates an S3 client using queue user credentials, which triggersthe expensive service discovery process once. Subsequent client creations usingthe same session object should then use the cached client, improving performance.This function is designed to be called in a background thread at application startup.Args:deadline: An existing deadline client. If None, one will be created.config: Optional configuration parser. If None, the default configuration will be used.farm_id: The farm ID. If None, it will be retrieved from settings.queue_id: The queue ID. If None, it will be retrieved from settings.queue_display_name: The queue display name. If None, it will be retrieved from the queue.Returns:Created (or current) s3 client for the given queue_role_sessionExample:# Fire and forget initialization in a background threadimport threadingthreading.Thread(target=initialize_queue_user_s3_client,daemon=True,name="S3ClientInit").start()"""if not deadline:deadline = get_boto3_client("deadline", config=config)if not queue_id:queue_id = get_setting("defaults.queue_id", config=config)if not farm_id:farm_id = get_setting("defaults.farm_id", config=config)if not queue_display_name:queue = deadline.get_queue(farmId=farm_id,queueId=queue_id,)queue_display_name = queue["displayName"]queue_role_session = get_queue_user_boto3_session(deadline=deadline,config=config,farm_id=farm_id,queue_id=queue_id,queue_display_name=queue_display_name,)# Initialize the S3 client to populate the cachereturn deadline, get_s3_client(queue_role_session) | 5 | 27 | 5 | 153 | 6 | 310 | 367 | 310 | deadline,config,farm_id,queue_id,queue_display_name | ['queue', 'farm_id', 'deadline', 'queue_display_name', 'queue_role_session', 'queue_id'] | Tuple[BaseClient, BaseClient] | {"Assign": 6, "Expr": 1, "If": 4, "Return": 1} | 6 | 58 | 6 | ["get_boto3_client", "get_setting", "get_setting", "deadline.get_queue", "get_queue_user_boto3_session", "get_s3_client"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients_warms_asset_uploader_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients_with_params"] | The function (precache_clients) defined within the public class called public.The function start at line 310 and ends at 367. It contains 27 lines of code and it has a cyclomatic complexity of 5. It takes 5 parameters, represented as [310.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["get_boto3_client", "get_setting", "get_setting", "deadline.get_queue", "get_queue_user_boto3_session", "get_s3_client"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients_warms_asset_uploader_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.api.test_api_session_py.test_precache_clients_with_params"]. |
aws-deadline_deadline-cloud | QueueUserCredentialProvider | public | 0 | 1 | __init__ | def __init__(self,deadline: BaseClient,farm_id: str,queue_id: str,queue_display_name: Optional[str] = None,):self.deadline = deadlineself.farm_id = farm_idself.queue_id = queue_idself.queue_display_name_or_id = queue_display_name or queue_id | 2 | 11 | 5 | 49 | 0 | 389 | 399 | 389 | self,deadline,farm_id,queue_id,queue_display_name | [] | None | {"Assign": 4} | 0 | 11 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called QueueUserCredentialProvider, that inherit another class.The function start at line 389 and ends at 399. It contains 11 lines of code and it has a cyclomatic complexity of 2. It takes 5 parameters, represented as [389.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | QueueUserCredentialProvider | public | 0 | 1 | load | def load(self):credentials = self._get_queue_credentials()return RefreshableCredentials.create_from_metadata(metadata=credentials,refresh_using=self._get_queue_credentials,method=self.METHOD,) | 1 | 7 | 1 | 34 | 0 | 401 | 407 | 401 | self | [] | Returns | {"Assign": 1, "Return": 1} | 2 | 7 | 2 | ["self._get_queue_credentials", "RefreshableCredentials.create_from_metadata"] | 149 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3550345_napalm_automation_napalm_junos.napalm_junos.utils.junos_views_py._loadyaml_bypass", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_options_py.TestInflection.test_load_bulk_id_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load_null", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load_without_schema_loads_only_ids", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_errors_in_strict_mode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_load", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_no_type_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_validate_no_data_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestMeta.test_load_many", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestMeta.test_load_single", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_missing_required_relationship", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_nested_relationship_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_errors", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_with_missing_param", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964858_zephyrproject_rtos_west.src.west.configuration_py.Configuration._copy_to_configparser", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.myst_parser.inventory_py.fetch_inventory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.myst_parser.inventory_py.inventory_cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_convert_roundtrip", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_inv_filter", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_inv_filter_wildcard", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985002_pypa_bandersnatch.src.bandersnatch.delete_py.delete_packages", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.cern_monit3_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.cern_monit_py.run"] | The function (load) defined within the public class called QueueUserCredentialProvider, that inherit another class.The function start at line 401 and ends at 407. It contains 7 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, It has 2.0 functions called inside which are ["self._get_queue_credentials", "RefreshableCredentials.create_from_metadata"], It has 149.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3550345_napalm_automation_napalm_junos.napalm_junos.utils.junos_views_py._loadyaml_bypass", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_options_py.TestInflection.test_load_bulk_id_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load_null", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestCompoundDocuments.test_include_data_load_without_schema_loads_only_ids", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_errors_in_strict_mode", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_load", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_no_type_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestErrorFormatting.test_validate_no_data_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestMeta.test_load_many", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestMeta.test_load_single", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_missing_required_relationship", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_nested_relationship_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_errors", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3960208_marshmallow_code_marshmallow_jsonapi.tests.test_schema_py.TestRelationshipLoading.test_deserializing_relationship_with_missing_param", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964858_zephyrproject_rtos_west.src.west.configuration_py.Configuration._copy_to_configparser", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.myst_parser.inventory_py.fetch_inventory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.myst_parser.inventory_py.inventory_cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_convert_roundtrip", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_inv_filter", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967064_executablebooks_myst_parser.tests.test_inventory_py.test_inv_filter_wildcard", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3985002_pypa_bandersnatch.src.bandersnatch.delete_py.delete_packages", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.cern_monit3_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.cern_monit_py.run"]. |
aws-deadline_deadline-cloud | QueueUserCredentialProvider | public | 0 | 1 | _get_queue_credentials | def _get_queue_credentials(self):"""Fetches or refreshes the credentials using the AssumeQueueRoleForUser APIfor the specified Farm ID and Queue ID."""try:queue_credentials = self.deadline.assume_queue_role_for_user(farmId=self.farm_id, queueId=self.queue_id).get("credentials", None)except ClientError as exc:code = exc.response.get("Error", {}).get("Code", None)if code == "ThrottlingException":raise DeadlineOperationError(f"Throttled while attempting to assume Queue role for user on Queue '{self.queue_display_name_or_id}': {exc}\n""Please retry the operation later, or contact your administrator to increase the API's rate limit.") from excelif code == "InternalServerException":raise DeadlineOperationError(f"An internal server error occurred while attempting to assume Queue role for user on "f"Queue '{self.queue_display_name_or_id}': {exc}\n") from excelse:raise DeadlineOperationError(f"Failed to assume Queue role for user on Queue '{self.queue_display_name_or_id}': {exc}\nPlease contact your ""administrator to ensure a Queue role exists and that you have permissions to access this Queue.") from excif not queue_credentials:raise DeadlineOperationError(f"Failed to get credentials for '{self.queue_display_name_or_id}': Empty credentials received.")return {"access_key": queue_credentials["accessKeyId"],"secret_key": queue_credentials["secretAccessKey"],"token": queue_credentials["sessionToken"],"expiry_time": queue_credentials["expiration"].isoformat(),} | 5 | 32 | 1 | 145 | 0 | 409 | 444 | 409 | self | [] | Returns | {"Assign": 2, "Expr": 1, "If": 3, "Return": 1, "Try": 1} | 9 | 36 | 9 | ["get", "self.deadline.assume_queue_role_for_user", "get", "exc.response.get", "DeadlineOperationError", "DeadlineOperationError", "DeadlineOperationError", "DeadlineOperationError", "isoformat"] | 0 | [] | The function (_get_queue_credentials) defined within the public class called QueueUserCredentialProvider, that inherit another class.The function start at line 409 and ends at 444. It contains 32 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters, and this function return a value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["get", "self.deadline.assume_queue_role_for_user", "get", "exc.response.get", "DeadlineOperationError", "DeadlineOperationError", "DeadlineOperationError", "DeadlineOperationError", "isoformat"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _summarize_asset_paths | def _summarize_asset_paths(input_paths: Collection[Path | str], output_paths: Collection[Path | str], name: str) -> list[str]:result = []if input_paths:result.append(f"{name} for upload:\n")result.append(textwrap.indent(summarize_path_list(input_paths), ""))if output_paths:result.append(f"{name} to collect job outputs for download:\n")# We expect the list of output paths to be small, but truncate it to an arbitrary limit just in case for the summarysummary_entry_count = 4if len(output_paths) == summary_entry_count + 1:summary_entry_count += 1result.extend(f"{path}\n" for path in sorted(output_paths)[:summary_entry_count])if len(output_paths) > summary_entry_count:result.append(f"\n... and {len(output_paths) - summary_entry_count} more\n")return result | 6 | 16 | 3 | 121 | 2 | 64 | 80 | 64 | input_paths,output_paths,name | ['summary_entry_count', 'result'] | list[str] | {"Assign": 2, "AugAssign": 1, "Expr": 5, "If": 4, "Return": 1} | 11 | 17 | 11 | ["result.append", "result.append", "textwrap.indent", "summarize_path_list", "result.append", "len", "result.extend", "sorted", "len", "result.append", "len"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py._generate_message_for_asset_paths"] | The function (_summarize_asset_paths) defined within the public class called public.The function start at line 64 and ends at 80. It contains 16 lines of code and it has a cyclomatic complexity of 6. It takes 3 parameters, represented as [64.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["result.append", "result.append", "textwrap.indent", "summarize_path_list", "result.append", "len", "result.extend", "sorted", "len", "result.append", "len"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py._generate_message_for_asset_paths"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _generate_message_for_asset_paths | def _generate_message_for_asset_paths(upload_group: AssetUploadGroup,storage_profile: Optional[StorageProfile],known_asset_paths: Iterable[str],) -> tuple[str, bool]:"""Generate a message about asset uploads and along with a flag indicating if there are warnings."""# Collect all the input and output pathsall_input_paths: set[Path | str] = set()all_output_paths: set[Path | str] = set()for group in upload_group.asset_groups:all_input_paths.update(path for path in group.inputs)all_output_paths.update(path for path in group.outputs)# Filter to get the unknown pathsif known_asset_paths:known_path_regex = re.compile(f"{'|'.join(re.escape(path) for path in known_asset_paths)}.*")unknown_input_paths = {path for path in all_input_paths if not known_path_regex.match(str(path))}unknown_output_paths = {path for path in all_output_paths if not known_path_regex.match(str(path))}else:unknown_input_paths = all_input_pathsunknown_output_paths = all_output_pathsunknown_path_warnings = _summarize_asset_paths(unknown_input_paths, unknown_output_paths, "Unknown locations")warning_messages = []default_prompt_response = not unknown_path_warningsif unknown_path_warnings:warning_messages.append("\nWARNING: Files were specified outside of known asset paths.\n\n")warning_messages.extend([f"Job submission contains {upload_group.total_input_files} input files "f"totaling {human_readable_file_size(upload_group.total_input_bytes)}. ""All input files will be uploaded to S3 if they are not already present in the job attachments bucket.\n\n"])warning_messages.extend(_summarize_asset_paths(all_input_paths, all_output_paths, "Locations"))if unknown_path_warnings:warning_messages.append("\n---\n\n")warning_messages.append("The list of known asset prefixes for this submission are:\n")if known_asset_paths:warning_messages.extend(f"{path}\n" for path in sorted(set(known_asset_paths)))else:warning_messages.append("(empty list)\n")warning_messages.append("\n")warning_messages.extend(unknown_path_warnings)warning_messages.append("\nTo enable submission without user input, add directory locations containing the unknown paths to either \n"+ "1. The list of known asset paths in the local Deadline Cloud configuration. \n")if storage_profile:warning_messages.append(f"2. The Storage Profile '{storage_profile.displayName}' as LOCAL file system locations, from the AWS Deadline Cloud management console.\n")else:warning_messages.append("2. In a Storage Profile as LOCAL file system locations created from the AWS Deadline Cloud management console, and then configured on your workstation.\n")return "".join(warning_messages), default_prompt_response | 14 | 60 | 3 | 287 | 6 | 83 | 151 | 83 | upload_group,storage_profile,known_asset_paths | ['unknown_path_warnings', 'warning_messages', 'unknown_input_paths', 'unknown_output_paths', 'default_prompt_response', 'known_path_regex'] | tuple[str, bool] | {"AnnAssign": 2, "Assign": 8, "Expr": 15, "For": 1, "If": 5, "Return": 1} | 29 | 69 | 29 | ["set", "set", "all_input_paths.update", "all_output_paths.update", "re.compile", "join", "re.escape", "known_path_regex.match", "str", "known_path_regex.match", "str", "_summarize_asset_paths", "warning_messages.append", "warning_messages.extend", "human_readable_file_size", "warning_messages.extend", "_summarize_asset_paths", "warning_messages.append", "warning_messages.append", "warning_messages.extend", "sorted", "set", "warning_messages.append", "warning_messages.append", "warning_messages.extend", "warning_messages.append", "warning_messages.append", "warning_messages.append", "join"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"] | The function (_generate_message_for_asset_paths) defined within the public class called public.The function start at line 83 and ends at 151. It contains 60 lines of code and it has a cyclomatic complexity of 14. It takes 3 parameters, represented as [83.0] and does not return any value. It declares 29.0 functions, It has 29.0 functions called inside which are ["set", "set", "all_input_paths.update", "all_output_paths.update", "re.compile", "join", "re.escape", "known_path_regex.match", "str", "known_path_regex.match", "str", "_summarize_asset_paths", "warning_messages.append", "warning_messages.extend", "human_readable_file_size", "warning_messages.extend", "_summarize_asset_paths", "warning_messages.append", "warning_messages.append", "warning_messages.extend", "sorted", "set", "warning_messages.append", "warning_messages.append", "warning_messages.extend", "warning_messages.append", "warning_messages.append", "warning_messages.append", "join"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _upload_attachments._default_update_upload_progress | def _default_update_upload_progress(upload_metadata: ProgressReportMetadata) -> bool:return True | 1 | 2 | 1 | 11 | 0 | 168 | 169 | 168 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_upload_attachments._default_update_upload_progress) defined within the public class called public.The function start at line 168 and ends at 169. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _upload_attachments | def _upload_attachments(asset_manager: S3AssetManager,manifests: List[AssetRootManifest],print_function_callback: Callable,upload_progress_callback: Optional[Callable],config: Optional[ConfigParser] = None,from_gui: bool = False,) -> Dict[str, Any]:"""Starts the job attachments upload and handles the progress reporting callback.Returns the attachment settings from the upload."""def _default_update_upload_progress(upload_metadata: ProgressReportMetadata) -> bool:return Trueif not upload_progress_callback:upload_progress_callback = _default_update_upload_progressupload_summary, attachment_settings = asset_manager.upload_assets(manifests=manifests,on_uploading_assets=upload_progress_callback,s3_check_cache_dir=config_file.get_cache_directory(),)api.get_deadline_cloud_library_telemetry_client(config=config).record_upload_summary(upload_summary,from_gui=from_gui,)if upload_summary.total_files > 0:print_function_callback("Upload Summary:")print_function_callback(textwrap.indent(str(upload_summary), ""))else:# Ensure to call the callback once if no files were processedupload_progress_callback(ProgressReportMetadata(status=ProgressStatus.UPLOAD_IN_PROGRESS,progress=100,transferRate=0,progressMessage="No files to upload",))return attachment_settings.to_dict() | 3 | 33 | 6 | 158 | 1 | 155 | 198 | 155 | asset_manager,manifests,print_function_callback,upload_progress_callback,config,from_gui | ['upload_progress_callback'] | Dict[str, Any] | {"Assign": 2, "Expr": 5, "If": 2, "Return": 2} | 12 | 44 | 12 | ["asset_manager.upload_assets", "config_file.get_cache_directory", "record_upload_summary", "api.get_deadline_cloud_library_telemetry_client", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "upload_progress_callback", "ProgressReportMetadata", "attachment_settings.to_dict", "api.record_success_fail_telemetry_event"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"] | The function (_upload_attachments) defined within the public class called public.The function start at line 155 and ends at 198. It contains 33 lines of code and it has a cyclomatic complexity of 3. It takes 6 parameters, represented as [155.0] and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["asset_manager.upload_assets", "config_file.get_cache_directory", "record_upload_summary", "api.get_deadline_cloud_library_telemetry_client", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "upload_progress_callback", "ProgressReportMetadata", "attachment_settings.to_dict", "api.record_success_fail_telemetry_event"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _snapshot_attachments._default_update_snapshot_progress | def _default_update_snapshot_progress(upload_metadata: ProgressReportMetadata) -> bool:return True | 1 | 2 | 1 | 11 | 0 | 216 | 217 | 216 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_snapshot_attachments._default_update_snapshot_progress) defined within the public class called public.The function start at line 216 and ends at 217. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _snapshot_attachments | def _snapshot_attachments(snapshot_dir: str,asset_manager: S3AssetManager,manifests: List[AssetRootManifest],print_function_callback: Callable,snapshot_progress_callback: Optional[Callable],config: Optional[ConfigParser] = None,from_gui: bool = False,) -> Dict[str, Any]:"""Starts the job attachments upload and handles the progress reporting callback.Returns the attachment settings from the upload."""def _default_update_snapshot_progress(upload_metadata: ProgressReportMetadata) -> bool:return Trueif not snapshot_progress_callback:snapshot_progress_callback = _default_update_snapshot_progressupload_summary, attachment_settings = asset_manager.snapshot_assets(snapshot_dir=snapshot_dir,manifests=manifests,on_snapshotting_assets=snapshot_progress_callback,)if upload_summary.total_files > 0:print_function_callback("Snapshot Summary:")print_function_callback(textwrap.indent(str(upload_summary), ""))else:# Ensure to call the callback once if no files were processedsnapshot_progress_callback(ProgressReportMetadata(status=ProgressStatus.UPLOAD_IN_PROGRESS,progress=100,transferRate=0,progressMessage="No files to upload",))return attachment_settings.to_dict() | 3 | 30 | 7 | 140 | 1 | 202 | 242 | 202 | snapshot_dir,asset_manager,manifests,print_function_callback,snapshot_progress_callback,config,from_gui | ['snapshot_progress_callback'] | Dict[str, Any] | {"Assign": 2, "Expr": 4, "If": 2, "Return": 2} | 9 | 41 | 9 | ["asset_manager.snapshot_assets", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "snapshot_progress_callback", "ProgressReportMetadata", "attachment_settings.to_dict", "api.record_success_fail_telemetry_event"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"] | The function (_snapshot_attachments) defined within the public class called public.The function start at line 202 and ends at 242. It contains 30 lines of code and it has a cyclomatic complexity of 3. It takes 7 parameters, represented as [202.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["asset_manager.snapshot_assets", "print_function_callback", "print_function_callback", "textwrap.indent", "str", "snapshot_progress_callback", "ProgressReportMetadata", "attachment_settings.to_dict", "api.record_success_fail_telemetry_event"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _filter_redundant_known_paths | def _filter_redundant_known_paths(known_asset_paths: Iterable[str]) -> list[str]:"""Filters out redundant paths from the known asset paths list.This algorithm identifies any paths that have a different path as a prefix,and removes them from the list. Pseudo-code is:1. Sort the paths from shortest to longest, so any prefix of a path has to happen before that path.2. For each path, split it into parts (i.e. '/mnt/prod/project' becomes ['/', 'mnt', 'prod', 'project']), and then insert it part by part into a nested dict called dir_tree organized as a TRIE. The value True in the TRIE indicates that a path with that as its final part is in the list.3. While inserting a path into the TRIE, detect whether another path already had a prefix of the parts, and filter out the path when that occurs."""# This directory tree gets filled with the known asset paths, with# a True value as a marker for the last part of already seen paths.dir_tree: dict[str, Any] = {}filtered_paths: list[str] = []# Process the paths from shortest to longest, so that prefixes are always seen firstfor path in sorted(known_asset_paths, key=len):parts = Path(path).partscurrent: Optional[dict[str, Any]] = dir_treefor part in parts[:-1]:# If we see a True value, another path is a prefix so we can skip it.if current.get(part) is True:# type: ignorecurrent = Nonebreakcurrent = current.setdefault(part, {})# type: ignore# If we didn't find a prefix or equal path, add this one and mark it in dir_treeif current is not None and current.get(parts[-1]) is not True:filtered_paths.append(path)current[parts[-1]] = Truereturn filtered_paths | 6 | 15 | 1 | 142 | 2 | 245 | 279 | 245 | known_asset_paths | ['current', 'parts'] | list[str] | {"AnnAssign": 3, "Assign": 4, "Expr": 2, "For": 2, "If": 2, "Return": 1} | 6 | 35 | 6 | ["sorted", "Path", "current.get", "current.setdefault", "current.get", "filtered_paths.append"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_bundle_submit_known_paths_py.test_filter_redundant_known_paths"] | The function (_filter_redundant_known_paths) defined within the public class called public.The function start at line 245 and ends at 279. It contains 15 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["sorted", "Path", "current.get", "current.setdefault", "current.get", "filtered_paths.append"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.api._submit_job_bundle_py.create_job_from_job_bundle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_bundle_submit_known_paths_py.test_filter_redundant_known_paths"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _save_debug_snapshot.write_commands | def write_commands(write_line: Callable, continuation: str):if "attachments" in create_job_args:for subdir in ("Data", "Manifests"):write_line(f"aws s3 cp {continuation}")write_line(f"--recursive {continuation}")write_line(f"./{subdir} {continuation}")write_line(f"s3://{asset_manager.job_attachment_settings.s3BucketName}/{asset_manager.job_attachment_settings.rootPrefix}/{subdir}"# type: ignore)write_line()write_line(f"aws deadline create-job {continuation}")for param_opts in cli_args[:-1]:write_line(f"{shlex.join(param_opts)} {continuation}")write_line(f"{shlex.join(cli_args[-1])}") | 4 | 14 | 2 | 73 | 0 | 314 | 327 | 314 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_save_debug_snapshot.write_commands) defined within the public class called public.The function start at line 314 and ends at 327. It contains 14 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [314.0] and does not return any value.. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.