project_name string | class_name string | class_modifiers string | class_implements int64 | class_extends int64 | function_name string | function_body string | cyclomatic_complexity int64 | NLOC int64 | num_parameter int64 | num_token int64 | num_variable int64 | start_line int64 | end_line int64 | function_index int64 | function_params string | function_variable string | function_return_type string | function_body_line_type string | function_num_functions int64 | function_num_lines int64 | outgoing_function_count int64 | outgoing_function_names string | incoming_function_count int64 | incoming_function_names string | lexical_representation string |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _upload_output_manifest_to_s3 | def _upload_output_manifest_to_s3(self,s3_settings: JobAttachmentS3Settings,output_manifest: BaseAssetManifest,full_output_prefix: str,root_path: str,file_system_location_name: Optional[str] = None,) -> None:"""Uploads the given output manifest to the given S3 bucket."""hash_alg = output_manifest.get_default_hash_alg()manifest_bytes = output_manifest.encode().encode("utf-8")manifest_name_prefix = hash_data(f"{file_system_location_name or ''}{root_path}".encode(), hash_alg)manifest_path = _join_s3_paths(full_output_prefix,f"{manifest_name_prefix}_output",)metadata = {"Metadata": {"asset-root": json.dumps(root_path, ensure_ascii=True)}}# S3 metadata must be ASCII, so use either 'asset-root' or 'asset-root-json' depending# on whether the value is ASCII.try:# Add the 'asset-root' metadata if the path is ASCIIroot_path.encode(encoding="ascii")metadata["Metadata"]["asset-root"] = root_pathexcept UnicodeEncodeError:# Add the 'asset-root-json' metadata encoded to ASCII as a JSON stringmetadata["Metadata"]["asset-root-json"] = json.dumps(root_path, ensure_ascii=True)if file_system_location_name:metadata["Metadata"]["file-system-location-name"] = file_system_location_nameself.logger.info(f"Uploading output manifest to {manifest_path}")self.s3_uploader.upload_bytes_to_s3(BytesIO(manifest_bytes),s3_settings.s3BucketName,manifest_path,extra_args=metadata,) | 3 | 32 | 6 | 179 | 0 | 525 | 563 | 525 | self,s3_settings,output_manifest,full_output_prefix,root_path,file_system_location_name | [] | None | {"Assign": 8, "Expr": 4, "If": 1, "Try": 1} | 12 | 39 | 12 | ["output_manifest.get_default_hash_alg", "encode", "output_manifest.encode", "hash_data", "encode", "_join_s3_paths", "json.dumps", "root_path.encode", "json.dumps", "self.logger.info", "self.s3_uploader.upload_bytes_to_s3", "BytesIO"] | 0 | [] | The function (_upload_output_manifest_to_s3) defined within the public class called AssetSync.The function start at line 525 and ends at 563. It contains 32 lines of code and it has a cyclomatic complexity of 3. It takes 6 parameters, represented as [525.0] and does not return any value. It declares 12.0 functions, and It has 12.0 functions called inside which are ["output_manifest.get_default_hash_alg", "encode", "output_manifest.encode", "hash_data", "encode", "_join_s3_paths", "json.dumps", "root_path.encode", "json.dumps", "self.logger.info", "self.s3_uploader.upload_bytes_to_s3", "BytesIO"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _generate_output_manifest | def _generate_output_manifest(self, outputs: List[OutputFile]) -> BaseAssetManifest:paths: list[RelativeFilePath] = []for output in outputs:path_args: dict[str, Any] = {"hash": output.file_hash,"path": output.rel_path,}path_args["size"] = output.file_size# stat().st_mtime_ns returns an int that represents the time in nanoseconds since the epoch.# The asset manifest spec requires the mtime to be represented as an integer in microseconds.path_args["mtime"] = trunc(Path(output.full_path).stat().st_mtime_ns // 1000)paths.append(self.manifest_model.Path(**path_args))asset_manifest_args: dict[str, Any] = {"paths": paths,"hash_alg": self.hash_alg,}asset_manifest_args["total_size"] = sum([output.file_size for output in outputs])return self.manifest_model.AssetManifest(**asset_manifest_args)# type: ignore[call-arg] | 3 | 16 | 2 | 141 | 0 | 565 | 584 | 565 | self,outputs | [] | BaseAssetManifest | {"AnnAssign": 3, "Assign": 3, "Expr": 1, "For": 1, "Return": 1} | 7 | 20 | 7 | ["trunc", "stat", "Path", "paths.append", "self.manifest_model.Path", "sum", "self.manifest_model.AssetManifest"] | 0 | [] | The function (_generate_output_manifest) defined within the public class called AssetSync.The function start at line 565 and ends at 584. It contains 16 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [565.0] and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["trunc", "stat", "Path", "paths.append", "self.manifest_model.Path", "sum", "self.manifest_model.AssetManifest"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _get_output_files | def _get_output_files(self,manifest_properties: ManifestProperties,s3_settings: JobAttachmentS3Settings,local_root: Path,session_dir: Path,) -> List[OutputFile]:"""Walks the output directories for this asset root for any output files that have been created or modifiedsince the start time provided. Hashes and checks if the output files already exist in the CAS."""output_files: List[OutputFile] = []source_path_format = manifest_properties.rootPathFormatcurrent_path_format = PathFormat.get_host_path_format()for output_dir in manifest_properties.outputRelativeDirectories or []:if source_path_format != current_path_format:if source_path_format == PathFormat.WINDOWS:output_dir = output_dir.replace("\\", "/")elif source_path_format == PathFormat.POSIX:output_dir = output_dir.replace("/", "\\")output_root: Path = local_root / output_dirtotal_file_count = 0total_file_size = 0# Don't fail if output dir hasn't been created yet; another task might be working on itif not output_root.is_dir():self.logger.info(f"Found 0 files (Output directory {output_root} does not exist.)")continue# Get all files in this directory (includes sub-directories)for file_path in output_root.glob("**/*"):# Files that are new or have been modified since the last sync will be added to the output list.mtime_when_synced = self.synced_assets_mtime.get(str(file_path), None)file_mtime = file_path.stat().st_mtime_nsis_modified = Falseif mtime_when_synced:if file_mtime > int(mtime_when_synced):# This file has been modified during this session action.is_modified = Trueelse:# This is a new file created during this session action.self.synced_assets_mtime[str(file_path)] = int(file_mtime)is_modified = True# Resolve the real path to prevent time-of-check/time-of-use vulnerabilityfile_real_path = file_path.resolve()# validate that the file resolves inside of the session working directory.is_file_path_under_session_dir = self._is_file_within_directory(file_real_path, session_dir)if is_file_path_under_session_dir is False:self.logger.info(f"Skipping file '{file_path}' as its resolved path '{file_real_path}' is"f" outside the session directory '{session_dir}'")continueif (not file_real_path.is_dir()and file_real_path.exists()and is_modifiedand is_file_path_under_session_dir):file_size = file_real_path.resolve().lstat().st_sizefile_hash = hash_file(str(file_real_path), self.hash_alg)s3_key = f"{file_hash}.{self.hash_alg.value}"if s3_settings.full_cas_prefix():s3_key = _join_s3_paths(s3_settings.full_cas_prefix(), s3_key)in_s3 = self.s3_uploader.file_already_uploaded(s3_settings.s3BucketName, s3_key)total_file_count += 1total_file_size += file_sizeoutput_files.append(OutputFile(file_size=file_size,file_hash=file_hash,rel_path=str(PurePosixPath(*file_path.relative_to(local_root).parts)),full_path=str(file_real_path),s3_key=s3_key,in_s3=in_s3,base_dir=str(session_dir),))self.logger.info(f"Found {total_file_count} file{'' if total_file_count == 1 else 's'}"f" totaling {human_readable_file_size(total_file_size)}"f" in output directory: {str(output_root)}")return output_files | 16 | 73 | 5 | 393 | 0 | 586 | 682 | 586 | self,manifest_properties,s3_settings,local_root,session_dir | [] | List[OutputFile] | {"AnnAssign": 2, "Assign": 19, "AugAssign": 2, "Expr": 5, "For": 2, "If": 9, "Return": 1} | 35 | 97 | 35 | ["PathFormat.get_host_path_format", "output_dir.replace", "output_dir.replace", "output_root.is_dir", "self.logger.info", "output_root.glob", "self.synced_assets_mtime.get", "str", "file_path.stat", "int", "str", "int", "file_path.resolve", "self._is_file_within_directory", "self.logger.info", "file_real_path.is_dir", "file_real_path.exists", "lstat", "file_real_path.resolve", "hash_file", "str", "s3_settings.full_cas_prefix", "_join_s3_paths", "s3_settings.full_cas_prefix", "self.s3_uploader.file_already_uploaded", "output_files.append", "OutputFile", "str", "PurePosixPath", "file_path.relative_to", "str", "str", "self.logger.info", "human_readable_file_size", "str"] | 0 | [] | The function (_get_output_files) defined within the public class called AssetSync.The function start at line 586 and ends at 682. It contains 73 lines of code and it has a cyclomatic complexity of 16. It takes 5 parameters, represented as [586.0] and does not return any value. It declares 35.0 functions, and It has 35.0 functions called inside which are ["PathFormat.get_host_path_format", "output_dir.replace", "output_dir.replace", "output_root.is_dir", "self.logger.info", "output_root.glob", "self.synced_assets_mtime.get", "str", "file_path.stat", "int", "str", "int", "file_path.resolve", "self._is_file_within_directory", "self.logger.info", "file_real_path.is_dir", "file_real_path.exists", "lstat", "file_real_path.resolve", "hash_file", "str", "s3_settings.full_cas_prefix", "_join_s3_paths", "s3_settings.full_cas_prefix", "self.s3_uploader.file_already_uploaded", "output_files.append", "OutputFile", "str", "PurePosixPath", "file_path.relative_to", "str", "str", "self.logger.info", "human_readable_file_size", "str"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _is_file_within_directory | def _is_file_within_directory(self, file_path: Path, directory_path: Path) -> bool:"""Checks if the given file path is within the given directory path."""real_file_path = file_path.resolve()real_directory_path = directory_path.resolve()common_path = os.path.commonpath([real_file_path, real_directory_path])return common_path.startswith(str(real_directory_path)) | 1 | 5 | 3 | 54 | 0 | 684 | 691 | 684 | self,file_path,directory_path | [] | bool | {"Assign": 3, "Expr": 1, "Return": 1} | 5 | 8 | 5 | ["file_path.resolve", "directory_path.resolve", "os.path.commonpath", "common_path.startswith", "str"] | 0 | [] | The function (_is_file_within_directory) defined within the public class called AssetSync.The function start at line 684 and ends at 691. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [684.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["file_path.resolve", "directory_path.resolve", "os.path.commonpath", "common_path.startswith", "str"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | get_s3_settings | def get_s3_settings(self, farm_id: str, queue_id: str) -> Optional[JobAttachmentS3Settings]:"""Gets Job Attachment S3 settings by calling the Deadline GetQueue API."""queue = get_queue(farm_id=farm_id,queue_id=queue_id,session=self.session,deadline_endpoint_url=self.deadline_endpoint_url,)return queue.jobAttachmentSettings if queue and queue.jobAttachmentSettings else None | 3 | 8 | 3 | 56 | 0 | 693 | 703 | 693 | self,farm_id,queue_id | [] | Optional[JobAttachmentS3Settings] | {"Assign": 1, "Expr": 1, "Return": 1} | 1 | 11 | 1 | ["get_queue"] | 0 | [] | The function (get_s3_settings) defined within the public class called AssetSync.The function start at line 693 and ends at 703. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [693.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["get_queue"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | get_attachments | def get_attachments(self, farm_id: str, queue_id: str, job_id: str) -> Optional[Attachments]:"""Gets Job Attachment settings by calling the Deadline GetJob API."""job = get_job(farm_id=farm_id,queue_id=queue_id,job_id=job_id,session=self.session,deadline_endpoint_url=self.deadline_endpoint_url,)return job.attachments if job and job.attachments else None | 3 | 9 | 4 | 64 | 0 | 705 | 716 | 705 | self,farm_id,queue_id,job_id | [] | Optional[Attachments] | {"Assign": 1, "Expr": 1, "Return": 1} | 1 | 12 | 1 | ["get_job"] | 0 | [] | The function (get_attachments) defined within the public class called AssetSync.The function start at line 705 and ends at 716. It contains 9 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [705.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["get_job"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _record_attachment_mtimes | def _record_attachment_mtimes(self, merged_manifests_by_root: dict[str, BaseAssetManifest]) -> None:# Record the mapping of downloaded files' absolute paths to their last modification time# (in microseconds). This is used to later determine which files have been modified or# newly created during the session and need to be uploaded as output.for local_root, merged_manifest in merged_manifests_by_root.items():for manifest_path in merged_manifest.paths:abs_path = str(Path(local_root) / manifest_path.path)self.synced_assets_mtime[abs_path] = Path(abs_path).stat().st_mtime_ns | 3 | 7 | 2 | 64 | 0 | 718 | 727 | 718 | self,merged_manifests_by_root | [] | None | {"Assign": 2, "For": 2} | 5 | 10 | 5 | ["merged_manifests_by_root.items", "str", "Path", "stat", "Path"] | 0 | [] | The function (_record_attachment_mtimes) defined within the public class called AssetSync.The function start at line 718 and ends at 727. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [718.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["merged_manifests_by_root.items", "str", "Path", "stat", "Path"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | _ensure_disk_capacity | def _ensure_disk_capacity(self, session_dir: Path, total_input_bytes: int) -> None:"""Raises an AssetSyncError if the given input bytes is larger than the available disk space."""disk_free: int = shutil.disk_usage(session_dir).freeif total_input_bytes > disk_free:input_size_readable = human_readable_file_size(total_input_bytes)disk_free_readable = human_readable_file_size(disk_free)raise AssetSyncError("Error occurred while attempting to sync input files: "f"Total file size required for download ({input_size_readable}) is larger than available disk space ({disk_free_readable})") | 2 | 9 | 3 | 52 | 0 | 729 | 740 | 729 | self,session_dir,total_input_bytes | [] | None | {"AnnAssign": 1, "Assign": 2, "Expr": 1, "If": 1} | 4 | 12 | 4 | ["shutil.disk_usage", "human_readable_file_size", "human_readable_file_size", "AssetSyncError"] | 0 | [] | The function (_ensure_disk_capacity) defined within the public class called AssetSync.The function start at line 729 and ends at 740. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [729.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["shutil.disk_usage", "human_readable_file_size", "human_readable_file_size", "AssetSyncError"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | sync_inputs | def sync_inputs(self,s3_settings: Optional[JobAttachmentS3Settings],attachments: Optional[Attachments],queue_id: str,job_id: str,session_dir: Path,fs_permission_settings: Optional[FileSystemPermissionSettings] = None,storage_profiles_path_mapping_rules: dict[str, str] = {},step_dependencies: Optional[list[str]] = None,on_downloading_files: Optional[Callable[[ProgressReportMetadata], bool]] = None,os_env_vars: Dict[str, str] | None = None,) -> Tuple[SummaryStatistics, List[Dict[str, str]]]:"""Depending on the fileSystem in the Attachments this will perform twodifferent behaviors:COPIED / None : downloads a manifest file and corresponding input files, if found.VIRTUAL: downloads a manifest file and mounts a Virtual File System at the specified asset root corresponding to the manifest contentsArgs:s3_settings: S3-specific Job Attachment settings.attachments: an object that holds all input assets for the job.queue_id: the ID of the queue.job_id: the ID of the job.session_dir: the directory that the session is going to use.fs_permission_settings: An instance defining group ownership and permission modesto be set on the downloaded (synchronized) input files and directories.storage_profiles_path_mapping_rules: A dict of source path -> destination path mappings.If this dict is not empty, it means that the Storage Profile set in the job isdifferent from the one configured in the Fleet performing the input-syncing.step_dependencies: the list of Step IDs whose output should be downloaded over the inputjob attachments.on_downloading_files: a function that will be called with a ProgressReportMetadata objectfor each file being downloaded. If the function returns False, the download will becancelled. If it returns True, the download will continue.os_env_vars: environment variables to set for launched subprocessesReturns:COPIED / None : a tuple of (1) final summary statistics for file downloads, and (2) a list of local roots for each asset root, used for path mapping.VIRTUAL: same as COPIED, but the summary statistics will be empty since the download hasn't started yet."""if not s3_settings:self.logger.info(f"No Job Attachment settings configured for Queue {queue_id}, no inputs to sync.")return (SummaryStatistics(), [])if not attachments:self.logger.info(f"No attachments configured for Job {job_id}, no inputs to sync.")return (SummaryStatistics(), [])grouped_manifests_by_root: DefaultDict[str, list[BaseAssetManifest]] = DefaultDict(list)pathmapping_rules: Dict[str, Dict[str, str]] = {}storage_profiles_source_paths = list(storage_profiles_path_mapping_rules.keys())for manifest_properties in attachments.manifests:local_root: str = ""if (len(storage_profiles_path_mapping_rules) > 0and manifest_properties.fileSystemLocationName):if manifest_properties.rootPath in storage_profiles_source_paths:local_root = storage_profiles_path_mapping_rules[manifest_properties.rootPath]else:raise AssetSyncError("Error occurred while attempting to sync input files: "f"No path mapping rule found for the source path {manifest_properties.rootPath}")else:dir_name: str = _get_unique_dest_dir_name(manifest_properties.rootPath)local_root = str(session_dir.joinpath(dir_name))pathmapping_rules[dir_name] = {"source_path_format": manifest_properties.rootPathFormat.value,"source_path": manifest_properties.rootPath,"destination_path": local_root,}if manifest_properties.inputManifestPath:manifest_s3_key = s3_settings.add_root_and_manifest_folder_prefix(manifest_properties.inputManifestPath)manifest = get_manifest_from_s3(manifest_key=manifest_s3_key,s3_bucket=s3_settings.s3BucketName,session=self.session,)grouped_manifests_by_root[local_root].append(manifest)# Handle step-step dependencies.if step_dependencies:for step_id in step_dependencies:manifests_by_root = get_output_manifests_by_asset_root(s3_settings,self.farm_id,queue_id,job_id,step_id=step_id,session=self.session,)for root, manifests in manifests_by_root.items():dir_name = _get_unique_dest_dir_name(root)local_root = str(session_dir.joinpath(dir_name))grouped_manifests_by_root[local_root].extend(manifests)# Merge the manifests in each root into a single manifestmerged_manifests_by_root: dict[str, BaseAssetManifest] = dict()total_input_size: int = 0for root, manifests in grouped_manifests_by_root.items():merged_manifest = merge_asset_manifests(manifests)if merged_manifest:merged_manifests_by_root[root] = merged_manifesttotal_input_size += merged_manifest.totalSize# type: ignore[attr-defined]# Download# Virtual Download Flowif (attachments.fileSystem == JobAttachmentsFileSystem.VIRTUAL.valueand sys.platform != "win32"and fs_permission_settings is not Noneand os_env_vars is not Noneand "AWS_PROFILE" in os_env_varsand isinstance(fs_permission_settings, PosixFileSystemPermissionSettings)):try:VFSProcessManager.find_vfs()mount_vfs_from_manifests(s3_bucket=s3_settings.s3BucketName,manifests_by_root=merged_manifests_by_root,boto3_session=self.session,session_dir=session_dir,fs_permission_settings=fs_permission_settings,# type: ignore[arg-type]os_env_vars=os_env_vars,# type: ignore[arg-type]cas_prefix=s3_settings.full_cas_prefix(),)summary_statistics = SummaryStatistics()self._record_attachment_mtimes(merged_manifests_by_root)return (summary_statistics, list(pathmapping_rules.values()))except VFSExecutableMissingError:logger.error(f"Virtual File System not found, falling back to {JobAttachmentsFileSystem.COPIED} for JobAttachmentsFileSystem.")# Copied Download flowself._ensure_disk_capacity(session_dir, total_input_size)try:download_summary_statistics = download_files_from_manifests(s3_bucket=s3_settings.s3BucketName,manifests_by_root=merged_manifests_by_root,cas_prefix=s3_settings.full_cas_prefix(),fs_permission_settings=fs_permission_settings,session=self.session,on_downloading_files=on_downloading_files,logger=self.logger,)except JobAttachmentsS3ClientError as exc:if exc.status_code == 404:raise JobAttachmentsS3ClientError(action=exc.action,status_code=exc.status_code,bucket_name=exc.bucket_name,key_or_prefix=exc.key_or_prefix,message=("This can happen if the S3 check cache on the submitting machine is out of date. ""Please delete the cache file from the submitting machine, usually located in the ""home directory (~/.deadline/cache/s3_check_cache.db) and try submitting again."),) from excelse:raiseself._record_attachment_mtimes(merged_manifests_by_root)return (download_summary_statistics.convert_to_summary_statistics(),list(pathmapping_rules.values()),) | 22 | 133 | 12 | 707 | 0 | 743 | 923 | 743 | self,s3_settings,attachments,queue_id,job_id,session_dir,fs_permission_settings,storage_profiles_path_mapping_rules,step_dependencies,on_downloading_files,os_env_vars | [] | Tuple[SummaryStatistics, List[Dict[str, str]]] | {"AnnAssign": 6, "Assign": 13, "AugAssign": 1, "Expr": 11, "For": 4, "If": 9, "Return": 4, "Try": 2} | 41 | 181 | 41 | ["self.logger.info", "SummaryStatistics", "self.logger.info", "SummaryStatistics", "DefaultDict", "list", "storage_profiles_path_mapping_rules.keys", "len", "AssetSyncError", "_get_unique_dest_dir_name", "str", "session_dir.joinpath", "s3_settings.add_root_and_manifest_folder_prefix", "get_manifest_from_s3", "append", "get_output_manifests_by_asset_root", "manifests_by_root.items", "_get_unique_dest_dir_name", "str", "session_dir.joinpath", "extend", "dict", "grouped_manifests_by_root.items", "merge_asset_manifests", "isinstance", "VFSProcessManager.find_vfs", "mount_vfs_from_manifests", "s3_settings.full_cas_prefix", "SummaryStatistics", "self._record_attachment_mtimes", "list", "pathmapping_rules.values", "logger.error", "self._ensure_disk_capacity", "download_files_from_manifests", "s3_settings.full_cas_prefix", "JobAttachmentsS3ClientError", "self._record_attachment_mtimes", "download_summary_statistics.convert_to_summary_statistics", "list", "pathmapping_rules.values"] | 0 | [] | The function (sync_inputs) defined within the public class called AssetSync.The function start at line 743 and ends at 923. It contains 133 lines of code and it has a cyclomatic complexity of 22. It takes 12 parameters, represented as [743.0] and does not return any value. It declares 41.0 functions, and It has 41.0 functions called inside which are ["self.logger.info", "SummaryStatistics", "self.logger.info", "SummaryStatistics", "DefaultDict", "list", "storage_profiles_path_mapping_rules.keys", "len", "AssetSyncError", "_get_unique_dest_dir_name", "str", "session_dir.joinpath", "s3_settings.add_root_and_manifest_folder_prefix", "get_manifest_from_s3", "append", "get_output_manifests_by_asset_root", "manifests_by_root.items", "_get_unique_dest_dir_name", "str", "session_dir.joinpath", "extend", "dict", "grouped_manifests_by_root.items", "merge_asset_manifests", "isinstance", "VFSProcessManager.find_vfs", "mount_vfs_from_manifests", "s3_settings.full_cas_prefix", "SummaryStatistics", "self._record_attachment_mtimes", "list", "pathmapping_rules.values", "logger.error", "self._ensure_disk_capacity", "download_files_from_manifests", "s3_settings.full_cas_prefix", "JobAttachmentsS3ClientError", "self._record_attachment_mtimes", "download_summary_statistics.convert_to_summary_statistics", "list", "pathmapping_rules.values"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | sync_outputs | def sync_outputs(self,s3_settings: Optional[JobAttachmentS3Settings],attachments: Optional[Attachments],queue_id: str,job_id: str,step_id: str,task_id: str,session_action_id: str,start_time: float,session_dir: Path,storage_profiles_path_mapping_rules: dict[str, str] = {},on_uploading_files: Optional[Callable[[ProgressReportMetadata], bool]] = None,) -> SummaryStatistics:"""Uploads any output files specified in the manifest, if found."""if not s3_settings:self.logger.info(f"No Job Attachment settings configured for Queue {queue_id}, no outputs to sync.")return SummaryStatistics()if not attachments:self.logger.info(f"No attachments configured for Job {job_id}, no outputs to sync.")return SummaryStatistics()all_output_files: List[OutputFile] = []storage_profiles_source_paths = list(storage_profiles_path_mapping_rules.keys())for manifest_properties in attachments.manifests:session_root = session_dirlocal_root: Path = Path()if (len(storage_profiles_path_mapping_rules) > 0and manifest_properties.fileSystemLocationName):if manifest_properties.rootPath in storage_profiles_source_paths:local_root = Path(storage_profiles_path_mapping_rules[manifest_properties.rootPath])# We use session_root to filter out any files resolved to a location outside# of that directory. If storage profile's path mapping rules are available,# we can consider the session_root to be the mapped-storage profile path.session_root = local_rootelse:raise AssetSyncError("Error occurred while attempting to sync output files: "f"No path mapping rule found for the source path {manifest_properties.rootPath}")else:dir_name: str = _get_unique_dest_dir_name(manifest_properties.rootPath)local_root = session_dir.joinpath(dir_name)output_files: List[OutputFile] = self._get_output_files(manifest_properties,s3_settings,local_root,session_root,)if output_files:output_manifest = self._generate_output_manifest(output_files)session_action_id_with_time_stamp = (f"{_float_to_iso_datetime_string(start_time)}_{session_action_id}")full_output_prefix = s3_settings.full_output_prefix(farm_id=self.farm_id,queue_id=queue_id,job_id=job_id,step_id=step_id,task_id=task_id,session_action_id=session_action_id_with_time_stamp,)self._upload_output_manifest_to_s3(s3_settings=s3_settings,output_manifest=output_manifest,full_output_prefix=full_output_prefix,root_path=manifest_properties.rootPath,file_system_location_name=manifest_properties.fileSystemLocationName,)all_output_files.extend(output_files)if all_output_files:num_output_files = len(all_output_files)self.logger.info(f"Uploading {num_output_files} output file{'' if num_output_files == 1 else 's'}"f" to S3: {s3_settings.s3BucketName}/{s3_settings.full_cas_prefix()}")summary_stats: SummaryStatistics = self._upload_output_files_to_s3(s3_settings, all_output_files, on_uploading_files)else:summary_stats = SummaryStatistics()return summary_stats | 9 | 83 | 13 | 361 | 0 | 925 | 1,016 | 925 | self,s3_settings,attachments,queue_id,job_id,step_id,task_id,session_action_id,start_time,session_dir,storage_profiles_path_mapping_rules,on_uploading_files | [] | SummaryStatistics | {"AnnAssign": 5, "Assign": 10, "Expr": 6, "For": 1, "If": 6, "Return": 3} | 23 | 92 | 23 | ["self.logger.info", "SummaryStatistics", "self.logger.info", "SummaryStatistics", "list", "storage_profiles_path_mapping_rules.keys", "Path", "len", "Path", "AssetSyncError", "_get_unique_dest_dir_name", "session_dir.joinpath", "self._get_output_files", "self._generate_output_manifest", "_float_to_iso_datetime_string", "s3_settings.full_output_prefix", "self._upload_output_manifest_to_s3", "all_output_files.extend", "len", "self.logger.info", "s3_settings.full_cas_prefix", "self._upload_output_files_to_s3", "SummaryStatistics"] | 0 | [] | The function (sync_outputs) defined within the public class called AssetSync.The function start at line 925 and ends at 1016. It contains 83 lines of code and it has a cyclomatic complexity of 9. It takes 13 parameters, represented as [925.0] and does not return any value. It declares 23.0 functions, and It has 23.0 functions called inside which are ["self.logger.info", "SummaryStatistics", "self.logger.info", "SummaryStatistics", "list", "storage_profiles_path_mapping_rules.keys", "Path", "len", "Path", "AssetSyncError", "_get_unique_dest_dir_name", "session_dir.joinpath", "self._get_output_files", "self._generate_output_manifest", "_float_to_iso_datetime_string", "s3_settings.full_output_prefix", "self._upload_output_manifest_to_s3", "all_output_files.extend", "len", "self.logger.info", "s3_settings.full_cas_prefix", "self._upload_output_files_to_s3", "SummaryStatistics"]. |
aws-deadline_deadline-cloud | AssetSync | public | 0 | 0 | cleanup_session | def cleanup_session(self,session_dir: Path,file_system: JobAttachmentsFileSystem,os_user: Optional[str] = None,):if file_system == JobAttachmentsFileSystem.COPIED.value:returnif not os_user:raise VFSOSUserNotSetError("No os user set - can't clean up vfs session")try:VFSProcessManager.find_vfs()# Shutdown all running Deadline VFS processes since session is completeVFSProcessManager.kill_all_processes(session_dir=session_dir, os_user=os_user)except VFSExecutableMissingError:logger.error("Virtual File System not found, no processes to kill.") | 4 | 15 | 4 | 70 | 0 | 1,018 | 1,033 | 1,018 | self,session_dir,file_system,os_user | [] | None | {"Expr": 3, "If": 2, "Return": 1, "Try": 1} | 4 | 16 | 4 | ["VFSOSUserNotSetError", "VFSProcessManager.find_vfs", "VFSProcessManager.kill_all_processes", "logger.error"] | 0 | [] | The function (cleanup_session) defined within the public class called AssetSync.The function start at line 1018 and ends at 1033. It contains 15 lines of code and it has a cyclomatic complexity of 4. It takes 4 parameters, represented as [1018.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["VFSOSUserNotSetError", "VFSProcessManager.find_vfs", "VFSProcessManager.kill_all_processes", "logger.error"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_manifest_from_s3 | def get_manifest_from_s3(manifest_key: str, s3_bucket: str, session: Optional[boto3.Session] = None) -> BaseAssetManifest:_, manifest = get_asset_root_and_manifest_from_s3(manifest_key, s3_bucket, session)return manifest | 1 | 5 | 3 | 38 | 0 | 85 | 89 | 85 | manifest_key,s3_bucket,session | [] | BaseAssetManifest | {"Assign": 1, "Return": 1} | 1 | 5 | 1 | ["get_asset_root_and_manifest_from_s3"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_manifest_from_s3_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_manifest_from_s3_error_message_on_timeout"] | The function (get_manifest_from_s3) defined within the public class called public.The function start at line 85 and ends at 89. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [85.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["get_asset_root_and_manifest_from_s3"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_manifest_from_s3_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_manifest_from_s3_error_message_on_timeout"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_asset_root_and_manifest_from_s3 | def get_asset_root_and_manifest_from_s3(manifest_key: str, s3_bucket: str, session: Optional[boto3.Session] = None) -> Tuple[Optional[str], BaseAssetManifest]:asset_root, _, asset_manifest = _get_asset_root_and_manifest_from_s3_with_last_modified(manifest_key, s3_bucket, session)return (asset_root, asset_manifest) | 1 | 7 | 3 | 52 | 0 | 92 | 98 | 92 | manifest_key,s3_bucket,session | [] | Tuple[Optional[str], BaseAssetManifest] | {"Assign": 1, "Return": 1} | 1 | 7 | 1 | ["_get_asset_root_and_manifest_from_s3_with_last_modified"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_manifest_from_s3"] | The function (get_asset_root_and_manifest_from_s3) defined within the public class called public.The function start at line 92 and ends at 98. It contains 7 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [92.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_get_asset_root_and_manifest_from_s3_with_last_modified"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_manifest_from_s3"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_asset_root_and_manifest_from_s3_with_last_modified | def _get_asset_root_and_manifest_from_s3_with_last_modified(manifest_key: str, s3_bucket: str, session: Optional[boto3.Session] = None) -> Tuple[Optional[str], datetime, BaseAssetManifest]:"""Gets manifest with its asset root and last modified from s3 using the manifest key in s3:param manifest_key: key for searching in s3:param s3_bucket: s3 bucket:param session: boto3 session:return: Returns Tuple of asset root, manifest's last modified time and the manifest"""s3_client = get_s3_client(session=session)try:# Assumption: the manifest is less than 5GB. S3 objects larger than 5GB will be truncated.# Using the assumption because it simplifies the code. A large manifest might be:# 1 million files * 256 bytes per file path = 256MB so this assumption is safe.res = s3_client.get_object(Bucket=s3_bucket,Key=manifest_key,ExpectedBucketOwner=get_account_id(session=session),)asset_root = _get_asset_root_from_metadata(metadata=res["Metadata"])contents = res["Body"].read().decode("utf-8")asset_manifest = decode_manifest(contents)last_modified = res["LastModified"]return (asset_root, last_modified, asset_manifest)except ClientError as exc:status_code = int(exc.response["ResponseMetadata"]["HTTPStatusCode"])status_code_guidance = {**COMMON_ERROR_GUIDANCE_FOR_S3,403: (("Forbidden or Access denied. Please check your AWS credentials, and ensure that ""your AWS IAM Role or User has the 's3:GetObject' permission for this bucket. ")if "kms:" not in str(exc)else ("Forbidden or Access denied. Please check your AWS credentials and Job Attachments S3 bucket ""encryption settings. If a customer-managed KMS key is set, confirm that your AWS IAM Role or ""User has the 'kms:Decrypt' and 'kms:DescribeKey' permissions for the key used to encrypt the bucket.")),404: "Not found. Please check your bucket name and object key, and ensure that they exist in the AWS account.",}raise JobAttachmentsS3ClientError(action="downloading binary file",status_code=status_code,bucket_name=s3_bucket,key_or_prefix=manifest_key,message=f"{status_code_guidance.get(status_code, '')} {str(exc)}",) from excexcept BotoCoreError as bce:raise JobAttachmentS3BotoCoreError(action="downloading binary file",error_details=str(bce),) from bceexcept Exception as e:raise AssetSyncError(e) from e | 5 | 47 | 3 | 229 | 8 | 101 | 158 | 101 | manifest_key,s3_bucket,session | ['asset_root', 'res', 'asset_manifest', 'last_modified', 's3_client', 'status_code', 'status_code_guidance', 'contents'] | Tuple[Optional[str], datetime, BaseAssetManifest] | {"Assign": 8, "Expr": 1, "Return": 1, "Try": 1} | 15 | 58 | 15 | ["get_s3_client", "s3_client.get_object", "get_account_id", "_get_asset_root_from_metadata", "decode", "read", "decode_manifest", "int", "str", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_manifest_and_make_paths_absolute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_asset_root_and_manifest_from_s3"] | The function (_get_asset_root_and_manifest_from_s3_with_last_modified) defined within the public class called public.The function start at line 101 and ends at 158. It contains 47 lines of code and it has a cyclomatic complexity of 5. It takes 3 parameters, represented as [101.0] and does not return any value. It declares 15.0 functions, It has 15.0 functions called inside which are ["get_s3_client", "s3_client.get_object", "get_account_id", "_get_asset_root_from_metadata", "decode", "read", "decode_manifest", "int", "str", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_manifest_and_make_paths_absolute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_asset_root_and_manifest_from_s3"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_asset_root_from_metadata | def _get_asset_root_from_metadata(metadata: dict[str, str]) -> Optional[str]:if "asset-root-json" in metadata:return json.loads(metadata["asset-root-json"])else:return metadata.get("asset-root", None) | 2 | 5 | 1 | 43 | 0 | 161 | 165 | 161 | metadata | [] | Optional[str] | {"If": 1, "Return": 2} | 2 | 5 | 2 | ["json.loads", "metadata.get"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._get_asset_root_and_manifest_from_s3_with_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_asset_root_from_metadata_returns_none_if_not_found"] | The function (_get_asset_root_from_metadata) defined within the public class called public.The function start at line 161 and ends at 165. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["json.loads", "metadata.get"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._get_asset_root_and_manifest_from_s3_with_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_asset_root_from_metadata_returns_none_if_not_found"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_output_manifest_prefix | def _get_output_manifest_prefix(s3_settings: JobAttachmentS3Settings,farm_id: str,queue_id: str,job_id: str,step_id: Optional[str] = None,task_id: Optional[str] = None,session_action_id: Optional[str] = None,) -> str:"""Get full prefix for output manifest with given farm id, queue id, job id, step id and task id"""manifest_prefix: strif session_action_id:if not task_id or not step_id:raise JobAttachmentsError("Session Action ID specified, but no Task ID or Step ID. Job, Step, and Task ID are required to retrieve task outputs.")manifest_prefix = s3_settings.full_output_prefix(farm_id, queue_id, job_id, step_id, task_id, session_action_id)if task_id:if not step_id:raise JobAttachmentsError("Task ID specified, but no Step ID. Job, Step, and Task ID are required to retrieve task outputs.")manifest_prefix = s3_settings.full_task_output_prefix(farm_id, queue_id, job_id, step_id, task_id)elif step_id:manifest_prefix = s3_settings.full_step_output_prefix(farm_id, queue_id, job_id, step_id)else:manifest_prefix = s3_settings.full_job_output_prefix(farm_id, queue_id, job_id)# Previous functions don't terminate the prefix with a '/'. So we'll do it here.return f"{manifest_prefix}/" | 7 | 31 | 7 | 148 | 1 | 168 | 203 | 168 | s3_settings,farm_id,queue_id,job_id,step_id,task_id,session_action_id | ['manifest_prefix'] | str | {"AnnAssign": 1, "Assign": 4, "Expr": 1, "If": 5, "Return": 1} | 6 | 36 | 6 | ["JobAttachmentsError", "s3_settings.full_output_prefix", "JobAttachmentsError", "s3_settings.full_task_output_prefix", "s3_settings.full_step_output_prefix", "s3_settings.full_job_output_prefix"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._add_output_manifests_from_s3", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_output_manifests_by_asset_root"] | The function (_get_output_manifest_prefix) defined within the public class called public.The function start at line 168 and ends at 203. It contains 31 lines of code and it has a cyclomatic complexity of 7. It takes 7 parameters, represented as [168.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["JobAttachmentsError", "s3_settings.full_output_prefix", "JobAttachmentsError", "s3_settings.full_task_output_prefix", "s3_settings.full_step_output_prefix", "s3_settings.full_job_output_prefix"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._add_output_manifests_from_s3", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_output_manifests_by_asset_root"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_tasks_manifests_keys_from_s3 | def _get_tasks_manifests_keys_from_s3(manifest_prefix: str,s3_bucket: str,session: Optional[boto3.Session] = None,*,select_latest_per_task=True,) -> List[str]:"""Returns the keys of all output manifests from the given s3 prefix.(Only the manifests that end with the prefix pattern task-*/*_output)"""manifests_keys: List[str] = []s3_client = get_s3_client(session=session)try:paginator = s3_client.get_paginator("list_objects_v2")page_iterator = paginator.paginate(Bucket=s3_bucket,Prefix=manifest_prefix,)# 1. Find all files that match the pattern: task-{any}/{any}/{any}output{any}task_prefixes = defaultdict(list)for page in page_iterator:contents = page.get("Contents", None)if contents is None:raise JobAttachmentsError(f"Unable to find asset manifest in s3://{s3_bucket}/{manifest_prefix}")for content in contents:if re.search(r"task-.*/.*/.*output.*", content["Key"]):parts = content["Key"].split("/")for i, part in enumerate(parts):if "task-" in part:task_folder = "/".join(parts[: i + 1])task_prefixes[task_folder].append(content["Key"])except ClientError as exc:status_code = int(exc.response["ResponseMetadata"]["HTTPStatusCode"])status_code_guidance = {**COMMON_ERROR_GUIDANCE_FOR_S3,403: ("Forbidden or Access denied. Please check your AWS credentials, and ensure that ""your AWS IAM Role or User has the 's3:ListBucket' permission for this bucket."),404: "Not found. Please ensure that the bucket and key/prefix exists.",}raise JobAttachmentsS3ClientError(action="listing bucket contents",status_code=status_code,bucket_name=s3_bucket,key_or_prefix=manifest_prefix,message=f"{status_code_guidance.get(status_code, '')} {str(exc)}",) from excexcept BotoCoreError as bce:raise JobAttachmentS3BotoCoreError(action="listing bucket contents",error_details=str(bce),) from bceexcept JobAttachmentsError:raise# pass along JobAttachmentsErrors if we get themexcept Exception as e:raise AssetSyncError(e) from eif select_latest_per_task:# 2. Select all files in the last subfolder (alphabetically) under each "task-{any}" folder.for task_folder, files in task_prefixes.items():last_subfolder = sorted(set(f.split("/")[len(task_folder.split("/"))] for f in files), reverse=True)[0]manifests_keys += [f for f in files if f.startswith(f"{task_folder}/{last_subfolder}/")]else:# Include all the keys, not just the latest per taskmanifests_keys = [f for _, files in task_prefixes.items() for f in files]# Now `manifests_keys` is a list of the keys of files in the last folder (alphabetically) under each "task-" folder.return manifests_keys | 18 | 64 | 4 | 372 | 11 | 206 | 281 | 206 | manifest_prefix,s3_bucket,session,select_latest_per_task | ['paginator', 'manifests_keys', 'page_iterator', 's3_client', 'task_folder', 'status_code', 'last_subfolder', 'status_code_guidance', 'parts', 'contents', 'task_prefixes'] | List[str] | {"AnnAssign": 1, "Assign": 11, "AugAssign": 1, "Expr": 2, "For": 4, "If": 4, "Return": 1, "Try": 1} | 26 | 76 | 26 | ["get_s3_client", "s3_client.get_paginator", "paginator.paginate", "defaultdict", "page.get", "JobAttachmentsError", "re.search", "split", "enumerate", "join", "append", "int", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError", "task_prefixes.items", "sorted", "set", "f.split", "len", "task_folder.split", "f.startswith", "task_prefixes.items"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._add_output_manifests_from_s3", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_output_manifests_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_tasks_manifests_keys_from_s3_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_tasks_manifests_keys_from_s3_error_message_on_timeout"] | The function (_get_tasks_manifests_keys_from_s3) defined within the public class called public.The function start at line 206 and ends at 281. It contains 64 lines of code and it has a cyclomatic complexity of 18. It takes 4 parameters, represented as [206.0] and does not return any value. It declares 26.0 functions, It has 26.0 functions called inside which are ["get_s3_client", "s3_client.get_paginator", "paginator.paginate", "defaultdict", "page.get", "JobAttachmentsError", "re.search", "split", "enumerate", "join", "append", "int", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError", "task_prefixes.items", "sorted", "set", "f.split", "len", "task_folder.split", "f.startswith", "task_prefixes.items"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._add_output_manifests_from_s3", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_output_manifests_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_tasks_manifests_keys_from_s3_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_get_tasks_manifests_keys_from_s3_error_message_on_timeout"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_job_input_paths_by_asset_root | def get_job_input_paths_by_asset_root(s3_settings: JobAttachmentS3Settings,attachments: Attachments,session: Optional[boto3.Session] = None,) -> dict[str, ManifestPathGroup]:"""Gets dict of grouped paths of all input files of a given job.The grouped paths are separated by asset root.Returns a dict of ManifestPathGroups, with the root path as the key."""inputs: dict[str, ManifestPathGroup] = {}for manifest_properties in attachments.manifests:if manifest_properties.inputManifestPath:key = _join_s3_paths(manifest_properties.inputManifestPath)_, asset_manifest = get_asset_root_and_manifest_from_s3(manifest_key=key,s3_bucket=s3_settings.s3BucketName,session=session,)root_path = manifest_properties.rootPathif root_path not in inputs:inputs[root_path] = ManifestPathGroup()inputs[root_path].add_manifest_to_group(asset_manifest)return inputs | 4 | 19 | 3 | 113 | 2 | 284 | 310 | 284 | s3_settings,attachments,session | ['root_path', 'key'] | dict[str, ManifestPathGroup] | {"AnnAssign": 1, "Assign": 4, "Expr": 2, "For": 1, "If": 2, "Return": 1} | 4 | 27 | 4 | ["_join_s3_paths", "get_asset_root_and_manifest_from_s3", "ManifestPathGroup", "add_manifest_to_group"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_input_paths_by_asset_root"] | The function (get_job_input_paths_by_asset_root) defined within the public class called public.The function start at line 284 and ends at 310. It contains 19 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [284.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_join_s3_paths", "get_asset_root_and_manifest_from_s3", "ManifestPathGroup", "add_manifest_to_group"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_input_paths_by_asset_root"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_job_input_output_paths_by_asset_root | def get_job_input_output_paths_by_asset_root(s3_settings: JobAttachmentS3Settings,attachments: Attachments,farm_id: str,queue_id: str,job_id: str,step_id: Optional[str] = None,task_id: Optional[str] = None,session_action_id: Optional[str] = None,session: Optional[boto3.Session] = None,) -> dict[str, ManifestPathGroup]:"""With given IDs, gets the paths of all input and output filesof this job. The grouped paths are separated by asset root.Returns a dict of ManifestPathGroups, with the root path as the key."""input_files = get_job_input_paths_by_asset_root(s3_settings=s3_settings,attachments=attachments,session=session,)output_files = get_job_output_paths_by_asset_root(s3_settings=s3_settings,farm_id=farm_id,queue_id=queue_id,job_id=job_id,step_id=step_id,task_id=task_id,session_action_id=session_action_id,session=session,)combined_path_groups: dict[str, ManifestPathGroup] = {}for asset_root, path_group in chain(input_files.items(), output_files.items()):if asset_root not in combined_path_groups:combined_path_groups[asset_root] = path_groupelse:combined_path_groups[asset_root].combine_with_group(path_group)return combined_path_groups | 3 | 33 | 9 | 180 | 2 | 313 | 352 | 313 | s3_settings,attachments,farm_id,queue_id,job_id,step_id,task_id,session_action_id,session | ['output_files', 'input_files'] | dict[str, ManifestPathGroup] | {"AnnAssign": 1, "Assign": 3, "Expr": 2, "For": 1, "If": 1, "Return": 1} | 6 | 40 | 6 | ["get_job_input_paths_by_asset_root", "get_job_output_paths_by_asset_root", "chain", "input_files.items", "output_files.items", "combine_with_group"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_input_output_paths_by_asset_root"] | The function (get_job_input_output_paths_by_asset_root) defined within the public class called public.The function start at line 313 and ends at 352. It contains 33 lines of code and it has a cyclomatic complexity of 3. It takes 9 parameters, represented as [313.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["get_job_input_paths_by_asset_root", "get_job_output_paths_by_asset_root", "chain", "input_files.items", "output_files.items", "combine_with_group"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_input_output_paths_by_asset_root"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_new_copy_file_path | def _get_new_copy_file_path(local_file_name: Path,collision_lock: Lock,collision_file_dict: DefaultDict[str, int],) -> Path:with collision_lock:file_str: str = str(local_file_name)num: int = collision_file_dict[file_str]new_file_name = local_file_name# Iterate until we find a number we don't conflict withwhile True:try:# Handle multi-process locks with creating and/or opening file to verify if it existswith open(new_file_name, "x"):break# If file exists we go here and increment num to find a unique pathexcept FileExistsError:num += 1new_file_name = local_file_name.parent.joinpath(f"{local_file_name.stem} ({num}){local_file_name.suffix}")collision_file_dict[file_str] = numlocal_file_name = new_file_namereturn local_file_name | 3 | 21 | 3 | 87 | 2 | 355 | 380 | 355 | local_file_name,collision_lock,collision_file_dict | ['new_file_name', 'local_file_name'] | Path | {"AnnAssign": 2, "Assign": 4, "AugAssign": 1, "Return": 1, "Try": 1, "While": 1, "With": 2} | 3 | 26 | 3 | ["str", "open", "local_file_name.parent.joinpath"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_get_new_copy_file_path_file_collisions"] | The function (_get_new_copy_file_path) defined within the public class called public.The function start at line 355 and ends at 380. It contains 21 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [355.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["str", "open", "local_file_name.parent.joinpath"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_get_new_copy_file_path_file_collisions"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_files_in_directory | def download_files_in_directory(s3_settings: JobAttachmentS3Settings,attachments: Attachments,farm_id: str,queue_id: str,job_id: str,directory_path: str,local_download_dir: str,session: Optional[boto3.Session] = None,on_downloading_files: Optional[Callable[[ProgressReportMetadata], bool]] = None,) -> DownloadSummaryStatistics:"""From a given job's input and output files, downloads all files inthe given directory path.(example of `directory_path`: "inputs/subdirectory1")(example of `local_download_dir`: "/home/username")"""all_grouped_paths = get_job_input_output_paths_by_asset_root(s3_settings=s3_settings,attachments=attachments,farm_id=farm_id,queue_id=queue_id,job_id=job_id,session=session,)# Group by hash algorithm all the files that fall under the directoryfiles_to_download: DefaultDict[HashAlgorithm, list[RelativeFilePath]] = DefaultDict(list)total_bytes = 0total_files = 0for path_group in all_grouped_paths.values():for hash_alg, path_list in path_group.files_by_hash_alg.items():files_list = [file for file in path_list if file.path.startswith(directory_path + "/")]files_size = sum([file.size for file in files_list])total_bytes += files_sizetotal_files += len(files_list)files_to_download[hash_alg].extend(files_list)# Sets up progress tracker to report download progress back to the caller.progress_tracker = ProgressTracker(status=ProgressStatus.DOWNLOAD_IN_PROGRESS,total_files=total_files,total_bytes=total_bytes,on_progress_callback=on_downloading_files,)num_download_workers = _get_num_download_workers()start_time = time.perf_counter()for hash_alg, file_paths in files_to_download.items():downloaded_files_paths = _download_files_parallel(file_paths,hash_alg,num_download_workers,local_download_dir,s3_settings.s3BucketName,s3_settings.full_cas_prefix(),progress_tracker=progress_tracker,)progress_tracker.total_time = time.perf_counter() - start_timereturn progress_tracker.get_download_summary_statistics({local_download_dir: downloaded_files_paths}) | 7 | 51 | 10 | 282 | 9 | 383 | 448 | 383 | s3_settings,attachments,farm_id,queue_id,job_id,directory_path,local_download_dir,session,on_downloading_files | ['total_files', 'downloaded_files_paths', 'total_bytes', 'files_size', 'all_grouped_paths', 'num_download_workers', 'progress_tracker', 'files_list', 'start_time'] | DownloadSummaryStatistics | {"AnnAssign": 1, "Assign": 10, "AugAssign": 2, "Expr": 2, "For": 3, "Return": 1} | 16 | 66 | 16 | ["get_job_input_output_paths_by_asset_root", "DefaultDict", "all_grouped_paths.values", "path_group.files_by_hash_alg.items", "file.path.startswith", "sum", "len", "extend", "ProgressTracker", "_get_num_download_workers", "time.perf_counter", "files_to_download.items", "_download_files_parallel", "s3_settings.full_cas_prefix", "time.perf_counter", "progress_tracker.get_download_summary_statistics"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_download_files_in_directory"] | The function (download_files_in_directory) defined within the public class called public.The function start at line 383 and ends at 448. It contains 51 lines of code and it has a cyclomatic complexity of 7. It takes 10 parameters, represented as [383.0] and does not return any value. It declares 16.0 functions, It has 16.0 functions called inside which are ["get_job_input_output_paths_by_asset_root", "DefaultDict", "all_grouped_paths.values", "path_group.files_by_hash_alg.items", "file.path.startswith", "sum", "len", "extend", "ProgressTracker", "_get_num_download_workers", "time.perf_counter", "files_to_download.items", "_download_files_parallel", "s3_settings.full_cas_prefix", "time.perf_counter", "progress_tracker.get_download_summary_statistics"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_download_files_in_directory"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_file.handler | def handler(bytes_downloaded):nonlocal progress_trackernonlocal futureif progress_tracker:should_continue = progress_tracker.track_progress_callback(bytes_downloaded)if not should_continue:future.cancel() | 3 | 7 | 1 | 29 | 0 | 516 | 523 | 516 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (download_file.handler) defined within the public class called public.The function start at line 516 and ends at 523. It contains 7 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_file.process_client_error | def process_client_error(exc: ClientError, status_code: int):status_code_guidance = {**COMMON_ERROR_GUIDANCE_FOR_S3,403: (("Forbidden or Access denied. Please check your AWS credentials, and ensure that ""your AWS IAM Role or User has the 's3:GetObject' permission for this bucket. ")if "kms:" not in str(exc)else ("Forbidden or Access denied. Please check your AWS credentials and Job Attachments S3 bucket ""encryption settings. If a customer-managed KMS key is set, confirm that your AWS IAM Role or ""User has the 'kms:Decrypt' and 'kms:DescribeKey' permissions for the key used to encrypt the bucket.")),404: ("Not found. Please check your bucket name and object key, and ensure that they exist in the AWS account."),}raise JobAttachmentsS3ClientError(action="downloading file",status_code=status_code,bucket_name=s3_bucket,key_or_prefix=s3_key,message=f"{status_code_guidance.get(status_code, '')} {str(exc)} (Failed to download the file to {str(local_file_path)})",) from exc | 2 | 26 | 2 | 74 | 0 | 544 | 569 | 544 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (download_file.process_client_error) defined within the public class called public.The function start at line 544 and ends at 569. It contains 26 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [544.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_file | def download_file(file: RelativeFilePath,hash_algorithm: HashAlgorithm,local_download_dir: str,collision_lock: Lock,collision_file_dict: DefaultDict[str, int],s3_bucket: str,cas_prefix: Optional[str],s3_client: Optional[BaseClient] = None,session: Optional[boto3.Session] = None,modified_time_override: Optional[float] = None,progress_tracker: Optional[ProgressTracker] = None,file_conflict_resolution: Optional[FileConflictResolution] = FileConflictResolution.CREATE_COPY,) -> Tuple[int, Optional[Path]]:"""Downloads a file from the S3 bucket to the local directory. `modified_time_override` is ignored if the manifestversion used supports timestamps.Returns a tuple of (size in bytes, filename) of the downloaded file.- The file size of 0 means that this file comes from a manifest version that does not provide file sizes.- The filename of None indicates that this file has been skipped or has not been downloaded."""if not s3_client:s3_client = get_s3_client(session=session)transfer_manager = get_s3_transfer_manager(s3_client=s3_client)# The modified time in the manifest is in microseconds, but utime requires the time be expressed in seconds.modified_time_override = file.mtime / 1000000# type: ignore[attr-defined]file_bytes = file.size# Python will handle the path separator '/' correctly on every platform.local_file_path: Path = _get_long_path_compatible_path(Path(local_download_dir).joinpath(file.path))s3_key = (f"{cas_prefix}/{file.hash}.{hash_algorithm.value}"if cas_prefixelse f"{file.hash}.{hash_algorithm.value}")# If the file name already exists, resolve the conflict based on the file_conflict_resolutionif local_file_path.is_file():if file_conflict_resolution == FileConflictResolution.SKIP:return (file_bytes, None)elif file_conflict_resolution == FileConflictResolution.OVERWRITE:passelif file_conflict_resolution == FileConflictResolution.CREATE_COPY:copy_local_file_path = _get_new_copy_file_path(local_file_path, collision_lock, collision_file_dict)# Re-run _get_long_path_compatible_path for updated file name after file conflict resolution# _get_long_path_compatible_path is idempotent, so it doesn't re-process an existing long pathlocal_file_path = _get_long_path_compatible_path(copy_local_file_path)else:raise ValueError(f"Unknown choice for file conflict resolution: {file_conflict_resolution}")local_file_path.parent.mkdir(parents=True, exist_ok=True)future: concurrent.futures.Futuredef handler(bytes_downloaded):nonlocal progress_trackernonlocal futureif progress_tracker:should_continue = progress_tracker.track_progress_callback(bytes_downloaded)if not should_continue:future.cancel()subscribers = [ProgressCallbackInvoker(handler)]future = transfer_manager.download(bucket=s3_bucket,key=s3_key,fileobj=str(local_file_path),extra_args={"ExpectedBucketOwner": get_account_id(session=session)},subscribers=subscribers,)try:future.result()except concurrent.futures.CancelledError as ce:if progress_tracker and progress_tracker.continue_reporting is False:raise AssetSyncCancelledError("File download cancelled.")else:raise AssetSyncError("File download failed.", ce) from ceexcept ClientError as exc:def process_client_error(exc: ClientError, status_code: int):status_code_guidance = {**COMMON_ERROR_GUIDANCE_FOR_S3,403: (("Forbidden or Access denied. Please check your AWS credentials, and ensure that ""your AWS IAM Role or User has the 's3:GetObject' permission for this bucket. ")if "kms:" not in str(exc)else ("Forbidden or Access denied. Please check your AWS credentials and Job Attachments S3 bucket ""encryption settings. If a customer-managed KMS key is set, confirm that your AWS IAM Role or ""User has the 'kms:Decrypt' and 'kms:DescribeKey' permissions for the key used to encrypt the bucket.")),404: ("Not found. Please check your bucket name and object key, and ensure that they exist in the AWS account."),}raise JobAttachmentsS3ClientError(action="downloading file",status_code=status_code,bucket_name=s3_bucket,key_or_prefix=s3_key,message=f"{status_code_guidance.get(status_code, '')} {str(exc)} (Failed to download the file to {str(local_file_path)})",) from exc# TODO: Temporary to prevent breaking backwards-compatibility; if file not found, try again without hash alg postfixstatus_code = int(exc.response["ResponseMetadata"]["HTTPStatusCode"])if status_code == 404:s3_key = s3_key.rsplit(".", 1)[0]future = transfer_manager.download(bucket=s3_bucket,key=s3_key,fileobj=str(local_file_path),extra_args={"ExpectedBucketOwner": get_account_id(session=session)},subscribers=subscribers,)try:future.result()except concurrent.futures.CancelledError as ce:if progress_tracker and progress_tracker.continue_reporting is False:raise AssetSyncCancelledError("File download cancelled.")else:raise AssetSyncError("File download failed.", ce) from ceexcept ClientError as secondExc:status_code = int(exc.response["ResponseMetadata"]["HTTPStatusCode"])process_client_error(secondExc, status_code)else:process_client_error(exc, status_code)except BotoCoreError as bce:raise JobAttachmentS3BotoCoreError(action="downloading file",error_details=str(bce),) from bceexcept Exception as e:raise AssetSyncError(e) from edownload_logger.debug(f"Downloaded {file.path} to {str(local_file_path)}")os.utime(local_file_path, (modified_time_override, modified_time_override))# type: ignore[arg-type]return (file_bytes, local_file_path) | 18 | 93 | 12 | 542 | 12 | 451 | 605 | 451 | file,hash_algorithm,local_download_dir,collision_lock,collision_file_dict,s3_bucket,cas_prefix,s3_client,session,modified_time_override,progress_tracker,file_conflict_resolution | ['s3_key', 'future', 'subscribers', 's3_client', 'status_code', 'modified_time_override', 'status_code_guidance', 'transfer_manager', 'file_bytes', 'should_continue', 'local_file_path', 'copy_local_file_path'] | Tuple[int, Optional[Path]] | {"AnnAssign": 2, "Assign": 15, "Expr": 9, "If": 10, "Return": 2, "Try": 2} | 41 | 155 | 41 | ["get_s3_client", "get_s3_transfer_manager", "_get_long_path_compatible_path", "joinpath", "Path", "local_file_path.is_file", "_get_new_copy_file_path", "_get_long_path_compatible_path", "ValueError", "local_file_path.parent.mkdir", "progress_tracker.track_progress_callback", "future.cancel", "ProgressCallbackInvoker", "transfer_manager.download", "str", "get_account_id", "future.result", "AssetSyncCancelledError", "AssetSyncError", "str", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "str", "int", "s3_key.rsplit", "transfer_manager.download", "str", "get_account_id", "future.result", "AssetSyncCancelledError", "AssetSyncError", "int", "process_client_error", "process_client_error", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError", "download_logger.debug", "str", "os.utime"] | 14 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models._utilities.dump_py.generic_image_dump", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.image_py.ImageInstance.window", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.image_py.SliceInstance.window", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.property_py.AttachedFile.download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.commandline.cloud_py.download_c", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.ingestion.sqlite.__init___py.minio_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.src.hyp3_sdk.jobs_py.Job.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload._test_create_copy_long_path_scenario", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.download_file_and_check_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_file_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_file_error_message_on_timeout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_windows_long_path_UNC_notation_WindowsOS", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_windows_long_path_UNC_notation_and_registry_WindowsOS"] | The function (download_file) defined within the public class called public.The function start at line 451 and ends at 605. It contains 93 lines of code and it has a cyclomatic complexity of 18. It takes 12 parameters, represented as [451.0] and does not return any value. It declares 41.0 functions, It has 41.0 functions called inside which are ["get_s3_client", "get_s3_transfer_manager", "_get_long_path_compatible_path", "joinpath", "Path", "local_file_path.is_file", "_get_new_copy_file_path", "_get_long_path_compatible_path", "ValueError", "local_file_path.parent.mkdir", "progress_tracker.track_progress_callback", "future.cancel", "ProgressCallbackInvoker", "transfer_manager.download", "str", "get_account_id", "future.result", "AssetSyncCancelledError", "AssetSyncError", "str", "JobAttachmentsS3ClientError", "status_code_guidance.get", "str", "str", "int", "s3_key.rsplit", "transfer_manager.download", "str", "get_account_id", "future.result", "AssetSyncCancelledError", "AssetSyncError", "int", "process_client_error", "process_client_error", "JobAttachmentS3BotoCoreError", "str", "AssetSyncError", "download_logger.debug", "str", "os.utime"], It has 14.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models._utilities.dump_py.generic_image_dump", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.image_py.ImageInstance.window", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.image_py.SliceInstance.window", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917825_cytomine_cytomine_python_client.cytomine.models.property_py.AttachedFile.download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.commandline.cloud_py.download_c", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69993433_splitgraph_sgr.splitgraph.ingestion.sqlite.__init___py.minio_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.src.hyp3_sdk.jobs_py.Job.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_file", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload._test_create_copy_long_path_scenario", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.download_file_and_check_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_file_error_message_on_access_denied", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_file_error_message_on_timeout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_windows_long_path_UNC_notation_WindowsOS", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_windows_long_path_UNC_notation_and_registry_WindowsOS"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _download_files_parallel | def _download_files_parallel(files: List[RelativeFilePath],hash_algorithm: HashAlgorithm,num_download_workers: int,local_download_dir: str,s3_bucket: str,cas_prefix: Optional[str],s3_client: Optional[BaseClient] = None,session: Optional[boto3.Session] = None,file_mod_time: Optional[float] = None,progress_tracker: Optional[ProgressTracker] = None,file_conflict_resolution: Optional[FileConflictResolution] = FileConflictResolution.CREATE_COPY,) -> list[str]:"""Downloads files in parallel using thread pool.Returns a list of local paths of downloaded files."""downloaded_file_names: list[str] = []collision_lock: Lock = Lock()collision_file_dict: DefaultDict[str, int] = DefaultDict(int)with concurrent.futures.ThreadPoolExecutor(max_workers=num_download_workers) as executor:futures = {executor.submit(download_file,file,hash_algorithm,local_download_dir,collision_lock,collision_file_dict,s3_bucket,cas_prefix,s3_client,session,file_mod_time,progress_tracker,file_conflict_resolution,): filefor file in files}# surfaces any exceptions in the threadfor future in concurrent.futures.as_completed(futures):(file_bytes, local_file_name) = future.result()if local_file_name:downloaded_file_names.append(str(local_file_name.resolve()))if progress_tracker:progress_tracker.increase_processed(1, 0)progress_tracker.report_progress()else:if progress_tracker:progress_tracker.increase_skipped(1, file_bytes)progress_tracker.report_progress()# to report progress 100% at the endif progress_tracker:progress_tracker.report_progress()return downloaded_file_names | 7 | 49 | 11 | 256 | 1 | 608 | 665 | 608 | files,hash_algorithm,num_download_workers,local_download_dir,s3_bucket,cas_prefix,s3_client,session,file_mod_time,progress_tracker,file_conflict_resolution | ['futures'] | list[str] | {"AnnAssign": 3, "Assign": 2, "Expr": 7, "For": 1, "If": 4, "Return": 1, "With": 1} | 14 | 58 | 14 | ["Lock", "DefaultDict", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "concurrent.futures.as_completed", "future.result", "downloaded_file_names.append", "str", "local_file_name.resolve", "progress_tracker.increase_processed", "progress_tracker.report_progress", "progress_tracker.increase_skipped", "progress_tracker.report_progress", "progress_tracker.report_progress"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory"] | The function (_download_files_parallel) defined within the public class called public.The function start at line 608 and ends at 665. It contains 49 lines of code and it has a cyclomatic complexity of 7. It takes 11 parameters, represented as [608.0] and does not return any value. It declares 14.0 functions, It has 14.0 functions called inside which are ["Lock", "DefaultDict", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "concurrent.futures.as_completed", "future.result", "downloaded_file_names.append", "str", "local_file_name.resolve", "progress_tracker.increase_processed", "progress_tracker.report_progress", "progress_tracker.increase_skipped", "progress_tracker.report_progress", "progress_tracker.report_progress"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_files | def download_files(files: list[RelativeFilePath],hash_algorithm: HashAlgorithm,local_download_dir: str,s3_settings: JobAttachmentS3Settings,session: Optional[boto3.Session] = None,progress_tracker: Optional[ProgressTracker] = None,file_conflict_resolution: Optional[FileConflictResolution] = FileConflictResolution.CREATE_COPY,) -> list[str]:"""Downloads all files from the S3 bucket in the Job Attachment settings to the specified directory.Returns a list of local paths of downloaded files."""s3_client = get_s3_client(session=session)num_download_workers = _get_num_download_workers()file_mod_time: float = datetime.now().timestamp()return _download_files_parallel(files,hash_algorithm,num_download_workers,local_download_dir,s3_settings.s3BucketName,s3_settings.full_cas_prefix(),s3_client,session,file_mod_time,progress_tracker,file_conflict_resolution,) | 1 | 25 | 7 | 118 | 2 | 668 | 698 | 668 | files,hash_algorithm,local_download_dir,s3_settings,session,progress_tracker,file_conflict_resolution | ['s3_client', 'num_download_workers'] | list[str] | {"AnnAssign": 1, "Assign": 2, "Expr": 1, "Return": 1} | 6 | 31 | 6 | ["get_s3_client", "_get_num_download_workers", "timestamp", "datetime.now", "_download_files_parallel", "s3_settings.full_cas_prefix"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.download_job_output"] | The function (download_files) defined within the public class called public.The function start at line 668 and ends at 698. It contains 25 lines of code and it has a cyclomatic complexity of 1. It takes 7 parameters, represented as [668.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["get_s3_client", "_get_num_download_workers", "timestamp", "datetime.now", "_download_files_parallel", "s3_settings.full_cas_prefix"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.download_job_output"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_job_output_paths_by_asset_root | def get_job_output_paths_by_asset_root(s3_settings: JobAttachmentS3Settings,farm_id: str,queue_id: str,job_id: str,step_id: Optional[str] = None,task_id: Optional[str] = None,session_action_id: Optional[str] = None,session: Optional[boto3.Session] = None,) -> dict[str, ManifestPathGroup]:"""Gets dict of grouped paths of all output files of a given job.The grouped paths are separated by asset root.Returns a dict of ManifestPathGroups, with the root path as the key."""output_manifests_by_root = get_output_manifests_by_asset_root(s3_settings, farm_id, queue_id, job_id, step_id, task_id, session_action_id, session=session)outputs: dict[str, ManifestPathGroup] = {}for root, manifests in output_manifests_by_root.items():for manifest in manifests:if root not in outputs:outputs[root] = ManifestPathGroup()outputs[root].add_manifest_to_group(manifest)return outputs | 4 | 20 | 8 | 140 | 1 | 701 | 727 | 701 | s3_settings,farm_id,queue_id,job_id,step_id,task_id,session_action_id,session | ['output_manifests_by_root'] | dict[str, ManifestPathGroup] | {"AnnAssign": 1, "Assign": 2, "Expr": 2, "For": 2, "If": 1, "Return": 1} | 4 | 27 | 4 | ["get_output_manifests_by_asset_root", "output_manifests_by_root.items", "ManifestPathGroup", "add_manifest_to_group"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_output_paths_by_asset_root_when_no_asset_root_throws_error"] | The function (get_job_output_paths_by_asset_root) defined within the public class called public.The function start at line 701 and ends at 727. It contains 20 lines of code and it has a cyclomatic complexity of 4. It takes 8 parameters, represented as [701.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_output_manifests_by_asset_root", "output_manifests_by_root.items", "ManifestPathGroup", "add_manifest_to_group"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_output_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.assert_get_job_output_paths_by_asset_root_when_no_asset_root_throws_error"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | get_output_manifests_by_asset_root | def get_output_manifests_by_asset_root(s3_settings: JobAttachmentS3Settings,farm_id: str,queue_id: str,job_id: str,step_id: Optional[str] = None,task_id: Optional[str] = None,session_action_id: Optional[str] = None,session: Optional[boto3.Session] = None,) -> dict[str, list[BaseAssetManifest]]:"""For a given job/step/task, gets a map from each root path to a corresponding list ofoutput manifests."""outputs: DefaultDict[str, list[BaseAssetManifest]] = DefaultDict(list)manifest_prefix: str = _get_output_manifest_prefix(s3_settings, farm_id, queue_id, job_id, step_id, task_id, session_action_id)try:manifests_keys: list[str] = _get_tasks_manifests_keys_from_s3(manifest_prefix, s3_settings.s3BucketName, session=session)except JobAttachmentsError:return outputswith concurrent.futures.ThreadPoolExecutor(max_workers=S3_DOWNLOAD_MAX_CONCURRENCY) as executor:futures = [executor.submit(get_asset_root_and_manifest_from_s3, key, s3_settings.s3BucketName, session)for key in manifests_keys]for key, future in zip(manifests_keys, futures):asset_root, asset_manifest = future.result()if not asset_root:raise MissingAssetRootError(f"Failed to get asset root from metadata of output manifest: {key}")outputs[asset_root].append(asset_manifest)return outputs | 5 | 35 | 8 | 209 | 1 | 730 | 770 | 730 | s3_settings,farm_id,queue_id,job_id,step_id,task_id,session_action_id,session | ['futures'] | dict[str, list[BaseAssetManifest]] | {"AnnAssign": 3, "Assign": 2, "Expr": 2, "For": 1, "If": 1, "Return": 2, "Try": 1, "With": 1} | 9 | 41 | 9 | ["DefaultDict", "_get_output_manifest_prefix", "_get_tasks_manifests_keys_from_s3", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "zip", "future.result", "MissingAssetRootError", "append"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_output_paths_by_asset_root"] | The function (get_output_manifests_by_asset_root) defined within the public class called public.The function start at line 730 and ends at 770. It contains 35 lines of code and it has a cyclomatic complexity of 5. It takes 8 parameters, represented as [730.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["DefaultDict", "_get_output_manifest_prefix", "_get_tasks_manifests_keys_from_s3", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "zip", "future.result", "MissingAssetRootError", "append"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_output_paths_by_asset_root"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_output_manifest_files_by_asset_root_with_last_modified | def _get_output_manifest_files_by_asset_root_with_last_modified(s3_settings: JobAttachmentS3Settings,output_manifest_paths: List[str],session: Optional[boto3.Session] = None,) -> list[Tuple[str, datetime, BaseAssetManifest]]:"""For a given list of output manifest paths, returns a list of tuples containing(asset_root, last_modified, manifest) that exactly mirrors the provided output_manifest_paths.Returns:A list of tuples containing (asset_root, last_modified, manifest) in the same order asthe provided output_manifest_paths."""outputs: List[Tuple[str, datetime, BaseAssetManifest]] = [None] * len(output_manifest_paths)# type: ignore[list-item]with concurrent.futures.ThreadPoolExecutor(max_workers=S3_DOWNLOAD_MAX_CONCURRENCY) as executor:# Submit all tasks and store futures in a list that preserves the original orderfutures = []for key in output_manifest_paths:future = executor.submit(_get_asset_root_and_manifest_from_s3_with_last_modified,key,s3_settings.s3BucketName,session,)futures.append(future)# Process results using explicit index-based iteration to ensure order preservationfor index in range(len(output_manifest_paths)):asset_root, last_modified, asset_manifest = futures[index].result()if not asset_root:raise MissingAssetRootError(f"Failed to get asset root from metadata of output manifest: {output_manifest_paths[index]}")outputs[index] = (asset_root, last_modified, asset_manifest)return outputs | 4 | 24 | 3 | 156 | 2 | 773 | 809 | 773 | s3_settings,output_manifest_paths,session | ['futures', 'future'] | list[Tuple[str, datetime, BaseAssetManifest]] | {"AnnAssign": 1, "Assign": 4, "Expr": 2, "For": 2, "If": 1, "Return": 1, "With": 1} | 8 | 37 | 8 | ["len", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "futures.append", "range", "len", "result", "MissingAssetRootError"] | 0 | [] | The function (_get_output_manifest_files_by_asset_root_with_last_modified) defined within the public class called public.The function start at line 773 and ends at 809. It contains 24 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [773.0] and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["len", "concurrent.futures.ThreadPoolExecutor", "executor.submit", "futures.append", "range", "len", "result", "MissingAssetRootError"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | download_files_from_manifests | def download_files_from_manifests(s3_bucket: str,manifests_by_root: dict[str, BaseAssetManifest],cas_prefix: Optional[str] = None,fs_permission_settings: Optional[FileSystemPermissionSettings] = None,session: Optional[boto3.Session] = None,on_downloading_files: Optional[Callable[[ProgressReportMetadata], bool]] = None,logger: Optional[Union[Logger, LoggerAdapter]] = None,conflict_resolution: FileConflictResolution = FileConflictResolution.CREATE_COPY,) -> DownloadSummaryStatistics:"""Given manifests, downloads all files from a CAS in each manifest.Args:s3_bucket: The name of the S3 bucket.manifests_by_root: a map from each local root path to a corresponding list of tuples of manifest contents and their path.cas_prefix: The CAS prefix of the files.session: The boto3 session to use.on_downloading_files: a callback to be called to periodically report progress to the caller.The callback returns True if the operation should continue as normal, or False to cancel.Returns:The download summary statistics."""s3_client = get_s3_client(session=session)num_download_workers = _get_num_download_workers()file_mod_time = datetime.now().timestamp()# Sets up progress tracker to report download progress back to the caller.total_size = 0total_files = 0for manifest in manifests_by_root.values():total_files += len(manifest.paths)total_size += manifest.totalSize# type: ignore[attr-defined]progress_tracker = ProgressTracker(status=ProgressStatus.DOWNLOAD_IN_PROGRESS,total_files=total_files,total_bytes=total_size,on_progress_callback=on_downloading_files,logger=logger,)start_time = time.perf_counter()downloaded_files_paths_by_root: DefaultDict[str, list[str]] = DefaultDict(list)for local_download_dir, manifest in manifests_by_root.items():downloaded_files_paths = _download_files_parallel(manifest.paths,manifest.hashAlg,num_download_workers,local_download_dir,s3_bucket,cas_prefix,s3_client,session,file_mod_time,progress_tracker=progress_tracker,file_conflict_resolution=conflict_resolution,)if fs_permission_settings is not None:_set_fs_group(file_paths=downloaded_files_paths,local_root=local_download_dir,fs_permission_settings=fs_permission_settings,)downloaded_files_paths_by_root[local_download_dir].extend(downloaded_files_paths)progress_tracker.total_time = time.perf_counter() - start_timereturn progress_tracker.get_download_summary_statistics(downloaded_files_paths_by_root) | 4 | 50 | 9 | 283 | 8 | 812 | 882 | 812 | s3_bucket,manifests_by_root,cas_prefix,fs_permission_settings,session,on_downloading_files,logger,conflict_resolution | ['total_files', 'downloaded_files_paths', 's3_client', 'file_mod_time', 'total_size', 'num_download_workers', 'progress_tracker', 'start_time'] | DownloadSummaryStatistics | {"AnnAssign": 1, "Assign": 9, "AugAssign": 2, "Expr": 3, "For": 2, "If": 1, "Return": 1} | 15 | 71 | 15 | ["get_s3_client", "_get_num_download_workers", "timestamp", "datetime.now", "manifests_by_root.values", "len", "ProgressTracker", "time.perf_counter", "DefaultDict", "manifests_by_root.items", "_download_files_parallel", "_set_fs_group", "extend", "time.perf_counter", "progress_tracker.get_download_summary_statistics"] | 8 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_download_with_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.copied_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_have_correct_group_posix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_have_correct_group_windows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_with_fs_permission_settings_posix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_with_fs_permission_settings_windows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_download_files_from_manifests"] | The function (download_files_from_manifests) defined within the public class called public.The function start at line 812 and ends at 882. It contains 50 lines of code and it has a cyclomatic complexity of 4. It takes 9 parameters, represented as [812.0] and does not return any value. It declares 15.0 functions, It has 15.0 functions called inside which are ["get_s3_client", "_get_num_download_workers", "timestamp", "datetime.now", "manifests_by_root.values", "len", "ProgressTracker", "time.perf_counter", "DefaultDict", "manifests_by_root.items", "_download_files_parallel", "_set_fs_group", "extend", "time.perf_counter", "progress_tracker.get_download_summary_statistics"], It has 8.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_download_with_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.copied_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_have_correct_group_posix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_have_correct_group_windows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_with_fs_permission_settings_posix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.TestFullDownload.test_download_files_from_manifests_with_fs_permission_settings_windows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_download_files_from_manifests"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_num_download_workers | def _get_num_download_workers() -> int:"""Determines the max number of thread workers for downloading multiple files in parallel,based on the allowed S3 max pool connections size. If the max worker count is calculatedto be 0 due to a small pool connections size limit, it returns 1."""num_download_workers = int(get_s3_max_pool_connections() / S3_DOWNLOAD_MAX_CONCURRENCY)if num_download_workers <= 0:# This can result in triggering "Connection pool is full" warning messages during downloads.num_download_workers = 1return num_download_workers | 2 | 5 | 0 | 27 | 1 | 885 | 895 | 885 | ['num_download_workers'] | int | {"Assign": 2, "Expr": 1, "If": 1, "Return": 1} | 2 | 11 | 2 | ["int", "get_s3_max_pool_connections"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_manifest_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory"] | The function (_get_num_download_workers) defined within the public class called public.The function start at line 885 and ends at 895. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["int", "get_s3_max_pool_connections"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments._incremental_downloads._manifest_s3_downloads_py._download_manifest_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_in_directory"]. | |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _set_fs_group | def _set_fs_group(file_paths: list[str],local_root: str,fs_permission_settings: FileSystemPermissionSettings,) -> None:"""Sets file system group ownership and permissions for all files and directoriesin the given paths, starting from root. It is expected that all `file_paths`point to files, not directories.Raises:TypeError: If the `fs_permission_settings` are not specific to the underlying OS."""if os.name == "posix":if not isinstance(fs_permission_settings, PosixFileSystemPermissionSettings):raise TypeError("The file system permission settings must be specific to Posix-based system.")_set_fs_group_for_posix(file_paths=file_paths,local_root=local_root,fs_permission_settings=fs_permission_settings,)else:# if os.name is not "posix"if not isinstance(fs_permission_settings, WindowsFileSystemPermissionSettings):raise TypeError("The file system permission settings must be specific to Windows.")_set_fs_permission_for_windows(file_paths=file_paths,local_root=local_root,fs_permission_settings=fs_permission_settings,) | 4 | 23 | 3 | 89 | 0 | 898 | 928 | 898 | file_paths,local_root,fs_permission_settings | [] | None | {"Expr": 3, "If": 3} | 6 | 31 | 6 | ["isinstance", "TypeError", "_set_fs_group_for_posix", "isinstance", "TypeError", "_set_fs_permission_for_windows"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests"] | The function (_set_fs_group) defined within the public class called public.The function start at line 898 and ends at 928. It contains 23 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [898.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["isinstance", "TypeError", "_set_fs_group_for_posix", "isinstance", "TypeError", "_set_fs_permission_for_windows"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | merge_asset_manifests | def merge_asset_manifests(manifests: list[BaseAssetManifest]) -> BaseAssetManifest | None:"""Merge files from multiple manifests into a single list, ensuring that each filenameis unique by keeping the one from the last encountered manifest. (Thus, the steps'outputs are downloaded over the input job attachments.)Args:manifests (list[AssetManifest]): A list of manifests to be merged.Raises:NotImplementedError: When two manifests have different hash algorithms.All manifests must use the same hash algorithm.Returns:AssetManifest | None: A single manifest containing the merged paths of all provided manifests or None if no manifests were provided"""if len(manifests) == 0:return Noneelif len(manifests) == 1:return manifests[0]first_manifest = manifests[0]hash_alg: HashAlgorithm = first_manifest.hashAlgmerged_paths: dict[str, RelativeFilePath] = dict()total_size: int = 0# Loop each manifestfor manifest in manifests:if manifest.hashAlg != hash_alg:raise NotImplementedError(f"Merging manifests with different hash algorithms is not supported.{manifest.hashAlg.value} does not match {hash_alg.value}")for path in manifest.paths:merged_paths[path.path] = pathmanifest_args: dict[str, Any] = {"hash_alg": hash_alg,"paths": list(merged_paths.values()),}total_size = sum([path.size for path in merged_paths.values()])# type: ignoremanifest_args["total_size"] = total_sizeoutput_manifest: BaseAssetManifest = first_manifest.__class__(**manifest_args)return output_manifest | 7 | 24 | 1 | 164 | 2 | 931 | 976 | 931 | manifests | ['total_size', 'first_manifest'] | BaseAssetManifest | None | {"AnnAssign": 5, "Assign": 4, "Expr": 1, "For": 2, "If": 3, "Return": 3} | 9 | 46 | 9 | ["len", "len", "dict", "NotImplementedError", "list", "merged_paths.values", "sum", "merged_paths.values", "first_manifest.__class__"] | 10 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_merge", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._merge_asset_manifests_sorted_asc_by_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.handle_existing_vfs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifest_single", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifests_empty"] | The function (merge_asset_manifests) defined within the public class called public.The function start at line 931 and ends at 976. It contains 24 lines of code and it has a cyclomatic complexity of 7. The function does not take any parameters and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["len", "len", "dict", "NotImplementedError", "list", "merged_paths.values", "sum", "merged_paths.values", "first_manifest.__class__"], It has 10.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_download", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_merge", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._aggregate_asset_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._merge_asset_manifests_sorted_asc_by_last_modified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.handle_existing_vfs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_download_files_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifest_single", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_merge_asset_manifests_empty"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _merge_asset_manifests_sorted_asc_by_last_modified | def _merge_asset_manifests_sorted_asc_by_last_modified(manifests_with_last_modified_timestamps: list[Tuple[datetime, BaseAssetManifest]],) -> BaseAssetManifest | None:"""Merge files from multiple manifests into a single list, sorting them by last modified timestamp asc.This function first sorts the manifests by their timestamps (oldest first) and then merges them,ensuring that newer files overwrite older ones with the same path.Args:manifests_with_last_modified_timestamps (list[Tuple[datetime, BaseAssetManifest]]): A list of tuples containing(timestamp, manifest) to be sorted and merged.Raises:NotImplementedError: When two manifests have different hash algorithms.All manifests must use the same hash algorithm.Returns:BaseAssetManifest | None: A single manifest containing the merged paths of all provided manifestsor None if no manifests were provided"""if not manifests_with_last_modified_timestamps:return None# Sort manifests by timestamp (oldest first)sorted_manifests_with_timestamps = sorted(manifests_with_last_modified_timestamps)# Extract just the manifests in the sorted ordersorted_manifests = [manifest for _, manifest in sorted_manifests_with_timestamps]# Use the existing merge function with the sorted manifestsreturn merge_asset_manifests(sorted_manifests) | 3 | 8 | 1 | 49 | 2 | 979 | 1,008 | 979 | manifests_with_last_modified_timestamps | ['sorted_manifests', 'sorted_manifests_with_timestamps'] | BaseAssetManifest | None | {"Assign": 2, "Expr": 1, "If": 1, "Return": 2} | 2 | 30 | 2 | ["sorted", "merge_asset_manifests"] | 0 | [] | The function (_merge_asset_manifests_sorted_asc_by_last_modified) defined within the public class called public.The function start at line 979 and ends at 1008. It contains 8 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["sorted", "merge_asset_manifests"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _write_manifest_to_temp_file | def _write_manifest_to_temp_file(manifest: BaseAssetManifest, dir: Path) -> str:with NamedTemporaryFile(suffix=".json", prefix="deadline-merged-manifest-", delete=False, mode="w", dir=dir) as file:file.write(manifest.encode())return file.name | 1 | 6 | 2 | 53 | 0 | 1,011 | 1,016 | 1,011 | manifest,dir | [] | str | {"Expr": 1, "Return": 1, "With": 1} | 3 | 6 | 3 | ["NamedTemporaryFile", "file.write", "manifest.encode"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests"] | The function (_write_manifest_to_temp_file) defined within the public class called public.The function start at line 1011 and ends at 1016. It contains 6 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1011.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["NamedTemporaryFile", "file.write", "manifest.encode"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _read_manifest_file | def _read_manifest_file(input_manifest_path: Path):"""Given a manifest path, open the file at that location and decodeArgs:input_manifest_path: Path to manifestReturns:BaseAssetManifest : Single decoded manifest"""with open(input_manifest_path) as input_manifest_file:return decode_manifest(input_manifest_file.read()) | 1 | 3 | 1 | 25 | 0 | 1,019 | 1,028 | 1,019 | input_manifest_path | [] | Returns | {"Expr": 1, "Return": 1, "With": 1} | 3 | 10 | 3 | ["open", "decode_manifest", "input_manifest_file.read"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.handle_existing_vfs"] | The function (_read_manifest_file) defined within the public class called public.The function start at line 1019 and ends at 1028. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 3.0 functions, It has 3.0 functions called inside which are ["open", "decode_manifest", "input_manifest_file.read"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.handle_existing_vfs"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | handle_existing_vfs | def handle_existing_vfs(manifest: BaseAssetManifest, session_dir: Path, mount_point: str, os_user: str) -> BaseAssetManifest:"""Combines provided manifest with the input manifest of the running VFS at thegiven mount_point if it exists. Then kills the running process at that mount soit can be replacedArgs:manifests (BaseAssetManifest): The manifest for the new inputs to be mountedmount_point (str): The local directory where the manifest is to be mountedos_user: the user running the job.Returns:BaseAssetManifest : A single manifest containing the merged paths or the original manifest"""if not VFSProcessManager.is_mount(mount_point):return manifestinput_manifest_path: Optional[Path] = VFSProcessManager.get_manifest_path_for_mount(session_dir=session_dir, mount_point=mount_point)if input_manifest_path is not None:input_manifest = _read_manifest_file(input_manifest_path)merged_input_manifest: Optional[BaseAssetManifest] = merge_asset_manifests([input_manifest, manifest])manifest = merged_input_manifest if merged_input_manifest is not None else manifestelse:download_logger.error(f"input manifest not found for mount at {mount_point}")return manifestVFSProcessManager.kill_process_at_mount(session_dir=session_dir, mount_point=mount_point, os_user=os_user)return manifest | 4 | 21 | 4 | 118 | 2 | 1,031 | 1,067 | 1,031 | manifest,session_dir,mount_point,os_user | ['manifest', 'input_manifest'] | BaseAssetManifest | {"AnnAssign": 2, "Assign": 2, "Expr": 3, "If": 2, "Return": 3} | 6 | 37 | 6 | ["VFSProcessManager.is_mount", "VFSProcessManager.get_manifest_path_for_mount", "_read_manifest_file", "merge_asset_manifests", "download_logger.error", "VFSProcessManager.kill_process_at_mount"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_handle_existing_vfs_no_mount_returns", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_handle_existing_vfs_success"] | The function (handle_existing_vfs) defined within the public class called public.The function start at line 1031 and ends at 1067. It contains 21 lines of code and it has a cyclomatic complexity of 4. It takes 4 parameters, represented as [1031.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["VFSProcessManager.is_mount", "VFSProcessManager.get_manifest_path_for_mount", "_read_manifest_file", "merge_asset_manifests", "download_logger.error", "VFSProcessManager.kill_process_at_mount"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_handle_existing_vfs_no_mount_returns", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_handle_existing_vfs_success"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | mount_vfs_from_manifests | def mount_vfs_from_manifests(s3_bucket: str,manifests_by_root: dict[str, BaseAssetManifest],boto3_session: boto3.Session,session_dir: Path,os_env_vars: dict[str, str],fs_permission_settings: FileSystemPermissionSettings,cas_prefix: Optional[str] = None,) -> None:"""Given manifests, downloads all files from a CAS in those manifests.Args:s3_bucket: The name of the S3 bucket.manifests_by_root: a map from each local root path to a corresponding list of tuples of manifest contents and their path.boto3_session: The boto3 session to use.session_dir: the directory that the session is going to use.os_user: the user running the job.os_group: the group of the user running the jobos_env_vars: environment variables to set for launched subprocessescas_prefix: The CAS prefix of the files.Returns:None"""if not isinstance(fs_permission_settings, PosixFileSystemPermissionSettings):raise TypeError("VFS can only be mounted from manifests on posix file systems.")vfs_cache_dir: Path = session_dir / VFS_CACHE_REL_PATH_IN_SESSIONasset_cache_hash_path: Path = vfs_cache_dirif cas_prefix is not None:asset_cache_hash_path = vfs_cache_dir / cas_prefix_ensure_paths_within_directory(str(vfs_cache_dir), [str(asset_cache_hash_path)])asset_cache_hash_path.mkdir(parents=True, exist_ok=True)_set_fs_group([str(asset_cache_hash_path)], str(vfs_cache_dir), fs_permission_settings)manifest_dir: Path = session_dir / VFS_MANIFEST_FOLDER_IN_SESSIONmanifest_dir.mkdir(parents=True, exist_ok=True)manifest_dir_permissions = VFS_MANIFEST_FOLDER_PERMISSIONSmanifest_dir_permissions.os_user = fs_permission_settings.os_usermanifest_dir_permissions.os_group = fs_permission_settings.os_group_set_fs_group([str(manifest_dir)], str(manifest_dir), manifest_dir_permissions)vfs_logs_dir: Path = session_dir / VFS_LOGS_FOLDER_IN_SESSIONvfs_logs_dir.mkdir(parents=True, exist_ok=True)_set_fs_group([str(vfs_logs_dir)], str(vfs_logs_dir), fs_permission_settings)for mount_point, manifest in manifests_by_root.items():# Validate the file paths to see if they are under the given download directory._ensure_paths_within_directory(mount_point,[path.path for path in manifest.paths],# type: ignore)final_manifest: BaseAssetManifest = handle_existing_vfs(manifest=manifest,session_dir=session_dir,mount_point=mount_point,os_user=fs_permission_settings.os_user,)# Write out a temporary file with the contents of the newly merged manifestmanifest_path: str = _write_manifest_to_temp_file(final_manifest, dir=manifest_dir)vfs_manager: VFSProcessManager = VFSProcessManager(s3_bucket,boto3_session.region_name,manifest_path,mount_point,fs_permission_settings.os_user,os_env_vars,getattr(fs_permission_settings, "os_group", ""),cas_prefix,str(vfs_cache_dir),)vfs_manager.start(session_dir=session_dir) | 5 | 51 | 7 | 330 | 2 | 1,070 | 1,147 | 1,070 | s3_bucket,manifests_by_root,boto3_session,session_dir,os_env_vars,fs_permission_settings,cas_prefix | ['asset_cache_hash_path', 'manifest_dir_permissions'] | None | {"AnnAssign": 7, "Assign": 4, "Expr": 10, "For": 1, "If": 2} | 25 | 78 | 25 | ["isinstance", "TypeError", "_ensure_paths_within_directory", "str", "str", "asset_cache_hash_path.mkdir", "_set_fs_group", "str", "str", "manifest_dir.mkdir", "_set_fs_group", "str", "str", "vfs_logs_dir.mkdir", "_set_fs_group", "str", "str", "manifests_by_root.items", "_ensure_paths_within_directory", "handle_existing_vfs", "_write_manifest_to_temp_file", "VFSProcessManager", "getattr", "str", "vfs_manager.start"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._launch_vfs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_mount_vfs_from_manifests"] | The function (mount_vfs_from_manifests) defined within the public class called public.The function start at line 1070 and ends at 1147. It contains 51 lines of code and it has a cyclomatic complexity of 5. It takes 7 parameters, represented as [1070.0] and does not return any value. It declares 25.0 functions, It has 25.0 functions called inside which are ["isinstance", "TypeError", "_ensure_paths_within_directory", "str", "str", "asset_cache_hash_path.mkdir", "_set_fs_group", "str", "str", "manifest_dir.mkdir", "_set_fs_group", "str", "str", "vfs_logs_dir.mkdir", "_set_fs_group", "str", "str", "manifests_by_root.items", "_ensure_paths_within_directory", "handle_existing_vfs", "_write_manifest_to_temp_file", "VFSProcessManager", "getattr", "str", "vfs_manager.start"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._launch_vfs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.sync_inputs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_mount_vfs_from_manifests"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _ensure_paths_within_directory | def _ensure_paths_within_directory(root_path: str, paths_relative_to_root: list[str]) -> None:"""Validates the given paths to ensure that they are within the given root path.If the root path is not an absolute path, raises a ValueError.If any path is not under the root directory, raises an PathOutsideDirectoryError."""if not Path(root_path).is_absolute():raise ValueError(f"The provided root path is not an absolute path: {root_path}")for path in paths_relative_to_root:resolved_path = Path(root_path, path).resolve()if not _is_relative_to(resolved_path, Path(root_path).resolve()):raise PathOutsideDirectoryError(f"The provided path is not under the root directory: {path}")return | 4 | 10 | 2 | 74 | 1 | 1,150 | 1,165 | 1,150 | root_path,paths_relative_to_root | ['resolved_path'] | None | {"Assign": 1, "Expr": 1, "For": 1, "If": 2, "Return": 1} | 9 | 16 | 9 | ["is_absolute", "Path", "ValueError", "resolve", "Path", "_is_relative_to", "resolve", "Path", "PathOutsideDirectoryError"] | 6 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.download_job_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_posix_no_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_posix_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_windows_no_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_windows_raises_error"] | The function (_ensure_paths_within_directory) defined within the public class called public.The function start at line 1150 and ends at 1165. It contains 10 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [1150.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["is_absolute", "Path", "ValueError", "resolve", "Path", "_is_relative_to", "resolve", "Path", "PathOutsideDirectoryError"], It has 6.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.download_job_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.mount_vfs_from_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_posix_no_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_posix_raises_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_windows_no_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_download_py.test_ensure_paths_within_directory_windows_raises_error"]. |
aws-deadline_deadline-cloud | OutputDownloader | public | 0 | 0 | __init__ | def __init__(self,s3_settings: JobAttachmentS3Settings,farm_id: str,queue_id: str,job_id: str,step_id: Optional[str] = None,task_id: Optional[str] = None,session_action_id: Optional[str] = None,session: Optional[boto3.Session] = None,) -> None:self.s3_settings = s3_settingsself.session = sessionself.outputs_by_root = get_job_output_paths_by_asset_root(s3_settings=s3_settings,farm_id=farm_id,queue_id=queue_id,job_id=job_id,step_id=step_id,task_id=task_id,session_action_id=session_action_id,session=session,) | 1 | 23 | 9 | 111 | 0 | 1,180 | 1,202 | 1,180 | self,s3_settings,farm_id,queue_id,job_id,step_id,task_id,session_action_id,session | [] | None | {"Assign": 3} | 1 | 23 | 1 | ["get_job_output_paths_by_asset_root"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called OutputDownloader.The function start at line 1180 and ends at 1202. It contains 23 lines of code and it has a cyclomatic complexity of 1. It takes 9 parameters, represented as [1180.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["get_job_output_paths_by_asset_root"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | OutputDownloader | public | 0 | 0 | get_output_paths_by_root | def get_output_paths_by_root(self) -> dict[str, list[str]]:"""Returns a dict of asset root paths to lists of output paths."""output_paths_by_root: dict[str, list[str]] = {}for root, path_group in self.outputs_by_root.items():output_paths_by_root[root] = path_group.get_all_paths()return output_paths_by_root | 2 | 5 | 1 | 55 | 0 | 1,204 | 1,212 | 1,204 | self | [] | dict[str, list[str]] | {"AnnAssign": 1, "Assign": 1, "Expr": 1, "For": 1, "Return": 1} | 2 | 9 | 2 | ["self.outputs_by_root.items", "path_group.get_all_paths"] | 0 | [] | The function (get_output_paths_by_root) defined within the public class called OutputDownloader.The function start at line 1204 and ends at 1212. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.outputs_by_root.items", "path_group.get_all_paths"]. |
aws-deadline_deadline-cloud | OutputDownloader | public | 0 | 0 | set_root_path | def set_root_path(self, original_root: str, new_root: str) -> None:"""Changes the root path for downloading output files, (which is the root pathsaved in the S3 metadata for the output manifest by default,) with a custom path.(It will store the new root path as an absolute path.)"""# Need to use absolute to not resolve symlinks, but need normpath to get rid of relative paths, i.e. '..'new_root = str(os.path.normpath(Path(new_root).absolute()))if original_root not in self.outputs_by_root:raise ValueError(f"The root path {original_root} was not found in output manifests {self.outputs_by_root}.")if new_root == original_root:returnif new_root in self.outputs_by_root:# If the new_root already exists, and the file path in the original_root already exists# among the file paths of the new_root, then prefix the file path with the original_root path.# This is to avoid duplicate file paths in the new_root.paths_in_new_root = self.outputs_by_root[new_root].get_all_paths()for manifest_paths in self.outputs_by_root[original_root].files_by_hash_alg.values():for manifest_path in manifest_paths:if manifest_path.path in paths_in_new_root:new_name_prefix = (original_root.replace("/", "_").replace("\\", "_").replace(":", "_"))manifest_path.path = str(Path(manifest_path.path).with_name(f"{new_name_prefix}_{manifest_path.path}"))self.outputs_by_root[new_root].combine_with_group(self.outputs_by_root[original_root])del self.outputs_by_root[original_root]else:self.outputs_by_root = {key if key != original_root else new_root: valuefor key, value in self.outputs_by_root.items()} | 9 | 28 | 3 | 200 | 0 | 1,214 | 1,253 | 1,214 | self,original_root,new_root | [] | None | {"Assign": 5, "Expr": 2, "For": 2, "If": 4, "Return": 1} | 15 | 40 | 15 | ["str", "os.path.normpath", "absolute", "Path", "ValueError", "get_all_paths", "files_by_hash_alg.values", "replace", "replace", "original_root.replace", "str", "with_name", "Path", "combine_with_group", "self.outputs_by_root.items"] | 0 | [] | The function (set_root_path) defined within the public class called OutputDownloader.The function start at line 1214 and ends at 1253. It contains 28 lines of code and it has a cyclomatic complexity of 9. It takes 3 parameters, represented as [1214.0] and does not return any value. It declares 15.0 functions, and It has 15.0 functions called inside which are ["str", "os.path.normpath", "absolute", "Path", "ValueError", "get_all_paths", "files_by_hash_alg.values", "replace", "replace", "original_root.replace", "str", "with_name", "Path", "combine_with_group", "self.outputs_by_root.items"]. |
aws-deadline_deadline-cloud | OutputDownloader | public | 0 | 0 | download_job_output | def download_job_output(self,file_conflict_resolution: Optional[FileConflictResolution] = FileConflictResolution.CREATE_COPY,on_downloading_files: Optional[Callable[[ProgressReportMetadata], bool]] = None,) -> DownloadSummaryStatistics:"""Downloads outputs files from S3 bucket to the asset root(s).Args:file_conflict_resolution: resolution method for file conflicts.on_downloading_files: a callback to be called to periodically report progress to the caller.The callback returns True if the operation should continue as normal, or False to cancel.Returns:The download summary statistics"""# Sets up progress tracker to report download progress back to the caller.total_bytes: int = 0total_files: int = 0for path_group in self.outputs_by_root.values():total_bytes += path_group.total_bytestotal_files += len(path_group.get_all_paths())progress_tracker = ProgressTracker(status=ProgressStatus.DOWNLOAD_IN_PROGRESS,total_files=total_files,total_bytes=total_bytes,on_progress_callback=on_downloading_files,)start_time = time.perf_counter()downloaded_files_paths_by_root: DefaultDict[str, list[str]] = DefaultDict(list)try:for root, output_path_group in self.outputs_by_root.items():for hash_alg, path_list in output_path_group.files_by_hash_alg.items():# Validate the file paths to see if they are under the given download directory._ensure_paths_within_directory(root, [file.path for file in path_list])downloaded_files_paths = download_files(files=path_list,hash_algorithm=hash_alg,local_download_dir=root,s3_settings=self.s3_settings,session=self.session,progress_tracker=progress_tracker,file_conflict_resolution=file_conflict_resolution,)downloaded_files_paths_by_root[root].extend(downloaded_files_paths)except AssetSyncCancelledError:downloaded_files = progress_tracker.processed_filesraise AssetSyncCancelledError("Download cancelled. "f"(Downloaded {downloaded_files} file{'' if downloaded_files == 1 else 's'} before cancellation.)")progress_tracker.total_time = time.perf_counter() - start_timereturn progress_tracker.get_download_summary_statistics(downloaded_files_paths_by_root) | 6 | 42 | 4 | 239 | 0 | 1,255 | 1,315 | 1,255 | self,file_conflict_resolution,on_downloading_files | [] | DownloadSummaryStatistics | {"AnnAssign": 3, "Assign": 5, "AugAssign": 2, "Expr": 3, "For": 3, "Return": 1, "Try": 1} | 14 | 61 | 14 | ["self.outputs_by_root.values", "len", "path_group.get_all_paths", "ProgressTracker", "time.perf_counter", "DefaultDict", "self.outputs_by_root.items", "output_path_group.files_by_hash_alg.items", "_ensure_paths_within_directory", "download_files", "extend", "AssetSyncCancelledError", "time.perf_counter", "progress_tracker.get_download_summary_statistics"] | 0 | [] | The function (download_job_output) defined within the public class called OutputDownloader.The function start at line 1255 and ends at 1315. It contains 42 lines of code and it has a cyclomatic complexity of 6. It takes 4 parameters, represented as [1255.0] and does not return any value. It declares 14.0 functions, and It has 14.0 functions called inside which are ["self.outputs_by_root.values", "len", "path_group.get_all_paths", "ProgressTracker", "time.perf_counter", "DefaultDict", "self.outputs_by_root.items", "output_path_group.files_by_hash_alg.items", "_ensure_paths_within_directory", "download_files", "extend", "AssetSyncCancelledError", "time.perf_counter", "progress_tracker.get_download_summary_statistics"]. |
aws-deadline_deadline-cloud | JobAttachmentsS3ClientError | public | 0 | 1 | __init__ | def __init__(self,action,status_code,bucket_name: str,key_or_prefix: str,message: Optional[str] = None,) -> None:self.action = actionself.status_code = status_codeself.bucket_name = bucket_nameself.key_or_prefix = key_or_prefixmessage_parts = [f"Error {action} in bucket '{bucket_name}', Target key or prefix: '{key_or_prefix}'",f"HTTP Status Code: {status_code}",]if message:message_parts.append(message)super().__init__(", ".join(message_parts)) | 2 | 19 | 6 | 81 | 0 | 34 | 54 | 34 | self,action,status_code,bucket_name,key_or_prefix,message | [] | None | {"Assign": 5, "Expr": 2, "If": 1} | 4 | 21 | 4 | ["message_parts.append", "__init__", "super", "join"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called JobAttachmentsS3ClientError, that inherit another class.The function start at line 34 and ends at 54. It contains 19 lines of code and it has a cyclomatic complexity of 2. It takes 6 parameters, represented as [34.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["message_parts.append", "__init__", "super", "join"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | JobAttachmentsS3ClientError | public | 0 | 1 | __init__ | def __init__(self, action: str, error_details: str) -> None:self.action = actionmessage = (f"An issue occurred with AWS service request while {action}: "f"{error_details}\n""This could be due to temporary issues with AWS, internet connection, or your AWS credentials. ""Please verify your credentials and network connection. If the problem persists, try again later"" or contact support for further assistance.")super().__init__(message) | 1 | 10 | 3 | 39 | 0 | 62 | 71 | 62 | self,action,status_code,bucket_name,key_or_prefix,message | [] | None | {"Assign": 5, "Expr": 2, "If": 1} | 4 | 21 | 4 | ["message_parts.append", "__init__", "super", "join"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called JobAttachmentsS3ClientError, that inherit another class.The function start at line 62 and ends at 71. It contains 10 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [62.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["message_parts.append", "__init__", "super", "join"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | JobAttachmentsS3ClientError | public | 0 | 1 | __init__ | def __init__(self, message, summary_statistics=None):super().__init__(message)self.summary_statistics = summary_statistics | 1 | 3 | 3 | 24 | 0 | 140 | 142 | 140 | self,action,status_code,bucket_name,key_or_prefix,message | [] | None | {"Assign": 5, "Expr": 2, "If": 1} | 4 | 21 | 4 | ["message_parts.append", "__init__", "super", "join"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called JobAttachmentsS3ClientError, that inherit another class.The function start at line 140 and ends at 142. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [140.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["message_parts.append", "__init__", "super", "join"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | ManifestPathGroup | public | 0 | 0 | add_manifest_to_group | def add_manifest_to_group(self, manifest: BaseAssetManifest) -> None:if manifest.hashAlg not in self.files_by_hash_alg:self.files_by_hash_alg[manifest.hashAlg] = manifest.pathselse:self.files_by_hash_alg[manifest.hashAlg].extend(manifest.paths)self.total_bytes += manifest.totalSize# type: ignore[attr-defined] | 2 | 6 | 2 | 57 | 0 | 78 | 83 | 78 | self,manifest | [] | None | {"Assign": 1, "AugAssign": 1, "Expr": 1, "If": 1} | 1 | 6 | 1 | ["extend"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_output_paths_by_asset_root"] | The function (add_manifest_to_group) defined within the public class called ManifestPathGroup.The function start at line 78 and ends at 83. It contains 6 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [78.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["extend"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_paths_by_asset_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_output_paths_by_asset_root"]. |
aws-deadline_deadline-cloud | ManifestPathGroup | public | 0 | 0 | combine_with_group | def combine_with_group(self, group: ManifestPathGroup) -> None:"""Adds the content of the given ManifestPathGroup to this ManifestPathGroup"""for hash_alg, paths in group.files_by_hash_alg.items():if hash_alg not in self.files_by_hash_alg:self.files_by_hash_alg[hash_alg] = pathselse:self.files_by_hash_alg[hash_alg].extend(paths)self.total_bytes += group.total_bytes | 3 | 7 | 2 | 61 | 0 | 85 | 92 | 85 | self,group | [] | None | {"Assign": 1, "AugAssign": 1, "Expr": 2, "For": 1, "If": 1} | 2 | 8 | 2 | ["group.files_by_hash_alg.items", "extend"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.set_root_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root"] | The function (combine_with_group) defined within the public class called ManifestPathGroup.The function start at line 85 and ends at 92. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [85.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["group.files_by_hash_alg.items", "extend"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.set_root_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.get_job_input_output_paths_by_asset_root"]. |
aws-deadline_deadline-cloud | ManifestPathGroup | public | 0 | 0 | get_all_paths | def get_all_paths(self) -> list[str]:"""Get all paths in this group, regardless of hashing algorithm.Note that this may include duplicates if the same path exists for multiple hashing algorithms.Returns a sorted list of paths represented as strings."""path_list: List[str] = []for paths in self.files_by_hash_alg.values():path_list.extend([path.path for path in paths])return sorted(path_list) | 3 | 5 | 1 | 50 | 0 | 94 | 104 | 94 | self | [] | list[str] | {"AnnAssign": 1, "Expr": 2, "For": 1, "Return": 1} | 3 | 11 | 3 | ["self.files_by_hash_alg.values", "path_list.extend", "sorted"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.set_root_path"] | The function (get_all_paths) defined within the public class called ManifestPathGroup.The function start at line 94 and ends at 104. It contains 5 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["self.files_by_hash_alg.values", "path_list.extend", "sorted"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py.OutputDownloader.set_root_path"]. |
aws-deadline_deadline-cloud | StorageProfileOperatingSystemFamily | public | 0 | 1 | _missing_ | def _missing_(cls, value):value = value.lower()for member in cls:if member == value:return memberreturn None | 3 | 6 | 2 | 28 | 0 | 131 | 136 | 131 | cls,value | [] | Returns | {"Assign": 1, "For": 1, "If": 1, "Return": 2} | 1 | 6 | 1 | ["value.lower"] | 0 | [] | The function (_missing_) defined within the public class called StorageProfileOperatingSystemFamily, that inherit another class.The function start at line 131 and ends at 136. It contains 6 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [131.0], and this function return a value. It declare 1.0 function, and It has 1.0 function called inside which is ["value.lower"]. |
aws-deadline_deadline-cloud | StorageProfileOperatingSystemFamily | public | 0 | 1 | get_host_os_family | def get_host_os_family(cls) -> StorageProfileOperatingSystemFamily:"""Get the current path format."""if sys.platform.startswith("win"):return cls.WINDOWSif sys.platform.startswith("darwin"):return cls.MACOSif sys.platform.startswith("linux"):return cls.LINUXelse:raise NotImplementedError(f"Operating system {sys.platform} is not supported.") | 4 | 9 | 1 | 58 | 0 | 139 | 148 | 139 | cls | [] | StorageProfileOperatingSystemFamily | {"Expr": 1, "If": 3, "Return": 3} | 4 | 10 | 4 | ["sys.platform.startswith", "sys.platform.startswith", "sys.platform.startswith", "NotImplementedError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_queue_incremental_download_py.test_incremental_output_download_storage_profile_path_mapping"] | The function (get_host_os_family) defined within the public class called StorageProfileOperatingSystemFamily, that inherit another class.The function start at line 139 and ends at 148. It contains 9 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["sys.platform.startswith", "sys.platform.startswith", "sys.platform.startswith", "NotImplementedError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_queue_incremental_download_py.test_incremental_output_download_storage_profile_path_mapping"]. |
aws-deadline_deadline-cloud | PathFormat | public | 0 | 1 | get_host_path_format | def get_host_path_format(cls) -> PathFormat:"""Get the current path format."""if sys.platform.startswith("win"):return cls.WINDOWSif sys.platform.startswith("darwin") or sys.platform.startswith("linux"):return cls.POSIXelse:raise NotImplementedError(f"Operating system {sys.platform} is not supported.") | 4 | 7 | 1 | 53 | 0 | 162 | 169 | 162 | cls | [] | PathFormat | {"Expr": 1, "If": 2, "Return": 2} | 4 | 8 | 4 | ["sys.platform.startswith", "sys.platform.startswith", "sys.platform.startswith", "NotImplementedError"] | 9 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_path_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_paths_by_nested_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_paths_by_sequence", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._get_output_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetManager.snapshot_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetManager.upload_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_json_format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_mismatching_path_format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_only_required_input"] | The function (get_host_path_format) defined within the public class called PathFormat, that inherit another class.The function start at line 162 and ends at 169. It contains 7 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["sys.platform.startswith", "sys.platform.startswith", "sys.platform.startswith", "NotImplementedError"], It has 9.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_path_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_paths_by_nested_directory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.common.path_utils_py.summarize_paths_by_sequence", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync._get_output_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetManager.snapshot_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetManager.upload_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_json_format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_mismatching_path_format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_stdout_with_only_required_input"]. |
aws-deadline_deadline-cloud | PathFormat | public | 0 | 1 | get_host_path_format_string | def get_host_path_format_string(cls) -> str:"""Get a string of the current path format."""return cls.get_host_path_format().value | 1 | 2 | 1 | 16 | 0 | 172 | 174 | 172 | cls | [] | str | {"Expr": 1, "Return": 1} | 1 | 3 | 1 | ["cls.get_host_path_format"] | 8 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py._download_job_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_upload_input_files_no_download_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.upload_input_files_no_input_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_cli_handle_web_url_download_output_only_required_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_cli_handle_web_url_download_output_with_optional_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_no_output_stdout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_handle_web_url_with_optional_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestModels.test_get_host_path_format_string"] | The function (get_host_path_format_string) defined within the public class called PathFormat, that inherit another class.The function start at line 172 and ends at 174. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["cls.get_host_path_format"], It has 8.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.job_group_py._download_job_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.test_upload_input_files_no_download_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.integ.deadline_job_attachments.test_job_attachments_py.upload_input_files_no_input_paths", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_cli_handle_web_url_download_output_only_required_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_handle_web_url_py.test_cli_handle_web_url_download_output_with_optional_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_no_output_stdout", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_client.cli.test_cli_job_py.test_cli_job_download_output_handle_web_url_with_optional_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestModels.test_get_host_path_format_string"]. |
aws-deadline_deadline-cloud | ManifestProperties | public | 0 | 0 | to_dict | def to_dict(self) -> dict[str, Any]:result: dict[str, Any] = {"rootPath": self.rootPath}if self.fileSystemLocationName:result["fileSystemLocationName"] = self.fileSystemLocationNameresult["rootPathFormat"] = self.rootPathFormat.valueif self.inputManifestPath:result["inputManifestPath"] = self.inputManifestPathif self.inputManifestHash:result["inputManifestHash"] = self.inputManifestHashif self.outputRelativeDirectories:result["outputRelativeDirectories"] = self.outputRelativeDirectoriesreturn result | 5 | 12 | 1 | 92 | 0 | 205 | 216 | 205 | self | [] | dict[str, Any] | {"AnnAssign": 1, "Assign": 5, "If": 4, "Return": 1} | 0 | 12 | 0 | [] | 35 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"] | The function (to_dict) defined within the public class called ManifestProperties.The function start at line 205 and ends at 216. It contains 12 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value. It has 35.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"]. |
aws-deadline_deadline-cloud | ManifestProperties | public | 0 | 0 | to_dict | def to_dict(self) -> dict[str, Any]:return {"manifests": [manifest.to_dict() for manifest in self.manifests],"fileSystem": self.fileSystem,} | 2 | 5 | 1 | 37 | 0 | 228 | 232 | 228 | self | [] | dict[str, Any] | {"AnnAssign": 1, "Assign": 5, "If": 4, "Return": 1} | 0 | 12 | 0 | [] | 35 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"] | The function (to_dict) defined within the public class called ManifestProperties.The function start at line 228 and ends at 232. It contains 5 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It has 35.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | from_root_path | def from_root_path(root_path: str) -> JobAttachmentS3Settings:path_split: list = root_path.split("/")if len(path_split) < 2:raise MalformedAttachmentSettingError("Invalid root path format, should be s3BucketName/rootPrefix.")return JobAttachmentS3Settings(path_split[0], "/".join(path_split[1:])) | 2 | 7 | 1 | 51 | 0 | 245 | 253 | 245 | root_path | [] | JobAttachmentS3Settings | {"AnnAssign": 1, "If": 1, "Return": 1} | 5 | 9 | 5 | ["root_path.split", "len", "MalformedAttachmentSettingError", "JobAttachmentS3Settings", "join"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_from_path_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_root_path"] | The function (from_root_path) defined within the public class called JobAttachmentS3Settings.The function start at line 245 and ends at 253. It contains 7 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["root_path.split", "len", "MalformedAttachmentSettingError", "JobAttachmentS3Settings", "join"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_from_path_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_root_path"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | from_s3_root_uri | def from_s3_root_uri(uri: str) -> JobAttachmentS3Settings:res = urlparse(uri)if not res.netloc or not res.path[1:] or res.scheme != "s3":raise MalformedAttachmentSettingError("Invalid root uri format, should be s3://s3BucketName/rootPrefix.")return JobAttachmentS3Settings(res.netloc, res.path[1:]) | 4 | 7 | 1 | 56 | 0 | 256 | 264 | 256 | uri | [] | JobAttachmentS3Settings | {"Assign": 1, "If": 1, "Return": 1} | 3 | 9 | 3 | ["urlparse", "MalformedAttachmentSettingError", "JobAttachmentS3Settings"] | 7 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_download_with_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.api.test_attachment_py.TestAttachmentUpload.test_upload_single_from_mapped", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.api.test_attachment_py.TestAttachmentUpload.test_upload_single_map_from_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_from_s3_root_uri_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_root_uri"] | The function (from_s3_root_uri) defined within the public class called JobAttachmentS3Settings.The function start at line 256 and ends at 264. It contains 7 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["urlparse", "MalformedAttachmentSettingError", "JobAttachmentS3Settings"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.client.cli._groups.manifest_group_py.manifest_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_download_with_root_manifests", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.attachment_py._attachment_upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.api.test_attachment_py.TestAttachmentUpload.test_upload_single_from_mapped", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.api.test_attachment_py.TestAttachmentUpload.test_upload_single_map_from_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_from_s3_root_uri_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_setting_root_uri"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | to_root_path | def to_root_path(self) -> str:return _join_s3_paths(self.s3BucketName, self.rootPrefix) | 1 | 2 | 1 | 18 | 0 | 266 | 267 | 266 | self | [] | str | {"Return": 1} | 1 | 2 | 1 | ["_join_s3_paths"] | 0 | [] | The function (to_root_path) defined within the public class called JobAttachmentS3Settings.The function start at line 266 and ends at 267. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | to_s3_root_uri | def to_s3_root_uri(self) -> str:return f"s3://{self.to_root_path()}" | 1 | 2 | 1 | 10 | 0 | 269 | 270 | 269 | self | [] | str | {"Return": 1} | 1 | 2 | 1 | ["self.to_root_path"] | 0 | [] | The function (to_s3_root_uri) defined within the public class called JobAttachmentS3Settings.The function start at line 269 and ends at 270. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self.to_root_path"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | full_cas_prefix | def full_cas_prefix(self) -> str:self._validate_root_prefix()return _join_s3_paths(self.rootPrefix, S3_DATA_FOLDER_NAME) | 1 | 3 | 1 | 21 | 0 | 272 | 274 | 272 | self | [] | str | {"Expr": 1, "Return": 1} | 2 | 3 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (full_cas_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 272 and ends at 274. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | full_job_output_prefix | def full_job_output_prefix(self, farm_id, queue_id, job_id) -> str:self._validate_root_prefix()return _join_s3_paths(self.rootPrefix, S3_MANIFEST_FOLDER_NAME, farm_id, queue_id, job_id) | 1 | 3 | 4 | 33 | 0 | 276 | 278 | 276 | self,farm_id,queue_id,job_id | [] | str | {"Expr": 1, "Return": 1} | 2 | 3 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (full_job_output_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 276 and ends at 278. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [276.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | full_step_output_prefix | def full_step_output_prefix(self, farm_id, queue_id, job_id, step_id) -> str:self._validate_root_prefix()return _join_s3_paths(self.rootPrefix, S3_MANIFEST_FOLDER_NAME, farm_id, queue_id, job_id, step_id) | 1 | 5 | 5 | 37 | 0 | 280 | 284 | 280 | self,farm_id,queue_id,job_id,step_id | [] | str | {"Expr": 1, "Return": 1} | 2 | 5 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (full_step_output_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 280 and ends at 284. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 5 parameters, represented as [280.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | full_task_output_prefix | def full_task_output_prefix(self, farm_id, queue_id, job_id, step_id, task_id) -> str:self._validate_root_prefix()return _join_s3_paths(self.rootPrefix, S3_MANIFEST_FOLDER_NAME, farm_id, queue_id, job_id, step_id, task_id) | 1 | 5 | 6 | 41 | 0 | 286 | 290 | 286 | self,farm_id,queue_id,job_id,step_id,task_id | [] | str | {"Expr": 1, "Return": 1} | 2 | 5 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (full_task_output_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 286 and ends at 290. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 6 parameters, represented as [286.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | full_output_prefix | def full_output_prefix(self, farm_id, queue_id, job_id, step_id, task_id, session_action_id) -> str:self._validate_root_prefix()return _join_s3_paths(self.rootPrefix,S3_MANIFEST_FOLDER_NAME,farm_id,queue_id,job_id,step_id,task_id,session_action_id,) | 1 | 14 | 7 | 46 | 0 | 292 | 305 | 292 | self,farm_id,queue_id,job_id,step_id,task_id,session_action_id | [] | str | {"Expr": 1, "Return": 1} | 2 | 14 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (full_output_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 292 and ends at 305. It contains 14 lines of code and it has a cyclomatic complexity of 1. It takes 7 parameters, represented as [292.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | partial_session_action_manifest_prefix | def partial_session_action_manifest_prefix(farm_id: str,queue_id: str,job_id: str,step_id: str,task_id: str,session_action_id: str,time: float,) -> str:"""Constructs the partial S3 prefix for storing session action output manifests.This method creates a hierarchical path structure for organizing output manifests in S3,following the pattern: farm_id/queue_id/job_id/step_id/task_id/timestamp_session_action_id.The timestamp is converted from a float to an ISO datetime string format."""return _join_s3_paths(farm_id,queue_id,job_id,step_id,task_id,f"{_float_to_iso_datetime_string(time)}_{session_action_id}",) | 1 | 17 | 7 | 52 | 0 | 308 | 331 | 308 | farm_id,queue_id,job_id,step_id,task_id,session_action_id,time | [] | str | {"Expr": 1, "Return": 1} | 2 | 24 | 2 | ["_join_s3_paths", "_float_to_iso_datetime_string"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_s3_settings_partial_session_action_manifest_prefix"] | The function (partial_session_action_manifest_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 308 and ends at 331. It contains 17 lines of code and it has a cyclomatic complexity of 1. It takes 7 parameters, represented as [308.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["_join_s3_paths", "_float_to_iso_datetime_string"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_models_py.TestJobAttachmentS3SettingsModel.test_job_attachment_s3_settings_partial_session_action_manifest_prefix"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | partial_manifest_prefix | def partial_manifest_prefix(self, farm_id, queue_id) -> str:guid = _generate_random_guid()return _join_s3_paths(farm_id,queue_id,S3_INPUT_MANIFEST_FOLDER_NAME,guid,) | 1 | 8 | 3 | 28 | 0 | 333 | 340 | 333 | self,farm_id,queue_id | [] | str | {"Assign": 1, "Return": 1} | 2 | 8 | 2 | ["_generate_random_guid", "_join_s3_paths"] | 0 | [] | The function (partial_manifest_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 333 and ends at 340. It contains 8 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [333.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["_generate_random_guid", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | add_root_and_manifest_folder_prefix | def add_root_and_manifest_folder_prefix(self, path: str) -> str:"""Adds “{self.rootPrefix}/{S3_MANIFEST_FOLDER_NAME}/” to the beginningof the path and returns it."""self._validate_root_prefix()return _join_s3_paths(self.rootPrefix, S3_MANIFEST_FOLDER_NAME, path) | 1 | 3 | 2 | 28 | 0 | 342 | 348 | 342 | self,path | [] | str | {"Expr": 2, "Return": 1} | 2 | 7 | 2 | ["self._validate_root_prefix", "_join_s3_paths"] | 0 | [] | The function (add_root_and_manifest_folder_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 342 and ends at 348. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [342.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._validate_root_prefix", "_join_s3_paths"]. |
aws-deadline_deadline-cloud | JobAttachmentS3Settings | public | 0 | 0 | _validate_root_prefix | def _validate_root_prefix(self) -> None:if not self.rootPrefix:raise MissingS3RootPrefixError("Missing S3 root prefix") | 2 | 3 | 1 | 18 | 0 | 350 | 352 | 350 | self | [] | None | {"If": 1} | 1 | 3 | 1 | ["MissingS3RootPrefixError"] | 0 | [] | The function (_validate_root_prefix) defined within the public class called JobAttachmentS3Settings.The function start at line 350 and ends at 352. It contains 3 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["MissingS3RootPrefixError"]. |
aws-deadline_deadline-cloud | ManifestProperties | public | 0 | 0 | to_dict | def to_dict(self) -> dict[str, Any]:return {"storageProfileId": self.storageProfileId,"displayName": self.displayName,"osFamily": self.osFamily.value,"fileSystemLocations": [item.to_dict() for item in self.fileSystemLocations],} | 2 | 7 | 1 | 51 | 0 | 392 | 398 | 392 | self | [] | dict[str, Any] | {"AnnAssign": 1, "Assign": 5, "If": 4, "Return": 1} | 0 | 12 | 0 | [] | 35 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"] | The function (to_dict) defined within the public class called ManifestProperties.The function start at line 392 and ends at 398. It contains 7 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It has 35.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"]. |
aws-deadline_deadline-cloud | ManifestProperties | public | 0 | 0 | to_dict | def to_dict(self) -> dict[str, Any]:return {"name": self.name, "path": self.path, "type": self.type.value} | 1 | 2 | 1 | 34 | 0 | 409 | 410 | 409 | self | [] | dict[str, Any] | {"AnnAssign": 1, "Assign": 5, "If": 4, "Return": 1} | 0 | 12 | 0 | [] | 35 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"] | The function (to_dict) defined within the public class called ManifestProperties.The function start at line 409 and ends at 410. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 35.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.omit_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957978_dgilland_pydash.src.pydash.objects_py.pick_by", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967770_docusign_code_examples_python.app.docusign.ds_client_py.DSClient.get_token", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.model.base_py.ProperDictMixin.to_proper_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.init_cls_from_base", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.models_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.tasks_py.edit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.59624560_openwpm_openwpm.openwpm.browser_manager_py.BrowserManagerHandle._unpack_pickled_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_collection_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69272947_misp_misp_stix.tests._test_stix_export_py.TestCollectionSTIX1Export._check_stix1_export_results", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69737800_eprbell_dali_rp2.src.dali.plugin.pair_converter.coinbase_advanced_py.PairConverterPlugin.get_historic_bar_from_native_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69848748_scverse_squidpy.src.squidpy.pl._ligrec_py.ligrec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_change_to_dict_return_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.dynaconf.loaders.__init___py.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70258989_dynaconf_dynaconf.tests.test_base_py.test_dotted_set", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_paging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70538610_asfhyp3_hyp3_sdk.tests.test_hyp3_py.test_find_jobs_user_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.arrow.dec_py.ArrowDecoder.decode_batch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.ext.rotbaum._model_py.QRX.fit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.daf.tslib.engine.hyperopt_py.HyperOptManager.load_records", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m1_py.generate_m1_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m3_py.generate_m3_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.few_shot_prediction.src.meta.datasets.m4_py.generate_m4_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.surrogate.transformers.config_py.DatasetCatch22Encoder.fit"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | default_glob_all | def default_glob_all() -> List[str]:return ["**/*"] | 1 | 2 | 0 | 13 | 0 | 425 | 426 | 425 | [] | List[str] | {"Return": 1} | 0 | 2 | 0 | [] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._glob_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_snapshot"] | The function (default_glob_all) defined within the public class called public.The function start at line 425 and ends at 426. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._glob_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.api.manifest_py._manifest_snapshot"]. | |
aws-deadline_deadline-cloud | PathMappingRule | public | 0 | 0 | get_hashed_source_path | def get_hashed_source_path(self, hash_alg: HashAlgorithm) -> str:return hash_data(self.source_path.encode("utf-8"), hash_alg) | 1 | 2 | 2 | 25 | 0 | 491 | 492 | 491 | self,hash_alg | [] | str | {"Return": 1} | 2 | 2 | 2 | ["hash_data", "self.source_path.encode"] | 0 | [] | The function (get_hashed_source_path) defined within the public class called PathMappingRule.The function start at line 491 and ends at 492. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [491.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["hash_data", "self.source_path.encode"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _set_fs_group_for_posix | def _set_fs_group_for_posix(file_paths: List[str],local_root: str,fs_permission_settings: PosixFileSystemPermissionSettings,) -> None:os_group = fs_permission_settings.os_groupdir_mode = fs_permission_settings.dir_modefile_mode = fs_permission_settings.file_mode# A set that stores the unique directory paths where permissions need to be changed.dir_paths_to_change_fs_group: Set[Path] = set()# 1. Set group ownership and permissions for each file.for file_path_str in file_paths:# The file path must be relative to the root path (ie. local_root).if not _is_relative_to(file_path_str, local_root):raise PathOutsideDirectoryError(f"The provided path '{file_path_str}' is not under the root directory: {local_root}")_change_permission_for_posix(file_path_str, os_group, file_mode)# Add the parent directories of each file to the set of directories whose# group ownership and permissions will be changed.path_components = Path(file_path_str).relative_to(local_root).parentsfor path_component in path_components:path_to_change = Path(local_root).joinpath(path_component)dir_paths_to_change_fs_group.add(path_to_change)# 2. Set group ownership and permissions for the directories in the path starting from root.for dir_path in dir_paths_to_change_fs_group:_change_permission_for_posix(str(dir_path), os_group, dir_mode) | 5 | 21 | 3 | 125 | 5 | 72 | 103 | 72 | file_paths,local_root,fs_permission_settings | ['file_mode', 'os_group', 'path_to_change', 'path_components', 'dir_mode'] | None | {"AnnAssign": 1, "Assign": 5, "Expr": 3, "For": 3, "If": 1} | 11 | 32 | 11 | ["set", "_is_relative_to", "PathOutsideDirectoryError", "_change_permission_for_posix", "relative_to", "Path", "joinpath", "Path", "dir_paths_to_change_fs_group.add", "_change_permission_for_posix", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._set_fs_group"] | The function (_set_fs_group_for_posix) defined within the public class called public.The function start at line 72 and ends at 103. It contains 21 lines of code and it has a cyclomatic complexity of 5. It takes 3 parameters, represented as [72.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["set", "_is_relative_to", "PathOutsideDirectoryError", "_change_permission_for_posix", "relative_to", "Path", "joinpath", "Path", "dir_paths_to_change_fs_group.add", "_change_permission_for_posix", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._set_fs_group"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _set_fs_permission_for_windows | def _set_fs_permission_for_windows(file_paths: List[str],local_root: str,fs_permission_settings: WindowsFileSystemPermissionSettings,) -> None:os_user = fs_permission_settings.os_userdir_mode = fs_permission_settings.dir_modefile_mode = fs_permission_settings.file_mode# A set that stores the unique directory paths where permissions need to be changed.dir_paths_to_change_fs_group: Set[Path] = set()# 1. Set permissions for each file.for file_path_str in file_paths:# The file path must be relative to the root path (ie. local_root).if not _is_relative_to(file_path_str, local_root):raise PathOutsideDirectoryError(f"The provided path '{file_path_str}' is not under the root directory: {local_root}")_change_permission_for_windows(file_path_str, os_user, file_mode)# Add the parent directories of each file to the set of directories whose# permissions will be changed.path_components = (_normalize_windows_path(file_path_str).relative_to(_normalize_windows_path(local_root)).parents)for path_component in path_components:path_to_change = Path(local_root).joinpath(path_component)dir_paths_to_change_fs_group.add(path_to_change)# 2. Set permissions for the directories in the path starting from root.for dir_path in dir_paths_to_change_fs_group:_change_permission_for_windows(str(dir_path), os_user, dir_mode) | 5 | 25 | 3 | 130 | 5 | 106 | 141 | 106 | file_paths,local_root,fs_permission_settings | ['file_mode', 'os_user', 'path_to_change', 'path_components', 'dir_mode'] | None | {"AnnAssign": 1, "Assign": 5, "Expr": 3, "For": 3, "If": 1} | 12 | 36 | 12 | ["set", "_is_relative_to", "PathOutsideDirectoryError", "_change_permission_for_windows", "relative_to", "_normalize_windows_path", "_normalize_windows_path", "joinpath", "Path", "dir_paths_to_change_fs_group.add", "_change_permission_for_windows", "str"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._set_fs_group"] | The function (_set_fs_permission_for_windows) defined within the public class called public.The function start at line 106 and ends at 141. It contains 25 lines of code and it has a cyclomatic complexity of 5. It takes 3 parameters, represented as [106.0] and does not return any value. It declares 12.0 functions, It has 12.0 functions called inside which are ["set", "_is_relative_to", "PathOutsideDirectoryError", "_change_permission_for_windows", "relative_to", "_normalize_windows_path", "_normalize_windows_path", "joinpath", "Path", "dir_paths_to_change_fs_group.add", "_change_permission_for_windows", "str"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.download_py._set_fs_group"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _change_permission_for_posix | def _change_permission_for_posix(path_str: str,os_group: str,mode: int,) -> None:if sys.platform == "win32":raise EnvironmentError("This function can only be executed on POSIX systems.")path = Path(path_str)shutil.chown(path, group=os_group)os.chmod(path, path.stat().st_mode | mode) | 2 | 10 | 3 | 62 | 1 | 144 | 154 | 144 | path_str,os_group,mode | ['path'] | None | {"Assign": 1, "Expr": 2, "If": 1} | 5 | 11 | 5 | ["EnvironmentError", "Path", "shutil.chown", "os.chmod", "path.stat"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._set_fs_group_for_posix"] | The function (_change_permission_for_posix) defined within the public class called public.The function start at line 144 and ends at 154. It contains 10 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [144.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["EnvironmentError", "Path", "shutil.chown", "os.chmod", "path.stat"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._set_fs_group_for_posix"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _change_permission_for_windows | def _change_permission_for_windows(path: str,os_user: str,mode: WindowsPermissionEnum,) -> None:if sys.platform != "win32":raise EnvironmentError("This function can only be executed on Windows systems.")import win32securitytry:con_mode = _get_ntsecuritycon_mode(mode)# Lookup the user's SID (Security Identifier)user_sid = win32security.LookupAccountName(None, os_user)[0]# Get existing DACL (Discretionary Access Control List). If dacl is none, create a new one.sd = win32security.GetFileSecurity(path, win32security.DACL_SECURITY_INFORMATION)dacl = sd.GetSecurityDescriptorDacl()if dacl is None:dacl = win32security.ACL()# Add new ACE (Access Control Entry)dacl.AddAccessAllowedAce(win32security.ACL_REVISION, con_mode, user_sid)# Set the modified DACL to the security descriptorsd.SetSecurityDescriptorDacl(1, dacl, 0)win32security.SetFileSecurity(path, win32security.DACL_SECURITY_INFORMATION, sd)except win32security.error as e:raise AssetSyncError(f"Failed to set permissions for file or directory ({path}): {e}") from e | 4 | 22 | 3 | 133 | 4 | 157 | 184 | 157 | path,os_user,mode | ['dacl', 'sd', 'con_mode', 'user_sid'] | None | {"Assign": 5, "Expr": 3, "If": 2, "Try": 1} | 10 | 28 | 10 | ["EnvironmentError", "_get_ntsecuritycon_mode", "win32security.LookupAccountName", "win32security.GetFileSecurity", "sd.GetSecurityDescriptorDacl", "win32security.ACL", "dacl.AddAccessAllowedAce", "sd.SetSecurityDescriptorDacl", "win32security.SetFileSecurity", "AssetSyncError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._set_fs_permission_for_windows"] | The function (_change_permission_for_windows) defined within the public class called public.The function start at line 157 and ends at 184. It contains 22 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [157.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["EnvironmentError", "_get_ntsecuritycon_mode", "win32security.LookupAccountName", "win32security.GetFileSecurity", "sd.GetSecurityDescriptorDacl", "win32security.ACL", "dacl.AddAccessAllowedAce", "sd.SetSecurityDescriptorDacl", "win32security.SetFileSecurity", "AssetSyncError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._set_fs_permission_for_windows"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | _get_ntsecuritycon_mode | def _get_ntsecuritycon_mode(mode: WindowsPermissionEnum) -> int:"""Get the NTSecurityCon mode for a WindowsPermissionEnum."""if sys.platform != "win32":raise EnvironmentError("This function can only be executed on Windows systems.")import ntsecuritycon as conpermission_mapping = {WindowsPermissionEnum.READ.value: con.FILE_GENERIC_READ,WindowsPermissionEnum.WRITE.value: con.FILE_GENERIC_WRITE,WindowsPermissionEnum.EXECUTE.value: con.FILE_GENERIC_EXECUTE,WindowsPermissionEnum.READ_WRITE.value: con.FILE_GENERIC_READ | con.FILE_GENERIC_WRITE,WindowsPermissionEnum.FULL_CONTROL.value: con.FILE_ALL_ACCESS,}return permission_mapping[mode.value] | 2 | 12 | 1 | 91 | 1 | 187 | 203 | 187 | mode | ['permission_mapping'] | int | {"Assign": 1, "Expr": 1, "If": 1, "Return": 1} | 1 | 17 | 1 | ["EnvironmentError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._change_permission_for_windows"] | The function (_get_ntsecuritycon_mode) defined within the public class called public.The function start at line 187 and ends at 203. It contains 12 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["EnvironmentError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.os_file_permission_py._change_permission_for_windows"]. |
aws-deadline_deadline-cloud | SummaryStatistics | public | 0 | 0 | aggregate | def aggregate(self, other: SummaryStatistics) -> SummaryStatistics:"""Aggregates other object of SummaryStatistics to this."""if not isinstance(other, self.__class__):raise TypeError("Only instances of the same type can be aggregated.")self.total_time += other.total_timeself.total_files += other.total_filesself.total_bytes += other.total_bytesself.processed_files += other.processed_filesself.processed_bytes += other.processed_bytesself.skipped_files += other.skipped_filesself.skipped_bytes += other.skipped_bytesself.transfer_rate = self.processed_bytes / self.total_time if self.total_time else 0.0return self | 3 | 12 | 2 | 98 | 0 | 42 | 57 | 42 | self,other | [] | SummaryStatistics | {"Assign": 1, "AugAssign": 7, "Expr": 1, "If": 1, "Return": 1} | 2 | 16 | 2 | ["isinstance", "TypeError"] | 28 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.manager_py.TranslationQueryset.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.query_py.AggregateTests.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3696868_turbogears_ming.ming.session_py.Session.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.models_py.NGO.get_total_funded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.reports.aggregate_campaigns_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_types_groupby_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby_agg", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_engine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_frame_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate_series"] | The function (aggregate) defined within the public class called SummaryStatistics.The function start at line 42 and ends at 57. It contains 12 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [42.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["isinstance", "TypeError"], It has 28.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.manager_py.TranslationQueryset.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.query_py.AggregateTests.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3696868_turbogears_ming.ming.session_py.Session.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.models_py.NGO.get_total_funded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.reports.aggregate_campaigns_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_types_groupby_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby_agg", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_engine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_frame_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate_series"]. |
aws-deadline_deadline-cloud | SummaryStatistics | public | 0 | 0 | __str__ | def __str__(self):return (f"Processed {self.processed_files} file{'' if self.processed_files == 1 else 's'}"+ f" totaling {human_readable_file_size(self.processed_bytes)}.\n"+ f"Skipped re-processing {self.skipped_files} files totaling"+ f" {human_readable_file_size(self.skipped_bytes)}.\n"+ f"Total processing time of {round(self.total_time, ndigits=5)} seconds"+ f" at {human_readable_file_size(int(self.transfer_rate))}/s.\n") | 1 | 9 | 1 | 25 | 0 | 59 | 67 | 59 | self | [] | Returns | {"Return": 1} | 5 | 9 | 5 | ["human_readable_file_size", "human_readable_file_size", "round", "human_readable_file_size", "int"] | 8 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.output.validate_py.OutputGoodBase.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.test_utils.project.app.models_py.NormalProxy.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.test_utils.project.app.models_py.NormalProxyProxy.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3704508_kronenthaler_mod_pbxproj.pbxproj.pbxsections.PBXContainerItemProxy_py.PBXContainerItemProxy.__getitem__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.mobilpay.mobilpay.payment.request.card_py.Card.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3956605_inducer_pymbolic.pymbolic.imperative.statement_py.ConditionalStatement.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963126_numpy_numpydoc.numpydoc.docscrape_py.FunctionDoc.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94530662_scikit_build_scikit_build_core.src.scikit_build_core.ast.ast_py.Block.__str__"] | The function (__str__) defined within the public class called SummaryStatistics.The function start at line 59 and ends at 67. It contains 9 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 5.0 functions, It has 5.0 functions called inside which are ["human_readable_file_size", "human_readable_file_size", "round", "human_readable_file_size", "int"], It has 8.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.dockertest.output.validate_py.OutputGoodBase.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.test_utils.project.app.models_py.NormalProxy.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.test_utils.project.app.models_py.NormalProxyProxy.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3704508_kronenthaler_mod_pbxproj.pbxproj.pbxsections.PBXContainerItemProxy_py.PBXContainerItemProxy.__getitem__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.mobilpay.mobilpay.payment.request.card_py.Card.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3956605_inducer_pymbolic.pymbolic.imperative.statement_py.ConditionalStatement.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963126_numpy_numpydoc.numpydoc.docscrape_py.FunctionDoc.__str__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94530662_scikit_build_scikit_build_core.src.scikit_build_core.ast.ast_py.Block.__str__"]. |
aws-deadline_deadline-cloud | SummaryStatistics | public | 0 | 0 | aggregate | def aggregate(self, other: SummaryStatistics) -> SummaryStatistics:"""Aggregates other object of DownloadSummaryStatistics to this."""super().aggregate(other)if not hasattr(other, "file_counts_by_root_directory"):raise TypeError(f"{other.__class__.__name__} does not have a file_counts_by_root_directory field.")else:self.file_counts_by_root_directory = dict(Counter(self.file_counts_by_root_directory)+ Counter(other.file_counts_by_root_directory))return self | 2 | 12 | 2 | 59 | 0 | 80 | 95 | 80 | self,other | [] | SummaryStatistics | {"Assign": 1, "AugAssign": 7, "Expr": 1, "If": 1, "Return": 1} | 2 | 16 | 2 | ["isinstance", "TypeError"] | 28 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.manager_py.TranslationQueryset.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.query_py.AggregateTests.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3696868_turbogears_ming.ming.session_py.Session.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.models_py.NGO.get_total_funded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.reports.aggregate_campaigns_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_types_groupby_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby_agg", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_engine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_frame_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate_series"] | The function (aggregate) defined within the public class called SummaryStatistics.The function start at line 80 and ends at 95. It contains 12 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [80.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["isinstance", "TypeError"], It has 28.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.manager_py.TranslationQueryset.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.query_py.AggregateTests.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3696868_turbogears_ming.ming.session_py.Session.aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.models_py.NGO.get_total_funded", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.56769247_dmwm_cmsspark.src.python.CMSSpark.reports.aggregate_campaigns_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.series.test_series_py.test_types_groupby_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_frame_py.test_types_groupby_agg", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_engine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_frame_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_ewm", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_expanding", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_resample", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_groupby_py.test_series_groupby_rolling", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_frame_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_resampler_py.test_aggregate_series_combinations", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_ewm_aggregate_series", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69936553_pandas_dev_pandas_stubs.tests.test_windowing_py.test_expanding_aggregate_series"]. |
aws-deadline_deadline-cloud | DownloadSummaryStatistics | public | 0 | 1 | convert_to_summary_statistics | def convert_to_summary_statistics(self) -> SummaryStatistics:"""Converts this DownloadSummaryStatistics to a SummaryStatistics."""download_summary_statistics_dict = asdict(self)del download_summary_statistics_dict["file_counts_by_root_directory"]return SummaryStatistics(**download_summary_statistics_dict) | 1 | 4 | 1 | 25 | 0 | 97 | 103 | 97 | self | [] | SummaryStatistics | {"Assign": 1, "Expr": 1, "Return": 1} | 2 | 7 | 2 | ["asdict", "SummaryStatistics"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.copied_download"] | The function (convert_to_summary_statistics) defined within the public class called DownloadSummaryStatistics, that inherit another class.The function start at line 97 and ends at 103. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["asdict", "SummaryStatistics"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.asset_sync_py.AssetSync.copied_download"]. |
aws-deadline_deadline-cloud | ProgressStatus | public | 0 | 1 | __init__ | def __init__(self, title, verb_in_message):self.title = titleself.verb_in_message = verb_in_message | 1 | 3 | 3 | 19 | 0 | 126 | 128 | 126 | self,title,verb_in_message | [] | None | {"Assign": 2} | 0 | 3 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called ProgressStatus, that inherit another class.The function start at line 126 and ends at 128. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [126.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | __init__.do_nothing | def do_nothing(*args, **kwargs) -> bool:return True | 1 | 2 | 2 | 13 | 0 | 165 | 166 | 165 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (__init__.do_nothing) defined within the public class called public.The function start at line 165 and ends at 166. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [165.0] and does not return any value.. |
aws-deadline_deadline-cloud | public | public | 0 | 0 | __init__.track_progress | def track_progress(bytes_amount: int, current_file_done: Optional[bool] = False) -> bool:"""When uploading or downloading files using boto3, pass this to the `Callback` argumentso that the progress can be updated with the amount of bytes processed."""with self._lock:self._initialize_timestamps_if_none()self.processed_bytes += bytes_amountif current_file_done:self.processed_files += 1self.completed_files_in_chunk += 1# Logs progress message to the logger (if exists)self._log_progress_message()# Invokes the callback with current progress datareturn self._report_progress() | 2 | 9 | 2 | 58 | 0 | 200 | 214 | 200 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (__init__.track_progress) defined within the public class called public.The function start at line 200 and ends at 214. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [200.0] and does not return any value.. |
aws-deadline_deadline-cloud | ProgressStatus | public | 0 | 1 | __init__ | def __init__(self,status: ProgressStatus,total_files: int,total_bytes: int,on_progress_callback: Optional[Callable[[ProgressReportMetadata], bool]] = None,callback_interval: int = CALLBACK_INTERVAL,max_files_in_chunk: int = MAX_FILES_IN_CHUNK,logger: Optional[Union[Logger, LoggerAdapter]] = None,log_interval: int = LOG_INTERVAL,log_percentage_threshold: int = LOG_PERCENTAGE_THRESHOLD,) -> None:def do_nothing(*args, **kwargs) -> bool:return Trueif not on_progress_callback:on_progress_callback = do_nothingself.on_progress_callback = on_progress_callbackself.continue_reporting = Trueself.reporting_interval = callback_intervalself.reporting_files_per_chunk = 1self.max_files_in_chunk = max_files_in_chunkself.completed_files_in_chunk = 0self.last_report_time: Optional[float] = Noneself.last_report_processed_bytes: int = 0self.logger = loggerself.log_interval = log_intervalself.log_percentage_threshold = log_percentage_thresholdself.last_logged_time: Optional[float] = Noneself.last_logged_completed_bytes: int = 0self.status = statusself.total_files = total_filesself.total_bytes = total_bytesif self.total_files >= self.max_files_in_chunk:self.reporting_files_per_chunk = self.max_files_in_chunkself.processed_files = 0self.processed_bytes = 0self.skipped_files = 0self.skipped_bytes = 0self.total_time = 0.0# total time (in fractional seconds) taken for the processself._lock = Lock()def track_progress(bytes_amount: int, current_file_done: Optional[bool] = False) -> bool:"""When uploading or downloading files using boto3, pass this to the `Callback` argumentso that the progress can be updated with the amount of bytes processed."""with self._lock:self._initialize_timestamps_if_none()self.processed_bytes += bytes_amountif current_file_done:self.processed_files += 1self.completed_files_in_chunk += 1# Logs progress message to the logger (if exists)self._log_progress_message()# Invokes the callback with current progress datareturn self._report_progress()self.track_progress_callback = track_progress | 3 | 41 | 11 | 234 | 0 | 153 | 216 | 153 | self,title,verb_in_message | [] | None | {"Assign": 2} | 0 | 3 | 0 | [] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called ProgressStatus, that inherit another class.The function start at line 153 and ends at 216. It contains 41 lines of code and it has a cyclomatic complexity of 3. It takes 11 parameters, represented as [153.0] and does not return any value. It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | set_total_files | def set_total_files(self, total_files, total_bytes) -> None:"""Stores the number and size of files to be processed."""self.total_files = total_filesself.total_bytes = total_bytesif self.total_files >= self.max_files_in_chunk:self.reporting_files_per_chunk = self.max_files_in_chunk | 2 | 5 | 3 | 38 | 0 | 218 | 225 | 218 | self,total_files,total_bytes | [] | None | {"Assign": 3, "Expr": 1, "If": 1} | 0 | 8 | 0 | [] | 0 | [] | The function (set_total_files) defined within the public class called ProgressTracker.The function start at line 218 and ends at 225. It contains 5 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [218.0] and does not return any value.. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | _initialize_timestamps_if_none | def _initialize_timestamps_if_none(self) -> None:"""This is to initialize the `last_report_time` and `last_logged_time` to thecurrent time in alignment with the start of process (i.e., hashing, uploading,or downloading.) These are used to calculate the current hashing or transferrate for the first progress report, which is the first callback invocation orthe first logging."""current_time = time.perf_counter()if self.last_report_time is None:self.last_report_time = current_timeif self.last_logged_time is None:self.last_logged_time = current_time | 3 | 6 | 1 | 39 | 0 | 227 | 239 | 227 | self | [] | None | {"Assign": 3, "Expr": 1, "If": 2} | 1 | 13 | 1 | ["time.perf_counter"] | 0 | [] | The function (_initialize_timestamps_if_none) defined within the public class called ProgressTracker.The function start at line 227 and ends at 239. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["time.perf_counter"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | increase_processed | def increase_processed(self, num_files: int = 1, file_bytes: int = 0) -> None:"""Adds the number and size of processed files."""with self._lock:self._initialize_timestamps_if_none()self.processed_files += num_filesself.completed_files_in_chunk += num_filesself.processed_bytes += file_bytes | 1 | 6 | 3 | 45 | 0 | 241 | 249 | 241 | self,num_files,file_bytes | [] | None | {"AugAssign": 3, "Expr": 2, "With": 1} | 1 | 9 | 1 | ["self._initialize_timestamps_if_none"] | 0 | [] | The function (increase_processed) defined within the public class called ProgressTracker.The function start at line 241 and ends at 249. It contains 6 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [241.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self._initialize_timestamps_if_none"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | increase_skipped | def increase_skipped(self, num_files: int = 1, file_bytes: int = 0) -> None:"""Adds the number and size of skipped files."""with self._lock:self.skipped_files += num_filesself.completed_files_in_chunk += num_filesself.skipped_bytes += file_bytes | 1 | 5 | 3 | 40 | 0 | 251 | 258 | 251 | self,num_files,file_bytes | [] | None | {"AugAssign": 3, "Expr": 1, "With": 1} | 0 | 8 | 0 | [] | 0 | [] | The function (increase_skipped) defined within the public class called ProgressTracker.The function start at line 251 and ends at 258. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [251.0] and does not return any value.. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | _report_progress | def _report_progress(self) -> bool:"""Invokes the callback with current progress metadata in one of the following cases:1. when a specific time interval has passed since the last call, (or since the process started,) or2. when a specific number of files (a chunk) has been processed, or3. when the progress is 100%, (including when all files were skipped during the process.)Sets the flag `continue_reporting` True if the operation should continue as normal,or False to cancel, and returns the flag."""if not self.continue_reporting:return Falsecurrent_time = time.perf_counter()if (self.last_report_time is Noneor current_time - self.last_report_time >= self.reporting_intervalor self.completed_files_in_chunk >= self.reporting_files_per_chunkor self.processed_files + self.skipped_files == self.total_files):self.continue_reporting = self.on_progress_callback(self._get_progress_report_metadata())self.last_report_processed_bytes = self.processed_bytesself.last_report_time = current_timeself.completed_files_in_chunk = 0return self.continue_reporting | 6 | 17 | 1 | 97 | 0 | 260 | 287 | 260 | self | [] | bool | {"Assign": 5, "Expr": 1, "If": 2, "Return": 2} | 3 | 28 | 3 | ["time.perf_counter", "self.on_progress_callback", "self._get_progress_report_metadata"] | 0 | [] | The function (_report_progress) defined within the public class called ProgressTracker.The function start at line 260 and ends at 287. It contains 17 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["time.perf_counter", "self.on_progress_callback", "self._get_progress_report_metadata"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | report_progress | def report_progress(self) -> bool:with self._lock:return self._report_progress() | 1 | 3 | 1 | 18 | 0 | 289 | 291 | 289 | self | [] | bool | {"Return": 1, "With": 1} | 1 | 3 | 1 | ["self._report_progress"] | 0 | [] | The function (report_progress) defined within the public class called ProgressTracker.The function start at line 289 and ends at 291. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self._report_progress"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | _get_progress_report_metadata | def _get_progress_report_metadata(self) -> ProgressReportMetadata:completed_bytes = self.processed_bytes + self.skipped_bytespercentage = round(completed_bytes / self.total_bytes * 100 if self.total_bytes > 0 else 0, 1)seconds_since_last_report = round(time.perf_counter() - self.last_report_time if self.last_report_time else 0, 2)transfer_rate = ((self.processed_bytes - self.last_report_processed_bytes) / seconds_since_last_reportif seconds_since_last_report > 0else 0)transfer_rate_name = "Transfer rate"if self.status == ProgressStatus.PREPARING_IN_PROGRESS:transfer_rate_name = "Hashing speed"progress_message = (f"{self.status.verb_in_message}"f" {human_readable_file_size(completed_bytes)} / {human_readable_file_size(self.total_bytes)}"f" of {self.total_files} file{'' if self.total_files == 1 else 's'}"f" ({transfer_rate_name}: {human_readable_file_size(int(transfer_rate))}/s)")return ProgressReportMetadata(status=self.status,progress=percentage,transferRate=transfer_rate,progressMessage=progress_message,) | 5 | 28 | 1 | 130 | 0 | 293 | 322 | 293 | self | [] | ProgressReportMetadata | {"Assign": 7, "If": 1, "Return": 1} | 8 | 30 | 8 | ["round", "round", "time.perf_counter", "human_readable_file_size", "human_readable_file_size", "human_readable_file_size", "int", "ProgressReportMetadata"] | 0 | [] | The function (_get_progress_report_metadata) defined within the public class called ProgressTracker.The function start at line 293 and ends at 322. It contains 28 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["round", "round", "time.perf_counter", "human_readable_file_size", "human_readable_file_size", "human_readable_file_size", "int", "ProgressReportMetadata"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | get_summary_statistics | def get_summary_statistics(self) -> SummaryStatistics:"""Returns the summary statistics of hashing or upload operation."""transfer_rate = self.processed_bytes / self.total_time if self.total_time else 0.0return SummaryStatistics(total_time=self.total_time,total_files=self.total_files,total_bytes=self.total_bytes,processed_files=self.processed_files,processed_bytes=self.processed_bytes,skipped_files=self.skipped_files,skipped_bytes=self.skipped_bytes,transfer_rate=transfer_rate,) | 2 | 12 | 1 | 75 | 0 | 324 | 339 | 324 | self | [] | SummaryStatistics | {"Assign": 1, "Expr": 1, "Return": 1} | 1 | 16 | 1 | ["SummaryStatistics"] | 0 | [] | The function (get_summary_statistics) defined within the public class called ProgressTracker.The function start at line 324 and ends at 339. It contains 12 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["SummaryStatistics"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | get_download_summary_statistics | def get_download_summary_statistics(self,downloaded_files_paths_by_root: dict[str, list[str]],) -> DownloadSummaryStatistics:"""Returns the summary statistics of download operation."""summary_statistics_dict = asdict(self.get_summary_statistics())summary_statistics_dict["file_counts_by_root_directory"] = {root: len(paths) for root, paths in downloaded_files_paths_by_root.items()}return DownloadSummaryStatistics(**summary_statistics_dict) | 2 | 9 | 2 | 60 | 0 | 341 | 352 | 341 | self,downloaded_files_paths_by_root | [] | DownloadSummaryStatistics | {"Assign": 2, "Expr": 1, "Return": 1} | 5 | 12 | 5 | ["asdict", "self.get_summary_statistics", "len", "downloaded_files_paths_by_root.items", "DownloadSummaryStatistics"] | 0 | [] | The function (get_download_summary_statistics) defined within the public class called ProgressTracker.The function start at line 341 and ends at 352. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [341.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["asdict", "self.get_summary_statistics", "len", "downloaded_files_paths_by_root.items", "DownloadSummaryStatistics"]. |
aws-deadline_deadline-cloud | ProgressTracker | public | 0 | 0 | _log_progress_message | def _log_progress_message(self) -> None:"""Logs progress message to the logger (if exists) on specific conditions:1. when the `log_interval` time has passed since the last call, (or since the process started,) or,2. when the tracked progress percentage difference from the last log exceeds the `log_percentage_threshold`, or3. when the process is fully completed (progress percentage reaches 100%)."""if self.logger is None:returncurrent_time = time.perf_counter()current_completed_bytes = self.processed_bytes + self.skipped_bytesprogress_difference = ((current_completed_bytes - self.last_logged_completed_bytes) / self.total_bytes * 100)if (self.last_logged_time is Noneor current_time - self.last_logged_time >= self.log_intervalor progress_difference >= self.log_percentage_thresholdor self.processed_files + self.skipped_files == self.total_files):self.logger.info(self._get_progress_report_metadata().progressMessage)self.last_logged_completed_bytes = current_completed_bytesself.last_logged_time = current_time | 6 | 17 | 1 | 110 | 0 | 354 | 380 | 354 | self | [] | None | {"Assign": 5, "Expr": 2, "If": 2, "Return": 1} | 3 | 27 | 3 | ["time.perf_counter", "self.logger.info", "self._get_progress_report_metadata"] | 0 | [] | The function (_log_progress_message) defined within the public class called ProgressTracker.The function start at line 354 and ends at 380. It contains 17 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["time.perf_counter", "self.logger.info", "self._get_progress_report_metadata"]. |
aws-deadline_deadline-cloud | S3AssetUploader | public | 0 | 0 | __init__ | def __init__(self,session: Optional[boto3.Session] = None,) -> None:if session is None:self._session = get_boto3_session()else:self._session = sessiontry:# The small file threshold is the chunk size multiplied by the small file threshold multiplier.small_file_threshold_multiplier = int(config_file.get_setting("settings.small_file_threshold_multiplier"))self.small_file_threshold = (S3_MULTIPART_UPLOAD_CHUNK_SIZE * small_file_threshold_multiplier)s3_max_pool_connections = int(config_file.get_setting("settings.s3_max_pool_connections"))self.num_upload_workers = int(s3_max_pool_connections/ min(small_file_threshold_multiplier, S3_UPLOAD_MAX_CONCURRENCY))if self.num_upload_workers <= 0:# This can result in triggering "Connection pool is full" warning messages during uploads.self.num_upload_workers = 1except ValueError as ve:raise AssetSyncError("Failed to parse configuration settings. Please ensure that the following settings in the config file are integers: ""'s3_max_pool_connections', 'small_file_threshold_multiplier'") from veself._s3 = get_s3_client(self._session)# pylint: disable=invalid-name# Confirm that the settings values are all positive.error_msg = ""if small_file_threshold_multiplier <= 0:error_msg = f"'small_file_threshold_multiplier' ({small_file_threshold_multiplier}) must be positive integer."elif s3_max_pool_connections <= 0:error_msg = (f"'s3_max_pool_connections' ({s3_max_pool_connections}) must be positive integer.")if error_msg:raise AssetSyncError("Nonvalid value for configuration setting: " + error_msg) | 7 | 39 | 2 | 154 | 0 | 100 | 145 | 100 | self,session | [] | None | {"Assign": 11, "If": 5, "Try": 1} | 10 | 46 | 10 | ["get_boto3_session", "int", "config_file.get_setting", "int", "config_file.get_setting", "int", "min", "AssetSyncError", "get_s3_client", "AssetSyncError"] | 4,993 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"] | The function (__init__) defined within the public class called S3AssetUploader.The function start at line 100 and ends at 145. It contains 39 lines of code and it has a cyclomatic complexity of 7. It takes 2 parameters, represented as [100.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["get_boto3_session", "int", "config_file.get_setting", "int", "config_file.get_setting", "int", "min", "AssetSyncError", "get_s3_client", "AssetSyncError"], It has 4993.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.logging_py.LoggerHandler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16379211_lorcalhost_btb_manager_telegram.btb_manager_telegram.schedule_py.TgScheduler.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.AmountMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.CalculatedAmountDiscrepancyError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ExchangeRateMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.InvalidTransactionError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.ParsingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.PriceMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.QuantityNotPositiveError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.SymbolMissingError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedColumnCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.exceptions_py.UnexpectedRowCountError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.raw_py.RawTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_equity_award_json_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.schwab_py.SchwabTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.trading212_py.Trading212Transaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.parsers.vanguard_py.VanguardTransaction.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.ExporterError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordFileDoesNotExistError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.config.exceptions_py.FritzPasswordTooLongError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.DeviceInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HomeAutomation.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostInfo.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.HostNumberOfEntries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.fritzexporter.fritzcapabilities_py.LanInterfaceConfig.__init__"]. |
aws-deadline_deadline-cloud | S3AssetUploader | public | 0 | 0 | upload_assets | def upload_assets(self,job_attachment_settings: JobAttachmentS3Settings,manifest: BaseAssetManifest,source_root: Path,partial_manifest_prefix: Optional[str] = None,file_system_location_name: Optional[str] = None,progress_tracker: Optional[ProgressTracker] = None,s3_check_cache_dir: Optional[str] = None,manifest_write_dir: Optional[str] = None,manifest_name_suffix: str = "input",manifest_metadata: dict[str, dict[str, str]] = dict(),manifest_file_name: Optional[str] = None,asset_root: Optional[Path] = None,) -> tuple[str, str]:"""Uploads assets based off of an asset manifest, uploads the asset manifest.Args:manifest: The asset manifest to upload.partial_manifest_prefix: The (partial) key prefix to use for uploading the manifest to S3, excluding the initial section "<root-prefix>/Manifest/".e.g. "farm-1234/queue-1234/Inputs/<some-guid>"source_root: The local root path of the assets.job_attachment_settings: The settings for the job attachment configured in Queue.progress_tracker: Optional progress tracker to track progress.manifest_name_suffix: Suffix for given manifest naming.manifest_metadata: File metadata for given manifest to be uploaded.manifest_file_name: Optional file name for given manifest to be uploaded, otherwise use default name.asset_root: The root in which asset actually in to facilitate path mapping.Returns:A tuple of (the partial key for the manifest on S3, the hash of input manifest)."""# Upload asset manifest(hash_alg, manifest_bytes, manifest_name) = S3AssetUploader._gather_upload_metadata(manifest=manifest,source_root=source_root,file_system_location_name=file_system_location_name,manifest_name_suffix=manifest_name_suffix,)manifest_name = manifest_file_name if manifest_file_name else manifest_nameif partial_manifest_prefix:partial_manifest_key = _join_s3_paths(partial_manifest_prefix, manifest_name)else:partial_manifest_key = manifest_namefull_manifest_key = job_attachment_settings.add_root_and_manifest_folder_prefix(partial_manifest_key)if manifest_write_dir:self._write_local_manifest(manifest_write_dir,manifest_name,full_manifest_key,manifest,)if partial_manifest_prefix:self.upload_bytes_to_s3(bytes=BytesIO(manifest_bytes),bucket=job_attachment_settings.s3BucketName,key=full_manifest_key,extra_args=manifest_metadata,)# Verify S3 hash cache integrity, and reset cache if cached files are missingif not self.verify_hash_cache_integrity(s3_check_cache_dir,manifest,job_attachment_settings.full_cas_prefix(),job_attachment_settings.s3BucketName,):self.reset_s3_check_cache(s3_check_cache_dir)# Upload assetsself.upload_input_files(manifest=manifest,s3_bucket=job_attachment_settings.s3BucketName,source_root=asset_root if asset_root else source_root,s3_cas_prefix=job_attachment_settings.full_cas_prefix(),progress_tracker=progress_tracker,s3_check_cache_dir=s3_check_cache_dir,)return (partial_manifest_key, hash_data(manifest_bytes, hash_alg)) | 7 | 59 | 11 | 296 | 0 | 147 | 235 | 147 | self,job_attachment_settings,manifest,source_root,partial_manifest_prefix,file_system_location_name,progress_tracker,s3_check_cache_dir,manifest_write_dir,manifest_name_suffix,manifest_metadata,manifest_file_name,asset_root | [] | tuple[str, str] | {"Assign": 5, "Expr": 5, "If": 4, "Return": 1} | 13 | 89 | 13 | ["dict", "S3AssetUploader._gather_upload_metadata", "_join_s3_paths", "job_attachment_settings.add_root_and_manifest_folder_prefix", "self._write_local_manifest", "self.upload_bytes_to_s3", "BytesIO", "self.verify_hash_cache_integrity", "job_attachment_settings.full_cas_prefix", "self.reset_s3_check_cache", "self.upload_input_files", "job_attachment_settings.full_cas_prefix", "hash_data"] | 0 | [] | The function (upload_assets) defined within the public class called S3AssetUploader.The function start at line 147 and ends at 235. It contains 59 lines of code and it has a cyclomatic complexity of 7. It takes 11 parameters, represented as [147.0] and does not return any value. It declares 13.0 functions, and It has 13.0 functions called inside which are ["dict", "S3AssetUploader._gather_upload_metadata", "_join_s3_paths", "job_attachment_settings.add_root_and_manifest_folder_prefix", "self._write_local_manifest", "self.upload_bytes_to_s3", "BytesIO", "self.verify_hash_cache_integrity", "job_attachment_settings.full_cas_prefix", "self.reset_s3_check_cache", "self.upload_input_files", "job_attachment_settings.full_cas_prefix", "hash_data"]. |
aws-deadline_deadline-cloud | S3AssetUploader | public | 0 | 0 | _snapshot_assets | def _snapshot_assets(self,snapshot_dir: Path,manifest: BaseAssetManifest,source_root: Path,partial_manifest_prefix: Optional[str] = None,file_system_location_name: Optional[str] = None,progress_tracker: Optional[ProgressTracker] = None,manifest_name_suffix: str = "input",manifest_file_name: Optional[str] = None,asset_root: Optional[Path] = None,) -> tuple[str, str]:"""Snapshots assets based off of an asset manifest, snapshots the asset manifest. The resultis a directory structure in snapshot_dir that matches the job attachments prefix layout.Args:snapshot_dir: The directory in which to place the data and manifest snapshots.manifest: The asset manifest to upload.partial_manifest_prefix: The (partial) key prefix to use for uploading the manifest to S3, excluding the initial section "<root-prefix>/Manifest/".e.g. "farm-1234/queue-1234/Inputs/<some-guid>"source_root: The local root path of the assets.progress_tracker: Optional progress tracker to track progress.manifest_name_suffix: Suffix for given manifest naming.manifest_metadata: File metadata for given manifest to be uploaded.manifest_file_name: Optional file name for given manifest to be uploaded, otherwise use default name.asset_root: The root in which asset actually in to facilitate path mapping.Returns:A tuple of (the partial key for the manifest in the snapshot, the hash of input manifest)."""# Snapshot asset manifest(hash_alg, manifest_bytes, manifest_name) = S3AssetUploader._gather_upload_metadata(manifest=manifest,source_root=source_root,file_system_location_name=file_system_location_name,manifest_name_suffix=manifest_name_suffix,)manifest_name = manifest_file_name if manifest_file_name else manifest_nameif partial_manifest_prefix:partial_manifest_key = _join_s3_paths(partial_manifest_prefix, manifest_name)else:partial_manifest_key = manifest_namemanifest_file_path = snapshot_dir / S3_MANIFEST_FOLDER_NAME / partial_manifest_keyos.makedirs(_get_long_path_compatible_path(manifest_file_path.parent), exist_ok=True)with open(_get_long_path_compatible_path(manifest_file_path), "wb") as fh:fh.write(manifest_bytes)# Snapshot assetsself._snapshot_input_files(snapshot_dir=snapshot_dir,manifest=manifest,source_root=asset_root if asset_root else source_root,progress_tracker=progress_tracker,)return (partial_manifest_key, hash_data(manifest_bytes, hash_alg)) | 4 | 34 | 10 | 206 | 0 | 237 | 297 | 237 | self,snapshot_dir,manifest,source_root,partial_manifest_prefix,file_system_location_name,progress_tracker,manifest_name_suffix,manifest_file_name,asset_root | [] | tuple[str, str] | {"Assign": 5, "Expr": 4, "If": 1, "Return": 1, "With": 1} | 9 | 61 | 9 | ["S3AssetUploader._gather_upload_metadata", "_join_s3_paths", "os.makedirs", "_get_long_path_compatible_path", "open", "_get_long_path_compatible_path", "fh.write", "self._snapshot_input_files", "hash_data"] | 0 | [] | The function (_snapshot_assets) defined within the public class called S3AssetUploader.The function start at line 237 and ends at 297. It contains 34 lines of code and it has a cyclomatic complexity of 4. It takes 10 parameters, represented as [237.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["S3AssetUploader._gather_upload_metadata", "_join_s3_paths", "os.makedirs", "_get_long_path_compatible_path", "open", "_get_long_path_compatible_path", "fh.write", "self._snapshot_input_files", "hash_data"]. |
aws-deadline_deadline-cloud | S3AssetUploader | public | 0 | 0 | _gather_upload_metadata | def _gather_upload_metadata(manifest: BaseAssetManifest,source_root: Path,manifest_name_suffix: str,# TODO - remove file_system_location_name after ASSET_SYNC_JOB_USER_FEATURE completionfile_system_location_name: Optional[str] = None,) -> tuple[HashAlgorithm, bytes, str]:"""Gathers metadata information of manifest to be used for writing the local manifest"""hash_alg = manifest.get_default_hash_alg()manifest_bytes = manifest.encode().encode("utf-8")# Converting Path to str uses OS-specific separators (\ vs /), which can produce different hashes across OSmanifest_name_prefix = hash_data(str(source_root).encode(), hash_alg)manifest_name = f"{manifest_name_prefix}_{manifest_name_suffix}"return (hash_alg, manifest_bytes, manifest_name) | 1 | 11 | 4 | 81 | 0 | 300 | 316 | 300 | manifest,source_root,manifest_name_suffix,file_system_location_name | [] | tuple[HashAlgorithm, bytes, str] | {"Assign": 4, "Expr": 1, "Return": 1} | 6 | 17 | 6 | ["manifest.get_default_hash_alg", "encode", "manifest.encode", "hash_data", "encode", "str"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader._snapshot_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.upload_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_upload_py.TestUpload.test_gather_upload_metadata"] | The function (_gather_upload_metadata) defined within the public class called S3AssetUploader.The function start at line 300 and ends at 316. It contains 11 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [300.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["manifest.get_default_hash_alg", "encode", "manifest.encode", "hash_data", "encode", "str"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader._snapshot_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.src.deadline.job_attachments.upload_py.S3AssetUploader.upload_assets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.test.unit.deadline_job_attachments.test_upload_py.TestUpload.test_gather_upload_metadata"]. |
aws-deadline_deadline-cloud | S3AssetUploader | public | 0 | 0 | _write_local_manifest | def _write_local_manifest(self,manifest_write_dir: str,manifest_name: str,full_manifest_key: str,manifest: BaseAssetManifest,root_dir_name: Optional[str] = None,) -> None:"""Writes a manifest file locally in a 'manifests' sub-directory.Also creates/appends to a file mapping the local manifest name to the full S3 key in the same directory."""self._write_local_input_manifest(manifest_write_dir, manifest_name, manifest, root_dir_name)self._write_local_manifest_s3_mapping(manifest_write_dir, manifest_name, full_manifest_key) | 1 | 10 | 6 | 56 | 0 | 318 | 332 | 318 | self,manifest_write_dir,manifest_name,full_manifest_key,manifest,root_dir_name | [] | None | {"Expr": 3} | 2 | 15 | 2 | ["self._write_local_input_manifest", "self._write_local_manifest_s3_mapping"] | 0 | [] | The function (_write_local_manifest) defined within the public class called S3AssetUploader.The function start at line 318 and ends at 332. It contains 10 lines of code and it has a cyclomatic complexity of 1. It takes 6 parameters, represented as [318.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._write_local_input_manifest", "self._write_local_manifest_s3_mapping"]. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.