project_name
string
class_name
string
class_modifiers
string
class_implements
int64
class_extends
int64
function_name
string
function_body
string
cyclomatic_complexity
int64
NLOC
int64
num_parameter
int64
num_token
int64
num_variable
int64
start_line
int64
end_line
int64
function_index
int64
function_params
string
function_variable
string
function_return_type
string
function_body_line_type
string
function_num_functions
int64
function_num_lines
int64
outgoing_function_count
int64
outgoing_function_names
string
incoming_function_count
int64
incoming_function_names
string
lexical_representation
string
unifyai_unify
BaseCache
public
1
1
set_cache_name
def set_cache_name(cls, name: str) -> None:"""Set the cache identifier/name."""
1
1
2
12
0
75
76
75
cls,name
[]
None
{"Expr": 1}
0
2
0
[]
0
[]
The function (set_cache_name) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 75 and ends at 76. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [75.0] and does not return any value..
unifyai_unify
BaseCache
public
1
1
get_cache_name
def get_cache_name(cls) -> str:"""Get the current cache identifier/name."""
1
1
1
8
0
80
81
80
cls
[]
str
{"Expr": 1}
0
2
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"]
The function (get_cache_name) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 80 and ends at 81. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"].
unifyai_unify
BaseCache
public
1
1
store_entry
def store_entry(cls,*,key: str,value: Any,res_types: Optional[Dict[str, Any]] = None,) -> None:"""Store a key-value pair in the cache."""
1
7
4
33
0
85
92
85
cls,key,value,res_types
[]
None
{"Expr": 1}
0
8
0
[]
0
[]
The function (store_entry) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 85 and ends at 92. It contains 7 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [85.0] and does not return any value..
unifyai_unify
BaseCache
public
1
1
initialize_cache
def initialize_cache(cls, name: str = None) -> None:"""Initialize or load the cache from storage."""
1
1
2
14
0
96
97
96
cls,name
[]
None
{"Expr": 1}
0
2
0
[]
0
[]
The function (initialize_cache) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 96 and ends at 97. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [96.0] and does not return any value..
unifyai_unify
BaseCache
public
1
1
list_keys
def list_keys(cls) -> List[str]:"""Get a list of all cache keys."""
1
1
1
11
0
101
102
101
cls
[]
List[str]
{"Expr": 1}
0
2
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"]
The function (list_keys) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 101 and ends at 102. It contains 1 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"].
unifyai_unify
BaseCache
public
1
1
retrieve_entry
def retrieve_entry(cls, key: str) -> tuple[Optional[Any], Optional[Dict[str, Any]]]:"""Retrieve a value from the cache.Returns:Tuple of (value, res_types) or (None, None) if not found"""
1
1
2
28
0
106
112
106
cls,key
[]
tuple[Optional[Any], Optional[Dict[str, Any]]]
{"Expr": 1}
0
7
0
[]
0
[]
The function (retrieve_entry) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 106 and ends at 112. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [106.0] and does not return any value..
unifyai_unify
BaseCache
public
1
1
has_key
def has_key(cls, key: str) -> bool:"""Check if a key exists in the cache."""
1
1
2
12
0
116
117
116
cls,key
[]
bool
{"Expr": 1}
0
2
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"]
The function (has_key) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 116 and ends at 117. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [116.0] and does not return any value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"].
unifyai_unify
BaseCache
public
1
1
remove_entry
def remove_entry(cls, key: str) -> None:"""Remove an entry from the cache."""
1
1
2
12
0
121
122
121
cls,key
[]
None
{"Expr": 1}
0
2
0
[]
0
[]
The function (remove_entry) defined within the public class called BaseCache, implement an interface, and it inherit another class.The function start at line 121 and ends at 122. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [121.0] and does not return any value..
unifyai_unify
CacheStats
public
0
0
get_percentage_of_cache_hits
def get_percentage_of_cache_hits(self) -> float:if self.reads == 0:return 0.0return self.hits / self.reads * 100
2
4
1
28
0
14
17
14
self
[]
float
{"If": 1, "Return": 2}
0
4
0
[]
0
[]
The function (get_percentage_of_cache_hits) defined within the public class called CacheStats.The function start at line 14 and ends at 17. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value..
unifyai_unify
CacheStats
public
0
0
get_percentage_of_cache_misses
def get_percentage_of_cache_misses(self) -> float:if self.reads == 0:return 0.0return self.misses / self.reads * 100
2
4
1
28
0
19
22
19
self
[]
float
{"If": 1, "Return": 2}
0
4
0
[]
0
[]
The function (get_percentage_of_cache_misses) defined within the public class called CacheStats.The function start at line 19 and ends at 22. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value..
unifyai_unify
CacheStats
public
0
0
__repr__
def __repr__(self) -> str:return f"CacheStats(hits={self.hits} ({self.get_percentage_of_cache_hits():.1f}%), misses={self.misses} ({self.get_percentage_of_cache_misses():.1f}%), reads={self.reads}, writes={self.writes})"
1
2
1
10
0
24
25
24
self
[]
str
{"Return": 1}
2
2
2
["self.get_percentage_of_cache_hits", "self.get_percentage_of_cache_misses"]
22
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3678342_kieranjol_ifiscripts.Objects_py.OtherNSElementList.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701004_elfi_dev_elfi.elfi.model.elfi_model_py.RandomVariable.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3916301_pythonanywhere_helper_scripts.tests.test_files_py.TestPAPathInit.test_repr_returns_url_property_value", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3916301_pythonanywhere_helper_scripts.tests.test_files_py.TestPAPathInit.test_url_property_contains_correct_pythonanywhere_resource_url_for_instantiated_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.key_py.CPubKey.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.cacheutils_py.LRI.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.excutils_py.ExceptionCauseMixin._get_trace_str", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.excutils_py._TBItem.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.tbutils_py.Callpoint.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967182_asarnow_pyem.pyem.util.util_py.chimera_xform2str", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3969490_pyro_ppl_funsor.funsor.cnf_py.Contraction.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69753201_searxng_searx_instances.searxinstances.model_py.AdditionalUrlList.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.gateway.__init___py.BlitzObjectWrapper.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.gateway.utils_py.ServiceOptsDict.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.util.concurrency_py.AtExitEvent.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero_ext.path_py.path.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70555013_embeddings_benchmark_mteb.mteb.overview_py.MTEBTasks.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77620428_lqhuang_mode_ng.src.mode.utils.locks_py.Event.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.78313907_openzim_python_scraperlib.tests.i18n.test_i18n_py.test_lang_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94562214_home_assistant_core.homeassistant.components.bond.utils_py.BondDevice.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95079799_mozilla_mozci.mozci.push_py.Push.__repr__"]
The function (__repr__) defined within the public class called CacheStats.The function start at line 24 and ends at 25. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["self.get_percentage_of_cache_hits", "self.get_percentage_of_cache_misses"], It has 22.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3678342_kieranjol_ifiscripts.Objects_py.OtherNSElementList.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701004_elfi_dev_elfi.elfi.model.elfi_model_py.RandomVariable.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3916301_pythonanywhere_helper_scripts.tests.test_files_py.TestPAPathInit.test_repr_returns_url_property_value", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3916301_pythonanywhere_helper_scripts.tests.test_files_py.TestPAPathInit.test_url_property_contains_correct_pythonanywhere_resource_url_for_instantiated_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.key_py.CPubKey.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.cacheutils_py.LRI.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.excutils_py.ExceptionCauseMixin._get_trace_str", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.excutils_py._TBItem.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957920_mahmoud_boltons.boltons.tbutils_py.Callpoint.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967182_asarnow_pyem.pyem.util.util_py.chimera_xform2str", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3969490_pyro_ppl_funsor.funsor.cnf_py.Contraction.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69753201_searxng_searx_instances.searxinstances.model_py.AdditionalUrlList.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.gateway.__init___py.BlitzObjectWrapper.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.gateway.utils_py.ServiceOptsDict.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero.util.concurrency_py.AtExitEvent.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero_ext.path_py.path.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70555013_embeddings_benchmark_mteb.mteb.overview_py.MTEBTasks.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77620428_lqhuang_mode_ng.src.mode.utils.locks_py.Event.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.78313907_openzim_python_scraperlib.tests.i18n.test_i18n_py.test_lang_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94562214_home_assistant_core.homeassistant.components.bond.utils_py.BondDevice.__repr__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95079799_mozilla_mozci.mozci.push_py.Push.__repr__"].
unifyai_unify
CacheStats
public
0
0
__add__
def __add__(self, other: "CacheStats") -> "CacheStats":return CacheStats(hits=self.hits + other.hits,misses=self.misses + other.misses,reads=self.reads + other.reads,writes=self.writes + other.writes,)
1
7
2
55
0
27
33
27
self,other
[]
'CacheStats'
{"Return": 1}
1
7
1
["CacheStats"]
7
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.quantity_py.Quantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.uncertainquantity_py.UncertainQuantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.unitquantity_py.UnitQuantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScript.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__radd__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero_ext.path_py.path.__add__"]
The function (__add__) defined within the public class called CacheStats.The function start at line 27 and ends at 33. It contains 7 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [27.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["CacheStats"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.quantity_py.Quantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.uncertainquantity_py.UncertainQuantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3694013_python_quantities_python_quantities.quantities.unitquantity_py.UnitQuantity.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3954802_petertodd_python_bitcoinlib.bitcoin.core.script_py.CScript.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__add__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964685_pallets_markupsafe.src.markupsafe.__init___py.Markup.__radd__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.src.omero_ext.path_py.path.__add__"].
unifyai_unify
public
public
0
0
_is_cache_benchmark_enabled
def _is_cache_benchmark_enabled() -> bool:return os.environ.get("UNIFY_CACHE_BENCHMARK", "false") == "true"
1
2
0
19
0
39
40
39
[]
bool
{"Return": 1}
1
2
1
["os.environ.get"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.get_cache_stats", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_get_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_write_to_cache"]
The function (_is_cache_benchmark_enabled) defined within the public class called public.The function start at line 39 and ends at 40. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["os.environ.get"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.get_cache_stats", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_get_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_write_to_cache"].
unifyai_unify
public
public
0
0
get_cache_stats
def get_cache_stats() -> CacheStats:if not _is_cache_benchmark_enabled():warnings.warn("Cache benchmark is not enabled, set UNIFY_CACHE_BENCHMARK=true to enable it, must be set before importing unify.",)return CURRENT_CACHE_STATS
2
6
0
21
0
43
48
43
[]
CacheStats
{"Expr": 1, "If": 1, "Return": 1}
2
6
2
["_is_cache_benchmark_enabled", "warnings.warn"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_get_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_write_to_cache"]
The function (get_cache_stats) defined within the public class called public.The function start at line 43 and ends at 48. It contains 6 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["_is_cache_benchmark_enabled", "warnings.warn"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_get_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.cache_benchmark_py.record_write_to_cache"].
unifyai_unify
public
public
0
0
reset_cache_stats
def reset_cache_stats() -> None:global CURRENT_CACHE_STATSCURRENT_CACHE_STATS = CacheStats()
1
3
0
13
1
51
53
51
['CURRENT_CACHE_STATS']
None
{"Assign": 1}
1
3
1
["CacheStats"]
0
[]
The function (reset_cache_stats) defined within the public class called public.The function start at line 51 and ends at 53. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["CacheStats"].
unifyai_unify
public
public
0
0
record_get_cache.wrapper
def wrapper(*args, **kwargs):benchmark = get_cache_stats()benchmark.reads += 1ret = fn(*args, **kwargs)if ret is None:benchmark.misses += 1else:benchmark.hits += 1return ret
2
9
2
48
0
62
70
62
null
[]
None
null
0
0
0
null
0
null
The function (record_get_cache.wrapper) defined within the public class called public.The function start at line 62 and ends at 70. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [62.0] and does not return any value..
unifyai_unify
public
public
0
0
record_get_cache
def record_get_cache(fn):if not _is_cache_benchmark_enabled():return fnelse:@functools.wraps(fn)def wrapper(*args, **kwargs):benchmark = get_cache_stats()benchmark.reads += 1ret = fn(*args, **kwargs)if ret is None:benchmark.misses += 1else:benchmark.hits += 1return retreturn wrapper
2
7
1
26
2
56
72
56
fn
['ret', 'benchmark']
Returns
{"Assign": 2, "AugAssign": 3, "If": 2, "Return": 3}
4
17
4
["_is_cache_benchmark_enabled", "get_cache_stats", "fn", "functools.wraps"]
0
[]
The function (record_get_cache) defined within the public class called public.The function start at line 56 and ends at 72. It contains 7 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters, and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_is_cache_benchmark_enabled", "get_cache_stats", "fn", "functools.wraps"].
unifyai_unify
public
public
0
0
record_write_to_cache.wrapper
def wrapper(*args, **kwargs):benchmark = get_cache_stats()benchmark.writes += 1return fn(*args, **kwargs)
1
4
2
28
0
81
84
81
null
[]
None
null
0
0
0
null
0
null
The function (record_write_to_cache.wrapper) defined within the public class called public.The function start at line 81 and ends at 84. It contains 4 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [81.0] and does not return any value..
unifyai_unify
public
public
0
0
record_write_to_cache
def record_write_to_cache(fn):if not _is_cache_benchmark_enabled():return fnelse:@functools.wraps(fn)def wrapper(*args, **kwargs):benchmark = get_cache_stats()benchmark.writes += 1return fn(*args, **kwargs)return wrapper
2
7
1
26
1
75
86
75
fn
['benchmark']
Returns
{"Assign": 1, "AugAssign": 1, "If": 1, "Return": 3}
4
12
4
["_is_cache_benchmark_enabled", "get_cache_stats", "fn", "functools.wraps"]
0
[]
The function (record_write_to_cache) defined within the public class called public.The function start at line 75 and ends at 86. It contains 7 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters, and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_is_cache_benchmark_enabled", "get_cache_stats", "fn", "functools.wraps"].
unifyai_unify
LocalCache
public
0
1
set_cache_name
def set_cache_name(cls, name: str) -> None:"""Set the cache filename and reset the in-memory cache."""cls._cache_filename = namecls._cache = None# Force reload on next access
1
3
2
22
0
28
31
28
cls,name
[]
None
{"Assign": 2, "Expr": 1}
0
4
0
[]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__exit__"]
The function (set_cache_name) defined within the public class called LocalCache, that inherit another class.The function start at line 28 and ends at 31. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [28.0] and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__exit__"].
unifyai_unify
LocalCache
public
0
1
get_cache_name
def get_cache_name(cls) -> str:"""Get the current cache filename."""return cls._cache_filename
1
2
1
12
0
34
36
34
cls
[]
str
{"Expr": 1, "Return": 1}
0
3
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"]
The function (get_cache_name) defined within the public class called LocalCache, that inherit another class.The function start at line 34 and ends at 36. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"].
unifyai_unify
LocalCache
public
0
1
get_cache_filepath
def get_cache_filepath(cls, name: str = None) -> str:"""Get the full filepath for the cache file."""if name is None:name = cls.get_cache_name()return os.path.join(cls._cache_dir, name)
2
4
2
39
0
39
43
39
cls,name
[]
str
{"Assign": 1, "Expr": 1, "If": 1, "Return": 1}
2
5
2
["cls.get_cache_name", "os.path.join"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__init__"]
The function (get_cache_filepath) defined within the public class called LocalCache, that inherit another class.The function start at line 39 and ends at 43. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [39.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["cls.get_cache_name", "os.path.join"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.helpers_py._CacheHandler.__init__"].
unifyai_unify
LocalCache
public
0
1
is_enabled
def is_enabled(cls) -> bool:"""Check if the cache is enabled."""return cls._enabled
1
2
1
12
0
46
48
46
cls
[]
bool
{"Expr": 1, "Return": 1}
0
3
0
[]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3663914_mozilla_releng_build_cloud_tools.cloudtools.slavealloc_py.get_classified_slaves", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.async_setup_entry"]
The function (is_enabled) defined within the public class called LocalCache, that inherit another class.The function start at line 46 and ends at 48. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3663914_mozilla_releng_build_cloud_tools.cloudtools.slavealloc_py.get_classified_slaves", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.async_setup_entry"].
unifyai_unify
LocalCache
public
0
1
store_entry
def store_entry(cls,*,key: str,value: Any,res_types: Optional[Dict[str, Any]] = None,) -> None:"""Store a key-value pair in the cache."""cls._cache[key] = {"value": value, "res_types": res_types}with open(cls.get_cache_filepath(), "a") as f:_write_to_ndjson_cache(f, key, value, res_types)
1
10
4
73
0
51
61
51
cls,key,value,res_types
[]
None
{"Assign": 1, "Expr": 2, "With": 1}
3
11
3
["open", "cls.get_cache_filepath", "_write_to_ndjson_cache"]
0
[]
The function (store_entry) defined within the public class called LocalCache, that inherit another class.The function start at line 51 and ends at 61. It contains 10 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [51.0] and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["open", "cls.get_cache_filepath", "_write_to_ndjson_cache"].
unifyai_unify
LocalCache
public
0
1
initialize_cache
def initialize_cache(cls, name: str = None) -> None:"""Initialize or load the cache from disk."""cache_filepath = cls.get_cache_filepath(name)if cls._cache is None:try:if not os.path.exists(cache_filepath):with open(cache_filepath, "w") as f:f.write("")with open(cache_filepath, "r") as f:cls._cache = _load_ndjson_cache(f)except IOError:# File does not exist or can't be read, reinitializewarnings.warn(f"Cache file {cache_filepath} can't be read, reinitializing",)cls._cache = {}with open(cache_filepath, "w") as f:f.write("")
4
16
2
109
0
64
83
64
cls,name
[]
None
{"Assign": 3, "Expr": 4, "If": 2, "Try": 1, "With": 3}
9
20
9
["cls.get_cache_filepath", "os.path.exists", "open", "f.write", "open", "_load_ndjson_cache", "warnings.warn", "open", "f.write"]
0
[]
The function (initialize_cache) defined within the public class called LocalCache, that inherit another class.The function start at line 64 and ends at 83. It contains 16 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [64.0] and does not return any value. It declares 9.0 functions, and It has 9.0 functions called inside which are ["cls.get_cache_filepath", "os.path.exists", "open", "f.write", "open", "_load_ndjson_cache", "warnings.warn", "open", "f.write"].
unifyai_unify
LocalCache
public
0
1
list_keys
def list_keys(cls) -> List[str]:return list(cls._cache.keys())
1
2
1
21
0
86
87
86
cls
[]
List[str]
{"Return": 1}
2
2
2
["list", "cls._cache.keys"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"]
The function (list_keys) defined within the public class called LocalCache, that inherit another class.The function start at line 86 and ends at 87. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["list", "cls._cache.keys"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"].
unifyai_unify
LocalCache
public
0
1
retrieve_entry
def retrieve_entry(cls, key: str) -> tuple[Optional[Any], Optional[Dict[str, Any]]]:"""Retrieve a value from the cache.Returns:Tuple of (value, type_registry) or (None, None) if not found"""if cls._cache is None:return None, Nonevalue = cls._cache.get(key)if value is None:return None, Nonedeserialized_value = json.loads(value["value"])return deserialized_value, value["res_types"]
3
8
2
76
0
90
105
90
cls,key
[]
tuple[Optional[Any], Optional[Dict[str, Any]]]
{"Assign": 2, "Expr": 1, "If": 2, "Return": 3}
2
16
2
["cls._cache.get", "json.loads"]
0
[]
The function (retrieve_entry) defined within the public class called LocalCache, that inherit another class.The function start at line 90 and ends at 105. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [90.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["cls._cache.get", "json.loads"].
unifyai_unify
LocalCache
public
0
1
has_key
def has_key(cls, key: str) -> bool:"""Check if a key exists in the cache."""return cls._cache is not None and key in cls._cache
2
2
2
25
0
108
110
108
cls,key
[]
bool
{"Expr": 1, "Return": 1}
0
3
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"]
The function (has_key) defined within the public class called LocalCache, that inherit another class.The function start at line 108 and ends at 110. It contains 2 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [108.0] and does not return any value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"].
unifyai_unify
LocalCache
public
0
1
remove_entry
def remove_entry(cls, key: str) -> None:"""Remove an entry and its res_types from the cache."""if cls._cache is not None:item = cls._cache.pop(key, None)if item is not None:with open(cls.get_cache_filepath(), "w") as f:for key, value in cls._cache.items():_write_to_ndjson_cache(f,key,value["value"],value["res_types"],)
4
12
2
82
0
113
125
113
cls,key
[]
None
{"Assign": 1, "Expr": 2, "For": 1, "If": 2, "With": 1}
5
13
5
["cls._cache.pop", "open", "cls.get_cache_filepath", "cls._cache.items", "_write_to_ndjson_cache"]
0
[]
The function (remove_entry) defined within the public class called LocalCache, that inherit another class.The function start at line 113 and ends at 125. It contains 12 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [113.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["cls._cache.pop", "open", "cls.get_cache_filepath", "cls._cache.items", "_write_to_ndjson_cache"].
unifyai_unify
LocalSeparateCache
public
0
1
set_cache_name
def set_cache_name(cls, name: str) -> None:"""Set the cache names and reset both caches."""cls._cache_name_read = f"{name}_read"cls._cache_name_write = f"{name}_write"cls._cache_read = None# Force reload of read cachecls._cache_write = None# Force reload of write cache
1
5
2
34
0
22
27
22
cls,name
[]
None
{"Assign": 4, "Expr": 1}
0
6
0
[]
0
[]
The function (set_cache_name) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 22 and ends at 27. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [22.0] and does not return any value..
unifyai_unify
LocalSeparateCache
public
0
1
get_cache_name
def get_cache_name(cls) -> str:"""Get the current read cache name."""return cls._cache_name_read
1
2
1
12
0
30
32
30
cls
[]
str
{"Expr": 1, "Return": 1}
0
3
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"]
The function (get_cache_name) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 30 and ends at 32. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy"].
unifyai_unify
LocalSeparateCache
public
0
1
get_cache_filepath
def get_cache_filepath(cls, name: str = None) -> str:"""Get the full filepath for the cache file."""if name is None:name = cls.get_cache_name()return os.path.join(cls._cache_dir, name)
2
4
2
39
0
35
39
35
cls,name
[]
str
{"Assign": 1, "Expr": 1, "If": 1, "Return": 1}
2
5
2
["cls.get_cache_name", "os.path.join"]
0
[]
The function (get_cache_filepath) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 35 and ends at 39. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [35.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["cls.get_cache_name", "os.path.join"].
unifyai_unify
LocalSeparateCache
public
0
1
is_enabled
def is_enabled(cls) -> bool:"""Check if the cache is enabled."""return cls._enabled
1
2
1
12
0
42
44
42
cls
[]
bool
{"Expr": 1, "Return": 1}
0
3
0
[]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3663914_mozilla_releng_build_cloud_tools.cloudtools.slavealloc_py.get_classified_slaves", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.async_setup_entry"]
The function (is_enabled) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 42 and ends at 44. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3663914_mozilla_releng_build_cloud_tools.cloudtools.slavealloc_py.get_classified_slaves", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3958470_robinostlund_homeassistant_volkswagencarnet.custom_components.volkswagencarnet.__init___py.async_setup_entry"].
unifyai_unify
LocalSeparateCache
public
0
1
store_entry
def store_entry(cls,*,key: str,value: Any,res_types: Optional[Dict[str, Any]] = None,) -> None:"""Store a key-value pair in the write cache."""cls._cache_write[key] = {"value": value, "res_types": res_types}with open(cls.get_cache_filepath(cls._cache_name_write), "a") as f:_write_to_ndjson_cache(f,key,value,res_types,)
1
15
4
77
0
47
62
47
cls,key,value,res_types
[]
None
{"Assign": 1, "Expr": 2, "With": 1}
3
16
3
["open", "cls.get_cache_filepath", "_write_to_ndjson_cache"]
0
[]
The function (store_entry) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 47 and ends at 62. It contains 15 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [47.0] and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["open", "cls.get_cache_filepath", "_write_to_ndjson_cache"].
unifyai_unify
LocalSeparateCache
public
0
1
initialize_cache
def initialize_cache(cls, name: str = None) -> None:"""Initialize both read and write caches."""# Always initialize the write cacheif cls._cache_write is None:cls._cache_write = {}# Initialize the read cacheif cls._cache_read is None:cls._cache_read = {}try:with open(cls.get_cache_filepath(cls._cache_name_read), "r") as f:cls._cache_read = _load_ndjson_cache(f)except IOError:# File can't be read, keep empty cachewarnings.warn(f"Cache file {cls.get_cache_filepath(cls._cache_name_read)} does not exist or can't be read.",)cls._cache_read = {}
4
13
2
84
0
65
82
65
cls,name
[]
None
{"Assign": 4, "Expr": 2, "If": 2, "Try": 1, "With": 1}
5
18
5
["open", "cls.get_cache_filepath", "_load_ndjson_cache", "warnings.warn", "cls.get_cache_filepath"]
0
[]
The function (initialize_cache) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 65 and ends at 82. It contains 13 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [65.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["open", "cls.get_cache_filepath", "_load_ndjson_cache", "warnings.warn", "cls.get_cache_filepath"].
unifyai_unify
LocalSeparateCache
public
0
1
list_keys
def list_keys(cls) -> List[str]:return list(cls._cache_read.keys()) + list(cls._cache_write.keys())
1
2
1
32
0
85
86
85
cls
[]
List[str]
{"Return": 1}
4
2
4
["list", "cls._cache_read.keys", "list", "cls._cache_write.keys"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"]
The function (list_keys) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 85 and ends at 86. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["list", "cls._cache_read.keys", "list", "cls._cache_write.keys"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"].
unifyai_unify
LocalSeparateCache
public
0
1
retrieve_entry
def retrieve_entry(cls, key: str) -> tuple[Optional[Any], Optional[Dict[str, Any]]]:"""Retrieve a value from the cache, checking write cache first.Returns:Tuple of (value, res_types) or (None, None) if not found"""# First check the write cacheif cls._cache_write and key in cls._cache_write:value = cls._cache_write[key]deserialized_value = json.loads(value["value"])return deserialized_value, value["res_types"]# If not found in write cache, check the read cacheif cls._cache_read and key in cls._cache_read:value = cls._cache_read[key]res_types = value["res_types"]deserialized_value = json.loads(value["value"])# Promote to write cache for faster future accesscls.store_entry(key=key,value=cls.serialize_object(deserialized_value),res_types=res_types,)return deserialized_value, res_typesreturn None, None
5
16
2
131
0
89
116
89
cls,key
[]
tuple[Optional[Any], Optional[Dict[str, Any]]]
{"Assign": 5, "Expr": 2, "If": 2, "Return": 3}
4
28
4
["json.loads", "json.loads", "cls.store_entry", "cls.serialize_object"]
0
[]
The function (retrieve_entry) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 89 and ends at 116. It contains 16 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [89.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["json.loads", "json.loads", "cls.store_entry", "cls.serialize_object"].
unifyai_unify
LocalSeparateCache
public
0
1
has_key
def has_key(cls, key: str) -> bool:"""Check if a key exists in either cache."""return (cls._cache_write is not None and key in cls._cache_write) or (cls._cache_read is not None and key in cls._cache_read)
4
4
2
42
0
119
123
119
cls,key
[]
bool
{"Expr": 1, "Return": 1}
0
5
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"]
The function (has_key) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 119 and ends at 123. It contains 4 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [119.0] and does not return any value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"].
unifyai_unify
LocalSeparateCache
public
0
1
remove_entry
def remove_entry(cls, key: str) -> None:"""Remove an entry from both caches."""if cls._cache_write:item = cls._cache_write.pop(key, None)if item is not None:with open(cls.get_cache_filepath(cls._cache_name_write), "w") as f:for key, value in cls._cache_write.items():_write_to_ndjson_cache(f,key,value["value"],value["res_types"],)if cls._cache_read:cls._cache_read.pop(key, None)
5
14
2
97
0
126
141
126
cls,key
[]
None
{"Assign": 1, "Expr": 3, "For": 1, "If": 3, "With": 1}
6
16
6
["cls._cache_write.pop", "open", "cls.get_cache_filepath", "cls._cache_write.items", "_write_to_ndjson_cache", "cls._cache_read.pop"]
0
[]
The function (remove_entry) defined within the public class called LocalSeparateCache, that inherit another class.The function start at line 126 and ends at 141. It contains 14 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [126.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["cls._cache_write.pop", "open", "cls.get_cache_filepath", "cls._cache_write.items", "_write_to_ndjson_cache", "cls._cache_read.pop"].
unifyai_unify
public
public
0
0
_load_ndjson_cache
def _load_ndjson_cache(filehandler: TextIO):cache = {}for line_number, line in enumerate(filehandler, start=1):line = line.strip()if not line:continuetry:item = json.loads(line)cache[item["key"]] = {"value": item["value"],"res_types": item["res_types"],}except json.JSONDecodeError:warnings.warn(f"Cache file {filehandler.name} contains invalid cache entry, skipping line {line_number}: {line[:40]}...",)return cache
4
17
1
86
3
6
24
6
filehandler
['cache', 'item', 'line']
Returns
{"Assign": 4, "Expr": 1, "For": 1, "If": 1, "Return": 1, "Try": 1}
4
19
4
["enumerate", "line.strip", "json.loads", "warnings.warn"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.initialize_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.initialize_cache"]
The function (_load_ndjson_cache) defined within the public class called public.The function start at line 6 and ends at 24. It contains 17 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters, and this function return a value. It declares 4.0 functions, It has 4.0 functions called inside which are ["enumerate", "line.strip", "json.loads", "warnings.warn"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.initialize_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.initialize_cache"].
unifyai_unify
public
public
0
0
_write_to_ndjson_cache
def _write_to_ndjson_cache(filehandler: TextIO,key: str,value: Any,res_types: Optional[List[str]] = None,):filehandler.write(json.dumps({"key": key,"value": value,"res_types": res_types,},)+ "\n",)
1
16
4
56
0
27
42
27
filehandler,key,value,res_types
[]
None
{"Expr": 1}
2
16
2
["filehandler.write", "json.dumps"]
4
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.store_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.store_entry"]
The function (_write_to_ndjson_cache) defined within the public class called public.The function start at line 27 and ends at 42. It contains 16 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [27.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["filehandler.write", "json.dumps"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_cache_py.LocalCache.store_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.local_separate_cache_py.LocalSeparateCache.store_entry"].
unifyai_unify
RemoteCache
public
0
1
_build_filter_expression
def _build_filter_expression(cache_key: str) -> str:"""Build a filter expression for querying logs."""return f"key == {json.dumps(cache_key)}"
1
2
1
13
0
19
21
19
cache_key
[]
str
{"Expr": 1, "Return": 1}
1
3
1
["json.dumps"]
0
[]
The function (_build_filter_expression) defined within the public class called RemoteCache, that inherit another class.The function start at line 19 and ends at 21. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["json.dumps"].
unifyai_unify
RemoteCache
public
0
1
set_cache_name
def set_cache_name(cls, name: str) -> None:"""Set the remote context name for the cache."""cls._remote_context = name
1
2
2
17
0
24
26
24
cls,name
[]
None
{"Assign": 1, "Expr": 1}
0
3
0
[]
0
[]
The function (set_cache_name) defined within the public class called RemoteCache, that inherit another class.The function start at line 24 and ends at 26. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [24.0] and does not return any value..
unifyai_unify
RemoteCache
public
0
1
get_cache_name
def get_cache_name(cls) -> str:"""Get the current remote context name."""return cls._remote_context
1
2
1
12
0
29
31
29
cls
[]
str
{"Expr": 1, "Return": 1}
0
3
0
[]
8
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_cached_decorator_upstream_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_cached_decorator_upstream_cache_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_closest_match_on_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_read", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_read_only", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_write"]
The function (get_cache_name) defined within the public class called RemoteCache, that inherit another class.The function start at line 29 and ends at 31. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 8.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.proxy_py.ProxyTests.test_proxy", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_cached_decorator_upstream_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_cached_decorator_upstream_cache_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_closest_match_on_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_read", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_read_only", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_utils.test_caching_py.test_upstream_cache_write"].
unifyai_unify
RemoteCache
public
0
1
store_entry
def store_entry(cls,*,key: str,value: Any,res_types: Optional[Dict[str, Any]] = None,) -> None:"""Store a key-value pair in the remote cache."""from unify import delete_logs, get_logs, log# Remove existing entries with the same keyexisting_logs = get_logs(context=cls._remote_context,filter=cls._build_filter_expression(key),return_ids_only=True,)if existing_logs:delete_logs(logs=existing_logs, context=cls._remote_context)# Create new log entryentries = {"value": value}if res_types:entries["res_types"] = json.dumps(res_types)log(key=key, context=cls._remote_context, **entries)
3
19
4
116
0
34
58
34
cls,key,value,res_types
[]
None
{"Assign": 3, "Expr": 3, "If": 2}
5
25
5
["get_logs", "cls._build_filter_expression", "delete_logs", "json.dumps", "log"]
0
[]
The function (store_entry) defined within the public class called RemoteCache, that inherit another class.The function start at line 34 and ends at 58. It contains 19 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [34.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["get_logs", "cls._build_filter_expression", "delete_logs", "json.dumps", "log"].
unifyai_unify
RemoteCache
public
0
1
write
def write(cls, filename: str = None) -> None:"""No-op for remote cache - data is persisted immediately."""
1
1
2
14
0
61
62
61
cls,filename
[]
None
{"Expr": 1}
0
2
0
[]
235
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_commit_normal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_all", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_nonpositive_throws", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_nonsense_throws", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_one", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_infer_false", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_infer_true", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_client_base.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_client_base.setUpClass", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.19337870_chia_network_clvm_tools.costs.generate_benchmark_py.print_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.process.plugins_py.PIDFile.start", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.logtest_py.LogCase.emptyLog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.logtest_py.LogCase.markLog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.test_static_py.StaticTest.setup_server", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.markdown.serializers_py._serialize_html", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.pygments.formatters.other_py.RawTokenFormatter.format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3516508_rvagg_archived_pangyp.gyp.pylib.gyp.generator.eclipse_py.GenerateClasspathFile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3535444_stormpath_stormpath_flask.tests.test_settings_py.TestCheckSettings.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.output.__init___py.StringIO.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3562726_candlepin_rho.test.clicommands_tests_py.CliCommandsTests.test_profile_add_hosts", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574492_gstreamer_gst_python.testsuite.old.test_event_py.EventFileSrcTest.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3594492_mangroveorg_mangrove.mangrove.form_model.xform_py.Xform._to_string", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3598540_saltstack_salt_testing.salttesting.runtests_py.SaltRuntests.__transplant_configs__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.generator_py.Generator._write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643539_agateau_yokadi.yokadi.core.daemon_py.Daemon.daemonize"]
The function (write) defined within the public class called RemoteCache, that inherit another class.The function start at line 61 and ends at 62. It contains 1 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [61.0] and does not return any value. It has 235.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_commit_normal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_all", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_nonpositive_throws", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_nonsense_throws", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_batch_one", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_infer_false", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_Transaction.test_query_infer_true", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_client_base.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18354993_vaticle_typedb_client_python.tests.integration.test_typedb_py.test_client_base.setUpClass", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.19337870_chia_network_clvm_tools.costs.generate_benchmark_py.print_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.process.plugins_py.PIDFile.start", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.logtest_py.LogCase.emptyLog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.logtest_py.LogCase.markLog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.test_static_py.StaticTest.setup_server", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.markdown.serializers_py._serialize_html", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.pygments.formatters.other_py.RawTokenFormatter.format", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3516508_rvagg_archived_pangyp.gyp.pylib.gyp.generator.eclipse_py.GenerateClasspathFile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3535444_stormpath_stormpath_flask.tests.test_settings_py.TestCheckSettings.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.output.__init___py.StringIO.write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3562726_candlepin_rho.test.clicommands_tests_py.CliCommandsTests.test_profile_add_hosts", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574492_gstreamer_gst_python.testsuite.old.test_event_py.EventFileSrcTest.setUp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3594492_mangroveorg_mangrove.mangrove.form_model.xform_py.Xform._to_string", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3598540_saltstack_salt_testing.salttesting.runtests_py.SaltRuntests.__transplant_configs__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.generator_py.Generator._write", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643539_agateau_yokadi.yokadi.core.daemon_py.Daemon.daemonize"].
unifyai_unify
RemoteCache
public
0
1
initialize_cache
def initialize_cache(cls, name: str = None) -> None:"""Ensure the remote context exists."""from unify import create_context, get_contextsif cls._remote_context not in get_contexts():create_context(cls._remote_context)
2
4
2
36
0
65
70
65
cls,name
[]
None
{"Expr": 2, "If": 1}
2
6
2
["get_contexts", "create_context"]
0
[]
The function (initialize_cache) defined within the public class called RemoteCache, that inherit another class.The function start at line 65 and ends at 70. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [65.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["get_contexts", "create_context"].
unifyai_unify
RemoteCache
public
0
1
list_keys
def list_keys(cls) -> List[str]:"""Get a list of all cache keys from the remote context."""from unify import get_logslogs = get_logs(context=cls._remote_context)return [log.entries["key"] for log in logs]
2
4
1
38
0
73
78
73
cls
[]
List[str]
{"Assign": 1, "Expr": 1, "Return": 1}
1
6
1
["get_logs"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"]
The function (list_keys) defined within the public class called RemoteCache, that inherit another class.The function start at line 73 and ends at 78. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["get_logs"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._s3_py.delete_objects"].
unifyai_unify
RemoteCache
public
0
1
retrieve_entry
def retrieve_entry(cls, key: str) -> tuple[Optional[Any], Optional[Dict[str, Any]]]:"""Retrieve a value from the remote cache.Returns:Tuple of (value, res_types) or (None, None) if not found"""from unify import get_logslogs = get_logs(context=cls._remote_context,filter=cls._build_filter_expression(key),)if not logs:return None, Noneentry = logs[0].entriesvalue = json.loads(entry["value"])res_types = Noneif "res_types" in entry:res_types = json.loads(entry["res_types"])return value, res_types
3
14
2
102
0
81
105
81
cls,key
[]
tuple[Optional[Any], Optional[Dict[str, Any]]]
{"Assign": 5, "Expr": 1, "If": 2, "Return": 2}
4
25
4
["get_logs", "cls._build_filter_expression", "json.loads", "json.loads"]
0
[]
The function (retrieve_entry) defined within the public class called RemoteCache, that inherit another class.The function start at line 81 and ends at 105. It contains 14 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [81.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["get_logs", "cls._build_filter_expression", "json.loads", "json.loads"].
unifyai_unify
RemoteCache
public
0
1
has_key
def has_key(cls, key: str) -> bool:"""Check if a key exists in the remote cache."""from unify import get_logslogs = get_logs(context=cls._remote_context,filter=cls._build_filter_expression(key),return_ids_only=True,)return len(logs) > 0
1
8
2
47
0
108
117
108
cls,key
[]
bool
{"Assign": 1, "Expr": 1, "Return": 1}
3
10
3
["get_logs", "cls._build_filter_expression", "len"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"]
The function (has_key) defined within the public class called RemoteCache, that inherit another class.The function start at line 108 and ends at 117. It contains 8 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [108.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["get_logs", "cls._build_filter_expression", "len"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.handle_tag", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.o", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.gitsome.lib.html2text.html2text_py.HTML2Text.previousIndex"].
unifyai_unify
RemoteCache
public
0
1
remove_entry
def remove_entry(cls, key: str) -> None:"""Remove an entry from the remote cache."""from unify import delete_logs, get_logslogs = get_logs(context=cls._remote_context,filter=cls._build_filter_expression(key),return_ids_only=True,)if logs:delete_logs(context=cls._remote_context, logs=logs)
2
9
2
57
0
120
130
120
cls,key
[]
None
{"Assign": 1, "Expr": 2, "If": 1}
3
11
3
["get_logs", "cls._build_filter_expression", "delete_logs"]
0
[]
The function (remove_entry) defined within the public class called RemoteCache, that inherit another class.The function start at line 120 and ends at 130. It contains 9 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [120.0] and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["get_logs", "cls._build_filter_expression", "delete_logs"].
aws-deadline_deadline-cloud
HatchCustomBuildHook
public
0
1
_validate_config
def _validate_config(self):if sorted(self.config) != ["copy_version_py", "path"] or list(self.config["copy_version_py"]) != ["destinations"]:raise RuntimeError("Configuration of the custom build hook must be like { 'copy_version_py': {'destinations': ['path1', ...]}}."+ f" Received:\n{self.config}")
3
8
1
41
0
18
25
18
self
[]
None
{"If": 1}
3
8
3
["sorted", "list", "RuntimeError"]
0
[]
The function (_validate_config) defined within the public class called HatchCustomBuildHook, that inherit another class.The function start at line 18 and ends at 25. It contains 8 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["sorted", "list", "RuntimeError"].
aws-deadline_deadline-cloud
HatchCustomBuildHook
public
0
1
initialize
def initialize(self, version: str, build_data: dict[str, Any]) -> None:self._validate_config()for destination in self.config["copy_version_py"]["destinations"]:print(f"Copying _version.py to {destination}")shutil.copy(os.path.join(self.root, "_version.py"),os.path.join(self.root, destination),)
2
8
3
74
0
27
35
27
self,version,build_data
[]
None
{"Expr": 3, "For": 1}
5
9
5
["self._validate_config", "print", "shutil.copy", "os.path.join", "os.path.join"]
89
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"]
The function (initialize) defined within the public class called HatchCustomBuildHook, that inherit another class.The function start at line 27 and ends at 35. It contains 8 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [27.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["self._validate_config", "print", "shutil.copy", "os.path.join", "os.path.join"], It has 89.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.Base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.intratests.garbage_check.garbage_check_py.garbage_check.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.pretests.docker_test_images.docker_test_images_py.docker_test_images.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.attach_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.sig_proxy_off_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.attach.attach_py.simple_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.build_py.BuildSubSubtest.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.build.expose_py.expose.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.commit.commit_py.commit_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.CpBase.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.cp_symlink.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.every_last.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.simple.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.cp.cp_py.volume_mount.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_py.create_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_remote_tag_py.create_remote_tag.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_signal_py.create_signal.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.create.create_tmpfs_py.create_tmpfs.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.deferred_deletion.deferred_deletion_py.deferred_deletion.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.diff.diff_py.diff_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.dockerhelp.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_base.initialize", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerhelp.dockerhelp_py.help_class_factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3617659_autotest_autotest_docker.subtests.docker_cli.dockerimport.empty_py.empty.initialize"].
aws-deadline_deadline-cloud
HatchCustomBuildHook
public
0
1
clean
def clean(self, versions: list[str]) -> None:self._validate_config()cleaned_count = 0for destination in self.config["copy_version_py"]["destinations"]:print(f"Cleaning _version.py from {destination}")clean_path = os.path.join(self.root, destination, "_version.py")try:os.remove(clean_path)cleaned_count += 1except FileNotFoundError:passprint(f"Cleaned {cleaned_count} items")
3
12
2
76
0
37
49
37
self,versions
[]
None
{"Assign": 2, "AugAssign": 1, "Expr": 4, "For": 1, "Try": 1}
5
13
5
["self._validate_config", "print", "os.path.join", "os.remove", "print"]
22
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3504621_hackfmi_diaphanum.projects.forms_py.ProjectForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3504621_hackfmi_diaphanum.protocols.forms_py.ProtocolForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3619342_bread_and_pepper_django_userena.userena.contrib.umessages.fields_py.CommaSeparatedUserField.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.BaseTranslatableModelForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.BaseTranslationFormSet.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.forms_py.CustomLanguageNormalForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.forms_py.NGOForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957905_thehamkercat_williambutcherbot.wbb.utils.inlinefuncs_py.urban_func", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963840_matthewwithanm_django_imagekit.imagekit.forms.fields_py.ProcessedImageField.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.Folder.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.PostmanCollectionV1.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.Request.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94638225_rozari0_NezukoBot.nezuko.utils.inlinefuncs_py.urban_func", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.forms_py.QueryForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95140527_hngprojects_hng_boilerplate_python_fastapi_web.api.v1.schemas.profile_py.ProfileCreateUpdate.phone_number_validator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.func_py.GroupedFunc.module_decl", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Field.c_to_mp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Field.mp_to_c", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Struct.definition", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Struct.module_decl", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.union_py.Union.definition", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.union_py.Union.module_decl"]
The function (clean) defined within the public class called HatchCustomBuildHook, that inherit another class.The function start at line 37 and ends at 49. It contains 12 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [37.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["self._validate_config", "print", "os.path.join", "os.remove", "print"], It has 22.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3504621_hackfmi_diaphanum.projects.forms_py.ProjectForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3504621_hackfmi_diaphanum.protocols.forms_py.ProtocolForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3619342_bread_and_pepper_django_userena.userena.contrib.umessages.fields_py.CommaSeparatedUserField.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.BaseTranslatableModelForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.forms_py.BaseTranslationFormSet.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3646959_kristianoellegaard_django_hvad.hvad.tests.forms_py.CustomLanguageNormalForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923647_code4romania_covid_19_ro_help.ro_help.hub.forms_py.NGOForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3957905_thehamkercat_williambutcherbot.wbb.utils.inlinefuncs_py.urban_func", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963840_matthewwithanm_django_imagekit.imagekit.forms.fields_py.ProcessedImageField.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.Folder.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.PostmanCollectionV1.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3964638_python_restx_flask_restx.flask_restx.postman_py.Request.as_dict", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94638225_rozari0_NezukoBot.nezuko.utils.inlinefuncs_py.urban_func", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95099308_explorerhq_django_sql_explorer.explorer.forms_py.QueryForm.clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95140527_hngprojects_hng_boilerplate_python_fastapi_web.api.v1.schemas.profile_py.ProfileCreateUpdate.phone_number_validator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.func_py.GroupedFunc.module_decl", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Field.c_to_mp", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Field.mp_to_c", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Struct.definition", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.struct_py.Struct.module_decl", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.union_py.Union.definition", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95161836_lvgl_micropython_lvgl_micropython.gen.api_gen.json_reader.union_py.Union.module_decl"].
aws-deadline_deadline-cloud
public
public
0
0
process_job_attachments
def process_job_attachments(farm_id, queue_id, inputs, outputDir, deadline_client, session):"""Uploads all of the input files to the Job Attachments S3 bucket associated withthe Deadline Queue, returning Attachment Settings to be associated with a Deadline Job."""print("Getting queue information...")start = time.perf_counter()queue = deadline_client.get_queue(farmId=farm_id, queueId=queue_id)total = time.perf_counter() - startprint(f"Finished getting queue information after {total} seconds.\n")print(f"Processing {len(inputs)} job attachments...")start = time.perf_counter()asset_manager = S3AssetManager(farm_id=farm_id,queue_id=queue_id,job_attachment_settings=JobAttachmentS3Settings(**queue["jobAttachmentSettings"]),session=session,)upload_group = asset_manager.prepare_paths_for_upload(inputs,[outputDir],[],)cache_directory = get_cache_directory()(_, manifests) = asset_manager.hash_assets_and_create_manifest(upload_group.asset_groups,upload_group.total_input_files,upload_group.total_input_bytes,cache_directory,)(_, attachments) = asset_manager.upload_assets(manifests, s3_check_cache_dir=cache_directory)attachments_dict = attachments.to_dict()total = time.perf_counter() - startprint(f"Finished processing job attachments after {total} seconds.\n")print(f"Created these attachment settings: {attachments_dict}\n")return attachments_dict
1
32
6
185
7
26
64
26
farm_id,queue_id,inputs,outputDir,deadline_client,session
['queue', 'total', 'start', 'attachments_dict', 'upload_group', 'cache_directory', 'asset_manager']
Returns
{"Assign": 11, "Expr": 6, "Return": 1}
18
39
18
["print", "time.perf_counter", "deadline_client.get_queue", "time.perf_counter", "print", "print", "len", "time.perf_counter", "S3AssetManager", "JobAttachmentS3Settings", "asset_manager.prepare_paths_for_upload", "get_cache_directory", "asset_manager.hash_assets_and_create_manifest", "asset_manager.upload_assets", "attachments.to_dict", "time.perf_counter", "print", "print"]
0
[]
The function (process_job_attachments) defined within the public class called public.The function start at line 26 and ends at 64. It contains 32 lines of code and it has a cyclomatic complexity of 1. It takes 6 parameters, represented as [26.0], and this function return a value. It declares 18.0 functions, and It has 18.0 functions called inside which are ["print", "time.perf_counter", "deadline_client.get_queue", "time.perf_counter", "print", "print", "len", "time.perf_counter", "S3AssetManager", "JobAttachmentS3Settings", "asset_manager.prepare_paths_for_upload", "get_cache_directory", "asset_manager.hash_assets_and_create_manifest", "asset_manager.upload_assets", "attachments.to_dict", "time.perf_counter", "print", "print"].
aws-deadline_deadline-cloud
public
public
0
0
submit_custom_job
def submit_custom_job(farm_id, queue_id, job_template, attachment_settings, parameters, deadline_client):"""Submits a Job defined in the Job Template to the given Queue, adding the givent Attachment Settingsto the Job definition."""# Submit the Jobprint("Submitting the job...")start = time.perf_counter()response = deadline_client.create_job(farmId=farm_id,queueId=queue_id,template=job_template,templateType="YAML",attachments=attachment_settings if attachment_settings else None,parameters=parameters,priority=50,)total = time.perf_counter() - startprint(f"Submitted Job Template after {total} seconds:")pprint.pprint(job_template.encode())print(f"Job ID: {response['jobId']}")
2
18
6
95
3
103
126
103
farm_id,queue_id,job_template,attachment_settings,parameters,deadline_client
['total', 'start', 'response']
None
{"Assign": 3, "Expr": 5}
8
24
8
["print", "time.perf_counter", "deadline_client.create_job", "time.perf_counter", "print", "pprint.pprint", "job_template.encode", "print"]
0
[]
The function (submit_custom_job) defined within the public class called public.The function start at line 103 and ends at 126. It contains 18 lines of code and it has a cyclomatic complexity of 2. It takes 6 parameters, represented as [103.0] and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["print", "time.perf_counter", "deadline_client.create_job", "time.perf_counter", "print", "pprint.pprint", "job_template.encode", "print"].
aws-deadline_deadline-cloud
public
public
0
0
main
def main():parser = argparse.ArgumentParser()parser.add_argument("--max-entries", type=int, default=10, help="How many entries to limit the summary to.")parser.add_argument("--file-sizes", default=False, action="store_true", help="Include file sizes.")parser.add_argument("--skip-dot-paths",default=False,action="store_true",help="Skip directories and files that start with '.'.",)parser.add_argument("--exclude-totals",default=False,action="store_true",help="Exclude totals from the summary.",)parser.add_argument("--follow-symlinks",default=False,action="store_true",help="Follows symlinks for directory traversal and file sizes.",)parser.add_argument("summary_dir", help="The directory to summarize.")args = parser.parse_args()if not os.path.exists(args.summary_dir) or not os.path.isdir(args.summary_dir):print(f"Directory not found: {args.summary_dir}")total_size_by_path: dict[str, int] = Noneif args.file_sizes:total_size_by_path = {}path_list = []dirs_to_visit = [args.summary_dir]while dirs_to_visit:dir = dirs_to_visit.pop()for entry in os.scandir(dir):if entry.is_dir(follow_symlinks=args.follow_symlinks):if not (args.skip_dot_paths and entry.name.startswith(".")):dirs_to_visit.append(entry.path)elif entry.is_file(follow_symlinks=args.follow_symlinks):if not (args.skip_dot_paths and entry.name.startswith(".")):path_list.append(entry.path)if total_size_by_path is not None:total_size_by_path[entry.path] = entry.stat(follow_symlinks=args.follow_symlinks).st_sizeif path_list:print(summarize_path_list(path_list,total_size_by_path=total_size_by_path,max_entries=args.max_entries,include_totals=not args.exclude_totals,))else:print(f"No files found in {args.summary_dir}")
14
59
0
335
6
20
85
20
['parser', 'args', 'dirs_to_visit', 'total_size_by_path', 'dir', 'path_list']
None
{"AnnAssign": 1, "Assign": 7, "Expr": 11, "For": 1, "If": 8, "While": 1}
23
66
23
["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "os.path.exists", "os.path.isdir", "print", "dirs_to_visit.pop", "os.scandir", "entry.is_dir", "entry.name.startswith", "dirs_to_visit.append", "entry.is_file", "entry.name.startswith", "path_list.append", "entry.stat", "print", "summarize_path_list", "print"]
134
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]
The function (main) defined within the public class called public.The function start at line 20 and ends at 85. It contains 59 lines of code and it has a cyclomatic complexity of 14. The function does not take any parameters and does not return any value. It declares 23.0 functions, It has 23.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "os.path.exists", "os.path.isdir", "print", "dirs_to_visit.pop", "os.scandir", "entry.is_dir", "entry.name.startswith", "dirs_to_visit.append", "entry.is_file", "entry.name.startswith", "path_list.append", "entry.stat", "print", "summarize_path_list", "print"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"].
aws-deadline_deadline-cloud
public
public
0
0
run
def run():print(MESSAGE_HOW_TO_CANCEL)parser = argparse.ArgumentParser(description=MESSAGE_HOW_TO_CANCEL)parser.add_argument("test_to_run",choices=["sync_inputs", "download_outputs"],help="Test to run. ('sync_inputs' or 'download_outputs')",)parser.add_argument("-f", "--farm-id", type=str, help="Deadline Farm to download assets from.", required=True)parser.add_argument("-q", "--queue-id", type=str, help="Deadline Queue to download assets from.", required=True)parser.add_argument("-j", "--job-id", type=str, help="Deadline Job to download assets from.", required=True)args = parser.parse_args()test_to_run = args.test_to_runfarm_id = args.farm_idqueue_id = args.queue_idjob_id = args.job_idif test_to_run == "sync_inputs":test_sync_inputs(farm_id=farm_id, queue_id=queue_id, job_id=job_id)elif test_to_run == "download_outputs":test_download_outputs(farm_id=farm_id, queue_id=queue_id, job_id=job_id)
3
26
0
162
6
36
63
36
['parser', 'args', 'test_to_run', 'farm_id', 'job_id', 'queue_id']
None
{"Assign": 6, "Expr": 7, "If": 2}
9
28
9
["print", "argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "test_sync_inputs", "test_download_outputs"]
281
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"]
The function (run) defined within the public class called public.The function start at line 36 and ends at 63. It contains 26 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["print", "argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "test_sync_inputs", "test_download_outputs"], It has 281.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"].
aws-deadline_deadline-cloud
public
public
0
0
test_sync_inputs
def test_sync_inputs(farm_id: str,queue_id: str,job_id: str,):"""Tests cancellation during execution of the `sync_inputs` function."""start_time = time.perf_counter()with TemporaryDirectory() as temp_root_dir:print(f"Created a temporary directory for the test: {temp_root_dir}")queue = get_queue(farm_id=farm_id, queue_id=queue_id)job = get_job(farm_id=farm_id, queue_id=queue_id, job_id=job_id)print("Starting test to sync inputs...")asset_sync = AssetSync(farm_id=farm_id)try:download_start = time.perf_counter()(summary_statistics, local_roots) = asset_sync.sync_inputs(s3_settings=queue.jobAttachmentSettings,attachments=job.attachments,queue_id=queue_id,job_id=job_id,session_dir=pathlib.Path(temp_root_dir),on_downloading_files=mock_on_downloading_files,)print(f"Download Summary Statistics:\n{summary_statistics}")print(f"Finished downloading after {time.perf_counter() - download_start} seconds, returned:\n{local_roots}")except AssetSyncCancelledError as asce:print(f"AssetSyncCancelledError: {asce}")print(f"payload: {asce.summary_statistics}")print(f"\nTotal test runtime: {time.perf_counter() - start_time}")print(f"Cleaned up the temporary directory: {temp_root_dir}")global main_terminatedmain_terminated = True
2
33
3
169
6
66
108
66
farm_id,queue_id,job_id
['queue', 'download_start', 'start_time', 'main_terminated', 'job', 'asset_sync']
None
{"Assign": 7, "Expr": 9, "Try": 1, "With": 1}
18
43
18
["time.perf_counter", "TemporaryDirectory", "print", "get_queue", "get_job", "print", "AssetSync", "time.perf_counter", "asset_sync.sync_inputs", "pathlib.Path", "print", "print", "time.perf_counter", "print", "print", "print", "time.perf_counter", "print"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.run"]
The function (test_sync_inputs) defined within the public class called public.The function start at line 66 and ends at 108. It contains 33 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [66.0] and does not return any value. It declares 18.0 functions, It has 18.0 functions called inside which are ["time.perf_counter", "TemporaryDirectory", "print", "get_queue", "get_job", "print", "AssetSync", "time.perf_counter", "asset_sync.sync_inputs", "pathlib.Path", "print", "print", "time.perf_counter", "print", "print", "print", "time.perf_counter", "print"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.run"].
aws-deadline_deadline-cloud
public
public
0
0
test_download_outputs
def test_download_outputs(farm_id: str,queue_id: str,job_id: str,):"""Tests cancellation during execution of the `download_job_output` function."""start_time = time.perf_counter()queue = get_queue(farm_id=farm_id, queue_id=queue_id)print("Starting test to download outputs...")try:download_start = time.perf_counter()output_downloader = OutputDownloader(s3_settings=queue.jobAttachmentSettings,farm_id=farm_id,queue_id=queue_id,job_id=job_id,)summary_statistics = output_downloader.download_job_output(on_downloading_files=mock_on_downloading_files)print(f"Download Summary Statistics:\n{summary_statistics}")print(f"Finished downloading after {time.perf_counter() - download_start} seconds")except AssetSyncCancelledError as asce:print(f"AssetSyncCancelledError: {asce}")print(f"payload: {asce.summary_statistics}")print(f"\nTotal test runtime: {time.perf_counter() - start_time}")global main_terminatedmain_terminated = True
2
27
3
117
6
111
146
111
farm_id,queue_id,job_id
['queue', 'download_start', 'output_downloader', 'summary_statistics', 'start_time', 'main_terminated']
None
{"Assign": 6, "Expr": 7, "Try": 1}
13
36
13
["time.perf_counter", "get_queue", "print", "time.perf_counter", "OutputDownloader", "output_downloader.download_job_output", "print", "print", "time.perf_counter", "print", "print", "print", "time.perf_counter"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run"]
The function (test_download_outputs) defined within the public class called public.The function start at line 111 and ends at 146. It contains 27 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [111.0] and does not return any value. It declares 13.0 functions, It has 13.0 functions called inside which are ["time.perf_counter", "get_queue", "print", "time.perf_counter", "OutputDownloader", "output_downloader.download_job_output", "print", "print", "time.perf_counter", "print", "print", "print", "time.perf_counter"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run"].
aws-deadline_deadline-cloud
public
public
0
0
mock_on_downloading_files
def mock_on_downloading_files(metadata):print(metadata)return mock_on_cancellation_check()
1
3
1
13
0
149
151
149
metadata
[]
Returns
{"Expr": 1, "Return": 1}
2
3
2
["print", "mock_on_cancellation_check"]
0
[]
The function (mock_on_downloading_files) defined within the public class called public.The function start at line 149 and ends at 151. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["print", "mock_on_cancellation_check"].
aws-deadline_deadline-cloud
public
public
0
0
mock_on_cancellation_check
def mock_on_cancellation_check():return continue_reporting
1
2
0
6
0
154
155
154
[]
Returns
{"Return": 1}
0
2
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.mock_on_downloading_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_preparing_to_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_uploading_assets"]
The function (mock_on_cancellation_check) defined within the public class called public.The function start at line 154 and ends at 155. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.mock_on_downloading_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_preparing_to_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_uploading_assets"].
aws-deadline_deadline-cloud
public
public
0
0
wait_for_cancellation_input
def wait_for_cancellation_input():while not main_terminated:ch = input()if ch == "k":set_cancelled()break
3
6
0
22
1
158
163
158
['ch']
None
{"Assign": 1, "Expr": 1, "If": 1, "While": 1}
2
6
2
["input", "set_cancelled"]
0
[]
The function (wait_for_cancellation_input) defined within the public class called public.The function start at line 158 and ends at 163. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["input", "set_cancelled"].
aws-deadline_deadline-cloud
public
public
0
0
set_cancelled
def set_cancelled():global continue_reportingcontinue_reporting = Falseprint("Canceled the process.")
1
4
0
13
1
166
169
166
['continue_reporting']
None
{"Assign": 1, "Expr": 1}
1
4
1
["print"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.wait_for_cancellation_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.wait_for_cancellation_input"]
The function (set_cancelled) defined within the public class called public.The function start at line 166 and ends at 169. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["print"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.wait_for_cancellation_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.wait_for_cancellation_input"].
aws-deadline_deadline-cloud
public
public
0
0
run_test
def run_test():parser = argparse.ArgumentParser()parser.add_argument("-n", "--num_files", type=int, required=True)parser.add_argument("-f", "--file_permission", required=False, type=str, default="FULL_CONTROL")parser.add_argument("-d", "--dir_permission", required=False, type=str, default="FULL_CONTROL")parser.add_argument("-u", "--target_user", required=True, type=str)parser.add_argument("-du", "--disjoint_user", required=True, type=str)parser.add_argument("-ex","--use_extended_paths",action="store_true",help="Use extended-length file paths (\\\\?\\)",)args = parser.parse_args()num_files = args.num_filesfile_permission = WindowsPermissionEnum(args.file_permission.upper())dir_permission = WindowsPermissionEnum(args.dir_permission.upper())with TemporaryDirectory() as temp_root_dir:print(f"Created a temporary directory for the test: {temp_root_dir}")print("Creating temporary files...")files = []for i in range(0, num_files):sub_dir = Path(temp_root_dir) / "sub_directory"sub_dir.mkdir(parents=True, exist_ok=True)if i < num_files / 2:file_path = Path(temp_root_dir) / f"test{i}.txt"else:file_path = sub_dir / f"test{i}.txt"if not os.path.exists(file_path):with file_path.open("w", encoding="utf-8") as f:f.write(f"test: {i}")file_path_str = (_to_extended_path(file_path) if args.use_extended_paths else str(file_path))files.append(str(file_path_str))print("Temporary files created.")print("Running test: Setting file permissions...")start_time = time.perf_counter()fs_permission_settings = WindowsFileSystemPermissionSettings(os_user=args.target_user,dir_mode=dir_permission,file_mode=file_permission,)_set_fs_permission_for_windows(file_paths=files,local_root=temp_root_dir,fs_permission_settings=fs_permission_settings,)print("File permissions set.")print(f"Total running time for {num_files} files: {time.perf_counter() - start_time}")print("Checking file permissions...")for path in files:assert check_file_permission(path, args.target_user) == (True, True)assert check_file_permission(path, args.disjoint_user) == (False, False)print("Verified that file permissions are correctly set.")print(f"Cleaned up the temporary directory: {temp_root_dir}")
6
56
0
390
11
61
125
61
['file_permission', 'parser', 'args', 'sub_dir', 'fs_permission_settings', 'file_path_str', 'dir_permission', 'files', 'num_files', 'file_path', 'start_time']
None
{"Assign": 12, "Expr": 19, "For": 2, "If": 2, "With": 2}
39
65
39
["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "WindowsPermissionEnum", "args.file_permission.upper", "WindowsPermissionEnum", "args.dir_permission.upper", "TemporaryDirectory", "print", "print", "range", "Path", "sub_dir.mkdir", "Path", "os.path.exists", "file_path.open", "f.write", "_to_extended_path", "str", "files.append", "str", "print", "print", "time.perf_counter", "WindowsFileSystemPermissionSettings", "_set_fs_permission_for_windows", "print", "print", "time.perf_counter", "print", "check_file_permission", "check_file_permission", "print", "print"]
5
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3720482_torproject_chutney.lib.chutney.TorNet_py.runConfigFile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982589_pysmt_pysmt.examples.efsmt_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.shell.train_py.run_train_and_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94654328_simonsobs_sotodlib.tests.test_pmat_py.PmatTest.test_pmat_healpix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94654328_simonsobs_sotodlib.tests.test_pmat_py.PmatTest.test_pmat_rectpix"]
The function (run_test) defined within the public class called public.The function start at line 61 and ends at 125. It contains 56 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 39.0 functions, It has 39.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "WindowsPermissionEnum", "args.file_permission.upper", "WindowsPermissionEnum", "args.dir_permission.upper", "TemporaryDirectory", "print", "print", "range", "Path", "sub_dir.mkdir", "Path", "os.path.exists", "file_path.open", "f.write", "_to_extended_path", "str", "files.append", "str", "print", "print", "time.perf_counter", "WindowsFileSystemPermissionSettings", "_set_fs_permission_for_windows", "print", "print", "time.perf_counter", "print", "check_file_permission", "check_file_permission", "print", "print"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3720482_torproject_chutney.lib.chutney.TorNet_py.runConfigFile", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982589_pysmt_pysmt.examples.efsmt_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.shell.train_py.run_train_and_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94654328_simonsobs_sotodlib.tests.test_pmat_py.PmatTest.test_pmat_healpix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94654328_simonsobs_sotodlib.tests.test_pmat_py.PmatTest.test_pmat_rectpix"].
aws-deadline_deadline-cloud
public
public
0
0
check_file_permission
def check_file_permission(file_path, username) -> Tuple[bool, bool]:# Get the file's security informationsd = win32security.GetFileSecurity(file_path, win32security.DACL_SECURITY_INFORMATION)# Get the discretionary access control list (DACL)dacl = sd.GetSecurityDescriptorDacl()# Lookup the user's SID (Security Identifier)sid, _, _ = win32security.LookupAccountName("", username)# Trusteetrustee = {"TrusteeForm": win32security.TRUSTEE_IS_SID,"TrusteeType": win32security.TRUSTEE_IS_USER,"Identifier": sid,}# Get effective rightsresult = dacl.GetEffectiveRightsFromAcl(trustee)# Return a tuple of (has read access, has write access)return (bool(result & con.FILE_GENERIC_READ), bool(result & con.FILE_GENERIC_WRITE))
1
11
2
95
4
128
149
128
file_path,username
['dacl', 'result', 'trustee', 'sd']
Tuple[bool, bool]
{"Assign": 5, "Return": 1}
6
22
6
["win32security.GetFileSecurity", "sd.GetSecurityDescriptorDacl", "win32security.LookupAccountName", "dacl.GetEffectiveRightsFromAcl", "bool", "bool"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test"]
The function (check_file_permission) defined within the public class called public.The function start at line 128 and ends at 149. It contains 11 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [128.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["win32security.GetFileSecurity", "sd.GetSecurityDescriptorDacl", "win32security.LookupAccountName", "dacl.GetEffectiveRightsFromAcl", "bool", "bool"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test"].
aws-deadline_deadline-cloud
public
public
0
0
_to_extended_path
def _to_extended_path(path: Path) -> str:# Convert to absolute and apply extended-length prefixreturn f"\\\\?\\{path.resolve()}"
1
2
1
12
0
152
154
152
path
[]
str
{"Return": 1}
1
3
1
["path.resolve"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test"]
The function (_to_extended_path) defined within the public class called public.The function start at line 152 and ends at 154. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["path.resolve"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.set_file_permission_for_windows_py.run_test"].
aws-deadline_deadline-cloud
public
public
0
0
run
def run():parser = argparse.ArgumentParser()parser.add_argument("-f", "--farm-id", type=str, help="Deadline Farm to download assets from.", required=True)parser.add_argument("-q", "--queue-id", type=str, help="Deadline Queue to download assets from.", required=True)parser.add_argument("-j", "--job-id", type=str, help="Deadline Job to download assets from.", required=True)parser.add_argument("-s","--step-ids",nargs="+",type=str,help="IDs of steps to sync inputs from",required=False,)args = parser.parse_args()farm_id = args.farm_idqueue_id = args.queue_idjob_id = args.job_idstep_ids = args.step_idstest_sync_inputs(farm_id, queue_id, job_id, step_ids)
1
25
0
133
6
29
55
29
['parser', 'args', 'farm_id', 'step_ids', 'job_id', 'queue_id']
None
{"Assign": 6, "Expr": 5}
7
27
7
["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "test_sync_inputs"]
281
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"]
The function (run) defined within the public class called public.The function start at line 29 and ends at 55. It contains 25 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 7.0 functions, It has 7.0 functions called inside which are ["argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.add_argument", "parser.parse_args", "test_sync_inputs"], It has 281.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"].
aws-deadline_deadline-cloud
public
public
0
0
test_sync_inputs
def test_sync_inputs(farm_id: str,queue_id: str,job_id: str,step_ids: list[str],):"""Downloads all inputs for a given job, and the outputs of the provided steps within the job."""with TemporaryDirectory() as temp_root_dir:print(f"Created a temporary directory for the test: {temp_root_dir}\n")queue = get_queue(farm_id=farm_id, queue_id=queue_id)job = get_job(farm_id=farm_id, queue_id=queue_id, job_id=job_id)print("Starting test to sync inputs...\n")asset_sync = AssetSync(farm_id=farm_id)download_start = time.perf_counter()(summary_statistics, local_roots) = asset_sync.sync_inputs(s3_settings=queue.jobAttachmentSettings,attachments=job.attachments,queue_id=queue_id,job_id=job_id,session_dir=pathlib.Path(temp_root_dir),step_dependencies=step_ids,)print(f"Download Summary Statistics:\n{summary_statistics}")print(f"Finished downloading after {time.perf_counter() - download_start} seconds, returned:")pprint(local_roots)print("\nListing files in the temporary directory:")for pathmapping in local_roots:all_files = _get_files_list_recursively(pathlib.Path(pathmapping["destination_path"]))for file in all_files:print(file)print(f"\nCleaned up the temporary directory: {temp_root_dir}")
3
32
4
178
5
58
99
58
farm_id,queue_id,job_id,step_ids
['queue', 'all_files', 'download_start', 'job', 'asset_sync']
None
{"Assign": 6, "Expr": 9, "For": 2, "With": 1}
18
42
18
["TemporaryDirectory", "print", "get_queue", "get_job", "print", "AssetSync", "time.perf_counter", "asset_sync.sync_inputs", "pathlib.Path", "print", "print", "time.perf_counter", "pprint", "print", "_get_files_list_recursively", "pathlib.Path", "print", "print"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.run"]
The function (test_sync_inputs) defined within the public class called public.The function start at line 58 and ends at 99. It contains 32 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [58.0] and does not return any value. It declares 18.0 functions, It has 18.0 functions called inside which are ["TemporaryDirectory", "print", "get_queue", "get_job", "print", "AssetSync", "time.perf_counter", "asset_sync.sync_inputs", "pathlib.Path", "print", "print", "time.perf_counter", "pprint", "print", "_get_files_list_recursively", "pathlib.Path", "print", "print"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.run"].
aws-deadline_deadline-cloud
public
public
0
0
_get_files_list_recursively
def _get_files_list_recursively(directory: pathlib.Path):files_list = []for file in directory.iterdir():if file.is_file():files_list.append(file)for subdirectory in directory.iterdir():if subdirectory.is_dir():subdirectory_files = _get_files_list_recursively(subdirectory)files_list.extend(subdirectory_files)return files_list
5
10
1
65
2
102
114
102
directory
['files_list', 'subdirectory_files']
Returns
{"Assign": 2, "Expr": 2, "For": 2, "If": 2, "Return": 1}
7
13
7
["directory.iterdir", "file.is_file", "files_list.append", "directory.iterdir", "subdirectory.is_dir", "_get_files_list_recursively", "files_list.extend"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py._get_files_list_recursively", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.test_sync_inputs"]
The function (_get_files_list_recursively) defined within the public class called public.The function start at line 102 and ends at 114. It contains 10 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters, and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["directory.iterdir", "file.is_file", "files_list.append", "directory.iterdir", "subdirectory.is_dir", "_get_files_list_recursively", "files_list.extend"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py._get_files_list_recursively", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.sync_inputs_with_step_deps_py.test_sync_inputs"].
aws-deadline_deadline-cloud
public
public
0
0
run
def run():print(MESSAGE_HOW_TO_CANCEL)start_time = time.perf_counter()parser = argparse.ArgumentParser()parser.add_argument("-f", "--farm-id", type=str, help="Deadline Farm you want to submit to.", required=True)parser.add_argument("-q", "--queue-id", type=str, help="Deadline Queue you want to submit to.", required=True)args = parser.parse_args()farm_id = args.farm_idqueue_id = args.queue_idprint("Setting up the test...")files = []root_path = pathlib.Path("/tmp/test_submit")root_path.mkdir(parents=True, exist_ok=True)if NUM_TINY_FILES > 0:for i in range(0, NUM_TINY_FILES):file_path = root_path / f"tiny_test{i}.txt"if not os.path.exists(file_path):with file_path.open("wb") as f:f.write(os.urandom(2 * (1024**2)))# 2 MB filesfiles.append(str(file_path))# Make small filesif NUM_SMALL_FILES > 0:for i in range(0, NUM_SMALL_FILES):file_path = root_path / f"small_test{i}.txt"if not os.path.exists(file_path):with file_path.open("wb") as f:f.write(os.urandom(10 * (1024**2)))# 10 MB filesfiles.append(str(file_path))# Make medium-sized filesif NUM_MEDIUM_FILES > 0:for i in range(0, NUM_MEDIUM_FILES):file_path = root_path / f"medium_test{i}.txt"if not os.path.exists(file_path):with file_path.open("wb") as f:f.write(os.urandom(100 * (1024**2)))# 100 MB filesfiles.append(str(file_path))# Make large filesif NUM_LARGE_FILES > 0:for i in range(0, NUM_LARGE_FILES):file_path = root_path / f"large_test{i}.txt"if not os.path.exists(file_path):with file_path.open("ab") as f:_create_large_file_with_chunks(file_path, 20 * (1024**3), 10**9)files.append(str(file_path))queue = get_queue(farm_id=farm_id, queue_id=queue_id)asset_manager = S3AssetManager(farm_id=farm_id, queue_id=queue_id, job_attachment_settings=queue.jobAttachmentSettings)print("\nStarting test...")start = time.perf_counter()try:print("\nStart hashing...")upload_group = asset_manager.prepare_paths_for_upload(".", files, [root_path / "outputs"], [])(summary_statistics_hashing, manifests) = asset_manager.hash_assets_and_create_manifest(asset_groups=upload_group.asset_groups,total_input_files=upload_group.total_input_files,total_input_bytes=upload_group.total_input_bytes,on_preparing_to_submit=mock_on_preparing_to_submit,)print(f"Hashing Summary Statistics:\n{summary_statistics_hashing}")print("\nStart uploading...")(summary_statistics_upload, attachment_settings) = asset_manager.upload_assets(manifests, on_uploading_assets=mock_on_uploading_assets)print(f"Upload Summary Statistics:\n{summary_statistics_upload}")total = time.perf_counter() - startprint(f"Finished uploading after {total} seconds, created these attachment settings:\n{attachment_settings.to_dict()}")except AssetSyncCancelledError as asce:print(f"AssetSyncCancelledError: {asce}")print(f"payload:\n{asce.summary_statistics}")print(f"\nTotal test runtime: {time.perf_counter() - start_time}")global main_terminatedmain_terminated = True
14
78
0
546
14
44
139
44
['queue', 'parser', 'args', 'asset_manager', 'start', 'total', 'root_path', 'farm_id', 'files', 'upload_group', 'file_path', 'start_time', 'main_terminated', 'queue_id']
None
{"Assign": 19, "Expr": 22, "For": 4, "If": 8, "Try": 1, "With": 4}
54
96
54
["print", "time.perf_counter", "argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.parse_args", "print", "pathlib.Path", "root_path.mkdir", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "_create_large_file_with_chunks", "files.append", "str", "get_queue", "S3AssetManager", "print", "time.perf_counter", "print", "asset_manager.prepare_paths_for_upload", "asset_manager.hash_assets_and_create_manifest", "print", "print", "asset_manager.upload_assets", "print", "time.perf_counter", "print", "attachment_settings.to_dict", "print", "print", "print", "time.perf_counter"]
281
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"]
The function (run) defined within the public class called public.The function start at line 44 and ends at 139. It contains 78 lines of code and it has a cyclomatic complexity of 14. The function does not take any parameters and does not return any value. It declares 54.0 functions, It has 54.0 functions called inside which are ["print", "time.perf_counter", "argparse.ArgumentParser", "parser.add_argument", "parser.add_argument", "parser.parse_args", "print", "pathlib.Path", "root_path.mkdir", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "f.write", "os.urandom", "files.append", "str", "range", "os.path.exists", "file_path.open", "_create_large_file_with_chunks", "files.append", "str", "get_queue", "S3AssetManager", "print", "time.perf_counter", "print", "asset_manager.prepare_paths_for_upload", "asset_manager.hash_assets_and_create_manifest", "print", "print", "asset_manager.upload_assets", "print", "time.perf_counter", "print", "attachment_settings.to_dict", "print", "print", "print", "time.perf_counter"], It has 281.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"].
aws-deadline_deadline-cloud
public
public
0
0
_create_large_file_with_chunks
def _create_large_file_with_chunks(file_path: str, total_size: int, chunk_size: int) -> None:"""Creates a large file of a given total size by writing in chunks with random data.It prevents MemoryError by dividing the size into manageable chunks and writingeach chunk sequentially."""with open(file_path, "wb") as f:num_chunks = total_size // chunk_sizefor _ in range(num_chunks):f.write(os.urandom(chunk_size))remaining = total_size % chunk_sizeif remaining > 0:f.write(os.urandom(remaining))
3
8
3
72
2
142
154
142
file_path,total_size,chunk_size
['remaining', 'num_chunks']
None
{"Assign": 2, "Expr": 3, "For": 1, "If": 1, "With": 1}
6
13
6
["open", "range", "f.write", "os.urandom", "f.write", "os.urandom"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.run"]
The function (_create_large_file_with_chunks) defined within the public class called public.The function start at line 142 and ends at 154. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [142.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["open", "range", "f.write", "os.urandom", "f.write", "os.urandom"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.run"].
aws-deadline_deadline-cloud
public
public
0
0
mock_on_preparing_to_submit
def mock_on_preparing_to_submit(metadata):print(metadata)return mock_on_cancellation_check()
1
3
1
13
0
157
159
157
metadata
[]
Returns
{"Expr": 1, "Return": 1}
2
3
2
["print", "mock_on_cancellation_check"]
0
[]
The function (mock_on_preparing_to_submit) defined within the public class called public.The function start at line 157 and ends at 159. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["print", "mock_on_cancellation_check"].
aws-deadline_deadline-cloud
public
public
0
0
mock_on_uploading_assets
def mock_on_uploading_assets(metadata):print(metadata)return mock_on_cancellation_check()
1
3
1
13
0
162
164
162
metadata
[]
Returns
{"Expr": 1, "Return": 1}
2
3
2
["print", "mock_on_cancellation_check"]
0
[]
The function (mock_on_uploading_assets) defined within the public class called public.The function start at line 162 and ends at 164. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["print", "mock_on_cancellation_check"].
aws-deadline_deadline-cloud
public
public
0
0
mock_on_cancellation_check
def mock_on_cancellation_check():return continue_reporting
1
2
0
6
0
167
168
167
[]
Returns
{"Return": 1}
0
2
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.mock_on_downloading_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_preparing_to_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_uploading_assets"]
The function (mock_on_cancellation_check) defined within the public class called public.The function start at line 167 and ends at 168. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.mock_on_downloading_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_preparing_to_submit", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.mock_on_uploading_assets"].
aws-deadline_deadline-cloud
public
public
0
0
wait_for_cancellation_input
def wait_for_cancellation_input():while not main_terminated:ch = input()if ch == "k":set_cancelled()break
3
6
0
22
1
171
176
171
['ch']
None
{"Assign": 1, "Expr": 1, "If": 1, "While": 1}
2
6
2
["input", "set_cancelled"]
0
[]
The function (wait_for_cancellation_input) defined within the public class called public.The function start at line 171 and ends at 176. It contains 6 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["input", "set_cancelled"].
aws-deadline_deadline-cloud
public
public
0
0
set_cancelled
def set_cancelled():global continue_reportingcontinue_reporting = Falseprint("Canceled the process.")
1
4
0
13
1
179
182
179
['continue_reporting']
None
{"Assign": 1, "Expr": 1}
1
4
1
["print"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.wait_for_cancellation_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.wait_for_cancellation_input"]
The function (set_cancelled) defined within the public class called public.The function start at line 179 and ends at 182. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["print"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.download_cancel_test_py.wait_for_cancellation_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.upload_cancel_test_py.wait_for_cancellation_input"].
aws-deadline_deadline-cloud
public
public
0
0
profile
def profile(name: str,farm_id: str,queue_id: str,job_bundle: Path,output_dir: Path,output_format: OutputFormat,parameters: Optional[dict[str, Union[int, float, str, Path]]] = None,) -> None:expanded_params = [["--parameter", f"{key}={str(value)}"]for key, value in (parameters.values() if parameters is not None else {})]subprocess.run(["pyinstrument","--renderer",str(output_format.value).lower(),"--outfile",str(output_dir / f"{name}.{str(output_format.value).lower()}"),"--from-path","deadline","bundle","submit",str(job_bundle),"--farm-id",farm_id,"--queue-id",queue_id,*expanded_params,"--yes",],input="y\n",text=True,check=True,)
3
36
7
154
1
18
53
18
name,farm_id,queue_id,job_bundle,output_dir,output_format,parameters
['expanded_params']
None
{"Assign": 1, "Expr": 1}
9
36
9
["str", "parameters.values", "subprocess.run", "lower", "str", "str", "lower", "str", "str"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.86980582_aws_neuron_transformers_neuronx.src.transformers_neuronx.generation_demo_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.profiling.profiling_py.cli"]
The function (profile) defined within the public class called public.The function start at line 18 and ends at 53. It contains 36 lines of code and it has a cyclomatic complexity of 3. It takes 7 parameters, represented as [18.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["str", "parameters.values", "subprocess.run", "lower", "str", "str", "lower", "str", "str"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.86980582_aws_neuron_transformers_neuronx.src.transformers_neuronx.generation_demo_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripted_tests.profiling.profiling_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
cli
def cli(farm_id: str, queue_id: str, output_dir: Path, output_format: OutputFormat) -> None:job_bundle_dir = Path(__file__).parent.parent / "job_bundles"if not output_dir.is_dir():output_dir.mkdir(parents=True)profile("minimal_job_bundle",farm_id,queue_id,job_bundle_dir / "minimal_job_bundle",output_dir,output_format,)profile("with_job_attachments",farm_id,queue_id,job_bundle_dir / "with_job_attachments",output_dir,output_format,)
2
20
4
83
1
68
87
68
farm_id,queue_id,output_dir,output_format
['job_bundle_dir']
None
{"Assign": 1, "Expr": 3, "If": 1}
11
20
11
["Path", "output_dir.is_dir", "output_dir.mkdir", "profile", "profile", "click.command", "click.option", "click.option", "click.option", "click.option", "click.Choice"]
154
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"]
The function (cli) defined within the public class called public.The function start at line 68 and ends at 87. It contains 20 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [68.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["Path", "output_dir.is_dir", "output_dir.mkdir", "profile", "profile", "click.command", "click.option", "click.option", "click.option", "click.option", "click.Choice"], It has 154.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"].
aws-deadline_deadline-cloud
public
public
0
0
setup_install_builder
def setup_install_builder(workdir: Path,install_builder_location: Optional[Path],license_file_path: Optional[Path],install_builder_s3_bucket: Optional[str] = None,install_builder_s3_key: Optional[str] = None,) -> Path:"""Ensure installbuilder is installed in some way and return the pathto the installation directory.The method of installing/finding installbuilder is based on the inputs:- If `install_builder_location` is provided, look for an installbuilder installation at that path- Else if `install_builder_s3_bucket` is provided, attempt to download and unpack install builder from that bucket- Else search for it at the default installation path (handy for dev mode)"""if install_builder_location is not None:selection = InstallBuilderSelection.from_path(install_builder_location)elif install_builder_s3_bucket is not None:selection = InstallBuilderSelection.from_s3(install_builder_s3_bucket, workdir, install_builder_s3_key)else:selection = InstallBuilderSelection.from_search()install_builder_path = selection.resolve_install_builder_installation(workdir)if platform.system() == "Windows":binary_name = "builder.exe"else:binary_name = "builder"if (not install_builder_path.is_dir()or not (install_builder_path / "bin" / binary_name).is_file()):raise FileNotFoundError(f"InstallBuilder path '{install_builder_path}' must be a directory containing 'bin/{binary_name}'.")if license_file_path is not None:shutil.copy(license_file_path, install_builder_path / "license.xml")return install_builder_path
7
30
5
156
3
26
68
26
workdir,install_builder_location,license_file_path,install_builder_s3_bucket,install_builder_s3_key
['selection', 'install_builder_path', 'binary_name']
Path
{"Assign": 6, "Expr": 2, "If": 5, "Return": 1}
9
43
9
["InstallBuilderSelection.from_path", "InstallBuilderSelection.from_s3", "InstallBuilderSelection.from_search", "selection.resolve_install_builder_installation", "platform.system", "install_builder_path.is_dir", "is_file", "FileNotFoundError", "shutil.copy"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.main"]
The function (setup_install_builder) defined within the public class called public.The function start at line 26 and ends at 68. It contains 30 lines of code and it has a cyclomatic complexity of 7. It takes 5 parameters, represented as [26.0] and does not return any value. It declares 9.0 functions, It has 9.0 functions called inside which are ["InstallBuilderSelection.from_path", "InstallBuilderSelection.from_s3", "InstallBuilderSelection.from_search", "selection.resolve_install_builder_installation", "platform.system", "install_builder_path.is_dir", "is_file", "FileNotFoundError", "shutil.copy"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.main"].
aws-deadline_deadline-cloud
public
public
0
0
build_installer
def build_installer(workdir: Path,component_file_path: Path,install_builder_location: Path,installer_platform: str,dev: bool,) -> Path:"""Actually build the installer"""if install_builder_location is None:raise FileNotFoundError("Could not find a default InstallBuilder path. Please specify one with '--install-builder-location'.")if not install_builder_location.is_dir():raise FileNotFoundError(f"InstallBuilder path '{install_builder_location}' must be a directory containing 'bin/builder'.")if installer_platform == "Linux":installbuilder_platform = "linux-x64"elif installer_platform == "MacOS":installbuilder_platform = "osx"elif installer_platform == "Windows":installbuilder_platform = "windows-x64"else:raise ValueError(f"Unknown platform '{installer_platform}'")install_builder_cli = install_builder_location / "bin" / "builder"out_dir = workdir / "out"installer_version = os.getenv("INSTALLER_VERSION") if not dev else "00000000"if installer_version is None:raise ValueError("INSTALLER_VERSION environment variable must be set.")output = run([install_builder_cli,"build",str(component_file_path),installbuilder_platform,"--setvars",f"project.outputDirectory={out_dir}",f"project.version={installer_version[:8]}-{datetime.today().date()}",])sys.stdout.write(f"{'-' * 30}\nBegin Install Builder Output\n{'-' * 30}\n"f"{output}\n"f"{'-' * 30}\nEnd Install Builder Output\n{'-' * 30}\n")if EVALUATION_VERSION_STRING in output and not dev:raise EvaluationBuildError("InstallBuilder was detected using an evaluation version.")elif dev and EVALUATION_VERSION_STRING not in output:raise EvaluationBuildError("InstallBuilder was not detected using an evaluation version when running a dev build. ""This could indicate that the error messaging when using an evaluation version has changed.\n""Please check the InstallBuilder logs to confirm if the error messaging has changed from "f"'{EVALUATION_VERSION_STRING}' and update the build_installer.py script accordingly.")return out_dir
12
54
5
189
5
71
131
71
workdir,component_file_path,install_builder_location,installer_platform,dev
['output', 'installer_version', 'installbuilder_platform', 'out_dir', 'install_builder_cli']
Path
{"Assign": 7, "Expr": 2, "If": 8, "Return": 1}
13
61
13
["FileNotFoundError", "install_builder_location.is_dir", "FileNotFoundError", "ValueError", "os.getenv", "ValueError", "run", "str", "date", "datetime.today", "sys.stdout.write", "EvaluationBuildError", "EvaluationBuildError"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.main"]
The function (build_installer) defined within the public class called public.The function start at line 71 and ends at 131. It contains 54 lines of code and it has a cyclomatic complexity of 12. It takes 5 parameters, represented as [71.0] and does not return any value. It declares 13.0 functions, It has 13.0 functions called inside which are ["FileNotFoundError", "install_builder_location.is_dir", "FileNotFoundError", "ValueError", "os.getenv", "ValueError", "run", "str", "date", "datetime.today", "sys.stdout.write", "EvaluationBuildError", "EvaluationBuildError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_py.main"].
aws-deadline_deadline-cloud
public
public
0
0
main
def main(dev: bool,install_builder_location: Optional[Path],install_builder_license_path: Optional[Path],install_builder_s3_bucket: Optional[str],install_builder_s3_key: Optional[str],output_dir: Optional[Path],cleanup: bool,installer_platform: str,installer_source_path: Path,) -> None:with tempfile.TemporaryDirectory() as wd:workdir = Path(wd)print(f"cwd: {os.getcwd()}")print(f"working directory: {str(workdir)}")installer_folder = Path(__file__).absolute().parent.parent / "installer"components_dir = installer_folder / "components"try:installbuilder_path = setup_install_builder(workdir=workdir,install_builder_location=install_builder_location,license_file_path=install_builder_license_path,install_builder_s3_bucket=install_builder_s3_bucket,install_builder_s3_key=install_builder_s3_key,)installer_dir = build_installer(workdir=workdir,component_file_path=installer_source_path,install_builder_location=installbuilder_path,dev=dev,installer_platform=installer_platform,)except Exception:if cleanup:shutil.rmtree(components_dir)raiseinstaller_filename = INSTALLER_FILENAMES[installer_platform]installer_path = installer_dir / installer_filename# The macOS .app installer will always be a directory, not a file.# Other OS installers will be files.if (not installer_path.is_dir()if installer_platform == "MacOS"else not installer_path.is_file()):raise FileNotFoundError(f"Expected installer file {installer_filename} not found in {installer_dir}.\n"f"Found:\n\t{os.linesep.join([str(i) for i in installer_dir.iterdir()])}")output_path = installer_filenameif output_dir:output_dir.mkdir(exist_ok=True)output_path = output_dir / output_pathshutil.move(installer_path, output_path)if cleanup:shutil.rmtree(components_dir)print(f"Deleted build directory: {components_dir}")
7
55
9
249
8
134
196
134
dev,install_builder_location,install_builder_license_path,install_builder_s3_bucket,install_builder_s3_key,output_dir,cleanup,installer_platform,installer_source_path
['output_path', 'workdir', 'components_dir', 'installer_filename', 'installer_dir', 'installer_path', 'installbuilder_path', 'installer_folder']
None
{"Assign": 9, "Expr": 7, "If": 4, "Try": 1, "With": 1}
21
63
21
["tempfile.TemporaryDirectory", "Path", "print", "os.getcwd", "print", "str", "absolute", "Path", "setup_install_builder", "build_installer", "shutil.rmtree", "installer_path.is_dir", "installer_path.is_file", "FileNotFoundError", "os.linesep.join", "str", "installer_dir.iterdir", "output_dir.mkdir", "shutil.move", "shutil.rmtree", "print"]
134
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"]
The function (main) defined within the public class called public.The function start at line 134 and ends at 196. It contains 55 lines of code and it has a cyclomatic complexity of 7. It takes 9 parameters, represented as [134.0] and does not return any value. It declares 21.0 functions, It has 21.0 functions called inside which are ["tempfile.TemporaryDirectory", "Path", "print", "os.getcwd", "print", "str", "absolute", "Path", "setup_install_builder", "build_installer", "shutil.rmtree", "installer_path.is_dir", "installer_path.is_file", "FileNotFoundError", "os.linesep.join", "str", "installer_dir.iterdir", "output_dir.mkdir", "shutil.move", "shutil.rmtree", "print"], It has 134.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16798231_kapji_capital_gains_calculator.cgt_calc.main_py.init", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_if_version_prints_version_and_stops", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_invalid_config_exits_with_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18106662_pdreker_fritz_exporter.tests.test_main_py.Test_Main.test_valid_args_run_clean", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.target_postgres.__init___py.cli", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_before_run_sql_is_executed_upon_construction", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_existing_new_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_newer_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_deduplication_older_rows", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_full_table_replication", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__generative", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__nullable__missing_from_schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__column_type_change__pks__same_resulting_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__empty__enabled_config_with_messages_for_only_one_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__column_type_change__pks__nullable", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__configuration__schema", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__default_null_value__non_nullable_column", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid__table_name__stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3683022_datamill_co_target_postgres.tests.unit.test_postgres_py.test_loading__invalid_column_name"].
aws-deadline_deadline-cloud
public
public
0
0
_snake_to_kebab
def _snake_to_kebab(name: str) -> str:return name.replace("_", "-")
1
2
1
18
0
15
16
15
name
[]
str
{"Return": 1}
1
2
1
["name.replace"]
4
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._dependency", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._mutually_exclude", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._require_if_all_false_or_unspecified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._require_if_false_or_unspecified"]
The function (_snake_to_kebab) defined within the public class called public.The function start at line 15 and ends at 16. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["name.replace"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._dependency", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._mutually_exclude", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._require_if_all_false_or_unspecified", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py._require_if_false_or_unspecified"].
aws-deadline_deadline-cloud
public
public
0
0
_combine_callbacks._combined_callback
def _combined_callback(ctx: click.Context, param: click.Option, value: Any):for callback in callbacks:value = callback(ctx, param, value)return value
2
4
3
36
0
26
29
26
null
[]
None
null
0
0
0
null
0
null
The function (_combine_callbacks._combined_callback) defined within the public class called public.The function start at line 26 and ends at 29. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [26.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_combine_callbacks
def _combine_callbacks(*callbacks: Callable[[click.Context, click.Option, Any], Any],) -> Callable[[click.Context, click.Option, Any], Any]:"""Combine multiple click callbacks which will then subsequently be called in order"""def _combined_callback(ctx: click.Context, param: click.Option, value: Any):for callback in callbacks:value = callback(ctx, param, value)return valuereturn _combined_callback
1
5
1
46
1
19
31
19
*callbacks
['value']
Callable[[click.Context, click.Option, Any], Any]
{"Assign": 1, "Expr": 1, "For": 1, "Return": 2}
1
13
1
["callback"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_combine_callbacks) defined within the public class called public.The function start at line 19 and ends at 31. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["callback"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
_mutually_exclude._callback
def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value:for other in others:if ctx.params.get(other):raise click.BadParameter(f"Cannot specify both --{param.name} and --{_snake_to_kebab(other)}")return value
4
8
3
49
0
39
46
39
null
[]
None
null
0
0
0
null
0
null
The function (_mutually_exclude._callback) defined within the public class called public.The function start at line 39 and ends at 46. It contains 8 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [39.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_mutually_exclude
def _mutually_exclude(others: Iterable[str]) -> Callable[[click.Context, click.Option, Any], Any]:"""Creates a callback wich raises an error if this argument is specified and any of other specified arguments are also specified"""def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value:for other in others:if ctx.params.get(other):raise click.BadParameter(f"Cannot specify both --{param.name} and --{_snake_to_kebab(other)}")return valuereturn _callback
1
3
1
32
0
34
48
34
others
[]
Callable[[click.Context, click.Option, Any], Any]
{"Expr": 1, "For": 1, "If": 2, "Return": 2}
3
15
3
["ctx.params.get", "click.BadParameter", "_snake_to_kebab"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_mutually_exclude) defined within the public class called public.The function start at line 34 and ends at 48. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["ctx.params.get", "click.BadParameter", "_snake_to_kebab"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
_dependency._callback
def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value:if not ctx.params.get(dependency):raise click.BadParameter(f"Must specify --{_snake_to_kebab(dependency)} when specifying --{param.name}")return value
3
7
3
45
0
56
62
56
null
[]
None
null
0
0
0
null
0
null
The function (_dependency._callback) defined within the public class called public.The function start at line 56 and ends at 62. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [56.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_dependency
def _dependency(dependency: str) -> Callable[[click.Context, click.Option, Any], Any]:"""Creates a callback that raises an error if this argument is specified but an given argument it depends on is not"""def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value:if not ctx.params.get(dependency):raise click.BadParameter(f"Must specify --{_snake_to_kebab(dependency)} when specifying --{param.name}")return valuereturn _callback
1
3
1
29
0
51
64
51
dependency
[]
Callable[[click.Context, click.Option, Any], Any]
{"Expr": 1, "If": 2, "Return": 2}
3
14
3
["ctx.params.get", "click.BadParameter", "_snake_to_kebab"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_dependency) defined within the public class called public.The function start at line 51 and ends at 64. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["ctx.params.get", "click.BadParameter", "_snake_to_kebab"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
_require_if_false_or_unspecified._callback
def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if not ctx.params.get(other):if not value:raise click.BadParameter(f"Must specify --{param.name} when --{_snake_to_kebab(other)} is not specified")return value
3
7
3
46
0
74
80
74
null
[]
None
null
0
0
0
null
0
null
The function (_require_if_false_or_unspecified._callback) defined within the public class called public.The function start at line 74 and ends at 80. It contains 7 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [74.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_require_if_false_or_unspecified
def _require_if_false_or_unspecified(other: str,) -> Callable[[click.Context, click.Option, Any], Any]:"""Creates a callback that raises an error if this argument is not specified and a given argument is either False or unspecified"""def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if not ctx.params.get(other):if not value:raise click.BadParameter(f"Must specify --{param.name} when --{_snake_to_kebab(other)} is not specified")return valuereturn _callback
1
5
1
30
0
67
82
67
other
[]
Callable[[click.Context, click.Option, Any], Any]
{"Expr": 1, "If": 2, "Return": 2}
3
16
3
["ctx.params.get", "click.BadParameter", "_snake_to_kebab"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_require_if_false_or_unspecified) defined within the public class called public.The function start at line 67 and ends at 82. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["ctx.params.get", "click.BadParameter", "_snake_to_kebab"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
_require_if_all_false_or_unspecified._callback
def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value is not None:return valuefor other in others:if ctx.params.get(other):return valueall_params = [f"--{_snake_to_kebab(other)}" for other in others]name = _snake_to_kebab(param.name if param.name else "")raise click.BadParameter(f"Must specify --{name} when none of {', '.join(all_params)} are specified")
6
11
3
78
0
92
102
92
null
[]
None
null
0
0
0
null
0
null
The function (_require_if_all_false_or_unspecified._callback) defined within the public class called public.The function start at line 92 and ends at 102. It contains 11 lines of code and it has a cyclomatic complexity of 6. It takes 3 parameters, represented as [92.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_require_if_all_false_or_unspecified
def _require_if_all_false_or_unspecified(others: Iterable[str],) -> Callable[[click.Context, click.Option, Any], Any]:"""Creates a callback that raises an error if this argument is not specified and all of the given arguments are either False or unspecified"""def _callback(ctx: click.Context, param: click.Option, value: Any) -> Any:if value is not None:return valuefor other in others:if ctx.params.get(other):return valueall_params = [f"--{_snake_to_kebab(other)}" for other in others]name = _snake_to_kebab(param.name if param.name else "")raise click.BadParameter(f"Must specify --{name} when none of {', '.join(all_params)} are specified")return _callback
1
5
1
33
2
85
104
85
others
['all_params', 'name']
Callable[[click.Context, click.Option, Any], Any]
{"Assign": 2, "Expr": 1, "For": 1, "If": 2, "Return": 3}
5
20
5
["ctx.params.get", "_snake_to_kebab", "_snake_to_kebab", "click.BadParameter", "join"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_require_if_all_false_or_unspecified) defined within the public class called public.The function start at line 85 and ends at 104. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["ctx.params.get", "_snake_to_kebab", "_snake_to_kebab", "click.BadParameter", "join"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
_current_platform_as_default
def _current_platform_as_default(_ctx: click.Context, _param: click.Option, value: Optional[str]) -> str:"""A callback that dynamically sets the default to the current platform"""if value is None:value = platform.system()if value == "Darwin":return "MacOS"return value
3
8
3
46
1
107
117
107
_ctx,_param,value
['value']
str
{"Assign": 1, "Expr": 1, "If": 2, "Return": 2}
1
11
1
["platform.system"]
0
[]
The function (_current_platform_as_default) defined within the public class called public.The function start at line 107 and ends at 117. It contains 8 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [107.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["platform.system"].
aws-deadline_deadline-cloud
public
public
0
0
_not_allowed_if_env_var_set._callback
def _callback(_ctx: click.Context, param: click.Option, value: Any) -> Any:if value and os.environ.get(env_var_name) is not None:raise click.BadParameter(f"--{param.name} cannot be used when {env_var_name} is set.")return value
3
4
3
46
0
127
130
127
null
[]
None
null
0
0
0
null
0
null
The function (_not_allowed_if_env_var_set._callback) defined within the public class called public.The function start at line 127 and ends at 130. It contains 4 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [127.0] and does not return any value..
aws-deadline_deadline-cloud
public
public
0
0
_not_allowed_if_env_var_set
def _not_allowed_if_env_var_set(env_var_name: str,) -> Callable[[click.Context, click.Option, Any], Any]:"""Creates a callback that raises an error if the argument was specified and a given environment variable is set"""def _callback(_ctx: click.Context, param: click.Option, value: Any) -> Any:if value and os.environ.get(env_var_name) is not None:raise click.BadParameter(f"--{param.name} cannot be used when {env_var_name} is set.")return valuereturn _callback
1
5
1
30
0
120
132
120
env_var_name
[]
Callable[[click.Context, click.Option, Any], Any]
{"Expr": 1, "If": 1, "Return": 2}
2
13
2
["os.environ.get", "click.BadParameter"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (_not_allowed_if_env_var_set) defined within the public class called public.The function start at line 120 and ends at 132. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["os.environ.get", "click.BadParameter"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
cli
def cli(install_builder_path: Optional[Path],install_builder_s3_bucket: Optional[str],install_builder_s3_key: Optional[str],install_builder_license_path: Optional[Path],dev: bool,local_dev: bool,platform: str,output_dir: Optional[Path],no_cleanup: bool,installer_source_path: Path,) -> None:cli_body(install_builder_path,install_builder_s3_bucket,install_builder_s3_key,install_builder_license_path,dev,local_dev,platform,output_dir,no_cleanup,installer_source_path,)
1
24
10
84
0
200
223
200
install_builder_path,install_builder_s3_bucket,install_builder_s3_key,install_builder_license_path,dev,local_dev,platform,output_dir,no_cleanup,installer_source_path
[]
None
{"Expr": 1}
26
24
26
["cli_body", "click.command", "click.option", "_mutually_exclude", "click.option", "_combine_callbacks", "_require_if_false_or_unspecified", "_mutually_exclude", "click.option", "_dependency", "click.option", "_require_if_all_false_or_unspecified", "click.option", "_mutually_exclude", "click.option", "_combine_callbacks", "_not_allowed_if_env_var_set", "_mutually_exclude", "click.option", "click.Choice", "_combine_callbacks", "_require_if_false_or_unspecified", "click.option", "_require_if_false_or_unspecified", "click.option", "click.option"]
154
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"]
The function (cli) defined within the public class called public.The function start at line 200 and ends at 223. It contains 24 lines of code and it has a cyclomatic complexity of 1. It takes 10 parameters, represented as [200.0] and does not return any value. It declares 26.0 functions, It has 26.0 functions called inside which are ["cli_body", "click.command", "click.option", "_mutually_exclude", "click.option", "_combine_callbacks", "_require_if_false_or_unspecified", "_mutually_exclude", "click.option", "_dependency", "click.option", "_require_if_all_false_or_unspecified", "click.option", "_mutually_exclude", "click.option", "_combine_callbacks", "_not_allowed_if_env_var_set", "_mutually_exclude", "click.option", "click.Choice", "_combine_callbacks", "_require_if_false_or_unspecified", "click.option", "_require_if_false_or_unspecified", "click.option", "click.option"], It has 154.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.create_resources", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_big_omap_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_keys_py.test_change_host_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.change_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.create_namespaces", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.test_change_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.try_change_one_namespace_lb_group_no_listeners", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_lb_py.verify_one_namespace_lb_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_change_ns_visibility_py.test_change_namespace_visibility", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_force_tls_py.TestForceTls.test_force_tls", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_all_hosts_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_discovery_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_invalid_nqn", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_subsys_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_no_access", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_namespace_subsystem_not_found", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_host_to_wrong_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_junk_host_to_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_nsid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69860788_ceph_ceph_nvmeof.tests.test_cli_py.TestCreate.test_add_namespace_double_uuid"].
aws-deadline_deadline-cloud
public
public
0
0
cli_body
def cli_body(install_builder_path: Optional[Path],install_builder_s3_bucket: Optional[str],install_builder_s3_key: Optional[str],install_builder_license_path: Optional[Path],dev: bool,local_dev: bool,platform: str,output_dir: Optional[Path],no_cleanup: bool,installer_source_path: Path,) -> None:"""Separate from the command function so we can mock the body outwhen testing the cli arguments"""main(dev or local_dev,install_builder_path,install_builder_license_path,install_builder_s3_bucket,install_builder_s3_key,output_dir,not no_cleanup,platform,installer_source_path,)
2
23
10
86
0
226
252
226
install_builder_path,install_builder_s3_bucket,install_builder_s3_key,install_builder_license_path,dev,local_dev,platform,output_dir,no_cleanup,installer_source_path
[]
None
{"Expr": 2}
1
27
1
["main"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"]
The function (cli_body) defined within the public class called public.The function start at line 226 and ends at 252. It contains 23 lines of code and it has a cyclomatic complexity of 2. It takes 10 parameters, represented as [226.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["main"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94876780_aws_deadline_deadline_cloud.scripts.build_installer_cli_py.cli"].
aws-deadline_deadline-cloud
public
public
0
0
run
def run(cmd, cwd=None, env=None, echo=True):if echo:sys.stdout.write(f"Running cmd: {cmd}\n")kwargs = {"shell": True,"stdout": subprocess.PIPE,"stderr": subprocess.PIPE,}if isinstance(cmd, list):kwargs["shell"] = Falseif cwd is not None:kwargs["cwd"] = cwdif env is not None:kwargs["env"] = envp = subprocess.Popen(cmd, **kwargs)stdout, stderr = p.communicate()output = stdout.decode("utf-8") + stderr.decode("utf-8")if p.returncode != 0:raise BadExitCodeError(f"Process failed with exit code ({p.returncode}) for command '{cmd}': {output}")return output
6
22
4
137
3
18
39
18
cmd,cwd,env,echo
['kwargs', 'p', 'output']
Returns
{"Assign": 7, "Expr": 1, "If": 5, "Return": 1}
7
22
7
["sys.stdout.write", "isinstance", "subprocess.Popen", "p.communicate", "stdout.decode", "stderr.decode", "BadExitCodeError"]
281
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"]
The function (run) defined within the public class called public.The function start at line 18 and ends at 39. It contains 22 lines of code and it has a cyclomatic complexity of 6. It takes 4 parameters, represented as [18.0], and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["sys.stdout.write", "isinstance", "subprocess.Popen", "p.communicate", "stdout.decode", "stderr.decode", "BadExitCodeError"], It has 281.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.namespace_py.ListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.hardware_py.ShowHardwareCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.plugins.tasks_py.TaskListCommand.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.MainLoop.eval", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.state_py._init_vintageous", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ReloadVintageousSettings.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574420_guillermooo_vintageous.xsupport_py.ResetVintageous.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.fabfile_py.update", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_datetime_related_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_db_queries_py.TestCase.clear_bucket", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_relations_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_model_save_update_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3581153_zetaops_pyoko.tests.test_uniqueness_py.TestCase.prepare_testbed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3596930_mengskysama_shadowsocks_rm.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3632606_shaunduncan_helga.helga.bin.helga_py.main", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.examples.csv.csv_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640866_pelagicore_qface.qface.cli_py.app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_down_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_correctly_when_the_service_is_up_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.examples_py.DescribeStatus.it_displays_the_status_of_multiple_services_json", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.spec.fuser_py.it_shows_help_given_no_arguments", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.testing.subprocess_py.assert_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.test.integration.test_app_runner_py.TestAppRunner.test_app_runner", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3682933_noisyboiler_wampy.wampy.cli.run_py.main"].
aws-deadline_deadline-cloud
InstallBuilderSelection
public
0
0
resolve_install_builder_installation
def resolve_install_builder_installation(self, workdir: Path) -> Path:if self.selection is None:return _get_default_installbuilder_location()elif isinstance(self.selection, _InstallBuilderPathSelection):return self.selection.pathelif isinstance(self.selection, _InstallBuilderS3Selection):filename = self.selection.key.split("/")[-1]s3 = boto3.client("s3")dest = self.selection.dest_path / filenamedest.parent.mkdir(parents=True, exist_ok=True)expected_bucket_owner = os.getenv("IB_SOURCE_BUCKET_OWNER")if not expected_bucket_owner:raise ValueError("IB_SOURCE_BUCKET_OWNER environment variable is required")if not (expected_bucket_owner.isdigit() and len(expected_bucket_owner) == 12):raise ValueError("IB_SOURCE_BUCKET_OWNER must be a 12-digit AWS Account ID")s3.download_file(self.selection.bucket,self.selection.key,str(dest),ExtraArgs={"ExpectedBucketOwner": expected_bucket_owner},)unpack_dest = workdir / filename.split(".tar.gz")[0]shutil.unpack_archive(dest, unpack_dest)dest.unlink()return unpack_destelse:raise ValueError(f"Unknown selection type: {type(self.selection)}")
7
27
2
200
0
39
66
39
self,workdir
[]
Path
{"Assign": 5, "Expr": 4, "If": 5, "Return": 3}
18
28
18
["_get_default_installbuilder_location", "isinstance", "isinstance", "self.selection.key.split", "boto3.client", "dest.parent.mkdir", "os.getenv", "ValueError", "expected_bucket_owner.isdigit", "len", "ValueError", "s3.download_file", "str", "filename.split", "shutil.unpack_archive", "dest.unlink", "ValueError", "type"]
0
[]
The function (resolve_install_builder_installation) defined within the public class called InstallBuilderSelection.The function start at line 39 and ends at 66. It contains 27 lines of code and it has a cyclomatic complexity of 7. It takes 2 parameters, represented as [39.0] and does not return any value. It declares 18.0 functions, and It has 18.0 functions called inside which are ["_get_default_installbuilder_location", "isinstance", "isinstance", "self.selection.key.split", "boto3.client", "dest.parent.mkdir", "os.getenv", "ValueError", "expected_bucket_owner.isdigit", "len", "ValueError", "s3.download_file", "str", "filename.split", "shutil.unpack_archive", "dest.unlink", "ValueError", "type"].