project_name
string
class_name
string
class_modifiers
string
class_implements
int64
class_extends
int64
function_name
string
function_body
string
cyclomatic_complexity
int64
NLOC
int64
num_parameter
int64
num_token
int64
num_variable
int64
start_line
int64
end_line
int64
function_index
int64
function_params
string
function_variable
string
function_return_type
string
function_body_line_type
string
function_num_functions
int64
function_num_lines
int64
outgoing_function_count
int64
outgoing_function_names
string
incoming_function_count
int64
incoming_function_names
string
lexical_representation
string
unifyai_unify
public
public
0
0
_sync_log
def _sync_log(project: str,context: Optional[str],params: Dict[str, Any],entries: Dict[str, Any],api_key: str,) -> unify.Log:"""Synchronously create a log entry using direct HTTP request.This is a helper function used when async logging is disabled or unavailable."""headers = _create_request_header(api_key)body = {"project": project,"context": context,"params": params,"entries": entries,}response = http.post(BASE_URL + "/logs", headers=headers, json=body)resp_json = response.json()# Apply row_ids to entries using the centralized helper_apply_row_ids(resp_json.get("row_ids"), [entries])return unify.Log(id=resp_json["log_event_ids"][0],api_key=api_key,**entries,params=params,context=context,)
1
24
5
137
4
763
795
763
project,context,params,entries,api_key
['headers', 'body', 'resp_json', 'response']
unify.Log
{"Assign": 4, "Expr": 2, "Return": 1}
6
33
6
["_create_request_header", "http.post", "response.json", "_apply_row_ids", "resp_json.get", "unify.Log"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]
The function (_sync_log) defined within the public class called public.The function start at line 763 and ends at 795. It contains 24 lines of code and it has a cyclomatic complexity of 1. It takes 5 parameters, represented as [763.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["_create_request_header", "http.post", "response.json", "_apply_row_ids", "resp_json.get", "unify.Log"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"].
unifyai_unify
public
public
0
0
_create_log
def _create_log(dct, params, context, api_key, context_entries=None):if context_entries is None:context_entries = {}return unify.Log(id=dct["id"],ts=dct["ts"],**dct["entries"],**dct["derived_entries"],**context_entries,params={param_name: (param_ver, params[param_name][param_ver])for param_name, param_ver in dct["params"].items()},context=context,api_key=api_key,)
3
16
5
98
1
798
813
798
dct,params,context,api_key,context_entries
['context_entries']
Returns
{"Assign": 1, "If": 1, "Return": 1}
2
16
2
["unify.Log", "items"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_not_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"]
The function (_create_log) defined within the public class called public.The function start at line 798 and ends at 813. It contains 16 lines of code and it has a cyclomatic complexity of 3. It takes 5 parameters, represented as [798.0], and this function return a value. It declares 2.0 functions, It has 2.0 functions called inside which are ["unify.Log", "items"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_not_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"].
unifyai_unify
public
public
0
0
_create_log_groups_nested
def _create_log_groups_nested(params,context,api_key,node,context_entries,prev_key=None,):if isinstance(node, dict) and "group" not in node:ret = unify.LogGroup(list(node.keys())[0])ret.value = _create_log_groups_nested(params,context,api_key,node[ret.field],context_entries,ret.field,)return retelse:if isinstance(node["group"][0]["value"], list):ret = {}for n in node["group"]:context_entries[prev_key] = n["key"]ret[n["key"]] = [_create_log(item,item["params"],context,api_key,context_entries,)for item in n["value"]]return retelse:ret = {}for n in node["group"]:context_entries[prev_key] = n["key"]ret[n["key"]] = _create_log_groups_nested(params,context,api_key,n["value"],context_entries,n["key"],)return ret
7
48
6
206
1
816
863
816
params,context,api_key,node,context_entries,prev_key
['ret']
Returns
{"Assign": 8, "For": 2, "If": 2, "Return": 3}
8
48
8
["isinstance", "unify.LogGroup", "list", "node.keys", "_create_log_groups_nested", "isinstance", "_create_log", "_create_log_groups_nested"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"]
The function (_create_log_groups_nested) defined within the public class called public.The function start at line 816 and ends at 863. It contains 48 lines of code and it has a cyclomatic complexity of 7. It takes 6 parameters, represented as [816.0], and this function return a value. It declares 8.0 functions, It has 8.0 functions called inside which are ["isinstance", "unify.LogGroup", "list", "node.keys", "_create_log_groups_nested", "isinstance", "_create_log", "_create_log_groups_nested"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._create_log_groups_nested", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"].
unifyai_unify
public
public
0
0
_create_log_groups_not_nested
def _create_log_groups_not_nested(logs, groups, params, context, api_key):logs_mapping = {}for dct in logs:logs_mapping[dct["id"]] = _create_log(dct, params, context, api_key)ret = []for group_key, group_value in groups.items():if isinstance(group_value, dict):val = {}for k, v in group_value.items():if isinstance(v, list):val[k] = [logs_mapping[log_id] for log_id in v]ret.append(unify.LogGroup(group_key, val))return ret
7
13
5
116
3
866
879
866
logs,groups,params,context,api_key
['val', 'logs_mapping', 'ret']
Returns
{"Assign": 5, "Expr": 1, "For": 3, "If": 2, "Return": 1}
7
14
7
["_create_log", "groups.items", "isinstance", "group_value.items", "isinstance", "ret.append", "unify.LogGroup"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"]
The function (_create_log_groups_not_nested) defined within the public class called public.The function start at line 866 and ends at 879. It contains 13 lines of code and it has a cyclomatic complexity of 7. It takes 5 parameters, represented as [866.0], and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["_create_log", "groups.items", "isinstance", "group_value.items", "isinstance", "ret.append", "unify.LogGroup"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.get_logs"].
unifyai_unify
public
public
0
0
create_logs
def create_logs(*,project: Optional[str] = None,context: Optional[str] = None,params: Optional[Union[List[Dict[str, Any]], Dict[str, Any]]] = None,entries: Optional[Union[List[Dict[str, Any]], Dict[str, Any]]] = None,mutable: Optional[Union[bool, Dict[str, bool]]] = True,batched: Optional[bool] = None,api_key: Optional[str] = None,) -> List[int]:"""Creates one or more logs associated to a project.Args:project: Name of the project the stored logs will be associated to.context: Context for the logs.entries: List of dictionaries with the entries to be logged. For contexts withnested unique IDs, parent ID values can be passed directly in the entriesdictionaries. For example, if a context has unique IDs `["run_id", "step_id"]`,you can pass `{"run_id": 0, "data": "value"}` in entries to generate the next`step_id` for that particular run. The leftmost N-1 unique columns can besupplied as normal entry keys, and the rightmost column is always auto-incremented.params: List of dictionaries with the params to be logged.mutable: Either a boolean to apply uniform mutability for all fields, or a dictionary mapping field names to booleans for per-field control. Defaults to True.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list of the created logs."""api_key = _validate_api_key(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = _handle_context(context)headers = _create_request_header(api_key)# ToDo: add support for all of the context variables, as is done for `unify.log` aboveparams = _handle_mutability(mutable, params)entries = _handle_mutability(mutable, entries)# ToDo remove the params/entries logic above once this [https://app.clickup.com/t/86c25g263] is doneparams = [{}] * len(entries) if params in [None, []] else paramsentries = [{}] * len(params) if entries in [None, []] else entries# end ToDobody = {"project": project,"context": context,"params": params,"entries": entries,}body_size = sys.getsizeof(json.dumps(body))if batched is None:batched = body_size < CHUNK_LIMITif batched:if body_size < CHUNK_LIMIT:response = http.post(BASE_URL + "/logs", headers=headers, json=body)else:response = http.post(BASE_URL + "/logs",headers=headers,data=_json_chunker(body),)resp_json = response.json()# Apply row_ids to entries using the centralized helper_apply_row_ids(resp_json.get("row_ids"), entries)return [unify.Log(project=project,context=context["name"] if isinstance(context, dict) else context,**{k: v for k, v in e.items() if k != "explicit_types"},**p,id=i,)for e, p, i in zip(entries, params, resp_json["log_event_ids"])]# Fallback for non-batched (iterative) loggingpbar = tqdm(total=len(params), unit="logs", desc="Creating Logs")try:unify.initialize_async_logger()_async_logger.register_callback(lambda: pbar.update(1))ret = []for p, e in zip(params, entries):ret.append(log(project=project,context=context,params=p,new=True,mutable=mutable,api_key=api_key,**e,),)finally:unify.shutdown_async_logger()pbar.close()return ret
12
69
9
499
13
882
984
882
project,context,params,entries,mutable,batched,api_key
['entries', 'resp_json', 'pbar', 'body', 'ret', 'headers', 'context', 'api_key', 'project', 'batched', 'params', 'response', 'body_size']
List[int]
{"Assign": 16, "Expr": 7, "For": 1, "If": 3, "Return": 2, "Try": 1}
30
103
30
["_validate_api_key", "_get_and_maybe_create_project", "_handle_context", "_create_request_header", "_handle_mutability", "_handle_mutability", "len", "len", "sys.getsizeof", "json.dumps", "http.post", "http.post", "_json_chunker", "response.json", "_apply_row_ids", "resp_json.get", "unify.Log", "isinstance", "e.items", "zip", "tqdm", "len", "unify.initialize_async_logger", "_async_logger.register_callback", "pbar.update", "zip", "ret.append", "log", "unify.shutdown_async_logger", "pbar.close"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.add_dataset_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset"]
The function (create_logs) defined within the public class called public.The function start at line 882 and ends at 984. It contains 69 lines of code and it has a cyclomatic complexity of 12. It takes 9 parameters, represented as [882.0] and does not return any value. It declares 30.0 functions, It has 30.0 functions called inside which are ["_validate_api_key", "_get_and_maybe_create_project", "_handle_context", "_create_request_header", "_handle_mutability", "_handle_mutability", "len", "len", "sys.getsizeof", "json.dumps", "http.post", "http.post", "_json_chunker", "response.json", "_apply_row_ids", "resp_json.get", "unify.Log", "isinstance", "e.items", "zip", "tqdm", "len", "unify.initialize_async_logger", "_async_logger.register_callback", "pbar.update", "zip", "ret.append", "log", "unify.shutdown_async_logger", "pbar.close"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.add_dataset_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset"].
unifyai_unify
public
public
0
0
_add_to_log
def _add_to_log(*,context: Optional[str] = None,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,mode: str = None,overwrite: bool = False,mutable: Optional[Union[bool, Dict[str, bool]]] = True,api_key: Optional[str] = None,**data,) -> Dict[str, str]:assert mode in ("params", "entries"), "mode must be one of 'params', 'entries'"data = _apply_col_context(**data)nest_level = {"params": PARAMS_NEST_LEVEL, "entries": ENTRIES_NEST_LEVEL}[mode]active = {"params": ACTIVE_PARAMS_WRITE, "entries": ACTIVE_ENTRIES_WRITE}[mode]api_key = _validate_api_key(api_key)context = _handle_context(context)data = _handle_special_types(data)data = _handle_mutability(mutable, data)if ASYNC_LOGGING and _async_logger is not None:# For simplicity, assume logs is a single unify.Log.if logs is None:log_obj = ACTIVE_LOG.get()[-1]elif isinstance(logs, unify.Log):log_obj = logselif isinstance(logs, list) and logs and isinstance(logs[0], unify.Log):log_obj = logs[0]else:# If not a Log, resolve synchronously.log_id = _to_log_ids(logs)[0]lf = _async_logger._loop.create_future()lf.set_result(log_id)log_obj = unify.Log(id=log_id, _future=lf, api_key=api_key)# Prepare the future to pass (if the log is still pending, use its _future)if hasattr(log_obj, "_future") and log_obj._future is not None:lf = log_obj._futureelse:lf = _async_logger._loop.create_future()lf.set_result(log_obj.id)_async_logger.log_update(project=_get_and_maybe_create_project(None, api_key=api_key),context=context,future=lf,mode=mode,overwrite=overwrite,data=data,)return {"detail": "Update queued asynchronously"}else:# Fallback to synchronous update if async logging isn’t enabled.log_ids = _to_log_ids(logs)headers = _create_request_header(api_key)all_kwargs = []if nest_level.get() > 0:for log_id in log_ids:combined_kwargs = {**data,**{k: vfor k, v in active.get().items()if k not in LOGGED.get().get(log_id, {})},}all_kwargs.append(combined_kwargs)assert all(kw == all_kwargs[0] for kw in all_kwargs), "All logs must share the same context if they're all being updated at the same time."data = all_kwargs[0]body = {"logs": log_ids, mode: data, "overwrite": overwrite, "context": context}response = http.put(BASE_URL + "/logs", headers=headers, json=body)if nest_level.get() > 0:logged = LOGGED.get()new_logged = {}for log_id in log_ids:if log_id in logged:new_logged[log_id] = logged[log_id] + list(data.keys())else:new_logged[log_id] = list(data.keys())LOGGED.set({**logged, **new_logged})return response.json()
18
75
7
577
16
987
1,065
987
context,logs,mode,overwrite,mutable,api_key,**data
['combined_kwargs', 'logged', 'new_logged', 'body', 'log_obj', 'lf', 'context', 'headers', 'active', 'log_ids', 'api_key', 'nest_level', 'log_id', 'all_kwargs', 'data', 'response']
Dict[str, str]
{"Assign": 26, "Expr": 5, "For": 2, "If": 8, "Return": 2}
36
79
36
["_apply_col_context", "_validate_api_key", "_handle_context", "_handle_special_types", "_handle_mutability", "ACTIVE_LOG.get", "isinstance", "isinstance", "isinstance", "_to_log_ids", "_async_logger._loop.create_future", "lf.set_result", "unify.Log", "hasattr", "_async_logger._loop.create_future", "lf.set_result", "_async_logger.log_update", "_get_and_maybe_create_project", "_to_log_ids", "_create_request_header", "nest_level.get", "items", "active.get", "get", "LOGGED.get", "all_kwargs.append", "all", "http.put", "nest_level.get", "LOGGED.get", "list", "data.keys", "list", "data.keys", "LOGGED.set", "response.json"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]
The function (_add_to_log) defined within the public class called public.The function start at line 987 and ends at 1065. It contains 75 lines of code and it has a cyclomatic complexity of 18. It takes 7 parameters, represented as [987.0] and does not return any value. It declares 36.0 functions, It has 36.0 functions called inside which are ["_apply_col_context", "_validate_api_key", "_handle_context", "_handle_special_types", "_handle_mutability", "ACTIVE_LOG.get", "isinstance", "isinstance", "isinstance", "_to_log_ids", "_async_logger._loop.create_future", "lf.set_result", "unify.Log", "hasattr", "_async_logger._loop.create_future", "lf.set_result", "_async_logger.log_update", "_get_and_maybe_create_project", "_to_log_ids", "_create_request_header", "nest_level.get", "items", "active.get", "get", "LOGGED.get", "all_kwargs.append", "all", "http.put", "nest_level.get", "LOGGED.get", "list", "data.keys", "list", "data.keys", "LOGGED.set", "response.json"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"].
unifyai_unify
public
public
0
0
add_log_params
def add_log_params(*,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,mutable: Optional[Union[bool, Dict[str, bool]]] = True,api_key: Optional[str] = None,**params,) -> Dict[str, str]:"""Add extra params into an existing log.Args:logs: The log(s) to update with extra params. Looks for the current active log ifno id is provided.mutable: Either a boolean to apply uniform mutability for all parameters, or a dictionary mapping parameter names to booleans for per-field control.Defaults to True.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.params: Dictionary containing one or more key:value pairs that will belogged into the platform as params.Returns:A message indicating whether the logs were successfully updated."""ret = _add_to_log(logs=logs,mode="params",mutable=mutable,api_key=api_key,**params,)if USR_LOGGING:logger.info(f"Added Params {', '.join(list(params.keys()))} "f"to [Logs({', '.join([str(i) for i in _to_log_ids(logs)])})]",)return ret
2
20
4
112
1
1,068
1,105
1,068
logs,mutable,api_key,**params
['ret']
Dict[str, str]
{"Assign": 1, "Expr": 2, "If": 1, "Return": 1}
8
38
8
["_add_to_log", "logger.info", "join", "list", "params.keys", "join", "str", "_to_log_ids"]
0
[]
The function (add_log_params) defined within the public class called public.The function start at line 1068 and ends at 1105. It contains 20 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [1068.0] and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["_add_to_log", "logger.info", "join", "list", "params.keys", "join", "str", "_to_log_ids"].
unifyai_unify
public
public
0
0
add_log_entries
def add_log_entries(*,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,overwrite: bool = False,mutable: Optional[Union[bool, Dict[str, bool]]] = True,api_key: Optional[str] = None,context: Optional[str] = None,**entries,) -> Dict[str, str]:"""Add extra entries into an existing log.Args:logs: The log(s) to update with extra entries. Looks for the current active log ifno id is provided.overwrite: Whether or not to overwrite an entry pre-existing with the same name.mutable: Either a boolean to apply uniform mutability for all entries, or a dictionary mapping entry names to booleans for per-field control.Defaults to True.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.entries: Dictionary containing one or more key:value pairs that will be loggedinto the platform as entries.Returns:A message indicating whether the logs were successfully updated."""ret = _add_to_log(logs=logs,mode="entries",overwrite=overwrite,mutable=mutable,api_key=api_key,context=context,**entries,)if USR_LOGGING:logger.info(f"Added Entries {', '.join(list(entries.keys()))} "f"to Logs({', '.join([str(i) for i in _to_log_ids(logs)])})",)return ret
2
24
6
135
1
1,108
1,152
1,108
logs,overwrite,mutable,api_key,context,**entries
['ret']
Dict[str, str]
{"Assign": 1, "Expr": 2, "If": 1, "Return": 1}
8
45
8
["_add_to_log", "logger.info", "join", "list", "entries.keys", "join", "str", "_to_log_ids"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.add_entries"]
The function (add_log_entries) defined within the public class called public.The function start at line 1108 and ends at 1152. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 6 parameters, represented as [1108.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["_add_to_log", "logger.info", "join", "list", "entries.keys", "join", "str", "_to_log_ids"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.add_entries"].
unifyai_unify
public
public
0
0
update_logs
def update_logs(*,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,context: Optional[Union[str, List[str]]] = None,params: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,entries: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,overwrite: bool = False,api_key: Optional[str] = None,) -> Dict[str, str]:"""Updates existing logs."""if not logs and not params and not entries:return {"detail": "No logs to update."}headers = _create_request_header(api_key)log_ids = _to_log_ids(logs)body = {"logs": log_ids,"context": context,"overwrite": overwrite,}if entries is not None:body["entries"] = entriesif params is not None:body["params"] = paramsresponse = http.put(BASE_URL + "/logs", headers=headers, json=body)return response.json()
6
24
8
220
4
1,155
1,181
1,155
logs,context,params,entries,overwrite,api_key
['headers', 'body', 'response', 'log_ids']
Dict[str, str]
{"Assign": 6, "Expr": 1, "If": 3, "Return": 2}
4
27
4
["_create_request_header", "_to_log_ids", "http.put", "response.json"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.update_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset"]
The function (update_logs) defined within the public class called public.The function start at line 1155 and ends at 1181. It contains 24 lines of code and it has a cyclomatic complexity of 6. It takes 8 parameters, represented as [1155.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_create_request_header", "_to_log_ids", "http.put", "response.json"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.update_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset"].
unifyai_unify
public
public
0
0
delete_logs
def delete_logs(*,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,project: Optional[str] = None,context: Optional[str] = None,delete_empty_logs: bool = False,source_type: str = "all",api_key: Optional[str] = None,) -> Dict[str, str]:"""Deletes logs from a project.Args:logs: log(s) to delete from a project.project: Name of the project to delete logs from.context: Context of the logs to delete. Logs will be removed from that context instead of being entirely deleted,unless it is the last context associated with the log.delete_empty_logs: Whether to delete logs that become empty after deleting the specified fields.source_type: Type of logs to delete. Can be "all", "derived", or "base".api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the logs were successfully deleted."""if logs is None:logs = get_logs(project=project, context=context, api_key=api_key)if not logs:return {"message": "No logs to delete"}project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_READ.get()log_ids = _to_log_ids(logs)headers = _create_request_header(api_key)body = {"project": project,"context": context,"ids_and_fields": [(log_ids, None)],"source_type": source_type,}params = {"delete_empty_logs": delete_empty_logs}response = http.delete(BASE_URL + "/logs",headers=headers,params=params,json=body,)if USR_LOGGING:logger.info(f"Deleted Logs({', '.join([str(i) for i in log_ids])})")return response.json()
5
33
6
217
8
1,184
1,237
1,184
logs,project,context,delete_empty_logs,source_type,api_key
['logs', 'body', 'context', 'headers', 'project', 'log_ids', 'params', 'response']
Dict[str, str]
{"Assign": 8, "Expr": 2, "If": 3, "Return": 2}
10
54
10
["get_logs", "_get_and_maybe_create_project", "CONTEXT_READ.get", "_to_log_ids", "_create_request_header", "http.delete", "logger.info", "join", "str", "response.json"]
4
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.store_entry"]
The function (delete_logs) defined within the public class called public.The function start at line 1184 and ends at 1237. It contains 33 lines of code and it has a cyclomatic complexity of 5. It takes 6 parameters, represented as [1184.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["get_logs", "_get_and_maybe_create_project", "CONTEXT_READ.get", "_to_log_ids", "_create_request_header", "http.delete", "logger.info", "join", "str", "response.json"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.store_entry"].
unifyai_unify
public
public
0
0
delete_log_fields
def delete_log_fields(*,field: str,logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, str]:"""Deletes an entry from a log.Args:field: Name of the field to delete from the given logs.logs: log(s) to delete entries from.project: Name of the project to delete logs from.context: Context of the logs to delete entries from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the log entries were successfully deleted."""log_ids = _to_log_ids(logs)api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_READ.get()body = {"project": project,"context": context,"ids_and_fields": [(log_ids, field)],}response = http.delete(BASE_URL + f"/logs",headers=headers,json=body,)if USR_LOGGING:logger.info(f"Deleted Field `{field}` from Logs({', '.join([str(i) for i in log_ids])})",)return response.json()
3
28
5
171
7
1,240
1,285
1,240
field,logs,project,context,api_key
['body', 'headers', 'context', 'log_ids', 'api_key', 'project', 'response']
Dict[str, str]
{"Assign": 7, "Expr": 2, "If": 1, "Return": 1}
10
46
10
["_to_log_ids", "_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.delete", "logger.info", "join", "str", "response.json"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.delete_entries"]
The function (delete_log_fields) defined within the public class called public.The function start at line 1240 and ends at 1285. It contains 28 lines of code and it has a cyclomatic complexity of 3. It takes 5 parameters, represented as [1240.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["_to_log_ids", "_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.delete", "logger.info", "join", "str", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.delete_entries"].
unifyai_unify
public
public
0
0
get_logs
def get_logs(*,project: Optional[str] = None,context: Optional[str] = None,column_context: Optional[str] = None,filter: Optional[str] = None,limit: Optional[int] = None,offset: int = 0,return_versions: Optional[bool] = None,group_threshold: Optional[int] = None,value_limit: Optional[int] = None,sorting: Optional[Dict[str, Any]] = None,group_sorting: Optional[Dict[str, Any]] = None,from_ids: Optional[List[int]] = None,exclude_ids: Optional[List[int]] = None,from_fields: Optional[List[str]] = None,exclude_fields: Optional[List[str]] = None,group_by: Optional[List[str]] = None,group_limit: Optional[int] = None,group_offset: Optional[int] = 0,group_depth: Optional[int] = None,nested_groups: Optional[bool] = True,groups_only: Optional[bool] = None,return_timestamps: Optional[bool] = None,return_ids_only: bool = False,api_key: Optional[str] = None,) -> Union[List[unify.Log], Dict[str, Any]]:"""Returns a list of filtered logs from a project.Args:project: Name of the project to get logs from.context: Context of the logs to get.column_context: Column context of the logs to get.filter: Boolean string to filter logs, for example:"(temperature > 0.5 and (len(system_msg) < 100 or 'no' in usr_response))"limit: The maximum number of logs to return. Default is None (unlimited).offset: The starting index of the logs to return. Default is 0.return_versions: Whether to return all versions of logs.group_threshold: Entries that appear in at least this many logs will be grouped together.value_limit: Maximum number of characters to return for string values.sorting: A dictionary specifying the sorting order for the logs by field names.group_sorting: A dictionary specifying the sorting order for the groups relative to each other based on aggregated metrics.from_ids: A list of log IDs to include in the results.exclude_ids: A list of log IDs to exclude from the results.from_fields: A list of field names to include in the results.exclude_fields: A list of field names to exclude from the results.group_by: A list of field names to group the logs by.group_limit: The maximum number of groups to return at each level.group_offset: Number of groups to skip at each level.group_depth: Maximum depth of nested groups to return.nested_groups: Whether to return nested groups.groups_only: Whether to return only the groups.return_timestamps: Whether to return the timestamps of the logs.return_ids_only: Whether to return only the log ids.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The list of logs for the project, after optionally applying filtering."""# ToDo: add support for all context handlersapi_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)context = context if context else CONTEXT_READ.get()column_context = column_context if column_context else COLUMN_CONTEXT_READ.get()merged_filters = ACTIVE_PARAMS_READ.get() | ACTIVE_ENTRIES_READ.get()if merged_filters:_filter = " and ".join(f"{k}=={repr(v)}" for k, v in merged_filters.items())if filter:filter = f"({filter}) and ({_filter})"else:filter = _filterparams = {"project": project,"context": context,"filter_expr": filter,"limit": limit,"offset": offset,"return_ids_only": return_ids_only,"column_context": column_context,"return_versions": return_versions,"group_threshold": group_threshold,"value_limit": value_limit,"sorting": json.dumps(sorting) if sorting is not None else None,"group_sorting": (json.dumps(group_sorting) if group_sorting is not None else None),"from_ids": "&".join(map(str, from_ids)) if from_ids else None,"exclude_ids": "&".join(map(str, exclude_ids)) if exclude_ids else None,"from_fields": "&".join(from_fields) if from_fields else None,"exclude_fields": "&".join(exclude_fields) if exclude_fields else None,"group_by": group_by,"group_limit": group_limit,"group_offset": group_offset,"group_depth": group_depth,"nested_groups": nested_groups,"groups_only": groups_only,"return_timestamps": return_timestamps,}response = http.get(BASE_URL + "/logs", headers=headers, params=params)if not group_by:if return_ids_only:return response.json()params, logs, _ = response.json().values()return [_create_log(dct, params, context, api_key) for dct in logs]if nested_groups:params, logs, _ = response.json().values()return _create_log_groups_nested(params, context, api_key, logs, {})else:params, groups, logs, _ = response.json().values()return _create_log_groups_not_nested(logs, groups, params, context, api_key)
16
82
24
650
10
1,289
1,431
1,289
project,context,column_context,filter,limit,offset,return_versions,group_threshold,value_limit,sorting,group_sorting,from_ids,exclude_ids,from_fields,exclude_fields,group_by,group_limit,group_offset,group_depth,nested_groups,groups_only,return_timestamps,return_ids_only,api_key
['merged_filters', 'filter', '_filter', 'headers', 'context', 'api_key', 'project', 'params', 'response', 'column_context']
Union[List[unify.Log], Dict[str, Any]]
{"Assign": 14, "Expr": 1, "If": 5, "Return": 4}
29
143
29
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "COLUMN_CONTEXT_READ.get", "ACTIVE_PARAMS_READ.get", "ACTIVE_ENTRIES_READ.get", "join", "repr", "merged_filters.items", "json.dumps", "json.dumps", "join", "map", "join", "map", "join", "join", "http.get", "response.json", "values", "response.json", "_create_log", "values", "response.json", "_create_log_groups_nested", "values", "response.json", "_create_log_groups_not_nested"]
11
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961676_fandoghpaas_fandogh_cli.fandogh_cli.service_commands_py.service_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_param_by_value", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_param_by_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.download_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.has_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.list_keys", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.retrieve_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.store_entry"]
The function (get_logs) defined within the public class called public.The function start at line 1289 and ends at 1431. It contains 82 lines of code and it has a cyclomatic complexity of 16. It takes 24 parameters, represented as [1289.0] and does not return any value. It declares 29.0 functions, It has 29.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "COLUMN_CONTEXT_READ.get", "ACTIVE_PARAMS_READ.get", "ACTIVE_ENTRIES_READ.get", "join", "repr", "merged_filters.items", "json.dumps", "json.dumps", "join", "map", "join", "map", "join", "join", "http.get", "response.json", "values", "response.json", "_create_log", "values", "response.json", "_create_log_groups_nested", "values", "response.json", "_create_log_groups_not_nested"], It has 11.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3961676_fandoghpaas_fandogh_cli.fandogh_cli.service_commands_py.service_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_param_by_value", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_param_by_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.download_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.upload_dataset", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.has_key", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.list_keys", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.remove_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.retrieve_entry", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.store_entry"].
unifyai_unify
public
public
0
0
get_log_by_id
def get_log_by_id(id: int,project: Optional[str] = None,*,api_key: Optional[str] = None,) -> unify.Log:"""Returns the log associated with a given id.Args:id: IDs of the logs to fetch.project: Name of the project to get logs from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The full set of log data."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)response = http.get(BASE_URL + "/logs",params={"project": project, "from_ids": [id]},headers=headers,)params, lgs, count = response.json().values()if len(lgs) == 0:raise Exception(f"Log with id {id} does not exist")lg = lgs[0]return unify.Log(id=lg["id"],ts=lg["ts"],**lg["entries"],**lg["derived_entries"],params={k: (v, params[k][v]) for k, v in lg["params"].items()},api_key=api_key,)
3
26
3
186
5
1,435
1,474
1,435
id,project,api_key
['headers', 'project', 'api_key', 'response', 'lg']
unify.Log
{"Assign": 6, "Expr": 1, "If": 1, "Return": 1}
10
40
10
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "http.get", "values", "response.json", "len", "Exception", "unify.Log", "items"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.download"]
The function (get_log_by_id) defined within the public class called public.The function start at line 1435 and ends at 1474. It contains 26 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [1435.0] and does not return any value. It declares 10.0 functions, It has 10.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "http.get", "values", "response.json", "len", "Exception", "unify.Log", "items"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Log.download"].
unifyai_unify
public
public
0
0
get_logs_metric
def get_logs_metric(*,metric: str,key: str,filter: Optional[str] = None,project: Optional[str] = None,context: Optional[str] = None,from_ids: Optional[List[int]] = None,exclude_ids: Optional[List[int]] = None,api_key: Optional[str] = None,) -> Union[float, int, bool]:"""Retrieve a set of log metrics across a project, after applying the filtering.Args:metric: The reduction metric to compute for the specified key. Supported are:sum, mean, var, std, min, max, median, mode.key: The key to compute the reduction statistic for.filter: The filtering to apply to the various log values, expressed as a string,for example:"(temperature > 0.5 and (len(system_msg) < 100 or 'no' in usr_response))"project: The id of the project to retrieve the logs for.context: The context of the logs to retrieve the metrics for.from_ids: A list of log IDs to include in the results.exclude_ids: A list of log IDs to exclude from the results.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The full set of reduced log metrics for the project, after optionally applyingthe optional filtering."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)params = {"project": project,"filter_expr": filter,"key": key,"from_ids": "&".join(map(str, from_ids)) if from_ids else None,"exclude_ids": "&".join(map(str, exclude_ids)) if exclude_ids else None,"context": context if context else CONTEXT_READ.get(),}response = http.get(BASE_URL + f"/logs/metric/{metric}",headers=headers,params=params,)return response.json()
4
28
8
196
5
1,478
1,533
1,478
metric,key,filter,project,context,from_ids,exclude_ids,api_key
['headers', 'project', 'api_key', 'response', 'params']
Union[float, int, bool]
{"Assign": 5, "Expr": 1, "Return": 1}
10
56
10
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "join", "map", "join", "map", "CONTEXT_READ.get", "http.get", "response.json"]
0
[]
The function (get_logs_metric) defined within the public class called public.The function start at line 1478 and ends at 1533. It contains 28 lines of code and it has a cyclomatic complexity of 4. It takes 8 parameters, represented as [1478.0] and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "join", "map", "join", "map", "CONTEXT_READ.get", "http.get", "response.json"].
unifyai_unify
public
public
0
0
get_groups
def get_groups(*,key: str,project: Optional[str] = None,context: Optional[str] = None,filter: Optional[Dict[str, Any]] = None,from_ids: Optional[List[int]] = None,exclude_ids: Optional[List[int]] = None,api_key: Optional[str] = None,) -> Dict[str, List[Dict[str, Any]]]:"""Returns a list of the different version/values of one entry within a given projectbased on its key.Args:key: Name of the log entry to do equality matching for.project: Name of the project to get logs from.context: The context to get groups from.filter: Boolean string to filter logs, for example:"(temperature > 0.5 and (len(system_msg) < 100 or 'no' in usr_response))"from_ids: A list of log IDs to include in the results.exclude_ids: A list of log IDs to exclude from the results.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A dict containing the grouped logs, with each key of the dict representing theversion of the log key with equal values, and the value being the equal value."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_READ.get()params = {"project": project,"context": context,"key": key,"filter_expr": filter,"from_ids": from_ids,"exclude_ids": exclude_ids,}response = http.get(BASE_URL + "/logs/groups", headers=headers, params=params)return response.json()
2
24
7
176
6
1,536
1,584
1,536
key,project,context,filter,from_ids,exclude_ids,api_key
['headers', 'context', 'api_key', 'project', 'params', 'response']
Dict[str, List[Dict[str, Any]]]
{"Assign": 6, "Expr": 1, "Return": 1}
6
49
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.get", "response.json"]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_experiment_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_experiment_version"]
The function (get_groups) defined within the public class called public.The function start at line 1536 and ends at 1584. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 7 parameters, represented as [1536.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.get", "response.json"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_experiment_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.compositions_py.get_experiment_version"].
unifyai_unify
public
public
0
0
get_logs_latest_timestamp
def get_logs_latest_timestamp(*,project: Optional[str] = None,context: Optional[str] = None,column_context: Optional[str] = None,filter: Optional[str] = None,sort_by: Optional[str] = None,from_ids: Optional[List[int]] = None,exclude_ids: Optional[List[int]] = None,limit: Optional[int] = None,offset: Optional[int] = None,api_key: Optional[str] = None,) -> int:"""Returns the update timestamp of the most recently updated log within the specified page and filter bounds."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_READ.get()column_context = column_context if column_context else COLUMN_CONTEXT_READ.get()params = {"project": project,"context": context,"column_context": column_context,"filter_expr": filter,"sort_by": sort_by,"from_ids": "&".join(map(str, from_ids)) if from_ids else None,"exclude_ids": "&".join(map(str, exclude_ids)) if exclude_ids else None,"limit": limit,"offset": offset,}response = http.get(BASE_URL + "/logs/latest_timestamp",headers=headers,params=params,)return response.json()
5
35
10
242
7
1,587
1,624
1,587
project,context,column_context,filter,sort_by,from_ids,exclude_ids,limit,offset,api_key
['headers', 'context', 'api_key', 'project', 'params', 'response', 'column_context']
int
{"Assign": 7, "Expr": 1, "Return": 1}
11
38
11
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "COLUMN_CONTEXT_READ.get", "join", "map", "join", "map", "http.get", "response.json"]
0
[]
The function (get_logs_latest_timestamp) defined within the public class called public.The function start at line 1587 and ends at 1624. It contains 35 lines of code and it has a cyclomatic complexity of 5. It takes 10 parameters, represented as [1587.0] and does not return any value. It declares 11.0 functions, and It has 11.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "COLUMN_CONTEXT_READ.get", "join", "map", "join", "map", "http.get", "response.json"].
unifyai_unify
public
public
0
0
update_derived_log
def update_derived_log(*,target: Union[List[int], Dict[str, str]],key: Optional[str] = None,equation: Optional[str] = None,referenced_logs: Optional[List[int]] = None,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Update the derived entries for a log.Args:target: The derived logs to updatekey: New key name for the derived entriesequation: New equation for computing derived valuesreferenced_logs: Optional new referenced logs to use for computation.project: The project to update the derived logs forcontext: The context to update the derived logs forapi_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the derived logs were successfully updated."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_WRITE.get()body = {"project": project,"context": context,"target_derived_logs": target,"key": key,"equation": equation,"referenced_logs": referenced_logs,}response = http.put(BASE_URL + "/logs/derived", headers=headers, json=body)return response.json()
2
24
7
168
6
1,627
1,672
1,627
target,key,equation,referenced_logs,project,context,api_key
['body', 'headers', 'context', 'api_key', 'project', 'response']
None
{"Assign": 6, "Expr": 1, "Return": 1}
6
46
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.put", "response.json"]
0
[]
The function (update_derived_log) defined within the public class called public.The function start at line 1627 and ends at 1672. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 7 parameters, represented as [1627.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.put", "response.json"].
unifyai_unify
public
public
0
0
create_derived_logs
def create_derived_logs(*,key: str,equation: str,referenced_logs,derived: Optional[bool] = None,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, Any]:"""Creates one or more entries based on equation and referenced_logs.Args:key: The name of the entry.equation: The equation for computing the value of each derived entry.referenced_logs: The logs to use for each newly created derived entry,either as a list of log ids or as a set of arguments for the get_logs endpoint.derived: Whether to create derived logs (True) or static entries in base logs (False).Returns:A message indicating whether the derived logs were successfully created."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_WRITE.get()body = {"project": project,"context": context,"key": key,"equation": equation,"referenced_logs": referenced_logs,}if derived is not None:body["derived"] = derivedresponse = http.post(BASE_URL + "/logs/derived", headers=headers, json=body)return response.json()
3
25
7
153
6
1,675
1,712
1,675
key,equation,referenced_logs,derived,project,context,api_key
['body', 'headers', 'context', 'api_key', 'project', 'response']
Dict[str, Any]
{"Assign": 7, "Expr": 1, "If": 1, "Return": 1}
6
38
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.post", "response.json"]
0
[]
The function (create_derived_logs) defined within the public class called public.The function start at line 1675 and ends at 1712. It contains 25 lines of code and it has a cyclomatic complexity of 3. It takes 7 parameters, represented as [1675.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.post", "response.json"].
unifyai_unify
public
public
0
0
join_logs
def join_logs(*,pair_of_args: Tuple[Dict[str, Any], Dict[str, Any]],join_expr: str,mode: str,new_context: str,copy: Optional[bool] = True,columns: Optional[List[str]] = None,project: Optional[str] = None,api_key: Optional[str] = None,):"""Join two sets of logs based on specified criteria and creates new logs with the joined data."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)body = {"project": project,"pair_of_args": pair_of_args,"join_expr": join_expr,"mode": mode,"new_context": new_context,"columns": columns,"copy": copy,}response = http.post(BASE_URL + "/logs/join", headers=headers, json=body)return response.json()
1
25
8
155
5
1,715
1,742
1,715
pair_of_args,join_expr,mode,new_context,copy,columns,project,api_key
['body', 'headers', 'project', 'api_key', 'response']
Returns
{"Assign": 5, "Expr": 1, "Return": 1}
5
28
5
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "http.post", "response.json"]
0
[]
The function (join_logs) defined within the public class called public.The function start at line 1715 and ends at 1742. It contains 25 lines of code and it has a cyclomatic complexity of 1. It takes 8 parameters, represented as [1715.0], and this function return a value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "http.post", "response.json"].
unifyai_unify
public
public
0
0
create_fields
def create_fields(fields: Union[Dict[str, Any], List[str]],*,backfill_logs: Optional[bool] = None,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,):"""Creates one or more fields in a project.Args:fields: Dictionary mapping field names to their types (or None if no explicit type).project: Name of the project to create fields in.context: The context to create fields in.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_WRITE.get()if isinstance(fields, list):fields = {field: None for field in fields}body = {"project": project,"context": context,"fields": fields,}if backfill_logs is not None:body["backfill_logs"] = backfill_logsresponse = http.post(BASE_URL + "/logs/fields", headers=headers, json=body)return response.json()
5
23
5
164
7
1,745
1,780
1,745
fields,backfill_logs,project,context,api_key
['fields', 'body', 'headers', 'context', 'api_key', 'project', 'response']
Returns
{"Assign": 8, "Expr": 1, "If": 2, "Return": 1}
7
36
7
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "isinstance", "http.post", "response.json"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923967_klen_peewee_migrate.peewee_migrate.auto_py.diff_one"]
The function (create_fields) defined within the public class called public.The function start at line 1745 and ends at 1780. It contains 23 lines of code and it has a cyclomatic complexity of 5. It takes 5 parameters, represented as [1745.0], and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "isinstance", "http.post", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3923967_klen_peewee_migrate.peewee_migrate.auto_py.diff_one"].
unifyai_unify
public
public
0
0
rename_field
def rename_field(name: str,new_name: str,*,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,):"""Rename a field in a project.Args:name: The name of the field to rename.new_name: The new name for the field.project: Name of the project to rename the field in.context: The context to rename the field in.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_WRITE.get()body = {"project": project,"context": context,"old_field_name": name,"new_field_name": new_name,}response = http.patch(BASE_URL + "/logs/rename_field",headers=headers,json=body,)return response.json()
2
24
5
120
6
1,783
1,821
1,783
name,new_name,project,context,api_key
['body', 'headers', 'context', 'api_key', 'project', 'response']
Returns
{"Assign": 6, "Expr": 1, "Return": 1}
6
39
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.patch", "response.json"]
0
[]
The function (rename_field) defined within the public class called public.The function start at line 1783 and ends at 1821. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 5 parameters, represented as [1783.0], and this function return a value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.patch", "response.json"].
unifyai_unify
public
public
0
0
get_fields
def get_fields(*,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,):"""Get a dictionary of fields names and their typesArgs:project: Name of the project to get fields from.context: The context to get fields from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A dictionary of fields names and their types"""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_READ.get()params = {"project": project,"context": context,}response = http.get(BASE_URL + "/logs/fields", headers=headers, params=params)return response.json()
2
16
3
103
6
1,824
1,853
1,824
project,context,api_key
['headers', 'context', 'api_key', 'project', 'params', 'response']
Returns
{"Assign": 6, "Expr": 1, "Return": 1}
6
30
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.get", "response.json"]
7
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.bosscore.serializers_py.ExperimentSerializer.get_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin._get_fields_with_attr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_all_fields_with_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_fields_with_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.filter_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77580766_uninett_argus.src.argus.site.serializers_py.MetadataSerializer.get_fields"]
The function (get_fields) defined within the public class called public.The function start at line 1824 and ends at 1853. It contains 16 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [1824.0], and this function return a value. It declares 6.0 functions, It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_READ.get", "http.get", "response.json"], It has 7.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.bosscore.serializers_py.ExperimentSerializer.get_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin._get_fields_with_attr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_all_fields_with_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.props_py.PropsMixin.get_fields_with_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.database.utils_py.filter_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77580766_uninett_argus.src.argus.site.serializers_py.MetadataSerializer.get_fields"].
unifyai_unify
public
public
0
0
delete_fields
def delete_fields(fields: List[str],*,project: Optional[str] = None,context: Optional[str] = None,api_key: Optional[str] = None,):"""Delete one or more fields from a project.Args:fields: List of field names to delete.project: Name of the project to delete fields from.context: The context to delete fields from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = context if context else CONTEXT_WRITE.get()body = {"project": project,"context": context,"fields": fields,}response = http.delete(BASE_URL + "/logs/fields",headers=headers,json=body,)return response.json()
2
22
4
115
6
1,856
1,890
1,856
fields,project,context,api_key
['body', 'headers', 'context', 'api_key', 'project', 'response']
Returns
{"Assign": 6, "Expr": 1, "Return": 1}
6
35
6
["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.delete", "response.json"]
0
[]
The function (delete_fields) defined within the public class called public.The function start at line 1856 and ends at 1890. It contains 22 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [1856.0], and this function return a value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["_validate_api_key", "_create_request_header", "_get_and_maybe_create_project", "CONTEXT_WRITE.get", "http.delete", "response.json"].
unifyai_unify
public
public
0
0
set_user_logging
def set_user_logging(value: bool):global USR_LOGGINGUSR_LOGGING = value
1
3
1
12
1
1,897
1,899
1,897
value
['USR_LOGGING']
None
{"Assign": 1}
0
3
0
[]
0
[]
The function (set_user_logging) defined within the public class called public.The function start at line 1897 and ends at 1899. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
public
public
0
0
create_project
def create_project(name: str,*,overwrite: Union[bool, str] = False,api_key: Optional[str] = None,is_versioned: bool = True,) -> Dict[str, str]:"""Creates a logging project and adds this to your account. This project will havea set of logs associated with it.Args:name: A unique, user-defined name used when referencing the project.overwrite: Controls how to handle existing projects with the same name.If False (default), raises an error if project exists.If True, deletes the entire existing project before creating new one.If "logs", only deletes the project's logs before creating.If "contexts", only deletes the project's contexts before creating.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.is_versioned: Whether the project is tracked via version control.Returns:A message indicating whether the project was created successfully."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)body = {"name": name, "is_versioned": is_versioned}if overwrite:if name in list_projects(api_key=api_key):if overwrite == "logs":return delete_project_logs(name=name, api_key=api_key)elif overwrite == "contexts":return delete_project_contexts(name=name, api_key=api_key)else:delete_project(name=name, api_key=api_key)response = http.post(BASE_URL + "/project", headers=headers, json=body)return response.json()
5
20
4
148
4
12
52
12
name,overwrite,api_key,is_versioned
['headers', 'body', 'api_key', 'response']
Dict[str, str]
{"Assign": 4, "Expr": 2, "If": 4, "Return": 3}
8
41
8
["_validate_api_key", "_create_request_header", "list_projects", "delete_project_logs", "delete_project_contexts", "delete_project", "http.post", "response.json"]
21
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker._test_no_changes_made", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_first_release", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_and_in_check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_and_staged", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_but_not_in_check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_hidden", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_missing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_get_default_compare_branch_missing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_git_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_ignored_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_invalid_fragment_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern_invalid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern_invalid_with_suffix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_none_stdout_encoding_works", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_release_branch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_hg_py.TestHg.test_complete_scenario", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_vcs_py.TestVCS.do_test_vcs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.create", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.activate"]
The function (create_project) defined within the public class called public.The function start at line 12 and ends at 52. It contains 20 lines of code and it has a cyclomatic complexity of 5. It takes 4 parameters, represented as [12.0] and does not return any value. It declares 8.0 functions, It has 8.0 functions called inside which are ["_validate_api_key", "_create_request_header", "list_projects", "delete_project_logs", "delete_project_contexts", "delete_project", "http.post", "response.json"], It has 21.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker._test_no_changes_made", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_first_release", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_and_in_check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_and_staged", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_but_not_in_check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_exists_hidden", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_fragment_missing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_get_default_compare_branch_missing", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_git_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_ignored_files", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_invalid_fragment_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern_invalid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_issue_pattern_invalid_with_suffix", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_none_stdout_encoding_works", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_check_py.TestChecker.test_release_branch", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_hg_py.TestHg.test_complete_scenario", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963047_twisted_towncrier.src.towncrier.test.test_vcs_py.TestVCS.do_test_vcs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.create", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.activate"].
unifyai_unify
public
public
0
0
rename_project
def rename_project(name: str,new_name: str,*,api_key: Optional[str] = None,) -> Dict[str, str]:"""Renames a project from `name` to `new_name` in your account.Args:name: Name of the project to rename.new_name: A unique, user-defined name used when referencing the project.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the project was successfully renamed."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)body = {"name": new_name}response = http.patch(BASE_URL + f"/project/{name}",headers=headers,json=body,)return response.json()
1
15
3
76
4
55
83
55
name,new_name,api_key
['headers', 'body', 'api_key', 'response']
Dict[str, str]
{"Assign": 4, "Expr": 1, "Return": 1}
4
29
4
["_validate_api_key", "_create_request_header", "http.patch", "response.json"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.rename"]
The function (rename_project) defined within the public class called public.The function start at line 55 and ends at 83. It contains 15 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [55.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.patch", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.rename"].
unifyai_unify
public
public
0
0
delete_project
def delete_project(name: str,*,api_key: Optional[str] = None,) -> str:"""Deletes a project from your account.Args:name: Name of the project to delete.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:Whether the project was successfully deleted."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)response = http.delete(BASE_URL + f"/project/{name}", headers=headers)return response.json()
1
9
2
55
3
86
106
86
name,api_key
['headers', 'api_key', 'response']
str
{"Assign": 3, "Expr": 1, "Return": 1}
4
21
4
["_validate_api_key", "_create_request_header", "http.delete", "response.json"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.projects_py.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"]
The function (delete_project) defined within the public class called public.The function start at line 86 and ends at 106. It contains 9 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [86.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.delete", "response.json"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981965_allegroai_clearml_server.apiserver.services.projects_py.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"].
unifyai_unify
public
public
0
0
delete_project_logs
def delete_project_logs(name: str,*,api_key: Optional[str] = None,) -> None:"""Deletes all logs from a project.Args:name: Name of the project to delete logs from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)response = http.delete(BASE_URL + f"/project/{name}/logs", headers=headers)return response.json()
1
9
2
55
3
109
126
109
name,api_key
['headers', 'api_key', 'response']
None
{"Assign": 3, "Expr": 1, "Return": 1}
4
18
4
["_validate_api_key", "_create_request_header", "http.delete", "response.json"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"]
The function (delete_project_logs) defined within the public class called public.The function start at line 109 and ends at 126. It contains 9 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [109.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.delete", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"].
unifyai_unify
public
public
0
0
delete_project_contexts
def delete_project_contexts(name: str,*,api_key: Optional[str] = None,) -> None:"""Deletes all contexts and their associated logs from a projectArgs:name: Name of the project to delete contexts from.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)response = http.delete(BASE_URL + f"/project/{name}/contexts", headers=headers)return response.json()
1
9
2
55
3
129
146
129
name,api_key
['headers', 'api_key', 'response']
None
{"Assign": 3, "Expr": 1, "Return": 1}
4
18
4
["_validate_api_key", "_create_request_header", "http.delete", "response.json"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"]
The function (delete_project_contexts) defined within the public class called public.The function start at line 129 and ends at 146. It contains 9 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [129.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.delete", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"].
unifyai_unify
public
public
0
0
list_projects
def list_projects(*,api_key: Optional[str] = None,) -> List[str]:"""Returns the names of all projects stored in your account.Args:api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:List of all project names."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)response = http.get(BASE_URL + "/projects", headers=headers)return response.json()
1
8
1
53
3
149
166
149
api_key
['headers', 'api_key', 'response']
List[str]
{"Assign": 3, "Expr": 1, "Return": 1}
4
18
4
["_validate_api_key", "_create_request_header", "http.get", "response.json"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.activate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"]
The function (list_projects) defined within the public class called public.The function start at line 149 and ends at 166. It contains 8 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.get", "response.json"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.Project.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.__init___py.activate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.projects_py.create_project"].
unifyai_unify
public
public
0
0
commit_project
def commit_project(name: str,commit_message: str,*,api_key: Optional[str] = None,) -> Dict[str, str]:"""Creates a commit for the entire project, saving a snapshot of all versioned contexts.Args:name: Name of the project to commit.commit_message: A description of the changes being saved.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A dictionary containing the new commit_hash."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)body = {"commit_message": commit_message}response = http.post(BASE_URL + f"/project/{name}/commit",headers=headers,json=body,)return response.json()
1
15
3
76
4
169
195
169
name,commit_message,api_key
['headers', 'body', 'api_key', 'response']
Dict[str, str]
{"Assign": 4, "Expr": 1, "Return": 1}
4
27
4
["_validate_api_key", "_create_request_header", "http.post", "response.json"]
0
[]
The function (commit_project) defined within the public class called public.The function start at line 169 and ends at 195. It contains 15 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [169.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.post", "response.json"].
unifyai_unify
public
public
0
0
rollback_project
def rollback_project(name: str,commit_hash: str,*,api_key: Optional[str] = None,) -> Dict[str, str]:"""Rolls back the entire project to a specific commit.Args:name: Name of the project to roll back.commit_hash: The hash of the commit to restore.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating the success of the rollback operation."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)body = {"commit_hash": commit_hash}response = http.post(BASE_URL + f"/project/{name}/rollback",headers=headers,json=body,)return response.json()
1
15
3
76
4
198
224
198
name,commit_hash,api_key
['headers', 'body', 'api_key', 'response']
Dict[str, str]
{"Assign": 4, "Expr": 1, "Return": 1}
4
27
4
["_validate_api_key", "_create_request_header", "http.post", "response.json"]
0
[]
The function (rollback_project) defined within the public class called public.The function start at line 198 and ends at 224. It contains 15 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [198.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.post", "response.json"].
unifyai_unify
public
public
0
0
get_project_commits
def get_project_commits(name: str, *, api_key: Optional[str] = None) -> List[Dict]:"""Retrieves the commit history for a project.Args:name: Name of the project.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list of dictionaries, each representing a commit."""api_key = _validate_api_key(api_key)headers = _create_request_header(api_key)response = http.get(BASE_URL + f"/project/{name}/commits", headers=headers)return response.json()
1
5
2
57
3
227
242
227
name,api_key
['headers', 'api_key', 'response']
List[Dict]
{"Assign": 3, "Expr": 1, "Return": 1}
4
16
4
["_validate_api_key", "_create_request_header", "http.get", "response.json"]
0
[]
The function (get_project_commits) defined within the public class called public.The function start at line 227 and ends at 242. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [227.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_validate_api_key", "_create_request_header", "http.get", "response.json"].
unifyai_unify
TraceLoader
public
0
1
__init__
def __init__(self, original_loader, filter: Callable = None):self._original_loader = original_loaderself.filter = filter
1
3
3
23
0
9
11
9
self,original_loader,filter
[]
None
{"Assign": 2}
0
3
0
[]
14,667
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]
The function (__init__) defined within the public class called TraceLoader, that inherit another class.The function start at line 9 and ends at 11. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [9.0] and does not return any value. It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"].
unifyai_unify
TraceLoader
public
0
1
create_module
def create_module(self, spec):return self._original_loader.create_module(spec)
1
2
2
16
0
13
14
13
self,spec
[]
Returns
{"Return": 1}
1
2
1
["self._original_loader.create_module"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3951091_pyjs_pyjs.pyjs.testing_py.module_from_src"]
The function (create_module) defined within the public class called TraceLoader, that inherit another class.The function start at line 13 and ends at 14. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [13.0], and this function return a value. It declare 1.0 function, It has 1.0 function called inside which is ["self._original_loader.create_module"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3951091_pyjs_pyjs.pyjs.testing_py.module_from_src"].
unifyai_unify
TraceLoader
public
0
1
exec_module
def exec_module(self, module):self._original_loader.exec_module(module)unify.traced(module, filter=self.filter)
1
3
2
27
0
16
18
16
self,module
[]
None
{"Expr": 2}
2
3
2
["self._original_loader.exec_module", "unify.traced"]
0
[]
The function (exec_module) defined within the public class called TraceLoader, that inherit another class.The function start at line 16 and ends at 18. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [16.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._original_loader.exec_module", "unify.traced"].
unifyai_unify
TraceLoader
public
0
1
__init__
def __init__(self, targets: List[str], filter: Callable = None):self.targets = targetsself.filter = filter
1
3
3
28
0
22
24
22
self,original_loader,filter
[]
None
{"Assign": 2}
0
3
0
[]
14,667
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]
The function (__init__) defined within the public class called TraceLoader, that inherit another class.The function start at line 22 and ends at 24. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [22.0] and does not return any value. It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"].
unifyai_unify
TraceFinder
public
0
1
find_spec
def find_spec(self, fullname, path, target=None):for target_module in self.targets:if not fullname.startswith(target_module):return Noneoriginal_sys_meta_path = sys.meta_path[:]sys.meta_path = [finder for finder in sys.meta_path if not isinstance(finder, TraceFinder)]try:spec = importlib.util.find_spec(fullname, path)if spec is None:return Nonefinally:sys.meta_path = original_sys_meta_pathif spec.origin is None or not spec.origin.endswith(".py"):return Nonespec.loader = TraceLoader(spec.loader, filter=self.filter)return spec
9
18
4
125
0
26
46
26
self,fullname,path,target
[]
Returns
{"Assign": 5, "For": 1, "If": 3, "Return": 4, "Try": 1}
5
21
5
["fullname.startswith", "isinstance", "importlib.util.find_spec", "spec.origin.endswith", "TraceLoader"]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913638_scijava_scyjava.src.scyjava._convert_py._stock_py_converters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3968768_linbit_linstor_client.linstor_client.argcomplete._check_module_py.find", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.73263466_fpgmaas_deptry.python.deptry.module_py.ModuleBuilder._is_package_installed"]
The function (find_spec) defined within the public class called TraceFinder, that inherit another class.The function start at line 26 and ends at 46. It contains 18 lines of code and it has a cyclomatic complexity of 9. It takes 4 parameters, represented as [26.0], and this function return a value. It declares 5.0 functions, It has 5.0 functions called inside which are ["fullname.startswith", "isinstance", "importlib.util.find_spec", "spec.origin.endswith", "TraceLoader"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913638_scijava_scyjava.src.scyjava._convert_py._stock_py_converters", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3968768_linbit_linstor_client.linstor_client.argcomplete._check_module_py.find", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.73263466_fpgmaas_deptry.python.deptry.module_py.ModuleBuilder._is_package_installed"].
unifyai_unify
public
public
0
0
install_tracing_hook
def install_tracing_hook(targets: List[str], filter: Callable = None):"""Install an import hook that wraps imported modules with the traced decorator.This function adds a TraceFinder to sys.meta_path that will intercept module importsand wrap them with the traced decorator. The hook will only be installed if onedoesn't already exist.Args:targets: List of module name prefixes to target for tracing. Only moduleswhose names start with these prefixes will be wrapped.filter: A filter function that is passed to the traced decorator."""if not any(isinstance(finder, TraceFinder) for finder in sys.meta_path):sys.meta_path.insert(0, TraceFinder(targets, filter))
3
3
2
50
0
49
64
49
targets,filter
[]
None
{"Expr": 2, "If": 1}
4
16
4
["any", "isinstance", "sys.meta_path.insert", "TraceFinder"]
0
[]
The function (install_tracing_hook) defined within the public class called public.The function start at line 49 and ends at 64. It contains 3 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [49.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["any", "isinstance", "sys.meta_path.insert", "TraceFinder"].
unifyai_unify
public
public
0
0
disable_tracing_hook
def disable_tracing_hook():"""Remove the tracing import hook from sys.meta_path.This function removes any TraceFinder instances from sys.meta_path, effectivelydisabling the tracing functionality for subsequent module imports."""for finder in sys.meta_path:if isinstance(finder, TraceFinder):sys.meta_path.remove(finder)
3
4
0
28
0
67
76
67
[]
None
{"Expr": 2, "For": 1, "If": 1}
2
10
2
["isinstance", "sys.meta_path.remove"]
0
[]
The function (disable_tracing_hook) defined within the public class called public.The function start at line 67 and ends at 76. It contains 4 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["isinstance", "sys.meta_path.remove"].
unifyai_unify
public
public
0
0
_usr_msg_to_prompt
def _usr_msg_to_prompt(user_message: str) -> Prompt:return Prompt(user_message)
1
2
1
14
0
9
10
9
user_message
[]
Prompt
{"Return": 1}
1
2
1
["Prompt"]
0
[]
The function (_usr_msg_to_prompt) defined within the public class called public.The function start at line 9 and ends at 10. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["Prompt"].
unifyai_unify
public
public
0
0
_bool_to_float
def _bool_to_float(boolean: bool) -> float:return float(boolean)
1
2
1
14
0
13
14
13
boolean
[]
float
{"Return": 1}
1
2
1
["float"]
0
[]
The function (_bool_to_float) defined within the public class called public.The function start at line 13 and ends at 14. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["float"].
unifyai_unify
public
public
0
0
_prompt_to_usr_msg
def _prompt_to_usr_msg(prompt: Prompt) -> str:return prompt.messages[-1]["content"]
1
2
1
20
0
20
21
20
prompt
[]
str
{"Return": 1}
0
2
0
[]
0
[]
The function (_prompt_to_usr_msg) defined within the public class called public.The function start at line 20 and ends at 21. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
public
public
0
0
_chat_completion_to_assis_msg
def _chat_completion_to_assis_msg(chat_completion: ChatCompletion) -> str:return chat_completion.choices[0].message.content
1
2
1
20
0
24
25
24
chat_completion
[]
str
{"Return": 1}
0
2
0
[]
0
[]
The function (_chat_completion_to_assis_msg) defined within the public class called public.The function start at line 24 and ends at 25. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
public
public
0
0
_float_to_bool
def _float_to_bool(float_in: float) -> bool:return bool(float_in)
1
2
1
14
0
28
29
28
float_in
[]
bool
{"Return": 1}
1
2
1
["bool"]
0
[]
The function (_float_to_bool) defined within the public class called public.The function start at line 28 and ends at 29. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["bool"].
unifyai_unify
public
public
0
0
_cast_from_selection
def _cast_from_selection(inp: Union[str, bool, float, Prompt, ChatCompletion],targets: List[Union[float, Prompt, ChatCompletion]],) -> Union[str, bool, float, Prompt, ChatCompletion]:"""Upcasts the input if possible, based on the permitted upcasting targets provided.Args:inp: The input to cast.targets: The set of permitted upcasting targets.Returns:The input after casting to the new type, if it was possible."""input_type = type(inp)assert input_type in _CAST_DICT, ("Cannot upcast input {} of type {}, because this type is not in the ""_CAST_DICT, meaning there are no functions for casting this type.")cast_fns = _CAST_DICT[input_type]targets = [target for target in targets if target in cast_fns]assert len(targets) == 1, "There must be exactly one valid casting target."to_type = targets[0]return cast_fns[to_type](inp)
3
14
2
104
4
49
73
49
inp,targets
['input_type', 'cast_fns', 'to_type', 'targets']
Union[str, bool, float, Prompt, ChatCompletion]
{"Assign": 4, "Expr": 1, "Return": 1}
2
25
2
["type", "len"]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.universal_api.casting_py.cast"]
The function (_cast_from_selection) defined within the public class called public.The function start at line 49 and ends at 73. It contains 14 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [49.0] and does not return any value. It declares 2.0 functions, It has 2.0 functions called inside which are ["type", "len"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.universal_api.casting_py.cast"].
unifyai_unify
public
public
0
0
cast
def cast(inp: Union[str, bool, float, Prompt, ChatCompletion],to_type: Union[Type[Union[str, bool, float, Prompt, ChatCompletion]],List[Type[Union[str, bool, float, Prompt, ChatCompletion]]],],) -> Union[str, bool, float, Prompt, ChatCompletion]:"""Cast the input to the specified type.Args:inp: The input to cast.to_type: The type to cast the input to.Returns:The input after casting to the new type."""if isinstance(to_type, list):return _cast_from_selection(inp, to_type)input_type = type(inp)if input_type is to_type:return inpreturn _CAST_DICT[input_type][to_type](inp)
3
13
2
113
1
79
102
79
inp,to_type
['input_type']
Union[str, bool, float, Prompt, ChatCompletion]
{"Assign": 1, "Expr": 1, "If": 2, "Return": 3}
3
24
3
["isinstance", "_cast_from_selection", "type"]
1,647
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group._default_group_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.add_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.format_subcommand_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.get_current_context", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.Constraint.check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.constraints.conftest_py.sample_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.example_command_py.make_example_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_option_groups_py.test_option_group_with_constrained_subgroups", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.21593641_openforcefield_openff_fragmenter.versioneer_py.get_config_from_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.image.hfe_py.hfev3_mk_track", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.tools.read_py.read_and_normalise", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.tools.write_py.write_from_image", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574492_gstreamer_gst_python.examples.plugins.python.audioplot_py.AudioPlotFilter.do_generate_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_action_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_actions_parallel_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_jobs_parallel_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.expander_py.ExpansionConverter.dict_to_trees", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.compatibility_py.ContextVar.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.compatibility_py.ContextVar.get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.logging_py.RecursivelyCensoredDictWrapper.__bytes__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.logging_py.RecursivelyCensoredDictWrapper.__unicode__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.transport.redis_gateway.core_py.RedisTransportCore._receive_message", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.transport.redis_gateway.core_py.RedisTransportCore.backend_layer"]
The function (cast) defined within the public class called public.The function start at line 79 and ends at 102. It contains 13 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [79.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["isinstance", "_cast_from_selection", "type"], It has 1647.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group._default_group_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.add_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.format_subcommand_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.get_current_context", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.Constraint.check", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.constraints.conftest_py.sample_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.example_command_py.make_example_command", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_option_groups_py.test_option_group_with_constrained_subgroups", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.21593641_openforcefield_openff_fragmenter.versioneer_py.get_config_from_root", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.image.hfe_py.hfev3_mk_track", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.tools.read_py.read_and_normalise", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3494570_keirf_greaseweazle.src.greaseweazle.tools.write_py.write_from_image", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3574492_gstreamer_gst_python.examples.plugins.python.audioplot_py.AudioPlotFilter.do_generate_output", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_action_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_actions_parallel_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.client_py.Client.call_jobs_parallel_future", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.client.expander_py.ExpansionConverter.dict_to_trees", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.compatibility_py.ContextVar.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.compatibility_py.ContextVar.get", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.logging_py.RecursivelyCensoredDictWrapper.__bytes__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.logging_py.RecursivelyCensoredDictWrapper.__unicode__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.transport.redis_gateway.core_py.RedisTransportCore._receive_message", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.common.transport.redis_gateway.core_py.RedisTransportCore.backend_layer"].
unifyai_unify
public
public
0
0
try_cast
def try_cast(inp: Union[str, bool, float, Prompt, ChatCompletion],to_type: Union[Type[Union[str, bool, float, Prompt, ChatCompletion]],List[Type[Union[str, bool, float, Prompt, ChatCompletion]]],],) -> Union[str, bool, float, Prompt, ChatCompletion]:# noinspection PyBroadExceptiontry:return cast(inp, to_type)except:return inp
2
11
2
86
0
105
116
105
inp,to_type
[]
Union[str, bool, float, Prompt, ChatCompletion]
{"Return": 2, "Try": 1}
1
12
1
["cast"]
0
[]
The function (try_cast) defined within the public class called public.The function start at line 105 and ends at 116. It contains 11 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [105.0] and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["cast"].
unifyai_unify
ChatBot
public
0
0
__init__
def __init__(self,client: _Client,) -> None:"""Initializes the ChatBot object, wrapped around a client.Args:client: The Client instance to wrap the chatbot logic around."""self._paused = Falseassert not client.return_full_completion, ("ChatBot currently only supports clients which only generate the message ""content in the return")self._client = clientself.clear_chat_history()
1
11
2
38
0
12
28
12
self,client
[]
None
{"Assign": 2, "Expr": 2}
1
17
1
["self.clear_chat_history"]
14,667
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]
The function (__init__) defined within the public class called ChatBot.The function start at line 12 and ends at 28. It contains 11 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [12.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self.clear_chat_history"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"].
unifyai_unify
ChatBot
public
0
0
client
def client(self) -> _Client:"""Get the client object.# noqa: DAR201.Returns:The client."""return self._client
1
2
1
12
0
31
38
31
self
[]
_Client
{"Expr": 1, "Return": 1}
0
8
0
[]
42
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.boss.throttling_py.BossThrottle.error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_caching_client_class_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_supported_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_supported_client_different_case", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_unsupported_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3962390_keselekpermen69_userbutt.userbot.modules.pastebin_py.paste", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.astronomer.providers.amazon.aws.hooks.base_aws_py.AwsBaseHookAsync._refresh_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.test.unit.scriptstest.test_parse_py.TestParse.testParse", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._service_utils_py.boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70164402_redhatinsights_vulnerability_engine.common.logging_py.setup_cw_logging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.cli.datasets.upload_py.upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.artifact", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.move_to", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.pull_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.session_py.account_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.92180700_osuripple_lets.helpers.s3_py.clientFactory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.core_api.routes.public.test_version_py.TestGetVersion.test_airflow_version_info", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_all_apps", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_catch_all_route_last", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_core_api_app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_execution_api_app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_execution_api_app_lifespan", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_main_app_lifespan", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.amazon.src.airflow.providers.amazon.aws.hooks.base_aws_py.AwsGenericHook.account_id"]
The function (client) defined within the public class called ChatBot.The function start at line 31 and ends at 38. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 42.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.boss.throttling_py.BossThrottle.error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_caching_client_class_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_supported_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_supported_client_different_case", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3673636_bandwidth_python_bandwidth.tests.test_client_py.ClientTests.test_call_with_unsupported_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3962390_keselekpermen69_userbutt.userbot.modules.pastebin_py.paste", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.astronomer.providers.amazon.aws.hooks.base_aws_py.AwsBaseHookAsync._refresh_credentials", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69906343_ome_omero_py.test.unit.scriptstest.test_parse_py.TestParse.testParse", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70085183_awslabs_seed_farmer.seedfarmer.services._service_utils_py.boto3_client", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70164402_redhatinsights_vulnerability_engine.common.logging_py.setup_cw_logging", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.cli.datasets.upload_py.upload", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.artifact", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.move_to", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.analytics_py.TrainingJob.pull_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.nursery.tsbench.src.tsbench.evaluations.aws.session_py.account_id", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.92180700_osuripple_lets.helpers.s3_py.clientFactory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.core_api.routes.public.test_version_py.TestGetVersion.test_airflow_version_info", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_all_apps", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_catch_all_route_last", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_core_api_app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_execution_api_app", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_execution_api_app_lifespan", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.airflow_core.tests.unit.api_fastapi.test_app_py.test_main_app_lifespan", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.amazon.src.airflow.providers.amazon.aws.hooks.base_aws_py.AwsGenericHook.account_id"].
unifyai_unify
ChatBot
public
0
0
set_client
def set_client(self, value: client) -> None:"""Set the client.# noqa: DAR101.Args:value: The unify client."""if isinstance(value, _Client):self._client = valueelse:raise Exception("Invalid client!")
2
5
2
32
0
40
50
40
self,value
[]
None
{"Assign": 1, "Expr": 1, "If": 1}
2
11
2
["isinstance", "Exception"]
0
[]
The function (set_client) defined within the public class called ChatBot.The function start at line 40 and ends at 50. It contains 5 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [40.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["isinstance", "Exception"].
unifyai_unify
ChatBot
public
0
0
_get_credits
def _get_credits(self) -> float:"""Retrieves the current credit balance from associated with the UNIFY account.Returns:Current credit balance."""return self._client.get_credit_balance()
1
2
1
16
0
52
59
52
self
[]
float
{"Expr": 1, "Return": 1}
1
8
1
["self._client.get_credit_balance"]
0
[]
The function (_get_credits) defined within the public class called ChatBot.The function start at line 52 and ends at 59. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["self._client.get_credit_balance"].
unifyai_unify
ChatBot
public
0
0
_update_message_history
def _update_message_history(self,role: str,content: Union[str, Dict[str, str]],) -> None:"""Updates message history with user input.Args:role: Either "assistant" or "user".content: User input message."""if isinstance(self._client, _UniClient):self._client.messages.append({"role": role,"content": content,},)elif isinstance(self._client, _MultiClient):if isinstance(content, str):content = {endpoint: content for endpoint in self._client.endpoints}for endpoint, cont in content.items():self._client.messages[endpoint].append({"role": role,"content": cont,},)else:raise Exception("client must either be a UniClient or MultiClient instance.",)
6
26
3
132
0
61
93
61
self,role,content
[]
None
{"Assign": 1, "Expr": 3, "For": 1, "If": 3}
7
33
7
["isinstance", "self._client.messages.append", "isinstance", "isinstance", "content.items", "append", "Exception"]
0
[]
The function (_update_message_history) defined within the public class called ChatBot.The function start at line 61 and ends at 93. It contains 26 lines of code and it has a cyclomatic complexity of 6. It takes 3 parameters, represented as [61.0] and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["isinstance", "self._client.messages.append", "isinstance", "isinstance", "content.items", "append", "Exception"].
unifyai_unify
ChatBot
public
0
0
clear_chat_history
def clear_chat_history(self) -> None:"""Clears the chat history."""if isinstance(self._client, _UniClient):self._client.set_messages([])elif isinstance(self._client, _MultiClient):self._client.set_messages({endpoint: [] for endpoint in self._client.endpoints},)else:raise Exception("client must either be a UniClient or MultiClient instance.",)
4
11
1
67
0
95
106
95
self
[]
None
{"Expr": 3, "If": 2}
5
12
5
["isinstance", "self._client.set_messages", "isinstance", "self._client.set_messages", "Exception"]
0
[]
The function (clear_chat_history) defined within the public class called ChatBot.The function start at line 95 and ends at 106. It contains 11 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["isinstance", "self._client.set_messages", "isinstance", "self._client.set_messages", "Exception"].
unifyai_unify
ChatBot
public
0
0
_stream_response
def _stream_response(response) -> str:words = ""for chunk in response:words += chunksys.stdout.write(chunk)sys.stdout.flush()sys.stdout.write("\n")return words
2
8
1
43
0
109
116
109
response
[]
str
{"Assign": 1, "AugAssign": 1, "Expr": 3, "For": 1, "Return": 1}
3
8
3
["sys.stdout.write", "sys.stdout.flush", "sys.stdout.write"]
0
[]
The function (_stream_response) defined within the public class called ChatBot.The function start at line 109 and ends at 116. It contains 8 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["sys.stdout.write", "sys.stdout.flush", "sys.stdout.write"].
unifyai_unify
ChatBot
public
0
0
_handle_uni_llm_response
def _handle_uni_llm_response(self,response: str,endpoint: Union[bool, str],) -> str:if endpoint:endpoint = self._client.endpoint if endpoint is True else endpointsys.stdout.write(endpoint + ":\n")if self._client.stream:words = self._stream_response(response)else:words = responsesys.stdout.write(words)sys.stdout.write("\n\n")return words
4
15
3
85
0
118
132
118
self,response,endpoint
[]
str
{"Assign": 3, "Expr": 3, "If": 2, "Return": 1}
4
15
4
["sys.stdout.write", "self._stream_response", "sys.stdout.write", "sys.stdout.write"]
0
[]
The function (_handle_uni_llm_response) defined within the public class called ChatBot.The function start at line 118 and ends at 132. It contains 15 lines of code and it has a cyclomatic complexity of 4. It takes 3 parameters, represented as [118.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["sys.stdout.write", "self._stream_response", "sys.stdout.write", "sys.stdout.write"].
unifyai_unify
ChatBot
public
0
0
_handle_multi_llm_response
def _handle_multi_llm_response(self, response: Dict[str, str]) -> Dict[str, str]:for endpoint, resp in response.items():self._handle_uni_llm_response(resp, endpoint)return response
2
4
2
42
0
134
137
134
self,response
[]
Dict[str, str]
{"Expr": 1, "For": 1, "Return": 1}
2
4
2
["response.items", "self._handle_uni_llm_response"]
0
[]
The function (_handle_multi_llm_response) defined within the public class called ChatBot.The function start at line 134 and ends at 137. It contains 4 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [134.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["response.items", "self._handle_uni_llm_response"].
unifyai_unify
ChatBot
public
0
0
_handle_response
def _handle_response(self,response: Union[str, Dict[str, str]],show_endpoint: bool,) -> None:if isinstance(self._client, _UniClient):response = self._handle_uni_llm_response(response, show_endpoint)elif isinstance(self._client, _MultiClient):response = self._handle_multi_llm_response(response)else:raise Exception("client must either be a UniClient or MultiClient instance.",)self._update_message_history(role="assistant",content=response,)
3
17
3
85
0
139
155
139
self,response,show_endpoint
[]
None
{"Assign": 2, "Expr": 1, "If": 2}
6
17
6
["isinstance", "self._handle_uni_llm_response", "isinstance", "self._handle_multi_llm_response", "Exception", "self._update_message_history"]
0
[]
The function (_handle_response) defined within the public class called ChatBot.The function start at line 139 and ends at 155. It contains 17 lines of code and it has a cyclomatic complexity of 3. It takes 3 parameters, represented as [139.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["isinstance", "self._handle_uni_llm_response", "isinstance", "self._handle_multi_llm_response", "Exception", "self._update_message_history"].
unifyai_unify
ChatBot
public
0
0
run
def run(self, show_credits: bool = False, show_endpoint: bool = False) -> None:"""Starts the chat interaction loop.Args:show_credits: Whether to show credit consumption. Defaults to False.show_endpoint: Whether to show the endpoint used. Defaults to False."""if not self._paused:sys.stdout.write("Let's have a chat. (Enter `pause` to pause and `quit` to exit)\n",)self.clear_chat_history()else:sys.stdout.write("Welcome back! (Remember, enter `pause` to pause and `quit` to exit)\n",)self._paused = Falsewhile True:sys.stdout.write("> ")inp = input()if inp == "quit":self.clear_chat_history()breakelif inp == "pause":self._paused = Truebreakself._update_message_history(role="user", content=inp)initial_credit_balance = self._get_credits()if isinstance(self._client, unify.AsyncUnify):response = asyncio.run(self._client.generate())else:response = self._client.generate()self._handle_response(response, show_endpoint)final_credit_balance = self._get_credits()if show_credits:sys.stdout.write("\n(spent {:.6f} credits)".format(initial_credit_balance - final_credit_balance,),)
7
34
3
185
0
157
197
157
self,show_credits,show_endpoint
[]
None
{"Assign": 7, "Expr": 9, "If": 5, "While": 1}
16
41
16
["sys.stdout.write", "self.clear_chat_history", "sys.stdout.write", "sys.stdout.write", "input", "self.clear_chat_history", "self._update_message_history", "self._get_credits", "isinstance", "asyncio.run", "self._client.generate", "self._client.generate", "self._handle_response", "self._get_credits", "sys.stdout.write", "format"]
864
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.app.handlers.package_versions_py.PackageVersions.index", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.benchmark_py.run_modpython", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3514474_ntfreedom_neverendshadowsocks.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.DotDevassistantCommandRunner._dot_devassistant_dependencies", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.PingPongCommandRunner._play_pingpong", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.SetupProjectDirCommandRunner.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.Command.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.dependencies_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.eval_exec_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.evaluate_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.expand_dependencies_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.package_managers_py.DNFPackageManager.resolve", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.path_runner_py.PathRunner._run_path_dependencies", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.path_runner_py.PathRunner._run_path_run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_bad_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_filenames", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_root_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_displays_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_lists_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_no_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgInstallAction.test_pkg_install", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgInstallAction.test_pkg_install_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgSearchAction.test_raising_exceptions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgUninstallAction.test_pkg_uninstall_dependent"]
The function (run) defined within the public class called ChatBot.The function start at line 157 and ends at 197. It contains 34 lines of code and it has a cyclomatic complexity of 7. It takes 3 parameters, represented as [157.0] and does not return any value. It declares 16.0 functions, It has 16.0 functions called inside which are ["sys.stdout.write", "self.clear_chat_history", "sys.stdout.write", "sys.stdout.write", "input", "self.clear_chat_history", "self._update_message_history", "self._get_credits", "isinstance", "asyncio.run", "self._client.generate", "self._client.generate", "self._handle_response", "self._get_credits", "sys.stdout.write", "format"], It has 864.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.app.handlers.package_versions_py.PackageVersions.index", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3511986_dart_archive_pub_dartlang.third_party.cherrypy.test.benchmark_py.run_modpython", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3514474_ntfreedom_neverendshadowsocks.shadowsocks.manager_py.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.DotDevassistantCommandRunner._dot_devassistant_dependencies", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.PingPongCommandRunner._play_pingpong", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.command_runners_py.SetupProjectDirCommandRunner.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.Command.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.dependencies_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.eval_exec_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.evaluate_expression", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.lang_py.expand_dependencies_section", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.package_managers_py.DNFPackageManager.resolve", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.path_runner_py.PathRunner._run_path_dependencies", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.devassistant.path_runner_py.PathRunner._run_path_run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_bad_input", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_filenames", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestAutoCompleteAction.test_root_path", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_displays_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_lists_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestDocAction.test_no_docs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgInstallAction.test_pkg_install", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgInstallAction.test_pkg_install_fails", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgSearchAction.test_raising_exceptions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3524351_devassistant_devassistant.test.test_actions_py.TestPkgUninstallAction.test_pkg_uninstall_dependent"].
unifyai_unify
public
public
0
0
with_logging.model_fn_w_logging
def model_fn_w_logging(*args,tags: Optional[List[str]] = None,timestamp: Optional[datetime.datetime] = None,log_query_body: bool = True,log_response_body: bool = True,**kwargs,):if len(args) != 0:raise Exception("When logging queries for a local model, all arguments to ""the model callable must be provided as keyword arguments. ""Positional arguments are not supported. This is so the ""query body dict can be fully populated with keys for each ""entry.",)query_body = kwargsresponse = model_fn(**query_body)if not isinstance(response, dict):response = {"response": response}kw = dict(endpoint=endpoint,query_body=query_body,response_body=response,tags=tags,timestamp=timestamp,api_key=api_key,)if log_query_body:if not log_response_body:del kw["response_body"]unify.log_query(**kw)return response
5
33
6
139
0
44
76
44
null
[]
None
null
0
0
0
null
0
null
The function (with_logging.model_fn_w_logging) defined within the public class called public.The function start at line 44 and ends at 76. It contains 33 lines of code and it has a cyclomatic complexity of 5. It takes 6 parameters, represented as [44.0] and does not return any value..
unifyai_unify
public
public
0
0
with_logging
def with_logging(model_fn: Optional[callable] = None,*,endpoint: str,tags: Optional[List[str]] = None,timestamp: Optional[datetime.datetime] = None,log_query_body: bool = True,log_response_body: bool = True,api_key: Optional[str] = None,):"""Wrap a local model callable with logging of the queries.Args:model_fn: The model callable to wrap logging around.endpoint: The endpoint name to give to this local callable.tags: Tags for later filtering.timestamp: A timestamp (if not set, will be the time of sending).log_query_body: Whether or not to log the query body.log_response_body: Whether or not to log the response body.api_key: If specified, unify API key to be used. Defaults to the value in the `UNIFY_KEY` environment variable.Returns:A new callable, but with logging added every time the function is called.Raises:requests.HTTPError: If the API request fails."""_tags = tags_timestamp = timestamp_log_query_body = log_query_body_log_response_body = log_response_bodyapi_key = _validate_api_key(api_key)# noinspection PyShadowingNamesdef model_fn_w_logging(*args,tags: Optional[List[str]] = None,timestamp: Optional[datetime.datetime] = None,log_query_body: bool = True,log_response_body: bool = True,**kwargs,):if len(args) != 0:raise Exception("When logging queries for a local model, all arguments to ""the model callable must be provided as keyword arguments. ""Positional arguments are not supported. This is so the ""query body dict can be fully populated with keys for each ""entry.",)query_body = kwargsresponse = model_fn(**query_body)if not isinstance(response, dict):response = {"response": response}kw = dict(endpoint=endpoint,query_body=query_body,response_body=response,tags=tags,timestamp=timestamp,api_key=api_key,)if log_query_body:if not log_response_body:del kw["response_body"]unify.log_query(**kw)return responsereturn model_fn_w_logging
1
17
7
86
8
9
78
9
model_fn,endpoint,tags,timestamp,log_query_body,log_response_body,api_key
['query_body', '_log_response_body', '_log_query_body', 'kw', '_tags', 'api_key', 'response', '_timestamp']
Returns
{"Assign": 9, "Expr": 2, "If": 4, "Return": 2}
7
70
7
["_validate_api_key", "len", "Exception", "model_fn", "isinstance", "dict", "unify.log_query"]
0
[]
The function (with_logging) defined within the public class called public.The function start at line 9 and ends at 78. It contains 17 lines of code and it has a cyclomatic complexity of 1. It takes 7 parameters, represented as [9.0], and this function return a value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["_validate_api_key", "len", "Exception", "model_fn", "isinstance", "dict", "unify.log_query"].
unifyai_unify
public
public
0
0
set_client_direct_mode
def set_client_direct_mode(value: bool) -> None:"""Set the direct mode for the client.Args:value: The value to set the direct mode to."""_Client._set_direct_mode(value)
1
2
1
16
0
31
38
31
value
[]
None
{"Expr": 2}
1
8
1
["_Client._set_direct_mode"]
0
[]
The function (set_client_direct_mode) defined within the public class called public.The function start at line 31 and ends at 38. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["_Client._set_direct_mode"].
unifyai_unify
_Client
protected
1
1
__init__
def __init__(self,*,system_message: Optional[str],messages: Optional[Union[List[ChatCompletionMessageParam],Dict[str, List[ChatCompletionMessageParam]],]],frequency_penalty: Optional[float],logit_bias: Optional[Dict[str, int]],logprobs: Optional[bool],top_logprobs: Optional[int],max_completion_tokens: Optional[int],n: Optional[int],presence_penalty: Optional[float],response_format: Optional[Union[Type[BaseModel], Dict[str, str]]],seed: Optional[int],stop: Union[Optional[str], List[str]],stream: Optional[bool],stream_options: Optional[ChatCompletionStreamOptionsParam],temperature: Optional[float],top_p: Optional[float],service_tier: Optional[str],tools: Optional[Iterable[ChatCompletionToolParam]],tool_choice: Optional[ChatCompletionToolChoiceOptionParam],parallel_tool_calls: Optional[bool],reasoning_effort: Optional[str],# platform argumentsuse_custom_keys: bool,tags: Optional[List[str]],drop_params: Optional[bool],region: Optional[str] = None,log_query_body: Optional[bool],log_response_body: Optional[bool],api_key: Optional[str],openai_api_key: Optional[str],# python client argumentsstateful: bool,return_full_completion: bool,traced: bool,cache: Union[bool, str],cache_backend: str,# passthrough argumentsextra_headers: Optional[Headers],extra_query: Optional[Query],**kwargs,) -> None:# noqa: DAR101, DAR401# initial valuesself._api_key = _validate_api_key(api_key)self._openai_api_key = _validate_openai_api_key(_Client._DIRECT_OPENAI_MODE,openai_api_key,)self._system_message = Noneself._messages = Noneself._frequency_penalty = Noneself._logit_bias = Noneself._logprobs = Noneself._top_logprobs = Noneself._max_completion_tokens = Noneself._n = Noneself._presence_penalty = Noneself._response_format = Noneself._seed = Noneself._stop = Noneself._stream = Noneself._stream_options = Noneself._temperature = Noneself._top_p = Noneself._service_tier = Noneself._tools = Noneself._tool_choice = Noneself._parallel_tool_calls = Noneself._reasoning_effort = Noneself._use_custom_keys = Noneself._tags = Noneself._drop_params = Noneself._region = Noneself._log_query_body = Noneself._log_response_body = Noneself._stateful = Noneself._return_full_completion = Noneself._traced = Noneself._cache = Noneself._cache_backend = Noneself._extra_headers = Noneself._extra_query = Noneself._extra_body = None# set based on argumentsself.set_system_message(system_message)self.set_messages(messages)self.set_frequency_penalty(frequency_penalty)self.set_logit_bias(logit_bias)self.set_logprobs(logprobs)self.set_top_logprobs(top_logprobs)self.set_max_completion_tokens(max_completion_tokens)self.set_n(n)self.set_presence_penalty(presence_penalty)self.set_response_format(response_format)self.set_seed(seed)self.set_stop(stop)self.set_stream(stream)self.set_stream_options(stream_options)self.set_temperature(temperature)self.set_top_p(top_p)self.set_service_tier(service_tier)self.set_tools(tools)self.set_tool_choice(tool_choice)self.set_parallel_tool_calls(parallel_tool_calls)self.set_reasoning_effort(reasoning_effort)# platform argumentsself.set_use_custom_keys(use_custom_keys)self.set_tags(tags)self.set_drop_params(drop_params)self.set_region(region)self.set_log_query_body(log_query_body)self.set_log_response_body(log_response_body)# python client argumentsself.set_stateful(stateful)self.set_return_full_completion(return_full_completion)self.set_traced(traced)self.set_cache(cache)self.set_cache_backend(cache_backend)# passthrough argumentsself.set_extra_headers(extra_headers)self.set_extra_query(extra_query)self.set_extra_body(kwargs)# Store defaultsself._defaults = {"system_message": system_message,"messages": messages,"frequency_penalty": frequency_penalty,"logit_bias": logit_bias,"logprobs": logprobs,"top_logprobs": top_logprobs,"max_completion_tokens": max_completion_tokens,"n": n,"presence_penalty": presence_penalty,"response_format": response_format,"seed": seed,"stop": stop,"stream": stream,"stream_options": stream_options,"temperature": temperature,"top_p": top_p,"service_tier": service_tier,"tools": tools,"tool_choice": tool_choice,"parallel_tool_calls": parallel_tool_calls,"reasoning_effort": reasoning_effort,}
1
144
38
799
0
46
201
46
self,system_message,messages,frequency_penalty,logit_bias,logprobs,top_logprobs,max_completion_tokens,n,presence_penalty,response_format,seed,stop,stream,stream_options,temperature,top_p,service_tier,tools,tool_choice,parallel_tool_calls,reasoning_effort,use_custom_keys,tags,drop_params,region,log_query_body,log_response_body,api_key,openai_api_key,stateful,return_full_completion,traced,cache,cache_backend,extra_headers,extra_query,**kwargs
[]
None
{"Assign": 38, "Expr": 35}
37
156
37
["_validate_api_key", "_validate_openai_api_key", "self.set_system_message", "self.set_messages", "self.set_frequency_penalty", "self.set_logit_bias", "self.set_logprobs", "self.set_top_logprobs", "self.set_max_completion_tokens", "self.set_n", "self.set_presence_penalty", "self.set_response_format", "self.set_seed", "self.set_stop", "self.set_stream", "self.set_stream_options", "self.set_temperature", "self.set_top_p", "self.set_service_tier", "self.set_tools", "self.set_tool_choice", "self.set_parallel_tool_calls", "self.set_reasoning_effort", "self.set_use_custom_keys", "self.set_tags", "self.set_drop_params", "self.set_region", "self.set_log_query_body", "self.set_log_response_body", "self.set_stateful", "self.set_return_full_completion", "self.set_traced", "self.set_cache", "self.set_cache_backend", "self.set_extra_headers", "self.set_extra_query", "self.set_extra_body"]
14,667
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]
The function (__init__) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 46 and ends at 201. It contains 144 lines of code and it has a cyclomatic complexity of 1. It takes 38 parameters, represented as [46.0] and does not return any value. It declares 37.0 functions, It has 37.0 functions called inside which are ["_validate_api_key", "_validate_openai_api_key", "self.set_system_message", "self.set_messages", "self.set_frequency_penalty", "self.set_logit_bias", "self.set_logprobs", "self.set_top_logprobs", "self.set_max_completion_tokens", "self.set_n", "self.set_presence_penalty", "self.set_response_format", "self.set_seed", "self.set_stop", "self.set_stream", "self.set_stream_options", "self.set_temperature", "self.set_top_p", "self.set_service_tier", "self.set_tools", "self.set_tool_choice", "self.set_parallel_tool_calls", "self.set_reasoning_effort", "self.set_use_custom_keys", "self.set_tags", "self.set_drop_params", "self.set_region", "self.set_log_query_body", "self.set_log_response_body", "self.set_stateful", "self.set_return_full_completion", "self.set_traced", "self.set_cache", "self.set_cache_backend", "self.set_extra_headers", "self.set_extra_query", "self.set_extra_body"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"].
unifyai_unify
_Client
protected
1
1
_set_direct_mode
def _set_direct_mode(cls, value: bool) -> None:cls._DIRECT_OPENAI_MODE = value
1
2
2
16
0
204
205
204
cls,value
[]
None
{"Assign": 1}
0
2
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.universal_api.clients.base_py.set_client_direct_mode"]
The function (_set_direct_mode) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 204 and ends at 205. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [204.0] and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.universal_api.clients.base_py.set_client_direct_mode"].
unifyai_unify
_Client
protected
1
1
_is_direct_mode_available
def _is_direct_mode_available(self) -> bool:return self._openai_api_key is not None and self._DIRECT_OPENAI_MODE
2
2
1
18
0
207
208
207
self
[]
bool
{"Return": 1}
0
2
0
[]
0
[]
The function (_is_direct_mode_available) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 207 and ends at 208. It contains 2 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
system_message
def system_message(self) -> Optional[str]:"""Get the default system message, if set.Returns:The default system message."""return self._system_message
1
2
1
15
0
214
221
214
self
[]
Optional[str]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (system_message) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 214 and ends at 221. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
messages
def messages(self,) -> Optional[Union[List[ChatCompletionMessageParam],Dict[str, List[ChatCompletionMessageParam]],]
1
7
1
26
0
224
230
224
self
[]
Optional[Union[List[ChatCompletionMessageParam], Dict[str, List[ChatCompletionMessageParam]]]]
{"Expr": 1, "Return": 1}
0
15
0
[]
2
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94529522_hyperledger_aries_mobile_test_harness.aries_mobile_tests.pageobjects.bc_wallet.issuer_get_authcode_interface.bc_vp_issuer_get_authcode_interface_gapi_py.BCVPIssuerGetAuthCodeInterface.get_auth_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.dataflow_py._DataflowJobsController._fetch_list_job_messages_responses"]
The function (messages) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 224 and ends at 230. It contains 7 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94529522_hyperledger_aries_mobile_test_harness.aries_mobile_tests.pageobjects.bc_wallet.issuer_get_authcode_interface.bc_vp_issuer_get_authcode_interface_gapi_py.BCVPIssuerGetAuthCodeInterface.get_auth_code", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.dataflow_py._DataflowJobsController._fetch_list_job_messages_responses"].
unifyai_unify
_Client
protected
1
1
frequency_penalty
def frequency_penalty(self) -> Optional[float]:"""Get the default frequency penalty, if set.Returns:The default frequency penalty."""return self._frequency_penalty
1
2
1
15
0
241
248
241
self
[]
Optional[float]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (frequency_penalty) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 241 and ends at 248. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
logit_bias
def logit_bias(self) -> Optional[Dict[str, int]]:"""Get the default logit bias, if set.Returns:The default logit bias."""return self._logit_bias
1
2
1
20
0
251
258
251
self
[]
Optional[Dict[str, int]]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (logit_bias) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 251 and ends at 258. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
logprobs
def logprobs(self) -> Optional[bool]:"""Get the default logprobs, if set.Returns:The default logprobs."""return self._logprobs
1
2
1
15
0
261
268
261
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (logprobs) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 261 and ends at 268. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
top_logprobs
def top_logprobs(self) -> Optional[int]:"""Get the default top logprobs, if set.Returns:The default top logprobs."""return self._top_logprobs
1
2
1
15
0
271
278
271
self
[]
Optional[int]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (top_logprobs) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 271 and ends at 278. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
max_completion_tokens
def max_completion_tokens(self) -> Optional[int]:"""Get the default max tokens, if set.Returns:The default max tokens."""return self._max_completion_tokens
1
2
1
15
0
281
288
281
self
[]
Optional[int]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (max_completion_tokens) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 281 and ends at 288. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
n
def n(self) -> Optional[int]:"""Get the default n, if set.Returns:The default n value."""return self._n
1
2
1
15
0
291
298
291
self
[]
Optional[int]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (n) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 291 and ends at 298. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
presence_penalty
def presence_penalty(self) -> Optional[float]:"""Get the default presence penalty, if set.Returns:The default presence penalty."""return self._presence_penalty
1
2
1
15
0
301
308
301
self
[]
Optional[float]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (presence_penalty) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 301 and ends at 308. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
response_format
def response_format(self) -> Optional[Union[Type[BaseModel], Dict[str, str]]]:"""Get the default response format, if set.Returns:The default response format."""return self._response_format
1
2
1
28
0
311
318
311
self
[]
Optional[Union[Type[BaseModel], Dict[str, str]]]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (response_format) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 311 and ends at 318. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
seed
def seed(self) -> Optional[int]:"""Get the default seed value, if set.Returns:The default seed value."""return self._seed
1
2
1
15
0
321
328
321
self
[]
Optional[int]
{"Expr": 1, "Return": 1}
0
8
0
[]
3
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tasks_py.derive_root_account_seed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tests.python.helpers_py.derive_root_account", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tests.python.helpers_py.root_account_seed"]
The function (seed) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 321 and ends at 328. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tasks_py.derive_root_account_seed", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tests.python.helpers_py.derive_root_account", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3635751_kinecosystem_blockchain_ops.tests.python.helpers_py.root_account_seed"].
unifyai_unify
_Client
protected
1
1
stop
def stop(self) -> Union[Optional[str], List[str]]:"""Get the default stop value, if set.Returns:The default stop value."""return self._stop
1
2
1
23
0
331
338
331
self
[]
Union[Optional[str], List[str]]
{"Expr": 1, "Return": 1}
0
8
0
[]
173
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.Context.start_entity_subscribers", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3595150_twisted_txaws.txaws.reactor_py.get_exitcode_reactor", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3630950_deliveryhero_lymph.lymph.patterns.serial_events_py.SerialEventHandler.stop_consuming", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3633612_rgs1_zk_shell.zk_shell.watcher_py.ChildWatcher.remove", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640381_floydhub_floyd_cli.floyd.cli.experiment_py.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.conftest_py.in_example_dir", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.mock_py.MockAssertCalledActionDirective.tear_down_test_case_action", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.stub_action_py._stop_stubbed_actions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.time_py.FreezeTimeMixin.stop_freeze", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3684026_glyph_automat.typical_example_happy_py.buildMachine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak._group_py.ConsumerGroup.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak.test.int.test_group_integration_py.TestAfkakGroupIntegration.test_broker_restart", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak.test.int.test_group_integration_py.TestAfkakGroupIntegration.test_three_coordinator_join", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3702519_legobot_legobot.Test.test_Lego_py.TestLego.test_add_child", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3720326_openxc_openxc_python.openxc.sources.usb_py.UsbDataSource.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913804_knio_dominate.tests.test_svg_py.test_linear_gradient", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913804_knio_dominate.tests.test_svg_py.test_radial_gradient", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3925410_remyroy_cdda_game_launcher.cddagl.ui.views.main_py.ProgressCopyTree.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3925410_remyroy_cdda_game_launcher.cddagl.ui.views.main_py.ProgressRmTree.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.xonsh.jupyter_kernel_py.XonshKernel.shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.hbmqtt.mqtt.protocol.broker_handler_py.BrokerProtocolHandler.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.hbmqtt.mqtt.protocol.client_handler_py.ClientProtocolHandler.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.samples.client_publish_acl_py.test_coro", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.samples.client_publish_py.test_coro2", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967002_jupyter_qtconsole.qtconsole.inprocess_py.QtInProcessChannel.stop"]
The function (stop) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 331 and ends at 338. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 173.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3549150_freenas_cli.freenas.cli.repl_py.Context.start_entity_subscribers", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3595150_twisted_txaws.txaws.reactor_py.get_exitcode_reactor", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3630950_deliveryhero_lymph.lymph.patterns.serial_events_py.SerialEventHandler.stop_consuming", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3633612_rgs1_zk_shell.zk_shell.watcher_py.ChildWatcher.remove", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3640381_floydhub_floyd_cli.floyd.cli.experiment_py.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3643700_yelp_pgctl.tests.conftest_py.in_example_dir", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.mock_py.MockAssertCalledActionDirective.tear_down_test_case_action", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.stub_action_py._stop_stubbed_actions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3677969_eventbrite_pysoa.pysoa.test.plan.grammar.directives.time_py.FreezeTimeMixin.stop_freeze", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3684026_glyph_automat.typical_example_happy_py.buildMachine", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak._group_py.ConsumerGroup.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak.test.int.test_group_integration_py.TestAfkakGroupIntegration.test_broker_restart", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3697546_ciena_afkak.afkak.test.int.test_group_integration_py.TestAfkakGroupIntegration.test_three_coordinator_join", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3702519_legobot_legobot.Test.test_Lego_py.TestLego.test_add_child", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3720326_openxc_openxc_python.openxc.sources.usb_py.UsbDataSource.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913804_knio_dominate.tests.test_svg_py.test_linear_gradient", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913804_knio_dominate.tests.test_svg_py.test_radial_gradient", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3925410_remyroy_cdda_game_launcher.cddagl.ui.views.main_py.ProgressCopyTree.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3925410_remyroy_cdda_game_launcher.cddagl.ui.views.main_py.ProgressRmTree.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3928915_donnemartin_gitsome.xonsh.jupyter_kernel_py.XonshKernel.shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.hbmqtt.mqtt.protocol.broker_handler_py.BrokerProtocolHandler.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.hbmqtt.mqtt.protocol.client_handler_py.ClientProtocolHandler.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.samples.client_publish_acl_py.test_coro", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3953273_beerfactory_hbmqtt.samples.client_publish_py.test_coro2", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3967002_jupyter_qtconsole.qtconsole.inprocess_py.QtInProcessChannel.stop"].
unifyai_unify
_Client
protected
1
1
stream
def stream(self) -> Optional[bool]:"""Get the default stream bool, if set.Returns:The default stream bool."""return self._stream
1
2
1
15
0
341
348
341
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
15
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981967_kiwigrid_k8s_sidecar.src.resources_py._watch_resource_iterator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.d_linear.estimator_py.DLinearEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.deepar.estimator_py.DeepAREstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.i_transformer.estimator_py.ITransformerEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.lag_tst.estimator_py.LagTSTEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.patch_tst.estimator_py.PatchTSTEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.simple_feedforward.estimator_py.SimpleFeedForwardEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.tft.estimator_py.TemporalFusionTransformerEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.tide.estimator_py.TiDEEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.wavenet.estimator_py.WaveNetEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.cncf.kubernetes.src.airflow.providers.cncf.kubernetes.operators.pod_py.KubernetesPodOperator.kill_istio_sidecar", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94657204_JuanBindez_pytubefix.pytubefix.request_py.seq_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.test_splice_py.stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.test_splice_py.test_concat", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94871369_ResearchObject_ro_crate_py.rocrate.model.preview_py.Preview.stream"]
The function (stream) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 341 and ends at 348. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 15.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3981967_kiwigrid_k8s_sidecar.src.resources_py._watch_resource_iterator", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.d_linear.estimator_py.DLinearEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.deepar.estimator_py.DeepAREstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.i_transformer.estimator_py.ITransformerEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.lag_tst.estimator_py.LagTSTEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.patch_tst.estimator_py.PatchTSTEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.simple_feedforward.estimator_py.SimpleFeedForwardEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.tft.estimator_py.TemporalFusionTransformerEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.tide.estimator_py.TiDEEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.torch.model.wavenet.estimator_py.WaveNetEstimator.create_training_data_loader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.cncf.kubernetes.src.airflow.providers.cncf.kubernetes.operators.pod_py.KubernetesPodOperator.kill_istio_sidecar", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94657204_JuanBindez_pytubefix.pytubefix.request_py.seq_stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.test_splice_py.stream", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94705237_tenstorrent_tt_buda.pybuda.test.test_splice_py.test_concat", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94871369_ResearchObject_ro_crate_py.rocrate.model.preview_py.Preview.stream"].
unifyai_unify
_Client
protected
1
1
stream_options
def stream_options(self) -> Optional[ChatCompletionStreamOptionsParam]:"""Get the default stream options, if set.Returns:The default stream options."""return self._stream_options
1
2
1
15
0
351
358
351
self
[]
Optional[ChatCompletionStreamOptionsParam]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (stream_options) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 351 and ends at 358. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
temperature
def temperature(self) -> Optional[float]:"""Get the default temperature, if set.Returns:The default temperature."""return self._temperature
1
2
1
15
0
361
368
361
self
[]
Optional[float]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (temperature) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 361 and ends at 368. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
top_p
def top_p(self) -> Optional[float]:"""Get the default top p value, if set.Returns:The default top p value."""return self._top_p
1
2
1
15
0
371
378
371
self
[]
Optional[float]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (top_p) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 371 and ends at 378. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
service_tier
def service_tier(self) -> Optional[str]:"""Get the default service tier, if set.Returns:The default service tier."""return self._service_tier
1
2
1
15
0
381
388
381
self
[]
Optional[str]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (service_tier) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 381 and ends at 388. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
tools
def tools(self) -> Optional[Iterable[ChatCompletionToolParam]]:"""Get the default tools, if set.Returns:The default tools."""return self._tools
1
2
1
18
0
391
398
391
self
[]
Optional[Iterable[ChatCompletionToolParam]]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (tools) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 391 and ends at 398. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
tool_choice
def tool_choice(self) -> Optional[ChatCompletionToolChoiceOptionParam]:"""Get the default tool choice, if set.Returns:The default tool choice."""return self._tool_choice
1
2
1
15
0
401
408
401
self
[]
Optional[ChatCompletionToolChoiceOptionParam]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (tool_choice) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 401 and ends at 408. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
parallel_tool_calls
def parallel_tool_calls(self) -> Optional[bool]:"""Get the default parallel tool calls bool, if set.Returns:The default parallel tool calls bool."""return self._parallel_tool_calls
1
2
1
15
0
411
418
411
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (parallel_tool_calls) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 411 and ends at 418. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
reasoning_effort
def reasoning_effort(self) -> Optional[str]:"""Get the default reasoning, if set.Returns:The default reasoning."""return self._reasoning_effort
1
2
1
15
0
421
428
421
self
[]
Optional[str]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (reasoning_effort) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 421 and ends at 428. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
use_custom_keys
def use_custom_keys(self) -> bool:"""Get the default use custom keys bool, if set.Returns:The default use custom keys bool."""return self._use_custom_keys
1
2
1
12
0
431
438
431
self
[]
bool
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (use_custom_keys) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 431 and ends at 438. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
tags
def tags(self) -> Optional[List[str]]:"""Get the default tags, if set.Returns:The default tags."""return self._tags
1
2
1
18
0
441
448
441
self
[]
Optional[List[str]]
{"Expr": 1, "Return": 1}
0
8
0
[]
1
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3522287_schacon_hg_git.hggit.hgrepo_py.generate_repo_subclass"]
The function (tags) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 441 and ends at 448. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3522287_schacon_hg_git.hggit.hgrepo_py.generate_repo_subclass"].
unifyai_unify
_Client
protected
1
1
drop_params
def drop_params(self) -> Optional[bool]:"""Get the default drop_params bool, if set.Returns:The default drop_params bool."""return self._drop_params
1
2
1
15
0
451
458
451
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (drop_params) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 451 and ends at 458. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
region
def region(self) -> Optional[str]:"""Get the default region, if set.Returns:The default region."""return self._region
1
2
1
15
0
461
468
461
self
[]
Optional[str]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (region) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 461 and ends at 468. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
log_query_body
def log_query_body(self) -> Optional[bool]:"""Get the default log query body bool, if set.Returns:The default log query body bool."""return self._log_query_body
1
2
1
15
0
471
478
471
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (log_query_body) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 471 and ends at 478. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
log_response_body
def log_response_body(self) -> Optional[bool]:"""Get the default log response body bool, if set.Returns:The default log response body bool."""return self._log_response_body
1
2
1
15
0
481
488
481
self
[]
Optional[bool]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (log_response_body) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 481 and ends at 488. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
stateful
def stateful(self) -> bool:"""Get the default stateful bool, if set.Returns:The default stateful bool."""return self._stateful
1
2
1
12
0
491
498
491
self
[]
bool
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (stateful) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 491 and ends at 498. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
return_full_completion
def return_full_completion(self) -> bool:"""Get the default return full completion bool.Returns:The default return full completion bool."""return self._return_full_completion
1
2
1
12
0
501
508
501
self
[]
bool
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (return_full_completion) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 501 and ends at 508. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
traced
def traced(self) -> bool:"""Get the default traced bool.Returns:The default traced bool."""return self._traced
1
2
1
12
0
511
518
511
self
[]
bool
{"Expr": 1, "Return": 1}
0
8
0
[]
4
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]
The function (traced) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 511 and ends at 518. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"].
unifyai_unify
_Client
protected
1
1
cache
def cache(self) -> bool:"""Get default the cache bool.Returns:The default cache bool."""return self._cache
1
2
1
12
0
521
528
521
self
[]
bool
{"Expr": 1, "Return": 1}
0
8
0
[]
23
["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.boss.authentication_py.TokenAuthentication.user_exist", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.CallbackSplitMixin.execute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Mappings.task_info_or_vimwiki_follow_link", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.inspect_presetheader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.inspect_viewport", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.integrate_tagbar", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.set_proper_colors", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.done", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.grid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.modify", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.sort", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.start", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.toggle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Split.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Split._process_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SplitCalendar.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.dev.breeze.src.airflow_breeze.utils.functools_cache_py.clearable_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.baseline_py.baseline", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.chat_py.chat", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.kaggle_py.kaggle"]
The function (cache) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 521 and ends at 528. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It has 23.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3656182_jhuapl_boss_boss.django.boss.authentication_py.TokenAuthentication.user_exist", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.CallbackSplitMixin.execute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Mappings.task_info_or_vimwiki_follow_link", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.inspect_presetheader", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.inspect_viewport", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.integrate_tagbar", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Meta.set_proper_colors", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.delete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.done", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.grid", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.modify", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.sort", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.start", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.stop", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SelectedTasks.toggle", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Split.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.Split._process_args", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3982575_tools_life_taskwiki.taskwiki.main_py.SplitCalendar.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.dev.breeze.src.airflow_breeze.utils.functools_cache_py.clearable_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.baseline_py.baseline", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.chat_py.chat", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95085986_MLSysOps_MLE_agent.mle.workflow.kaggle_py.kaggle"].
unifyai_unify
_Client
protected
1
1
extra_headers
def extra_headers(self) -> Optional[Headers]:"""Get the default extra headers, if set.Returns:The default extra headers."""return self._extra_headers
1
2
1
15
0
531
538
531
self
[]
Optional[Headers]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (extra_headers) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 531 and ends at 538. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
extra_query
def extra_query(self) -> Optional[Query]:"""Get the default extra query, if set.Returns:The default extra query."""return self._extra_query
1
2
1
15
0
541
548
541
self
[]
Optional[Query]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (extra_query) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 541 and ends at 548. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
extra_body
def extra_body(self) -> Optional[Mapping[str, str]]:"""Get the default extra body, if set.Returns:The default extra body."""return self._extra_body
1
2
1
20
0
551
558
551
self
[]
Optional[Mapping[str, str]]
{"Expr": 1, "Return": 1}
0
8
0
[]
0
[]
The function (extra_body) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 551 and ends at 558. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value..
unifyai_unify
_Client
protected
1
1
set_system_message
def set_system_message(self, value: str) -> Self:"""Set the default system message.# noqa: DAR101.Args:value: The default system message.Returns:This client, useful for chaining inplace calls."""self._system_message = valueif self._messages is None or self._messages == []:self._messages = [{"role": "system","content": value,},]elif self._messages[0]["role"] != "system":self._messages = [{"role": "system","content": value,},] + self._messageselse:self._messages[0] = {"role": "system","content": value,}return self
4
22
2
103
0
563
593
563
self,value
[]
Self
{"Assign": 4, "Expr": 1, "If": 2, "Return": 1}
0
31
0
[]
0
[]
The function (set_system_message) defined within the protected class called _Client, implement an interface, and it inherit another class.The function start at line 563 and ends at 593. It contains 22 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [563.0] and does not return any value..