project_name string | class_name string | class_modifiers string | class_implements int64 | class_extends int64 | function_name string | function_body string | cyclomatic_complexity int64 | NLOC int64 | num_parameter int64 | num_token int64 | num_variable int64 | start_line int64 | end_line int64 | function_index int64 | function_params string | function_variable string | function_return_type string | function_body_line_type string | function_num_functions int64 | function_num_lines int64 | outgoing_function_count int64 | outgoing_function_names string | incoming_function_count int64 | incoming_function_names string | lexical_representation string |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
unifyai_unify | Log | public | 0 | 0 | __init__ | def __init__(self):self.read_vars = set() | 1 | 2 | 1 | 12 | 0 | 782 | 783 | 782 | self,id,_future,ts,project,context,api_key,params,**entries | [] | None | {"Assign": 8} | 1 | 20 | 1 | ["_validate_api_key"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the public class called Log.The function start at line 782 and ends at 783. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_validate_api_key"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | public | public | 0 | 0 | visit_Name | def visit_Name(self, node):self.read_vars.add(node.id) | 1 | 2 | 2 | 17 | 0 | 785 | 786 | 785 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (visit_Name) defined within the public class called public.The function start at line 785 and ends at 786. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [785.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | visit_withitem | def visit_withitem(self, node):pass | 1 | 2 | 2 | 8 | 0 | 788 | 789 | 788 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (visit_withitem) defined within the public class called public.The function start at line 788 and ends at 789. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [788.0] and does not return any value.. |
unifyai_unify | Log | public | 0 | 0 | __init__ | def __init__(self, lineno):self.lineno = linenoself.target_node = None | 1 | 3 | 2 | 17 | 0 | 792 | 794 | 792 | self,id,_future,ts,project,context,api_key,params,**entries | [] | None | {"Assign": 8} | 1 | 20 | 1 | ["_validate_api_key"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the public class called Log.The function start at line 792 and ends at 794. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [792.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_validate_api_key"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | public | public | 0 | 0 | visit_With | def visit_With(self, node):if node.lineno <= self.lineno <= node.end_lineno:self.target_node = node | 2 | 3 | 2 | 25 | 0 | 796 | 798 | 796 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (visit_With) defined within the public class called public.The function start at line 796 and ends at 798. It contains 3 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [796.0] and does not return any value.. |
unifyai_unify | Traced | public | 0 | 0 | _extract_read_vars | def _extract_read_vars(self):if not self.code:return set()tree = ast.parse(textwrap.dedent(self.code))finder = self._WithBlockFinder(self.runtime_lineno - self.start_linenumber + 1)finder.visit(tree)if not finder.target_node:return set()reader = self._VarReadVisitor()reader.visit(finder.target_node)return reader.read_vars | 3 | 11 | 1 | 81 | 0 | 800 | 815 | 800 | self | [] | Returns | {"Assign": 3, "Expr": 2, "If": 2, "Return": 3} | 8 | 16 | 8 | ["set", "ast.parse", "textwrap.dedent", "self._WithBlockFinder", "finder.visit", "set", "self._VarReadVisitor", "reader.visit"] | 0 | [] | The function (_extract_read_vars) defined within the public class called Traced.The function start at line 800 and ends at 815. It contains 11 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters, and this function return a value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["set", "ast.parse", "textwrap.dedent", "self._WithBlockFinder", "finder.visit", "set", "self._VarReadVisitor", "reader.visit"]. |
unifyai_unify | public | public | 0 | 0 | _print_tree_source_with_lineno | def _print_tree_source_with_lineno(fn_name, tree):source_unparsed = "\n".join(f"{i+1}: {line}" for i, line in enumerate(ast.unparse(tree).split("\n")))_traced_logger.debug(f"AST[{fn_name}]:\n{source_unparsed}") | 2 | 5 | 2 | 42 | 1 | 818 | 822 | 818 | fn_name,tree | ['source_unparsed'] | None | {"Assign": 1, "Expr": 1} | 5 | 5 | 5 | ["join", "enumerate", "split", "ast.unparse", "_traced_logger.debug"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_function"] | The function (_print_tree_source_with_lineno) defined within the public class called public.The function start at line 818 and ends at 822. It contains 5 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [818.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["join", "enumerate", "split", "ast.unparse", "_traced_logger.debug"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_function"]. |
unifyai_unify | public | public | 0 | 0 | _nested_add | def _nested_add(a, b):if a is None and isinstance(b, dict):a = {k: None if isinstance(v, dict) else 0 for k, v in b.items()}elif b is None and isinstance(a, dict):b = {k: None if isinstance(v, dict) else 0 for k, v in a.items()}if isinstance(a, dict) and isinstance(b, dict):return {k: _nested_add(a[k], b[k]) for k in a if k in b}elif a is None and b is None:return Noneelif a is None:return belif b is None:return areturn a + b | 17 | 14 | 2 | 152 | 2 | 825 | 838 | 825 | a,b | ['b', 'a'] | Returns | {"Assign": 2, "If": 6, "Return": 5} | 9 | 14 | 9 | ["isinstance", "isinstance", "b.items", "isinstance", "isinstance", "a.items", "isinstance", "isinstance", "_nested_add"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._nested_add"] | The function (_nested_add) defined within the public class called public.The function start at line 825 and ends at 838. It contains 14 lines of code and it has a cyclomatic complexity of 17. It takes 2 parameters, represented as [825.0], and this function return a value. It declares 9.0 functions, It has 9.0 functions called inside which are ["isinstance", "isinstance", "b.items", "isinstance", "isinstance", "a.items", "isinstance", "isinstance", "_nested_add"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._nested_add"]. |
unifyai_unify | public | public | 0 | 0 | _create_span | def _create_span(fn, args, kwargs, span_type, name):exec_start_time = time.perf_counter()ts = datetime.now(timezone.utc).isoformat()if not SPAN.get():RUNNING_TIME.set(exec_start_time)signature = inspect.signature(fn)bound_args = signature.bind(*args, **kwargs)bound_args.apply_defaults()inputs = bound_args.argumentsinputs = inputs["kw"] if span_type == "llm-cached" else inputsinputs = _make_json_serializable(inputs)try:lines, start_line = inspect.getsourcelines(fn)code = textwrap.dedent("".join(lines))except:lines, start_line = None, Nonetry:code = textwrap.dedent(inspect.getsource(fn))except:code = Nonename_w_sub = nameif name_w_sub is not None:for k, v in inputs.items():substr = "{" + k + "}"if substr in name_w_sub:name_w_sub = name_w_sub.replace(substr, str(v))try:code_fpath = inspect.getsourcefile(fn)except Exception as e:code_fpath = Nonenew_span = {"id": str(uuid.uuid4()),"type": span_type,"parent_span_id": (None if not SPAN.get() else SPAN.get()["id"]),"span_name": fn.__name__ if name_w_sub is None else name_w_sub,"exec_time": None,"timestamp": ts,"offset": round(0.0 if not SPAN.get() else exec_start_time - RUNNING_TIME.get(),2,),"llm_usage": None,"llm_usage_inc_cache": None,"code": f"```python\n{code}\n```","code_fpath": code_fpath,"code_start_line": start_line,"inputs": inputs,"outputs": None,"errors": None,"child_spans": [],"completed": False,}if inspect.ismethod(fn) and hasattr(fn.__self__, "endpoint"):new_span["endpoint"] = fn.__self__.endpointif not GLOBAL_SPAN.get():global_token = GLOBAL_SPAN.set(new_span)local_token = SPAN.set(GLOBAL_SPAN.get())else:global_token = NoneSPAN.get()["child_spans"].append(new_span)local_token = SPAN.set(new_span)_get_trace_logger().update_trace(ACTIVE_TRACE_LOG.get()[0],copy.deepcopy(GLOBAL_SPAN.get()),)return new_span, exec_start_time, local_token, global_token | 15 | 66 | 5 | 459 | 12 | 841 | 908 | 841 | fn,args,kwargs,span_type,name | ['exec_start_time', 'global_token', 'name_w_sub', 'code', 'new_span', 'signature', 'bound_args', 'ts', 'substr', 'local_token', 'inputs', 'code_fpath'] | Returns | {"Assign": 23, "Expr": 4, "For": 1, "If": 5, "Return": 1, "Try": 3} | 39 | 68 | 39 | ["time.perf_counter", "isoformat", "datetime.now", "SPAN.get", "RUNNING_TIME.set", "inspect.signature", "signature.bind", "bound_args.apply_defaults", "_make_json_serializable", "inspect.getsourcelines", "textwrap.dedent", "join", "textwrap.dedent", "inspect.getsource", "inputs.items", "name_w_sub.replace", "str", "inspect.getsourcefile", "str", "uuid.uuid4", "SPAN.get", "SPAN.get", "round", "SPAN.get", "RUNNING_TIME.get", "inspect.ismethod", "hasattr", "GLOBAL_SPAN.get", "GLOBAL_SPAN.set", "SPAN.set", "GLOBAL_SPAN.get", "append", "SPAN.get", "SPAN.set", "update_trace", "_get_trace_logger", "ACTIVE_TRACE_LOG.get", "copy.deepcopy", "GLOBAL_SPAN.get"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__enter__"] | The function (_create_span) defined within the public class called public.The function start at line 841 and ends at 908. It contains 66 lines of code and it has a cyclomatic complexity of 15. It takes 5 parameters, represented as [841.0], and this function return a value. It declares 39.0 functions, It has 39.0 functions called inside which are ["time.perf_counter", "isoformat", "datetime.now", "SPAN.get", "RUNNING_TIME.set", "inspect.signature", "signature.bind", "bound_args.apply_defaults", "_make_json_serializable", "inspect.getsourcelines", "textwrap.dedent", "join", "textwrap.dedent", "inspect.getsource", "inputs.items", "name_w_sub.replace", "str", "inspect.getsourcefile", "str", "uuid.uuid4", "SPAN.get", "SPAN.get", "round", "SPAN.get", "RUNNING_TIME.get", "inspect.ismethod", "hasattr", "GLOBAL_SPAN.get", "GLOBAL_SPAN.set", "SPAN.set", "GLOBAL_SPAN.get", "append", "SPAN.get", "SPAN.set", "update_trace", "_get_trace_logger", "ACTIVE_TRACE_LOG.get", "copy.deepcopy", "GLOBAL_SPAN.get"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__enter__"]. |
unifyai_unify | public | public | 0 | 0 | _finalize_span | def _finalize_span(new_span,local_token,outputs,exec_time,prune_empty,global_token,):SPAN.get()["exec_time"] = exec_timeSPAN.get()["outputs"] = outputsSPAN.get()["completed"] = Trueif SPAN.get()["type"] == "llm" and outputs is not None:SPAN.get()["llm_usage"] = outputs["usage"]if SPAN.get()["type"] in ("llm", "llm-cached") and outputs is not None:SPAN.get()["llm_usage_inc_cache"] = outputs["usage"]trace = SPAN.get()if prune_empty:trace = _prune_dict(trace)SPAN.set(trace)if global_token:GLOBAL_SPAN.set(trace)SPAN.reset(local_token)if local_token.old_value is not local_token.MISSING:SPAN.get()["llm_usage"] = _nested_add(SPAN.get()["llm_usage"],new_span["llm_usage"],)SPAN.get()["llm_usage_inc_cache"] = _nested_add(SPAN.get()["llm_usage_inc_cache"],new_span["llm_usage_inc_cache"],)_get_trace_logger().update_trace(ACTIVE_TRACE_LOG.get()[0],copy.deepcopy(GLOBAL_SPAN.get()),)if global_token:GLOBAL_SPAN.reset(global_token) | 9 | 37 | 6 | 245 | 1 | 911 | 947 | 911 | new_span,local_token,outputs,exec_time,prune_empty,global_token | ['trace'] | None | {"Assign": 9, "Expr": 5, "If": 6} | 24 | 37 | 24 | ["SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "_prune_dict", "SPAN.set", "GLOBAL_SPAN.set", "SPAN.reset", "SPAN.get", "_nested_add", "SPAN.get", "SPAN.get", "_nested_add", "SPAN.get", "update_trace", "_get_trace_logger", "ACTIVE_TRACE_LOG.get", "copy.deepcopy", "GLOBAL_SPAN.get", "GLOBAL_SPAN.reset"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__exit__"] | The function (_finalize_span) defined within the public class called public.The function start at line 911 and ends at 947. It contains 37 lines of code and it has a cyclomatic complexity of 9. It takes 6 parameters, represented as [911.0] and does not return any value. It declares 24.0 functions, It has 24.0 functions called inside which are ["SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "SPAN.get", "_prune_dict", "SPAN.set", "GLOBAL_SPAN.set", "SPAN.reset", "SPAN.get", "_nested_add", "SPAN.get", "SPAN.get", "_nested_add", "SPAN.get", "update_trace", "_get_trace_logger", "ACTIVE_TRACE_LOG.get", "copy.deepcopy", "GLOBAL_SPAN.get", "GLOBAL_SPAN.reset"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__exit__"]. |
unifyai_unify | public | public | 0 | 0 | _default_trace_filter | def _default_trace_filter(obj, name):return not (name.startswith("__") and name.endswith("__")) | 2 | 2 | 2 | 24 | 0 | 950 | 951 | 950 | obj,name | [] | Returns | {"Return": 1} | 2 | 2 | 2 | ["name.startswith", "name.endswith"] | 0 | [] | The function (_default_trace_filter) defined within the public class called public.The function start at line 950 and ends at 951. It contains 2 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [950.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["name.startswith", "name.endswith"]. |
unifyai_unify | public | public | 0 | 0 | _trace_class | def _trace_class(cls, prune_empty, span_type, name, filter):_obj_filter = (lambda obj: inspect.isfunction(obj)or inspect.isclass(obj)or inspect.ismethod(obj))for member_name, value in inspect.getmembers(cls, predicate=_obj_filter):if not filter(value, member_name):continue_name = f"{name if name is not None else cls.__name__}.{member_name}"try:setattr(cls,member_name,traced(value, prune_empty=prune_empty, span_type=span_type, name=_name),)except AttributeError:passreturn cls | 6 | 19 | 5 | 102 | 2 | 954 | 973 | 954 | cls,prune_empty,span_type,name,filter | ['_obj_filter', '_name'] | Returns | {"Assign": 2, "Expr": 1, "For": 1, "If": 1, "Return": 1, "Try": 1} | 7 | 20 | 7 | ["inspect.isfunction", "inspect.isclass", "inspect.ismethod", "inspect.getmembers", "filter", "setattr", "traced"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (_trace_class) defined within the public class called public.The function start at line 954 and ends at 973. It contains 19 lines of code and it has a cyclomatic complexity of 6. It takes 5 parameters, represented as [954.0], and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["inspect.isfunction", "inspect.isclass", "inspect.ismethod", "inspect.getmembers", "filter", "setattr", "traced"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. |
unifyai_unify | public | public | 0 | 0 | _trace_instance | def _trace_instance(inst, prune_empty, span_type, name, filter):"""Trace *only this instance* – the class itself is left untouched.Every callable attribute that passes `filter` is wrapped and reboundto the original instance so that `self` works exactly as before."""cls_name = type(inst).__name__obj_filter = lambda obj: callable(obj)for member_name, value in inspect.getmembers(inst, predicate=obj_filter):if not filter(value, member_name):continuespan_name = f"{name if name is not None else cls_name}.{member_name}"# Use the class-level (unbound) function if it exists – cleaner `@wraps`unbound = getattr(type(inst), member_name, value)traced_fn = traced(unbound,prune_empty=prune_empty,span_type=span_type,name=span_name,filter=filter,)# Determine the original attribute type (regular, staticmethod, classmethod)try:original_attr = inspect.getattr_static(type(inst), member_name)except AttributeError:original_attr = None# Re-bind based on attribute typetry:if isinstance(original_attr, staticmethod):# For staticmethods we do NOT bind to the instance – they behave like plain functionssetattr(inst, member_name, traced_fn)elif isinstance(original_attr, classmethod):# For classmethods we bind to the *class*, not the instancesetattr(inst, member_name, MethodType(traced_fn, type(inst)))else:# Regular instance methods get bound to the instance so that `self` is passed correctlysetattr(inst, member_name, MethodType(traced_fn, inst))except AttributeError:passreturn inst | 7 | 29 | 5 | 181 | 6 | 976 | 1,021 | 976 | inst,prune_empty,span_type,name,filter | ['traced_fn', 'original_attr', 'unbound', 'obj_filter', 'span_name', 'cls_name'] | Returns | {"Assign": 7, "Expr": 4, "For": 1, "If": 3, "Return": 1, "Try": 2} | 17 | 46 | 17 | ["type", "callable", "inspect.getmembers", "filter", "getattr", "type", "traced", "inspect.getattr_static", "type", "isinstance", "setattr", "isinstance", "setattr", "MethodType", "type", "setattr", "MethodType"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (_trace_instance) defined within the public class called public.The function start at line 976 and ends at 1021. It contains 29 lines of code and it has a cyclomatic complexity of 7. It takes 5 parameters, represented as [976.0], and this function return a value. It declares 17.0 functions, It has 17.0 functions called inside which are ["type", "callable", "inspect.getmembers", "filter", "getattr", "type", "traced", "inspect.getattr_static", "type", "isinstance", "setattr", "isinstance", "setattr", "MethodType", "type", "setattr", "MethodType"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. |
unifyai_unify | public | public | 0 | 0 | _trace_module | def _trace_module(module, prune_empty, span_type, name, filter):_obj_filter = (lambda obj: inspect.isfunction(obj)or inspect.isclass(obj)or inspect.ismethod(obj))for member_name, value in inspect.getmembers(module, predicate=_obj_filter):if not filter(value, member_name):continue_name = f"{name if name is not None else module.__name__}.{member_name}"try:setattr(module,member_name,traced(value, prune_empty=prune_empty, span_type=span_type, name=_name),)except AttributeError:passreturn module | 6 | 19 | 5 | 102 | 2 | 1,024 | 1,043 | 1,024 | module,prune_empty,span_type,name,filter | ['_obj_filter', '_name'] | Returns | {"Assign": 2, "Expr": 1, "For": 1, "If": 1, "Return": 1, "Try": 1} | 7 | 20 | 7 | ["inspect.isfunction", "inspect.isclass", "inspect.ismethod", "inspect.getmembers", "filter", "setattr", "traced"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (_trace_module) defined within the public class called public.The function start at line 1024 and ends at 1043. It contains 19 lines of code and it has a cyclomatic complexity of 6. It takes 5 parameters, represented as [1024.0], and this function return a value. It declares 7.0 functions, It has 7.0 functions called inside which are ["inspect.isfunction", "inspect.isclass", "inspect.ismethod", "inspect.getmembers", "filter", "setattr", "traced"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. |
unifyai_unify | public | public | 0 | 0 | _get_or_compile | def _get_or_compile(func, compiled_ast):if hasattr(func, "__cached_tracer"):transformed_func = getattr(func, "__cached_tracer")_traced_logger.debug(f"Using cached tracer for {func.__name__}")else:is_bound = hasattr(func, "__self__") and func.__self__ is not Noneorig_fn = func.__func__ if is_bound else funcinstance = func.__self__ if is_bound else Noneglobal_ns = func.__globals__.copy()# TODO: This is a hack to get the original function's closureif orig_fn.__closure__:for var, cell in zip(orig_fn.__code__.co_freevars, orig_fn.__closure__):global_ns[var] = cell.cell_contentsglobal_ns["traced"] = tracedlocal_ns = {}_traced_logger.debug(f"Executing compiled AST for {func.__name__}")exec(compiled_ast, global_ns, local_ns)transformed_func = local_ns[orig_fn.__name__]if is_bound:transformed_func = MethodType(transformed_func, instance)try:setattr(func, "__cached_tracer", transformed_func)except:passreturn transformed_func | 9 | 24 | 2 | 164 | 6 | 1,046 | 1,073 | 1,046 | func,compiled_ast | ['instance', 'transformed_func', 'is_bound', 'orig_fn', 'global_ns', 'local_ns'] | Returns | {"Assign": 10, "Expr": 4, "For": 1, "If": 3, "Return": 1, "Try": 1} | 10 | 28 | 10 | ["hasattr", "getattr", "_traced_logger.debug", "hasattr", "func.__globals__.copy", "zip", "_traced_logger.debug", "exec", "MethodType", "setattr"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"] | The function (_get_or_compile) defined within the public class called public.The function start at line 1046 and ends at 1073. It contains 24 lines of code and it has a cyclomatic complexity of 9. It takes 2 parameters, represented as [1046.0], and this function return a value. It declares 10.0 functions, It has 10.0 functions called inside which are ["hasattr", "getattr", "_traced_logger.debug", "hasattr", "func.__globals__.copy", "zip", "_traced_logger.debug", "exec", "MethodType", "setattr"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"]. |
unifyai_unify | public | public | 0 | 0 | _trace_wrapper_factory.async_wrapped | async def async_wrapped(*args, **kwargs):token = _set_active_trace_parameters(prune_empty=prune_empty,span_type=span_type,name=name,filter=filter,fn_type=fn_type,recursive=recursive,depth=depth,skip_modules=skip_modules,skip_functions=skip_functions,)transformed_fn = _get_or_compile(fn, compiled_ast)with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = await transformed_fn(*args, **kwargs)_t.result = result_reset_active_trace_parameters(token)return result | 1 | 18 | 2 | 98 | 0 | 1,095 | 1,112 | 1,095 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_trace_wrapper_factory.async_wrapped) defined within the public class called public.The function start at line 1095 and ends at 1112. It contains 18 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1095.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _trace_wrapper_factory.async_wrapped | async def async_wrapped(*args, **kwargs):with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = await fn(*args, **kwargs)_t.result = resultreturn result | 1 | 5 | 2 | 45 | 0 | 1,119 | 1,123 | 1,119 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_trace_wrapper_factory.async_wrapped) defined within the public class called public.The function start at line 1119 and ends at 1123. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1119.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _trace_wrapper_factory.wrapped | def wrapped(*args, **kwargs):token = _set_active_trace_parameters(prune_empty=prune_empty,span_type=span_type,name=name,filter=filter,fn_type=fn_type,recursive=recursive,depth=depth,skip_modules=skip_modules,skip_functions=skip_functions,)transformed_fn = _get_or_compile(fn, compiled_ast)with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = transformed_fn(*args, **kwargs)_t.result = result_reset_active_trace_parameters(token)return result | 1 | 18 | 2 | 97 | 0 | 1,131 | 1,148 | 1,131 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_trace_wrapper_factory.wrapped) defined within the public class called public.The function start at line 1131 and ends at 1148. It contains 18 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1131.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _trace_wrapper_factory.wrapped | def wrapped(*args, **kwargs):with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = fn(*args, **kwargs)_t.result = resultreturn result | 1 | 5 | 2 | 44 | 0 | 1,155 | 1,159 | 1,155 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_trace_wrapper_factory.wrapped) defined within the public class called public.The function start at line 1155 and ends at 1159. It contains 5 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1155.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _trace_wrapper_factory | def _trace_wrapper_factory(*,fn,fn_type,span_type,name,prune_empty,recursive,filter,depth,compiled_ast,skip_modules,skip_functions,):is_coroutine = inspect.iscoroutinefunction(inspect.unwrap(fn)) or fn_type == "async"if is_coroutine and recursive:@functools.wraps(fn)async def async_wrapped(*args, **kwargs):token = _set_active_trace_parameters(prune_empty=prune_empty,span_type=span_type,name=name,filter=filter,fn_type=fn_type,recursive=recursive,depth=depth,skip_modules=skip_modules,skip_functions=skip_functions,)transformed_fn = _get_or_compile(fn, compiled_ast)with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = await transformed_fn(*args, **kwargs)_t.result = result_reset_active_trace_parameters(token)return resultreturn async_wrappedif is_coroutine and not recursive:@functools.wraps(fn)async def async_wrapped(*args, **kwargs):with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = await fn(*args, **kwargs)_t.result = resultreturn resultreturn async_wrappedis_function = inspect.isfunction(fn) or inspect.ismethod(fn)if is_function and recursive:@functools.wraps(fn)def wrapped(*args, **kwargs):token = _set_active_trace_parameters(prune_empty=prune_empty,span_type=span_type,name=name,filter=filter,fn_type=fn_type,recursive=recursive,depth=depth,skip_modules=skip_modules,skip_functions=skip_functions,)transformed_fn = _get_or_compile(fn, compiled_ast)with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = transformed_fn(*args, **kwargs)_t.result = result_reset_active_trace_parameters(token)return resultreturn wrappedif is_function and not recursive:@functools.wraps(fn)def wrapped(*args, **kwargs):with _Traced(fn, args, kwargs, span_type, name, prune_empty) as _t:result = fn(*args, **kwargs)_t.result = resultreturn resultreturn wrappedraise TypeError(f"Unsupported object type, should be function, coroutine or method: {fn_type}",) | 11 | 35 | 11 | 135 | 5 | 1,076 | 1,165 | 1,076 | fn,fn_type,span_type,name,prune_empty,recursive,filter,depth,compiled_ast,skip_modules,skip_functions | ['is_function', 'is_coroutine', 'transformed_fn', 'result', 'token'] | Returns | {"Assign": 14, "Expr": 2, "If": 4, "Return": 8, "With": 4} | 23 | 90 | 23 | ["inspect.iscoroutinefunction", "inspect.unwrap", "_set_active_trace_parameters", "_get_or_compile", "_Traced", "transformed_fn", "_reset_active_trace_parameters", "functools.wraps", "_Traced", "fn", "functools.wraps", "inspect.isfunction", "inspect.ismethod", "_set_active_trace_parameters", "_get_or_compile", "_Traced", "transformed_fn", "_reset_active_trace_parameters", "functools.wraps", "_Traced", "fn", "functools.wraps", "TypeError"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_function"] | The function (_trace_wrapper_factory) defined within the public class called public.The function start at line 1076 and ends at 1165. It contains 35 lines of code and it has a cyclomatic complexity of 11. It takes 11 parameters, represented as [1076.0], and this function return a value. It declares 23.0 functions, It has 23.0 functions called inside which are ["inspect.iscoroutinefunction", "inspect.unwrap", "_set_active_trace_parameters", "_get_or_compile", "_Traced", "transformed_fn", "_reset_active_trace_parameters", "functools.wraps", "_Traced", "fn", "functools.wraps", "inspect.isfunction", "inspect.ismethod", "_set_active_trace_parameters", "_get_or_compile", "_Traced", "transformed_fn", "_reset_active_trace_parameters", "functools.wraps", "_Traced", "fn", "functools.wraps", "TypeError"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_function"]. |
unifyai_unify | public | public | 0 | 0 | _trace_function | def _trace_function(fn,prune_empty,span_type,name,filter,fn_type,recursive,depth,skip_modules,skip_functions,):_traced_logger.debug(f"Applying trace decorator to function {fn.__name__}")if not recursive or depth <= 0:return _trace_wrapper_factory(fn=fn,fn_type=fn_type,span_type=span_type,name=name,prune_empty=prune_empty,recursive=False,filter=filter,depth=0,compiled_ast=None,skip_modules=skip_modules,skip_functions=skip_functions,)try:source = inspect.getsource(fn)source = textwrap.dedent(source)except Exception as e:_traced_logger.warning(f"Error getting source for {fn.__name__}: {e}")# Fallback to non-recursive tracingreturn _trace_wrapper_factory(fn=fn,fn_type=fn_type,span_type=span_type,name=name,prune_empty=prune_empty,recursive=False,filter=filter,depth=depth,compiled_ast=None,skip_modules=skip_modules,skip_functions=skip_functions,)parsed_ast = ast.parse(source)func_def = parsed_ast.body[0]if not isinstance(func_def, (ast.FunctionDef, ast.AsyncFunctionDef)):# Fallback to non-recursive tracingreturn _trace_wrapper_factory(fn=fn,fn_type=fn_type,span_type=span_type,name=name,prune_empty=prune_empty,recursive=False,filter=filter,depth=depth,compiled_ast=None,skip_modules=skip_modules,skip_functions=skip_functions,)# Remove decorators# TODO should only remove traced decoratorfunc_def.decorator_list = []collector = TracerCallCollector(func_def.name)collector.visit(func_def)transformer = TracerCallTransformer(collector.get_local_function_names(),collector.get_external_call_names(),)transformer.visit(parsed_ast)ast.fix_missing_locations(parsed_ast)_traced_logger.debug(f"Compiling AST for {fn.__name__}")if _traced_logger_enabled:_print_tree_source_with_lineno(fn.__name__, parsed_ast)compiled_ast = compile(parsed_ast, filename="<ast>", mode="exec")return _trace_wrapper_factory(fn=fn,fn_type=fn_type,span_type=span_type,name=name,prune_empty=prune_empty,recursive=True,filter=filter,depth=depth - 1,compiled_ast=compiled_ast,skip_modules=skip_modules,skip_functions=skip_functions,) | 6 | 87 | 10 | 377 | 6 | 1,168 | 1,267 | 1,168 | fn,prune_empty,span_type,name,filter,fn_type,recursive,depth,skip_modules,skip_functions | ['func_def', 'collector', 'source', 'compiled_ast', 'parsed_ast', 'transformer'] | Returns | {"Assign": 8, "Expr": 7, "If": 3, "Return": 4, "Try": 1} | 20 | 100 | 20 | ["_traced_logger.debug", "_trace_wrapper_factory", "inspect.getsource", "textwrap.dedent", "_traced_logger.warning", "_trace_wrapper_factory", "ast.parse", "isinstance", "_trace_wrapper_factory", "TracerCallCollector", "collector.visit", "TracerCallTransformer", "collector.get_local_function_names", "collector.get_external_call_names", "transformer.visit", "ast.fix_missing_locations", "_traced_logger.debug", "_print_tree_source_with_lineno", "compile", "_trace_wrapper_factory"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (_trace_function) defined within the public class called public.The function start at line 1168 and ends at 1267. It contains 87 lines of code and it has a cyclomatic complexity of 6. It takes 10 parameters, represented as [1168.0], and this function return a value. It declares 20.0 functions, It has 20.0 functions called inside which are ["_traced_logger.debug", "_trace_wrapper_factory", "inspect.getsource", "textwrap.dedent", "_traced_logger.warning", "_trace_wrapper_factory", "ast.parse", "isinstance", "_trace_wrapper_factory", "TracerCallCollector", "collector.visit", "TracerCallTransformer", "collector.get_local_function_names", "collector.get_external_call_names", "transformer.visit", "ast.fix_missing_locations", "_traced_logger.debug", "_print_tree_source_with_lineno", "compile", "_trace_wrapper_factory"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. |
unifyai_unify | public | public | 0 | 0 | traced | def traced(obj: Union[Callable, ModuleType, Type[Any], Any] = None,*,prune_empty: bool = True,span_type: str = "function",name: Optional[str] = None,filter: Optional[Callable[[callable], bool]] = None,fn_type: Optional[str] = None,recursive: bool = False,# Only valid for Functions.depth: Optional[int] = None,skip_modules: Optional[List[ModuleType]] = None,skip_functions: Optional[List[Callable]] = None,):if not json.loads(os.environ.get("UNIFY_TRACED", "true")):return objinitialize_trace_logger()if obj is None:# Any changes to the arguments of traced should be reflected here.return lambda f: traced(f,prune_empty=prune_empty,span_type=span_type,name=name,filter=filter,fn_type=fn_type,recursive=recursive,depth=depth,skip_modules=skip_modules,skip_functions=skip_functions,)if ACTIVE_TRACE_PARAMETERS.get() is not None:args = ACTIVE_TRACE_PARAMETERS.get()prune_empty = args["prune_empty"]span_type = args["span_type"]name = args["name"]filter = args["filter"]fn_type = args["fn_type"]recursive = args["recursive"]depth = args["depth"]skip_modules = args["skip_modules"]skip_functions = args["skip_functions"]if hasattr(obj, "__unify_traced") or (skip_modules is not None and inspect.getmodule(obj) in skip_modules):return objret = Noneif inspect.isclass(obj):ret = _trace_class(obj,prune_empty,span_type,name,filter if filter else _default_trace_filter,)elif inspect.ismodule(obj):ret = _trace_module(obj,prune_empty,span_type,name,filter if filter else _default_trace_filter,)elif inspect.isfunction(obj) or inspect.ismethod(obj):if skip_functions is not None and obj in skip_functions:return objif depth is None:depth = float("inf")ret = _trace_function(obj,prune_empty,span_type,name,filter,fn_type,recursive,depth,skip_modules,skip_functions,)else:ret = _trace_instance(obj,prune_empty,span_type,name,filter if filter else _default_trace_filter,)if ret is not None:try:setattr(ret, "__unify_traced", True)except (AttributeError, TypeError):passreturn ret | 19 | 92 | 12 | 446 | 11 | 1,270 | 1,370 | 1,270 | obj,prune_empty,span_type,name,filter,fn_type,recursive,depth,skip_modules,skip_functions | ['skip_modules', 'args', 'filter', 'prune_empty', 'fn_type', 'skip_functions', 'ret', 'name', 'span_type', 'depth', 'recursive'] | Returns | {"Assign": 16, "Expr": 2, "If": 10, "Return": 5, "Try": 1} | 18 | 101 | 18 | ["json.loads", "os.environ.get", "initialize_trace_logger", "traced", "ACTIVE_TRACE_PARAMETERS.get", "ACTIVE_TRACE_PARAMETERS.get", "hasattr", "inspect.getmodule", "inspect.isclass", "_trace_class", "inspect.ismodule", "_trace_module", "inspect.isfunction", "inspect.ismethod", "float", "_trace_function", "_trace_instance", "setattr"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (traced) defined within the public class called public.The function start at line 1270 and ends at 1370. It contains 92 lines of code and it has a cyclomatic complexity of 19. It takes 12 parameters, represented as [1270.0], and this function return a value. It declares 18.0 functions, It has 18.0 functions called inside which are ["json.loads", "os.environ.get", "initialize_trace_logger", "traced", "ACTIVE_TRACE_PARAMETERS.get", "ACTIVE_TRACE_PARAMETERS.get", "hasattr", "inspect.getmodule", "inspect.isclass", "_trace_class", "inspect.ismodule", "_trace_module", "inspect.isfunction", "inspect.ismethod", "float", "_trace_function", "_trace_instance", "setattr"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_instance", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_module", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. |
unifyai_unify | Log | public | 0 | 0 | __init__ | def __init__(self):super().__init__()self.param_names = []self.assigned_names = set()self._in_function = False | 1 | 5 | 1 | 30 | 0 | 1,374 | 1,378 | 1,374 | self,id,_future,ts,project,context,api_key,params,**entries | [] | None | {"Assign": 8} | 1 | 20 | 1 | ["_validate_api_key"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the public class called Log.The function start at line 1374 and ends at 1378. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_validate_api_key"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | TracerCallCollector | public | 0 | 1 | visit_FunctionDef | def visit_FunctionDef(self, node: ast.FunctionDef):self._in_function = True# Collect non-underscore paramsself.param_names = [arg.arg for arg in node.args.args if not arg.arg.startswith("_")]# Collect non-underscore kwonlyargsself.param_names += [arg.arg for arg in node.args.kwonlyargs if not arg.arg.startswith("_")]# Add **kwargs parameter if not already presentif not node.args.kwarg:node.args.kwarg = ast.arg(arg="kwargs")# TODO: this is a hack to ensure that the function always returns somethingif not isinstance(node.body[-1], ast.Return):node.body.append(ast.Return(value=ast.Constant(value=None)))node = self.generic_visit(node)self._in_function = Falseself.param_names = []self.assigned_names = set()return node | 7 | 17 | 2 | 159 | 0 | 1,380 | 1,403 | 1,380 | self,node | [] | Returns | {"Expr": 2, "Return": 1} | 2 | 4 | 2 | ["self.generic_visit", "self.local_function_names.add"] | 0 | [] | The function (visit_FunctionDef) defined within the public class called TracerCallCollector, that inherit another class.The function start at line 1380 and ends at 1403. It contains 17 lines of code and it has a cyclomatic complexity of 7. It takes 2 parameters, represented as [1380.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.generic_visit", "self.local_function_names.add"]. |
unifyai_unify | TracerCallCollector | public | 0 | 1 | visit_AsyncFunctionDef | def visit_AsyncFunctionDef(self, node: ast.AsyncFunctionDef):self._in_function = Trueself.param_names = [arg.arg for arg in node.args.args if not arg.arg.startswith("_")]node = self.generic_visit(node)self._in_function = Falseself.param_names = []self.assigned_names = set()return node | 3 | 10 | 2 | 71 | 0 | 1,405 | 1,414 | 1,405 | self,node | [] | Returns | {"Expr": 2, "Return": 1} | 2 | 4 | 2 | ["self.generic_visit", "self.local_function_names.add"] | 0 | [] | The function (visit_AsyncFunctionDef) defined within the public class called TracerCallCollector, that inherit another class.The function start at line 1405 and ends at 1414. It contains 10 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [1405.0], and this function return a value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.generic_visit", "self.local_function_names.add"]. |
unifyai_unify | LogTransformer | public | 0 | 1 | visit_Assign | def visit_Assign(self, node: ast.Assign):if self._in_function:for target in node.targets:if isinstance(target, ast.Name) and not target.id.startswith("_"):# Remove from param_names if it's a reassigned parameterif target.id in self.param_names:self.param_names.remove(target.id)self.assigned_names.add(target.id)return node | 6 | 8 | 2 | 74 | 0 | 1,416 | 1,424 | 1,416 | self,node | [] | Returns | {"Expr": 2, "For": 1, "If": 3, "Return": 1} | 4 | 9 | 4 | ["isinstance", "target.id.startswith", "self.param_names.remove", "self.assigned_names.add"] | 0 | [] | The function (visit_Assign) defined within the public class called LogTransformer, that inherit another class.The function start at line 1416 and ends at 1424. It contains 8 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [1416.0], and this function return a value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["isinstance", "target.id.startswith", "self.param_names.remove", "self.assigned_names.add"]. |
unifyai_unify | LogTransformer | public | 0 | 1 | visit_Return | def visit_Return(self, node: ast.Return):if not self._in_function:return nodelog_keywords = []# Add regular parameters (that weren't reassigned)for p in self.param_names:log_keywords.append(ast.keyword(arg=p, value=ast.Name(id=p, ctx=ast.Load())),)# Add assigned variables (including reassigned parameters)for var_name in sorted(self.assigned_names):log_keywords.append(ast.keyword(arg=var_name, value=ast.Name(id=var_name, ctx=ast.Load())),)# Add filtered kwargs (non-underscore keys)kwargs_dict = ast.DictComp(key=ast.Name(id="k", ctx=ast.Load()),value=ast.Name(id="v", ctx=ast.Load()),generators=[ast.comprehension(target=ast.Tuple(elts=[ast.Name(id="k", ctx=ast.Store()),ast.Name(id="v", ctx=ast.Store()),],ctx=ast.Store(),),iter=ast.Call(func=ast.Attribute(value=ast.Name(id="kwargs", ctx=ast.Load()),attr="items",ctx=ast.Load(),),args=[],keywords=[],),ifs=[ast.UnaryOp(op=ast.Not(),operand=ast.Call(func=ast.Attribute(value=ast.Name(id="k", ctx=ast.Load()),attr="startswith",ctx=ast.Load(),),args=[ast.Constant(value="_")],keywords=[],),),],is_async=0,),],)log_keywords.append(ast.keyword(arg=None, value=kwargs_dict))return_value = (node.value if node.value is not None else ast.Constant(value=None))log_call = ast.Expr(value=ast.Call(func=ast.Name(id="unify_log", ctx=ast.Load()),args=[],keywords=log_keywords,),)return [log_call, ast.Return(value=return_value)] | 5 | 63 | 2 | 458 | 0 | 1,426 | 1,497 | 1,426 | self,node | [] | Returns | {"Assign": 4, "Expr": 3, "For": 2, "If": 1, "Return": 2} | 42 | 72 | 42 | ["log_keywords.append", "ast.keyword", "ast.Name", "ast.Load", "sorted", "log_keywords.append", "ast.keyword", "ast.Name", "ast.Load", "ast.DictComp", "ast.Name", "ast.Load", "ast.Name", "ast.Load", "ast.comprehension", "ast.Tuple", "ast.Name", "ast.Store", "ast.Name", "ast.Store", "ast.Store", "ast.Call", "ast.Attribute", "ast.Name", "ast.Load", "ast.Load", "ast.UnaryOp", "ast.Not", "ast.Call", "ast.Attribute", "ast.Name", "ast.Load", "ast.Load", "ast.Constant", "log_keywords.append", "ast.keyword", "ast.Constant", "ast.Expr", "ast.Call", "ast.Name", "ast.Load", "ast.Return"] | 0 | [] | The function (visit_Return) defined within the public class called LogTransformer, that inherit another class.The function start at line 1426 and ends at 1497. It contains 63 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [1426.0], and this function return a value. It declares 42.0 functions, and It has 42.0 functions called inside which are ["log_keywords.append", "ast.keyword", "ast.Name", "ast.Load", "sorted", "log_keywords.append", "ast.keyword", "ast.Name", "ast.Load", "ast.DictComp", "ast.Name", "ast.Load", "ast.Name", "ast.Load", "ast.comprehension", "ast.Tuple", "ast.Name", "ast.Store", "ast.Name", "ast.Store", "ast.Store", "ast.Call", "ast.Attribute", "ast.Name", "ast.Load", "ast.Load", "ast.UnaryOp", "ast.Not", "ast.Call", "ast.Attribute", "ast.Name", "ast.Load", "ast.Load", "ast.Constant", "log_keywords.append", "ast.keyword", "ast.Constant", "ast.Expr", "ast.Call", "ast.Name", "ast.Load", "ast.Return"]. |
unifyai_unify | public | public | 0 | 0 | log_decorator.transformed_func | def transformed_func(*args, **kwargs):with unify.Log():return trans(*args, **kwargs) | 1 | 3 | 2 | 25 | 0 | 1,533 | 1,535 | 1,533 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (log_decorator.transformed_func) defined within the public class called public.The function start at line 1533 and ends at 1535. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [1533.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | log_decorator | def log_decorator(func):"""Decorator that rewrites the function's AST so that it logs non-underscoreparameters, and assigned variables."""# 1) Parse the source to an ASTsource = textwrap.dedent(inspect.getsource(func))# Remove the decorator line if presentsource_lines = source.split("\n")if source_lines[0].strip().startswith("@"):source = "\n".join(source_lines[1:])mod = ast.parse(source)# 2) Transform the ASTtransformer = LogTransformer()mod = transformer.visit(mod)ast.fix_missing_locations(mod)# 3) Compile the new ASTcode = compile(mod, filename="<ast>", mode="exec")# 4) Get the current module's globalsmodule = inspect.getmodule(func)func_globals = module.__dict__.copy() if module else globals().copy()func_globals["unify_log"] = unify_log# 5) Execute the compiled module code in that namespaceexec(code, func_globals)trans = func_globals[func.__name__]# 6 ) Add logging contextdef transformed_func(*args, **kwargs):with unify.Log():return trans(*args, **kwargs)# Copy necessary attributestransformed_func.__name__ = func.__name__transformed_func.__doc__ = func.__doc__transformed_func.__module__ = func.__module__transformed_func.__annotations__ = func.__annotations__# Copy closure and cell variables if they existif func.__closure__:transformed_func.__closure__ = func.__closure__return transformed_func | 4 | 23 | 1 | 186 | 8 | 1,500 | 1,547 | 1,500 | func | ['mod', 'func_globals', 'code', 'source', 'source_lines', 'module', 'transformer', 'trans'] | Returns | {"Assign": 16, "Expr": 3, "If": 2, "Return": 2, "With": 1} | 18 | 48 | 18 | ["textwrap.dedent", "inspect.getsource", "source.split", "startswith", "strip", "join", "ast.parse", "LogTransformer", "transformer.visit", "ast.fix_missing_locations", "compile", "inspect.getmodule", "module.__dict__.copy", "copy", "globals", "exec", "unify.Log", "trans"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"] | The function (log_decorator) defined within the public class called public.The function start at line 1500 and ends at 1547. It contains 23 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters, and this function return a value. It declares 18.0 functions, It has 18.0 functions called inside which are ["textwrap.dedent", "inspect.getsource", "source.split", "startswith", "strip", "join", "ast.parse", "LogTransformer", "transformer.visit", "ast.fix_missing_locations", "compile", "inspect.getmodule", "module.__dict__.copy", "copy", "globals", "exec", "unify.Log", "trans"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | __init__ | def __init__(self,*,name: str = "unknown",base_url: str = BASE_URL,api_key: str = os.getenv("UNIFY_KEY"),num_consumers: int = 256,max_queue_size: int = 10000,):self.name = f"UnifyAsyncLogger.{name}"self.loop = asyncio.new_event_loop()self.queue = Noneself.consumers: List[asyncio.Task] = []self.num_consumers = num_consumersself.start_flag = threading.Event()self.shutting_down = Falseself.thread_raised_exception = threading.Event()self.max_queue_size = max_queue_sizeself.logger = logging.getLogger(self.name)if ASYNC_LOGGER_DEBUG:self.logger.setLevel(logging.DEBUG)# Register shutdown handleratexit.register(self.stop_sync, immediate=False)headers = _create_request_header(api_key)url = base_url + "/"connector = aiohttp.TCPConnector(limit=num_consumers // 2, loop=self.loop)self.session = aiohttp.ClientSession(url,headers=headers,loop=self.loop,connector=connector,)self.thread = threading.Thread(target=self._run_loop,daemon=True,name=self.name,)self.thread.start()self.start_flag.wait()if self.thread_raised_exception.is_set():raise RuntimeError(f"{self.name} thread failed to start")self.callbacks = [] | 3 | 41 | 4 | 252 | 0 | 21 | 65 | 21 | self,name,base_url,api_key,num_consumers,max_queue_size | [] | None | {"AnnAssign": 1, "Assign": 15, "Expr": 4, "If": 2} | 15 | 45 | 15 | ["os.getenv", "asyncio.new_event_loop", "threading.Event", "threading.Event", "logging.getLogger", "self.logger.setLevel", "atexit.register", "_create_request_header", "aiohttp.TCPConnector", "aiohttp.ClientSession", "threading.Thread", "self.thread.start", "self.start_flag.wait", "self.thread_raised_exception.is_set", "RuntimeError"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the public class called AsyncLoggerManager.The function start at line 21 and ends at 65. It contains 41 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [21.0] and does not return any value. It declares 15.0 functions, It has 15.0 functions called inside which are ["os.getenv", "asyncio.new_event_loop", "threading.Event", "threading.Event", "logging.getLogger", "self.logger.setLevel", "atexit.register", "_create_request_header", "aiohttp.TCPConnector", "aiohttp.ClientSession", "threading.Thread", "self.thread.start", "self.start_flag.wait", "self.thread_raised_exception.is_set", "RuntimeError"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | register_callback | def register_callback(self, fn):self.callbacks.append(fn) | 1 | 2 | 2 | 15 | 0 | 67 | 68 | 67 | self,fn | [] | None | {"Expr": 1} | 1 | 2 | 1 | ["self.callbacks.append"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_alarmsystem_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_group_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_light_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_sensor_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94562214_home_assistant_core.homeassistant.components.egardia.alarm_control_panel_py.EgardiaAlarm.async_added_to_hass"] | The function (register_callback) defined within the public class called AsyncLoggerManager.The function start at line 67 and ends at 68. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [67.0] and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self.callbacks.append"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_alarmsystem_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_group_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_light_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3930807_kane610_deconz.tests.test_gateway_py.test_sensor_events", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94562214_home_assistant_core.homeassistant.components.egardia.alarm_control_panel_py.EgardiaAlarm.async_added_to_hass"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | clear_callbacks | def clear_callbacks(self):self.callbacks = [] | 1 | 2 | 1 | 11 | 0 | 70 | 71 | 70 | self | [] | None | {"Assign": 1} | 0 | 2 | 0 | [] | 0 | [] | The function (clear_callbacks) defined within the public class called AsyncLoggerManager.The function start at line 70 and ends at 71. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value.. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _notify_callbacks | def _notify_callbacks(self):for fn in self.callbacks:fn() | 2 | 3 | 1 | 15 | 0 | 73 | 75 | 73 | self | [] | None | {"Expr": 1, "For": 1} | 1 | 3 | 1 | ["fn"] | 0 | [] | The function (_notify_callbacks) defined within the public class called AsyncLoggerManager.The function start at line 73 and ends at 75. It contains 3 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["fn"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _join | async def _join(self):if self.queue is None:returnawait self.queue.join() | 2 | 4 | 1 | 21 | 0 | 77 | 80 | 77 | self | [] | None | {"Expr": 1, "If": 1, "Return": 1} | 1 | 4 | 1 | ["self.queue.join"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3955608_simplejson_simplejson.simplejson.decoder_py.py_scanstring"] | The function (_join) defined within the public class called AsyncLoggerManager.The function start at line 77 and ends at 80. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["self.queue.join"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3955608_simplejson_simplejson.simplejson.decoder_py.py_scanstring"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | join | def join(self):try:future = asyncio.run_coroutine_threadsafe(self._join(), self.loop)while True:try:future.result(timeout=0.5)breakexcept (asyncio.TimeoutError, TimeoutError):self.logger.debug(f"Join waiting for {self.queue._unfinished_tasks} tasks to complete",)continueexcept Exception as e:self.logger.error(f"Error in join: {e}")raise e | 4 | 15 | 1 | 75 | 0 | 82 | 96 | 82 | self | [] | None | {"Assign": 1, "Expr": 3, "Try": 2, "While": 1} | 5 | 15 | 5 | ["asyncio.run_coroutine_threadsafe", "self._join", "future.result", "self.logger.debug", "self.logger.error"] | 7,270 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.response_py.Record._diff", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.response_py.Record._endpoint_from_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.models.dcim_py.TraceableRecord._get_obj_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.config_py.Configuration.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.duckhunt_py.ducks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.duckhunt_py.make_duck", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.emojify_py.get_emoji_match", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.events_py.events_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.expand_py.expand", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.feeds_py.feeds_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.help_py.help", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.insult_py.insult", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.tweets_py.tweets_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.format_subcommand_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.handle_bad_command_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py._process_unexpected_kwarg_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.make_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.common_py.format_param", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.common_py.format_param_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.conditions_py._And.negated_description", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.conditions_py._Or.negated_description", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.write_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.write_text", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.scripts.generate_git_example_py.generate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_formatting_py.test_write_text_with_styles"] | The function (join) defined within the public class called AsyncLoggerManager.The function start at line 82 and ends at 96. It contains 15 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["asyncio.run_coroutine_threadsafe", "self._join", "future.result", "self.logger.debug", "self.logger.error"], It has 7270.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.response_py.Record._diff", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.response_py.Record._endpoint_from_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.models.dcim_py.TraceableRecord._get_obj_class", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.config_py.Configuration.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.duckhunt_py.ducks", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.duckhunt_py.make_duck", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.emojify_py.get_emoji_match", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.events_py.events_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.expand_py.expand", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.feeds_py.feeds_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.help_py.help", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.insult_py.insult", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.modules.tweets_py.tweets_timer", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.format_subcommand_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.handle_bad_command_name", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py._process_unexpected_kwarg_error", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.make_repr", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.common_py.format_param", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.common_py.format_param_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.conditions_py._And.negated_description", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.conditions_py._Or.negated_description", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.write_aliases", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.write_text", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.scripts.generate_git_example_py.generate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_formatting_py.test_write_text_with_styles"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _main_loop | async def _main_loop(self):self.start_flag.set()self.logger.debug(f"Spawning {self.num_consumers} consumers")await asyncio.gather(*self.consumers, return_exceptions=True) | 1 | 4 | 1 | 35 | 0 | 98 | 101 | 98 | self | [] | None | {"Expr": 3} | 3 | 4 | 3 | ["self.start_flag.set", "self.logger.debug", "asyncio.gather"] | 0 | [] | The function (_main_loop) defined within the public class called AsyncLoggerManager.The function start at line 98 and ends at 101. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["self.start_flag.set", "self.logger.debug", "asyncio.gather"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _run_loop | def _run_loop(self):asyncio.set_event_loop(self.loop)self.queue = asyncio.Queue(maxsize=self.max_queue_size)for _ in range(self.num_consumers):self.consumers.append(self._log_consumer())try:self.loop.run_until_complete(self._main_loop())except Exception as e:self.shutting_down = Trueself.thread_raised_exception.set()self.start_flag.set()raise efinally:self.loop.close() | 4 | 14 | 1 | 98 | 0 | 103 | 118 | 103 | self | [] | None | {"Assign": 2, "Expr": 6, "For": 1, "Try": 1} | 10 | 16 | 10 | ["asyncio.set_event_loop", "asyncio.Queue", "range", "self.consumers.append", "self._log_consumer", "self.loop.run_until_complete", "self._main_loop", "self.thread_raised_exception.set", "self.start_flag.set", "self.loop.close"] | 0 | [] | The function (_run_loop) defined within the public class called AsyncLoggerManager.The function start at line 103 and ends at 118. It contains 14 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["asyncio.set_event_loop", "asyncio.Queue", "range", "self.consumers.append", "self._log_consumer", "self.loop.run_until_complete", "self._main_loop", "self.thread_raised_exception.set", "self.start_flag.set", "self.loop.close"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _consume_create | async def _consume_create(self, body, future, idx):async with self.session.post("logs", json=body) as res:if res.status != 200:txt = await res.text()self.logger.error(f"Failed to create log {idx} {res.status}: {txt}",)returnres_json = await res.json()self.logger.debug(f"Created log {res_json['log_event_ids'][0]} with status {res.status}",)future.set_result(res_json["log_event_ids"][0]) | 2 | 13 | 4 | 84 | 0 | 120 | 132 | 120 | self,body,future,idx | [] | None | {"Assign": 2, "Expr": 3, "If": 1, "Return": 1} | 6 | 13 | 6 | ["self.session.post", "res.text", "self.logger.error", "res.json", "self.logger.debug", "future.set_result"] | 0 | [] | The function (_consume_create) defined within the public class called AsyncLoggerManager.The function start at line 120 and ends at 132. It contains 13 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [120.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["self.session.post", "res.text", "self.logger.error", "res.json", "self.logger.debug", "future.set_result"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _consume_update | async def _consume_update(self, body, future, idx):if not future.done():await futurebody["logs"] = [future.result()]async with self.session.put("logs", json=body) as res:if res.status != 200:txt = await res.text()self.logger.error(f"Failed to update log {idx} {body['logs'][0]} {res.status}: {txt}",)returnres_json = await res.json()self.logger.debug(f"Updated log {res_json['log_event_ids'][0]} with status {res.status}",) | 3 | 15 | 4 | 94 | 0 | 134 | 148 | 134 | self,body,future,idx | [] | None | {"Assign": 3, "Expr": 3, "If": 2, "Return": 1} | 7 | 15 | 7 | ["future.done", "future.result", "self.session.put", "res.text", "self.logger.error", "res.json", "self.logger.debug"] | 0 | [] | The function (_consume_update) defined within the public class called AsyncLoggerManager.The function start at line 134 and ends at 148. It contains 15 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [134.0] and does not return any value. It declares 7.0 functions, and It has 7.0 functions called inside which are ["future.done", "future.result", "self.session.put", "res.text", "self.logger.error", "res.json", "self.logger.debug"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | _log_consumer | async def _log_consumer(self):while True:try:event = await self.queue.get()idx = self.queue.qsize() + 1self.logger.debug(f"'{event['type']}' processing {idx}")if event["type"] == "create":await self._consume_create(event["_data"], event["future"], idx)elif event["type"] == "update":await self._consume_update(event["_data"], event["future"], idx)else:raise Exception(f"Unknown event type: {event['type']}")except Exception as e:event["future"].set_exception(e)self.logger.error(f"Error in consumer: {e}")raise efinally:self.queue.task_done()self._notify_callbacks() | 6 | 19 | 1 | 137 | 0 | 150 | 168 | 150 | self | [] | None | {"Assign": 2, "Expr": 7, "If": 2, "Try": 1, "While": 1} | 10 | 19 | 10 | ["self.queue.get", "self.queue.qsize", "self.logger.debug", "self._consume_create", "self._consume_update", "Exception", "set_exception", "self.logger.error", "self.queue.task_done", "self._notify_callbacks"] | 0 | [] | The function (_log_consumer) defined within the public class called AsyncLoggerManager.The function start at line 150 and ends at 168. It contains 19 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["self.queue.get", "self.queue.qsize", "self.logger.debug", "self._consume_create", "self._consume_update", "Exception", "set_exception", "self.logger.error", "self.queue.task_done", "self._notify_callbacks"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | log_create | def log_create(self,project: str,context: str,params: dict,entries: dict,) -> asyncio.Future:if self.shutting_down:self.logger.warning(f"Not running, skipping log create")return Nonefut = self.loop.create_future()event = {"_data": {"project": project,"context": context,"params": params,"entries": entries,},"type": "create","future": fut,}asyncio.run_coroutine_threadsafe(self.queue.put(event), self.loop).result()return fut | 2 | 23 | 5 | 107 | 0 | 170 | 192 | 170 | self,project,context,params,entries | [] | asyncio.Future | {"Assign": 2, "Expr": 2, "If": 1, "Return": 2} | 5 | 23 | 5 | ["self.logger.warning", "self.loop.create_future", "result", "asyncio.run_coroutine_threadsafe", "self.queue.put"] | 0 | [] | The function (log_create) defined within the public class called AsyncLoggerManager.The function start at line 170 and ends at 192. It contains 23 lines of code and it has a cyclomatic complexity of 2. It takes 5 parameters, represented as [170.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["self.logger.warning", "self.loop.create_future", "result", "asyncio.run_coroutine_threadsafe", "self.queue.put"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | log_update | def log_update(self,project: str,context: str,future: asyncio.Future,mode: str,overwrite: bool,data: dict,) -> None:if self.shutting_down:self.logger.warning(f"Not running, skipping log update")returnevent = {"_data": {mode: data,"project": project,"context": context,"overwrite": overwrite,},"type": "update","future": future,}asyncio.run_coroutine_threadsafe(self.queue.put(event), self.loop).result() | 2 | 23 | 7 | 103 | 0 | 194 | 216 | 194 | self,project,context,future,mode,overwrite,data | [] | None | {"Assign": 1, "Expr": 2, "If": 1, "Return": 1} | 4 | 23 | 4 | ["self.logger.warning", "result", "asyncio.run_coroutine_threadsafe", "self.queue.put"] | 0 | [] | The function (log_update) defined within the public class called AsyncLoggerManager.The function start at line 194 and ends at 216. It contains 23 lines of code and it has a cyclomatic complexity of 2. It takes 7 parameters, represented as [194.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["self.logger.warning", "result", "asyncio.run_coroutine_threadsafe", "self.queue.put"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | clear_queue | def clear_queue(self):if self.queue is None:returnorig_size = self.queue.qsize()while not self.queue.empty():self.queue.get_nowait()self.queue.task_done()self.logger.debug(f"{orig_size} log requests cleared") | 3 | 8 | 1 | 55 | 0 | 218 | 226 | 218 | self | [] | None | {"Assign": 1, "Expr": 3, "If": 1, "Return": 1, "While": 1} | 5 | 9 | 5 | ["self.queue.qsize", "self.queue.empty", "self.queue.get_nowait", "self.queue.task_done", "self.logger.debug"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77620428_lqhuang_mode_ng.tests.unit.utils.test_queues_py.test_ThrowableQueue.test_clear__cancels_waiting_putter"] | The function (clear_queue) defined within the public class called AsyncLoggerManager.The function start at line 218 and ends at 226. It contains 8 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["self.queue.qsize", "self.queue.empty", "self.queue.get_nowait", "self.queue.task_done", "self.logger.debug"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.77620428_lqhuang_mode_ng.tests.unit.utils.test_queues_py.test_ThrowableQueue.test_clear__cancels_waiting_putter"]. |
unifyai_unify | AsyncLoggerManager | public | 0 | 0 | stop_sync | def stop_sync(self, immediate=False):if self.shutting_down:self.logger.debug(f"Already shutting down, skipping stop")returnself.shutting_down = Trueif immediate:self.logger.debug(f"Shutting down immediately")self.loop.stop()else:self.logger.debug(f"Shutting down gracefully")self.join() | 3 | 11 | 2 | 64 | 0 | 228 | 239 | 228 | self,immediate | [] | None | {"Assign": 1, "Expr": 5, "If": 2, "Return": 1} | 5 | 12 | 5 | ["self.logger.debug", "self.logger.debug", "self.loop.stop", "self.logger.debug", "self.join"] | 0 | [] | The function (stop_sync) defined within the public class called AsyncLoggerManager.The function start at line 228 and ends at 239. It contains 11 lines of code and it has a cyclomatic complexity of 3. It takes 2 parameters, represented as [228.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["self.logger.debug", "self.logger.debug", "self.loop.stop", "self.logger.debug", "self.join"]. |
unifyai_unify | public | public | 0 | 0 | get_param_by_version | def get_param_by_version(field: str,version: Union[str, int],api_key: Optional[str] = None,) -> Any:"""Gets the parameter by version.Args:field: The field of the parameter to get.version: The version of the parameter to get.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The parameter by version."""version = str(version)filter_exp = f"version({field}) == {version}"return get_logs(filter=filter_exp, limit=1, api_key=api_key)[0].params[field][1] | 1 | 8 | 3 | 65 | 2 | 11 | 32 | 11 | field,version,api_key | ['version', 'filter_exp'] | Any | {"Assign": 2, "Expr": 1, "Return": 1} | 2 | 22 | 2 | ["str", "get_logs"] | 0 | [] | The function (get_param_by_version) defined within the public class called public.The function start at line 11 and ends at 32. It contains 8 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [11.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["str", "get_logs"]. |
unifyai_unify | public | public | 0 | 0 | get_param_by_value | def get_param_by_value(field: str,value: Any,api_key: Optional[str] = None,) -> Any:"""Gets the parameter by value.Args:field: The field of the parameter to get.value: The value of the parameter to get.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The parameter by version."""filter_exp = f"{field} == {json.dumps(value)}"return get_logs(filter=filter_exp, limit=1, api_key=api_key)[0].params[field][0] | 1 | 7 | 3 | 54 | 1 | 35 | 55 | 35 | field,value,api_key | ['filter_exp'] | Any | {"Assign": 1, "Expr": 1, "Return": 1} | 2 | 21 | 2 | ["json.dumps", "get_logs"] | 0 | [] | The function (get_param_by_value) defined within the public class called public.The function start at line 35 and ends at 55. It contains 7 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [35.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["json.dumps", "get_logs"]. |
unifyai_unify | public | public | 0 | 0 | get_source | def get_source() -> str:"""Extracts the source code for the file from where this function was called.Returns:The source code for the file, as a string."""frame = inspect.getouterframes(inspect.currentframe())[1]with open(frame.filename, "r") as file:source = file.read()return f"```python\n{source}\n```" | 1 | 5 | 0 | 44 | 2 | 58 | 68 | 58 | ['frame', 'source'] | str | {"Assign": 2, "Expr": 1, "Return": 1, "With": 1} | 4 | 11 | 4 | ["inspect.getouterframes", "inspect.currentframe", "open", "file.read"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3659819_nic0_tyrs.src.tyrs.widget_py.StatusWidget.get_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3931673_hyde_hyde.hyde.ext.templates.jinja_py.HydeLoader.get_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963857_anime_dl_anime_downloader.anime_downloader.extractors.rapidvideo_py.RapidVideo._get_data"] | The function (get_source) defined within the public class called public.The function start at line 58 and ends at 68. It contains 5 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["inspect.getouterframes", "inspect.currentframe", "open", "file.read"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3659819_nic0_tyrs.src.tyrs.widget_py.StatusWidget.get_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3931673_hyde_hyde.hyde.ext.templates.jinja_py.HydeLoader.get_source", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963857_anime_dl_anime_downloader.anime_downloader.extractors.rapidvideo_py.RapidVideo._get_data"]. | |
unifyai_unify | public | public | 0 | 0 | get_experiment_name | def get_experiment_name(version: int, api_key: Optional[str] = None) -> str:"""Gets the experiment name (by version).Args:version: The version of the experiment to get.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The experiment name with said version."""experiments = get_groups(key="experiment", api_key=api_key)if not experiments:return Noneelif version < 0:version = len(experiments) + versionif str(version) not in experiments:return Nonereturn experiments[str(version)] | 4 | 9 | 2 | 69 | 2 | 75 | 95 | 75 | version,api_key | ['experiments', 'version'] | str | {"Assign": 2, "Expr": 1, "If": 3, "Return": 3} | 4 | 21 | 4 | ["get_groups", "len", "str", "str"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Experiment.__init__"] | The function (get_experiment_name) defined within the public class called public.The function start at line 75 and ends at 95. It contains 9 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [75.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_groups", "len", "str", "str"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Experiment.__init__"]. |
unifyai_unify | public | public | 0 | 0 | get_experiment_version | def get_experiment_version(name: str, api_key: Optional[str] = None) -> int:"""Gets the experiment version (by name).Args:name: The name of the experiment to get.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:The experiment version with said name."""experiments = get_groups(key="experiment", api_key=api_key)if not experiments:return Noneexperiments = {v: k for k, v in experiments.items()}if name not in experiments:return Nonereturn int(experiments[name]) | 4 | 8 | 2 | 70 | 1 | 98 | 117 | 98 | name,api_key | ['experiments'] | int | {"Assign": 2, "Expr": 1, "If": 2, "Return": 3} | 3 | 20 | 3 | ["get_groups", "experiments.items", "int"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Experiment.__init__"] | The function (get_experiment_version) defined within the public class called public.The function start at line 98 and ends at 117. It contains 8 lines of code and it has a cyclomatic complexity of 4. It takes 2 parameters, represented as [98.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["get_groups", "experiments.items", "int"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Experiment.__init__"]. |
unifyai_unify | public | public | 0 | 0 | create_context | def create_context(name: str,description: str = None,is_versioned: bool = True,allow_duplicates: bool = True,unique_keys: Optional[Dict[str, str]] = None,auto_counting: Optional[Dict[str, Optional[str]]] = None,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Create a context.Args:name: Name of the context to create.description: Description of the context to create.is_versioned: Whether the context is tracked via version control.allow_duplicates: Whether to allow duplicates in the context.unique_keys: Unique key definition. Keys are column names, values are types('str', 'int', 'float', 'bool', 'datetime', 'time', 'date', 'timedelta', 'dict', 'list').Default is None.auto_counting: Auto-counting configuration. Keys are column names to auto-increment,values are parent counter names (None for independent counters). Default is None.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the context was successfully created."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)body = {"name": name,"description": description,"is_versioned": is_versioned,"allow_duplicates": allow_duplicates,"unique_keys": unique_keys,"auto_counting": auto_counting,}response = http.post(BASE_URL + f"/project/{project}/contexts",headers=headers,json=body,)return response.json() | 1 | 31 | 8 | 155 | 4 | 13 | 70 | 13 | name,description,is_versioned,allow_duplicates,unique_keys,auto_counting,project,api_key | ['headers', 'body', 'response', 'project'] | None | {"Assign": 4, "Expr": 1, "Return": 1} | 4 | 58 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"] | 123 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_aborted_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_failed_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_finished_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.google.cloud.extractors.test_bigquery_py.TestBigQueryAsyncExtractor.test_extract_on_complete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.google.cloud.extractors.test_bigquery_py.TestBigQueryAsyncExtractor.test_unavailable_xcom_raises_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.operators.test_snowflake_py.TestSnowflakeOperatorAsync.test_snowflake_execute_operator_async_deffered", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.operators.test_snowflake_py.TestSnowflakeOperatorAsync.test_snowflake_execute_operator_async_finish_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.sensors.test_snowflake_py.TestPytestSnowflakeSensorAsync.test_snowflake_execute_operator_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperator.test_kubernetes_pod_operator_active_deadline_seconds", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_already_checked_on_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_already_checked_on_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_no_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_no_logs_long", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_with_get_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_config_path_move", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_delete_operator_pod", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_disable_privilege_escalation", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_env_vars", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_faulty_image", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_faulty_service_account", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_fs_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_full_pod_spec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_init_container"] | The function (create_context) defined within the public class called public.The function start at line 13 and ends at 70. It contains 31 lines of code and it has a cyclomatic complexity of 1. It takes 8 parameters, represented as [13.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"], It has 123.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_aborted_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_failed_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.amazon.aws.operators.test_redshift_sql_py.TestRedshiftSQLOperatorAsync.test_redshiftsql_op_async_finished_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.google.cloud.extractors.test_bigquery_py.TestBigQueryAsyncExtractor.test_extract_on_complete", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.google.cloud.extractors.test_bigquery_py.TestBigQueryAsyncExtractor.test_unavailable_xcom_raises_exception", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.operators.test_snowflake_py.TestSnowflakeOperatorAsync.test_snowflake_execute_operator_async_deffered", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.operators.test_snowflake_py.TestSnowflakeOperatorAsync.test_snowflake_execute_operator_async_finish_before_deferred", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69484938_astronomer_astronomer_providers.tests.snowflake.sensors.test_snowflake_py.TestPytestSnowflakeSensorAsync.test_snowflake_execute_operator_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperator.test_kubernetes_pod_operator_active_deadline_seconds", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_already_checked_on_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_already_checked_on_success", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_failure", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_no_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_no_logs_long", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_changing_base_container_name_with_get_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_config_path_move", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_delete_operator_pod", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_disable_privilege_escalation", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_env_vars", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_faulty_image", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_faulty_service_account", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_fs_group", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_full_pod_spec", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.kubernetes_tests.tests.kubernetes_tests.test_kubernetes_pod_operator_py.TestKubernetesPodOperatorSystem.test_init_container"]. |
unifyai_unify | public | public | 0 | 0 | create_contexts | def create_contexts(contexts: List[Union[Dict[str, str], str]],*,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Create multiple contexts.Args:contexts: List of contexts to create. Each context can be a list of context names or a dictionary with the following keys, only the name is required:- name: Name of the context.- description: Description of the context.- is_versioned: Whether the context is tracked via version control.- allow_duplicates: Whether to allow duplicates in the context.- unique_keys: Unique key definition. Keys are column names, values are types('str', 'int', 'float', 'bool', 'datetime', 'time', 'date', 'timedelta', 'dict', 'list').- auto_counting: Auto-counting configuration. Keys are column names to auto-increment,values are parent counter names (None for independent counters).project: Name of the project the contexts belong to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the contexts were successfully created."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)response = http.post(BASE_URL + f"/project/{project}/contexts",headers=headers,json=contexts,)return response.json() | 1 | 18 | 3 | 91 | 3 | 73 | 112 | 73 | contexts,project,api_key | ['headers', 'response', 'project'] | None | {"Assign": 3, "Expr": 1, "Return": 1} | 4 | 40 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"] | 0 | [] | The function (create_contexts) defined within the public class called public.The function start at line 73 and ends at 112. It contains 18 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [73.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | rename_context | def rename_context(name: str,new_name: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Rename a context.Args:name: Name of the context to rename.new_name: New name of the context.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)response = http.patch(BASE_URL + f"/project/{project}/contexts/{name}/rename",headers=headers,json={"name": new_name},)return response.json() | 1 | 19 | 4 | 86 | 3 | 115 | 146 | 115 | name,new_name,project,api_key | ['headers', 'response', 'project'] | None | {"Assign": 3, "Expr": 1, "Return": 1} | 4 | 32 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.patch", "response.json"] | 0 | [] | The function (rename_context) defined within the public class called public.The function start at line 115 and ends at 146. It contains 19 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [115.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.patch", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | get_context | def get_context(name: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, str]:"""Get information about a specific context including its versioning status and current version.Args:name: Name of the context to get.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)response = http.get(BASE_URL + f"/project/{project}/contexts/{name}",headers=headers,)return response.json() | 1 | 17 | 3 | 79 | 3 | 149 | 176 | 149 | name,project,api_key | ['headers', 'response', 'project'] | Dict[str, str] | {"Assign": 3, "Expr": 1, "Return": 1} | 4 | 28 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.get", "response.json"] | 39 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libcmds_py.SetupHome._setup_gitconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libcmds_py.SetupHome.execute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.download_buildtools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_build_environ", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_dir", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_filename", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.run_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.run_cmd_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.ssh_cleanup_agent", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.ssh_setup_agent", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.CleanAll.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.CleanSstate.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.Purge.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.dump_py.Dump.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.lock_py.Lock.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.checkout_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.clone_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.fetch_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.MercurialRepo.checkout_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.keyhandler", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.signers_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.RepoImpl.apply_patches_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.RepoImpl.checkout"] | The function (get_context) defined within the public class called public.The function start at line 149 and ends at 176. It contains 17 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [149.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.get", "response.json"], It has 39.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libcmds_py.SetupHome._setup_gitconfig", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libcmds_py.SetupHome.execute", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.download_buildtools", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_build_environ", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_dir", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_filename", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.get_buildtools_url", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.run_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.run_cmd_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.ssh_cleanup_agent", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.libkas_py.ssh_setup_agent", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.CleanAll.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.CleanSstate.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.clean_py.Purge.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.dump_py.Dump.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.plugins.lock_py.Lock.run", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.checkout_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.clone_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.GitRepo.fetch_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.MercurialRepo.checkout_cmd", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.factory", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.keyhandler", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.Repo.signers_type", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.RepoImpl.apply_patches_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3984122_siemens_kas.kas.repos_py.RepoImpl.checkout"]. |
unifyai_unify | public | public | 0 | 0 | get_contexts | def get_contexts(project: Optional[str] = None,*,prefix: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, str]:"""Gets all contexts associated with a project, with the corresponding prefix.Args:prefix: Prefix of the contexts to get.project: Name of the project the artifacts belong to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.kwargs: Dictionary containing one or more key:value pairs that will be storedas artifacts.Returns:A message indicating whether the artifacts were successfully added."""headers = _create_request_header(api_key)project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)response = http.get(BASE_URL + f"/project/{project}/contexts",headers=headers,)contexts = response.json()contexts = {context["name"]: context["description"] for context in contexts}if prefix:contexts = {context: descriptionfor context, description in contexts.items()if context.startswith(prefix)}return contexts | 5 | 25 | 3 | 131 | 4 | 179 | 220 | 179 | project,prefix,api_key | ['headers', 'contexts', 'response', 'project'] | Dict[str, str] | {"Assign": 6, "Expr": 1, "If": 1, "Return": 1} | 6 | 42 | 6 | ["_create_request_header", "_get_and_maybe_create_project", "http.get", "response.json", "contexts.items", "context.startswith"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.contexts_py.delete_context", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.list_datasets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.initialize_cache"] | The function (get_contexts) defined within the public class called public.The function start at line 179 and ends at 220. It contains 25 lines of code and it has a cyclomatic complexity of 5. It takes 3 parameters, represented as [179.0] and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["_create_request_header", "_get_and_maybe_create_project", "http.get", "response.json", "contexts.items", "context.startswith"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.contexts_py.delete_context", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.list_datasets", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.utils.caching.remote_cache_py.RemoteCache.initialize_cache"]. |
unifyai_unify | public | public | 0 | 0 | delete_context | def delete_context(name: str,*,delete_children: bool = True,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Delete a context from the server.Args:name: Name of the context to delete.delete_children: Whether to delete child contexts (which share the same "/" separated prefix).project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)# ToDo: remove this hack once this task [https://app.clickup.com/t/86c3kuch6] is doneall_contexts = get_contexts(project, prefix=name)for ctx in all_contexts:response = http.delete(BASE_URL + f"/project/{project}/contexts/{ctx}",headers=headers,)if all_contexts:return response.json() | 3 | 21 | 4 | 98 | 4 | 223 | 258 | 223 | name,delete_children,project,api_key | ['headers', 'all_contexts', 'response', 'project'] | None | {"Assign": 4, "Expr": 1, "For": 1, "If": 1, "Return": 1} | 5 | 36 | 5 | ["_get_and_maybe_create_project", "_create_request_header", "get_contexts", "http.delete", "response.json"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.delete_dataset"] | The function (delete_context) defined within the public class called public.The function start at line 223 and ends at 258. It contains 21 lines of code and it has a cyclomatic complexity of 3. It takes 4 parameters, represented as [223.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "get_contexts", "http.delete", "response.json"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.datasets_py.delete_dataset"]. |
unifyai_unify | public | public | 0 | 0 | add_logs_to_context | def add_logs_to_context(log_ids: List[int],*,context: Optional[str] = None,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Add logs to a context.Args:log_ids: List of log ids to add to the context.context: Name of the context to add the logs to.project: Name of the project the logs belong to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating whether the logs were successfully added to the context."""context = context if context else CONTEXT_WRITE.get()project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)body = {"context_name": context,"log_ids": log_ids,}response = http.post(BASE_URL + f"/project/{project}/contexts/add_logs",headers=headers,json=body,)return response.json() | 2 | 24 | 4 | 113 | 5 | 261 | 300 | 261 | log_ids,context,project,api_key | ['body', 'headers', 'context', 'project', 'response'] | None | {"Assign": 5, "Expr": 1, "Return": 1} | 5 | 40 | 5 | ["CONTEXT_WRITE.get", "_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"] | 0 | [] | The function (add_logs_to_context) defined within the public class called public.The function start at line 261 and ends at 300. It contains 24 lines of code and it has a cyclomatic complexity of 2. It takes 4 parameters, represented as [261.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["CONTEXT_WRITE.get", "_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | commit_context | def commit_context(name: str,commit_message: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, str]:"""Creates a commit for a single context.Args:name: Name of the context to commit.commit_message: A description of the changes being saved.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A dictionary containing the new commit_hash."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)body = {"commit_message": commit_message}response = http.post(BASE_URL + f"/project/{project}/contexts/{name}/commit",headers=headers,json=body,)return response.json() | 1 | 20 | 4 | 94 | 4 | 303 | 335 | 303 | name,commit_message,project,api_key | ['headers', 'body', 'response', 'project'] | Dict[str, str] | {"Assign": 4, "Expr": 1, "Return": 1} | 4 | 33 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"] | 0 | [] | The function (commit_context) defined within the public class called public.The function start at line 303 and ends at 335. It contains 20 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [303.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | rollback_context | def rollback_context(name: str,commit_hash: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> Dict[str, str]:"""Rolls back a single context to a specific commit.Args:name: Name of the context to roll back.commit_hash: The hash of the commit to restore.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A message indicating the success of the rollback operation."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)body = {"commit_hash": commit_hash}response = http.post(BASE_URL + f"/project/{project}/contexts/{name}/rollback",headers=headers,json=body,)return response.json() | 1 | 20 | 4 | 94 | 4 | 338 | 370 | 338 | name,commit_hash,project,api_key | ['headers', 'body', 'response', 'project'] | Dict[str, str] | {"Assign": 4, "Expr": 1, "Return": 1} | 4 | 33 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"] | 0 | [] | The function (rollback_context) defined within the public class called public.The function start at line 338 and ends at 370. It contains 20 lines of code and it has a cyclomatic complexity of 1. It takes 4 parameters, represented as [338.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.post", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | get_context_commits | def get_context_commits(name: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> List[Dict]:"""Retrieves the commit history for a context.Args:name: Name of the context.project: Name of the project the context belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list of dictionaries, each representing a commit."""project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)headers = _create_request_header(api_key)response = http.get(BASE_URL + f"/project/{project}/contexts/{name}/commits",headers=headers,)return response.json() | 1 | 17 | 3 | 77 | 3 | 373 | 401 | 373 | name,project,api_key | ['headers', 'response', 'project'] | List[Dict] | {"Assign": 3, "Expr": 1, "Return": 1} | 4 | 29 | 4 | ["_get_and_maybe_create_project", "_create_request_header", "http.get", "response.json"] | 0 | [] | The function (get_context_commits) defined within the public class called public.The function start at line 373 and ends at 401. It contains 17 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [373.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["_get_and_maybe_create_project", "_create_request_header", "http.get", "response.json"]. |
unifyai_unify | public | public | 0 | 0 | list_datasets | def list_datasets(*,project: Optional[str] = None,prefix: str = "",api_key: Optional[str] = None,) -> Dict[str, str]:"""List all datasets associated with a project and context.Args:project: Name of the project the datasets belong to.prefix: Prefix of the datasets to get.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list of datasets."""contexts = get_contexts(prefix=f"Datasets/{prefix}",project=project,api_key=api_key,)return {"/".join(name.split("/")[1:]): descriptionfor name, description in contexts.items()} | 2 | 15 | 3 | 86 | 1 | 12 | 40 | 12 | project,prefix,api_key | ['contexts'] | Dict[str, str] | {"Assign": 1, "Expr": 1, "Return": 1} | 4 | 29 | 4 | ["get_contexts", "join", "name.split", "contexts.items"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.bigquery_py.BigQueryHook.get_datasets_list"] | The function (list_datasets) defined within the public class called public.The function start at line 12 and ends at 40. It contains 15 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [12.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["get_contexts", "join", "name.split", "contexts.items"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.bigquery_py.BigQueryHook.get_datasets_list"]. |
unifyai_unify | public | public | 0 | 0 | upload_dataset | def upload_dataset(name: str,data: List[Any],*,overwrite: bool = False,allow_duplicates: bool = False,project: Optional[str] = None,api_key: Optional[str] = None,) -> List[int]:"""Upload a dataset to the server.Args:name: Name of the dataset.data: Contents of the dataset.overwrite: Whether to overwrite the dataset if it already exists.allow_duplicates: Whether to allow duplicates in the dataset.project: Name of the project the dataset belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list all log ids in the dataset."""api_key = _validate_api_key(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)context = f"Datasets/{name}"log_instances = [isinstance(item, unify.Log) for item in data]are_logs = Falseif not allow_duplicates and not overwrite:# ToDo: remove this verbose logic once ignore_duplicates is implementedif name in unify.list_datasets():upstream_dataset = unify.Dataset(unify.download_dataset(name, project=project, api_key=api_key),)else:upstream_dataset = unify.Dataset([])if any(log_instances):assert all(log_instances), "If any items are logs, all items must be logs"are_logs = True# ToDo: remove this verbose logic once ignore_duplicates is implementedif not allow_duplicates and not overwrite:data = [l for l in data if l not in upstream_dataset]elif not all(isinstance(item, dict) for item in data):# ToDo: remove this verbose logic once ignore_duplicates is implementedif not allow_duplicates and not overwrite:data = [item for item in data if item not in upstream_dataset]data = [{"data": item} for item in data]if name in unify.list_datasets():upstream_ids = get_logs(project=project,context=context,return_ids_only=True,)else:upstream_ids = []if not are_logs:return upstream_ids + create_logs(project=project,context=context,entries=data,mutable=True,batched=True,# ToDo: uncomment once ignore_duplicates is implemented# ignore_duplicates=not allow_duplicates,)local_ids = [l.id for l in data]matching_ids = [id for id in upstream_ids if id in local_ids]matching_data = [l.entries for l in data if l.id in matching_ids]assert len(matching_data) == len(matching_ids,), "matching data and ids must be the same length"if matching_data:update_logs(logs=matching_ids,api_key=api_key,entries=matching_data,overwrite=True,)if overwrite:upstream_only_ids = [id for id in upstream_ids if id not in local_ids]if upstream_only_ids:delete_logs(logs=upstream_only_ids,context=context,project=project,api_key=api_key,)upstream_ids = [id for id in upstream_ids if id not in upstream_only_ids]ids_not_in_dataset = [id for id in local_ids if id not in matching_ids and id is not None]if ids_not_in_dataset:if context not in unify.get_contexts():unify.create_context(context,project=project,api_key=api_key,)unify.add_logs_to_context(log_ids=ids_not_in_dataset,context=context,project=project,api_key=api_key,)local_only_data = [l.entries for l in data if l.id is None]if local_only_data:return upstream_ids + create_logs(project=project,context=context,entries=local_only_data,mutable=True,batched=True,)return upstream_ids + ids_not_in_dataset | 39 | 95 | 6 | 537 | 14 | 43 | 161 | 43 | name,data,overwrite,allow_duplicates,project,api_key | ['are_logs', 'upstream_dataset', 'upstream_ids', 'upstream_only_ids', 'local_ids', 'log_instances', 'local_only_data', 'context', 'matching_ids', 'api_key', 'project', 'ids_not_in_dataset', 'matching_data', 'data'] | List[int] | {"Assign": 20, "Expr": 5, "If": 14, "Return": 3} | 22 | 119 | 22 | ["_validate_api_key", "_get_and_maybe_create_project", "isinstance", "unify.list_datasets", "unify.Dataset", "unify.download_dataset", "unify.Dataset", "any", "all", "all", "isinstance", "unify.list_datasets", "get_logs", "create_logs", "len", "len", "update_logs", "delete_logs", "unify.get_contexts", "unify.create_context", "unify.add_logs_to_context", "create_logs"] | 0 | [] | The function (upload_dataset) defined within the public class called public.The function start at line 43 and ends at 161. It contains 95 lines of code and it has a cyclomatic complexity of 39. It takes 6 parameters, represented as [43.0] and does not return any value. It declares 22.0 functions, and It has 22.0 functions called inside which are ["_validate_api_key", "_get_and_maybe_create_project", "isinstance", "unify.list_datasets", "unify.Dataset", "unify.download_dataset", "unify.Dataset", "any", "all", "all", "isinstance", "unify.list_datasets", "get_logs", "create_logs", "len", "len", "update_logs", "delete_logs", "unify.get_contexts", "unify.create_context", "unify.add_logs_to_context", "create_logs"]. |
unifyai_unify | public | public | 0 | 0 | download_dataset | def download_dataset(name: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> List[Log]:"""Download a dataset from the server.Args:name: Name of the dataset.project: Name of the project the dataset belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)logs = get_logs(project=project,context=f"Datasets/{name}",)return list(reversed(logs)) | 1 | 13 | 3 | 72 | 3 | 164 | 187 | 164 | name,project,api_key | ['logs', 'api_key', 'project'] | List[Log] | {"Assign": 3, "Expr": 1, "Return": 1} | 5 | 24 | 5 | ["_validate_api_key", "_get_and_maybe_create_project", "get_logs", "list", "reversed"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701896_chainer_chainer_chemistry.chainer_chemistry.datasets.molnet.molnet_py.get_grid_featurized_pdbbind_dirpath", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701896_chainer_chainer_chemistry.chainer_chemistry.datasets.molnet.molnet_py.get_molnet_filepath", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69376580_substra_substra_tests.substratest.client_py.Client.download_opener", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.repository._gp_copula_2019_py.generate_gp_copula_dataset"] | The function (download_dataset) defined within the public class called public.The function start at line 164 and ends at 187. It contains 13 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [164.0] and does not return any value. It declares 5.0 functions, It has 5.0 functions called inside which are ["_validate_api_key", "_get_and_maybe_create_project", "get_logs", "list", "reversed"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701896_chainer_chainer_chemistry.chainer_chemistry.datasets.molnet.molnet_py.get_grid_featurized_pdbbind_dirpath", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3701896_chainer_chainer_chemistry.chainer_chemistry.datasets.molnet.molnet_py.get_molnet_filepath", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69376580_substra_substra_tests.substratest.client_py.Client.download_opener", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.70598532_awslabs_gluonts.src.gluonts.dataset.repository._gp_copula_2019_py.generate_gp_copula_dataset"]. |
unifyai_unify | public | public | 0 | 0 | delete_dataset | def delete_dataset(name: str,*,project: Optional[str] = None,api_key: Optional[str] = None,) -> None:"""Delete a dataset from the server.Args:name: Name of the dataset.project: Name of the project the dataset belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable."""api_key = _validate_api_key(api_key)project = _get_and_maybe_create_project(project, api_key=api_key)delete_context(f"Datasets/{name}", project=project, api_key=api_key) | 1 | 9 | 3 | 60 | 2 | 190 | 209 | 190 | name,project,api_key | ['api_key', 'project'] | None | {"Assign": 2, "Expr": 2} | 3 | 20 | 3 | ["_validate_api_key", "_get_and_maybe_create_project", "delete_context"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.bigquery_py.BigQueryHook.delete_dataset"] | The function (delete_dataset) defined within the public class called public.The function start at line 190 and ends at 209. It contains 9 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [190.0] and does not return any value. It declares 3.0 functions, It has 3.0 functions called inside which are ["_validate_api_key", "_get_and_maybe_create_project", "delete_context"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.94556628_apache_airflow.providers.google.src.airflow.providers.google.cloud.hooks.bigquery_py.BigQueryHook.delete_dataset"]. |
unifyai_unify | public | public | 0 | 0 | add_dataset_entries | def add_dataset_entries(name: str,data: List[Any],*,project: Optional[str] = None,api_key: Optional[str] = None,) -> List[int]:"""Adds entries to an existing dataset in the server.Args:name: Name of the dataset.contents: Contents to add to the dataset.project: Name of the project the dataset belongs to.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.Returns:A list of the newly added dataset logs."""api_key = _validate_api_key(api_key)project = _get_and_maybe_create_project(project,api_key=api_key,create_if_missing=False,)if not all(isinstance(item, dict) for item in data):data = [{"data": item} for item in data]logs = create_logs(project=project,context=f"Datasets/{name}",entries=data,mutable=True,batched=True,)return logs | 4 | 23 | 4 | 119 | 4 | 212 | 249 | 212 | name,data,project,api_key | ['logs', 'data', 'api_key', 'project'] | List[int] | {"Assign": 4, "Expr": 1, "If": 1, "Return": 1} | 5 | 38 | 5 | ["_validate_api_key", "_get_and_maybe_create_project", "all", "isinstance", "create_logs"] | 0 | [] | The function (add_dataset_entries) defined within the public class called public.The function start at line 212 and ends at 249. It contains 23 lines of code and it has a cyclomatic complexity of 4. It takes 4 parameters, represented as [212.0] and does not return any value. It declares 5.0 functions, and It has 5.0 functions called inside which are ["_validate_api_key", "_get_and_maybe_create_project", "all", "isinstance", "create_logs"]. |
unifyai_unify | _TraceLogState | protected | 0 | 0 | __init__ | def __init__(self) -> None:self.lock = asyncio.Lock()self.processing = Falseself.pending_value: Dict[str, Any] | None = None | 1 | 4 | 1 | 35 | 0 | 105 | 108 | 105 | self | [] | None | {"AnnAssign": 1, "Assign": 2} | 1 | 4 | 1 | ["asyncio.Lock"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the protected class called _TraceLogState.The function start at line 105 and ends at 108. It contains 4 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["asyncio.Lock"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | _TraceLogState | protected | 0 | 0 | __init__ | def __init__(self) -> None:self._states: dict[str, _TraceLogState] = {}self.stopped = Falseself._api_key = _validate_api_key(None)self._pending_submit_requests = 0atexit.register(self.shutdown, flush=True)signal.signal(signal.SIGINT, self._on_sigint)headers = _create_request_header(self._api_key)self._loop = asyncio.new_event_loop()self._loop.set_default_executor(ThreadPoolExecutor(thread_name_prefix="UnifyTraceLogger"),)self._client = aiohttp.ClientSession(loop=self._loop, headers=headers)self._thread = threading.Thread(name="UnifyTraceLogger",target=self._loop.run_forever,daemon=True,)self._thread.start() | 1 | 19 | 1 | 143 | 0 | 112 | 133 | 112 | self | [] | None | {"AnnAssign": 1, "Assign": 2} | 1 | 4 | 1 | ["asyncio.Lock"] | 14,667 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"] | The function (__init__) defined within the protected class called _TraceLogState.The function start at line 112 and ends at 133. It contains 19 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["asyncio.Lock"], It has 14667.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.AllocationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ContentError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.ParameterValidationError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.15914487_netbox_community_pynetbox.pynetbox.core.query_py.RequestError.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.16851155_pbui_bobbit.src.bobbit.http_client_py.HTTPClient.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Command.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._commands_py.Group.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._context_py.Context.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._option_groups_py.OptionGroupMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Argument.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._params_py.Option.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._sections_py.SectionMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup._util_py.FrozenSpaceMeta.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.AcceptBetween.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._core_py.RequireExactly.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints._support_py.ConstraintMixin.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.ConstraintViolated.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.constraints.exceptions_py.UnsatisfiableConstraint.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.cloup.formatting._formatter_py.HelpFormatter.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.18358916_janluke_cloup.tests.test_commands_py.test_group_command_class_is_used_to_create_subcommands", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.explorer_py.MainWindow.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.search_filter_py.QudFilterModel.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudObjTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudPopTreeView.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.26231153_trashmonks_qud_wiki.qbe.tree_view_py.QudTreeView.__init__"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | update_trace | def update_trace(self, log: unify.Log, trace: dict):metadata = {"id": log.id,"project": unify.active_project(),"context": log.context,}fut = asyncio.run_coroutine_threadsafe(self._update_log(metadata, trace),self._loop,)self._pending_submit_requests += 1fut.add_done_callback(self._update_log_callback) | 1 | 12 | 3 | 72 | 0 | 135 | 146 | 135 | self,log,trace | [] | None | {"Assign": 2, "AugAssign": 1, "Expr": 1} | 4 | 12 | 4 | ["unify.active_project", "asyncio.run_coroutine_threadsafe", "self._update_log", "fut.add_done_callback"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__exit__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._create_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span"] | The function (update_trace) defined within the protected class called _AsyncTraceLogger.The function start at line 135 and ends at 146. It contains 12 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [135.0] and does not return any value. It declares 4.0 functions, It has 4.0 functions called inside which are ["unify.active_project", "asyncio.run_coroutine_threadsafe", "self._update_log", "fut.add_done_callback"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__exit__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._create_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | is_processing | def is_processing(self) -> bool:return self._pending_submit_requests > 0 or len(self._states) > 0 | 2 | 2 | 1 | 22 | 0 | 148 | 149 | 148 | self | [] | bool | {"Return": 1} | 1 | 2 | 1 | ["len"] | 0 | [] | The function (is_processing) defined within the protected class called _AsyncTraceLogger.The function start at line 148 and ends at 149. It contains 2 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["len"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _update_log_callback | def _update_log_callback(self, fut):self._pending_submit_requests -= 1 | 1 | 2 | 2 | 12 | 0 | 151 | 152 | 151 | self,fut | [] | None | {"AugAssign": 1} | 0 | 2 | 0 | [] | 0 | [] | The function (_update_log_callback) defined within the protected class called _AsyncTraceLogger.The function start at line 151 and ends at 152. It contains 2 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [151.0] and does not return any value.. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | shutdown | def shutdown(self, flush: bool = True) -> None:if self.stopped:returnself.stopped = Trueif flush:drain_future = asyncio.run_coroutine_threadsafe(self._drain(),self._loop,)from concurrent.futures import TimeoutErrorwhile True:try:drain_future.result(timeout=0.1,)# blocks until all requests are doneexcept (asyncio.TimeoutError, TimeoutError):continueelse:breakelse:asyncio.run_coroutine_threadsafe(self._shutdown_tasks(),self._loop,)close_future = asyncio.run_coroutine_threadsafe(self._close_client(),self._loop,)close_future.result()self._loop.call_soon_threadsafe(self._loop.stop)self._thread.join()self._loop.close() | 5 | 32 | 2 | 144 | 0 | 154 | 189 | 154 | self,flush | [] | None | {"Assign": 3, "Expr": 6, "If": 2, "Return": 1, "Try": 1, "While": 1} | 11 | 36 | 11 | ["asyncio.run_coroutine_threadsafe", "self._drain", "drain_future.result", "asyncio.run_coroutine_threadsafe", "self._shutdown_tasks", "asyncio.run_coroutine_threadsafe", "self._close_client", "close_future.result", "self._loop.call_soon_threadsafe", "self._thread.join", "self._loop.close"] | 17 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3660028_openprocurement_openprocurement_auction.openprocurement.auction.helpers.chronograph_py.AuctionScheduler.shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3703461_scoutapp_scout_apm_python.tests.unit.test_core_py.test_shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3703461_scoutapp_scout_apm_python.tests.unit.test_core_py.test_shutdown_message_disabled", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718286_gawel_pyquery.README_fixt_py.teardown_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963052_sibson_vncdotool.vncdotool.api_py.connect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.alpa.testing_py.PipelineBasicTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.pipeline_parallel.test_inference_only_py.PipelineInferenceTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_create_state_py.CreateStateTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_device_mesh_py.DeviceMeshTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_device_mesh_py.DeviceMesh_ResourceAwareness.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_dist_save_load_py.DistSaveLoadTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_follow_parallel_py.FollowParallelTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_memory_leak_py.MemoryLeakTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_parallel_plan_py.ParallelPlanTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_random_seed_py.RandomSeedTest.test_remat_rng", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_tracing_py.TracingTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.shard_parallel.test_gradient_accumulation_py.GradAccumulationTest.run_gradient_accumulation"] | The function (shutdown) defined within the protected class called _AsyncTraceLogger.The function start at line 154 and ends at 189. It contains 32 lines of code and it has a cyclomatic complexity of 5. It takes 2 parameters, represented as [154.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["asyncio.run_coroutine_threadsafe", "self._drain", "drain_future.result", "asyncio.run_coroutine_threadsafe", "self._shutdown_tasks", "asyncio.run_coroutine_threadsafe", "self._close_client", "close_future.result", "self._loop.call_soon_threadsafe", "self._thread.join", "self._loop.close"], It has 17.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3660028_openprocurement_openprocurement_auction.openprocurement.auction.helpers.chronograph_py.AuctionScheduler.shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3703461_scoutapp_scout_apm_python.tests.unit.test_core_py.test_shutdown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3703461_scoutapp_scout_apm_python.tests.unit.test_core_py.test_shutdown_message_disabled", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3718286_gawel_pyquery.README_fixt_py.teardown_test", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3963052_sibson_vncdotool.vncdotool.api_py.connect", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.alpa.testing_py.PipelineBasicTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.pipeline_parallel.test_inference_only_py.PipelineInferenceTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_create_state_py.CreateStateTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_device_mesh_py.DeviceMeshTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_device_mesh_py.DeviceMesh_ResourceAwareness.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_dist_save_load_py.DistSaveLoadTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_follow_parallel_py.FollowParallelTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_memory_leak_py.MemoryLeakTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_parallel_plan_py.ParallelPlanTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_random_seed_py.RandomSeedTest.test_remat_rng", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.runtime.test_tracing_py.TracingTest.tearDown", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.69413963_alpa_projects_alpa.tests.shard_parallel.test_gradient_accumulation_py.GradAccumulationTest.run_gradient_accumulation"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _update_log | async def _update_log(self, log_metadata, trace) -> None:state = self._states.setdefault(log_metadata["id"], _TraceLogState())async with state.lock:state.pending_value = {"trace": trace,"project": log_metadata["project"],"context": log_metadata["context"],}if not state.processing:state.processing = Trueasyncio.create_task(self._process_log(log_metadata["id"], state)) | 2 | 11 | 3 | 85 | 0 | 191 | 201 | 191 | self,log_metadata,trace | [] | None | {"Assign": 3, "Expr": 1, "If": 1} | 4 | 11 | 4 | ["self._states.setdefault", "_TraceLogState", "asyncio.create_task", "self._process_log"] | 0 | [] | The function (_update_log) defined within the protected class called _AsyncTraceLogger.The function start at line 191 and ends at 201. It contains 11 lines of code and it has a cyclomatic complexity of 2. It takes 3 parameters, represented as [191.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["self._states.setdefault", "_TraceLogState", "asyncio.create_task", "self._process_log"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _process_log | async def _process_log(self, log_id: int, state: _TraceLogState):try:while True:async with state.lock:value = state.pending_valuestate.pending_value = Noneif value is None:async with state.lock:state.processing = Falsereturntry:await self._send_request(log_id, value)except Exception as e:logger.error(f"error updating trace {log_id!r}: {e}")if value["trace"].get("completed") == True:async with state.lock:state.processing = Falseself._states.pop(log_id, None)returnexcept asyncio.CancelledError:pass | 6 | 21 | 3 | 115 | 0 | 203 | 226 | 203 | self,log_id,state | [] | None | {"Assign": 4, "Expr": 3, "If": 2, "Return": 2, "Try": 2, "While": 1} | 4 | 24 | 4 | ["self._send_request", "logger.error", "get", "self._states.pop"] | 0 | [] | The function (_process_log) defined within the protected class called _AsyncTraceLogger.The function start at line 203 and ends at 226. It contains 21 lines of code and it has a cyclomatic complexity of 6. It takes 3 parameters, represented as [203.0] and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["self._send_request", "logger.error", "get", "self._states.pop"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _send_request | async def _send_request(self, log_id: str, value: Dict[str, Any]):entries = {"trace": value["trace"]}entries = _apply_col_context(**entries)entries = {**entries, **ACTIVE_ENTRIES_WRITE.get()}entries = _handle_special_types(entries)entries = _handle_mutability(True, entries)body = {"logs": [log_id],"project": value["project"],"context": value["context"],"entries": entries,"overwrite": True,}async with self._client.put(f"{BASE_URL}/logs",json=body,) as resp:resp.raise_for_status() | 1 | 18 | 3 | 118 | 0 | 228 | 247 | 228 | self,log_id,value | [] | None | {"Assign": 6, "Expr": 1} | 6 | 20 | 6 | ["_apply_col_context", "ACTIVE_ENTRIES_WRITE.get", "_handle_special_types", "_handle_mutability", "self._client.put", "resp.raise_for_status"] | 0 | [] | The function (_send_request) defined within the protected class called _AsyncTraceLogger.The function start at line 228 and ends at 247. It contains 18 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [228.0] and does not return any value. It declares 6.0 functions, and It has 6.0 functions called inside which are ["_apply_col_context", "ACTIVE_ENTRIES_WRITE.get", "_handle_special_types", "_handle_mutability", "self._client.put", "resp.raise_for_status"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _drain | async def _drain(self):while any(state.processing for state in self._states.values()):await asyncio.sleep(0.05) | 3 | 3 | 1 | 32 | 0 | 249 | 251 | 249 | self | [] | None | {"Expr": 1, "While": 1} | 3 | 3 | 3 | ["any", "self._states.values", "asyncio.sleep"] | 0 | [] | The function (_drain) defined within the protected class called _AsyncTraceLogger.The function start at line 249 and ends at 251. It contains 3 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters and does not return any value. It declares 3.0 functions, and It has 3.0 functions called inside which are ["any", "self._states.values", "asyncio.sleep"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _shutdown_tasks | async def _shutdown_tasks(self):for task in self.tasks:await task.cancel() | 2 | 3 | 1 | 18 | 0 | 253 | 255 | 253 | self | [] | None | {"Expr": 1, "For": 1} | 1 | 3 | 1 | ["task.cancel"] | 0 | [] | The function (_shutdown_tasks) defined within the protected class called _AsyncTraceLogger.The function start at line 253 and ends at 255. It contains 3 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["task.cancel"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _close_client | async def _close_client(self):await self._client.close()await asyncio.sleep(1) | 1 | 3 | 1 | 20 | 0 | 257 | 259 | 257 | self | [] | None | {"Expr": 2} | 2 | 3 | 2 | ["self._client.close", "asyncio.sleep"] | 0 | [] | The function (_close_client) defined within the protected class called _AsyncTraceLogger.The function start at line 257 and ends at 259. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self._client.close", "asyncio.sleep"]. |
unifyai_unify | _AsyncTraceLogger | protected | 0 | 0 | _on_sigint | def _on_sigint(self, signum, frame):self.shutdown(flush=False)exit(0) | 1 | 3 | 3 | 21 | 0 | 261 | 263 | 261 | self,signum,frame | [] | None | {"Expr": 2} | 2 | 3 | 2 | ["self.shutdown", "exit"] | 0 | [] | The function (_on_sigint) defined within the protected class called _AsyncTraceLogger.The function start at line 261 and ends at 263. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 3 parameters, represented as [261.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["self.shutdown", "exit"]. |
unifyai_unify | public | public | 0 | 0 | _removes_unique_trace_values | def _removes_unique_trace_values(kw: Dict[str, Any]) -> Dict[str, Any]:del kw["id"]del kw["exec_time"]if "parent_span_id" in kw:del kw["parent_span_id"]if "child_spans" in kw:kw["child_spans"] = [_removes_unique_trace_values(cs) for cs in kw["child_spans"]]return kw | 4 | 10 | 1 | 64 | 0 | 266 | 275 | 266 | kw | [] | Dict[str, Any] | {"Assign": 1, "If": 2, "Return": 1} | 1 | 10 | 1 | ["_removes_unique_trace_values"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._handle_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._removes_unique_trace_values"] | The function (_removes_unique_trace_values) defined within the public class called public.The function start at line 266 and ends at 275. It contains 10 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_removes_unique_trace_values"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._handle_cache", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._removes_unique_trace_values"]. |
unifyai_unify | public | public | 0 | 0 | initialize_async_logger | def initialize_async_logger(queue_size: Optional[int] = 10000,api_key: Optional[str] = None,) -> None:"""Initialize the async logger with the specified configuration.Args:queue_size: Maximum number of log events to store in the queue, defaults to 10000.if maximum queue size is exceeded, calls to unify.log will block until space is available.api_key: API key for authentication"""global _async_logger, ASYNC_LOGGINGif _async_logger is not None:returnapi_key = _validate_api_key(api_key)_async_logger = AsyncLoggerManager(name="default",base_url=BASE_URL,api_key=api_key,max_queue_size=queue_size,)ASYNC_LOGGING = True | 2 | 15 | 2 | 66 | 3 | 278 | 302 | 278 | queue_size,api_key | ['ASYNC_LOGGING', '_async_logger', 'api_key'] | None | {"Assign": 3, "Expr": 1, "If": 1, "Return": 1} | 2 | 25 | 2 | ["_validate_api_key", "AsyncLoggerManager"] | 0 | [] | The function (initialize_async_logger) defined within the public class called public.The function start at line 278 and ends at 302. It contains 15 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [278.0] and does not return any value. It declares 2.0 functions, and It has 2.0 functions called inside which are ["_validate_api_key", "AsyncLoggerManager"]. |
unifyai_unify | public | public | 0 | 0 | shutdown_async_logger | def shutdown_async_logger(immediate=False) -> None:"""Gracefully shutdown the async logger, ensuring all pending logs are flushed."""global _async_logger, ASYNC_LOGGINGif _async_logger is not None:_async_logger.stop_sync(immediate=immediate)_async_logger = NoneASYNC_LOGGING = False | 2 | 6 | 1 | 34 | 2 | 305 | 314 | 305 | immediate | ['ASYNC_LOGGING', '_async_logger'] | None | {"Assign": 2, "Expr": 2, "If": 1} | 1 | 10 | 1 | ["_async_logger.stop_sync"] | 0 | [] | The function (shutdown_async_logger) defined within the public class called public.The function start at line 305 and ends at 314. It contains 6 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, and It has 1.0 function called inside which is ["_async_logger.stop_sync"]. |
unifyai_unify | public | public | 0 | 0 | initialize_trace_logger | def initialize_trace_logger():"""Initialize the trace logger. Must be called from the main thread."""global _trace_loggerif _trace_logger is None:_trace_logger = _AsyncTraceLogger() | 2 | 4 | 0 | 17 | 1 | 317 | 323 | 317 | ['_trace_logger'] | None | {"Assign": 1, "Expr": 1, "If": 1} | 1 | 7 | 1 | ["_AsyncTraceLogger"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"] | The function (initialize_trace_logger) defined within the public class called public.The function start at line 317 and ends at 323. It contains 4 lines of code and it has a cyclomatic complexity of 2. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["_AsyncTraceLogger"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.traced"]. | |
unifyai_unify | public | public | 0 | 0 | _get_trace_logger | def _get_trace_logger():return _trace_logger | 1 | 2 | 0 | 6 | 0 | 326 | 327 | 326 | [] | Returns | {"Return": 1} | 0 | 2 | 0 | [] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_logging.test_tracing_py._wait_for_trace_logger", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__exit__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._create_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span"] | The function (_get_trace_logger) defined within the public class called public.The function start at line 326 and ends at 327. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.tests.test_logging.test_tracing_py._wait_for_trace_logger", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__exit__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._create_span", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._finalize_span"]. | |
unifyai_unify | public | public | 0 | 0 | _set_active_trace_parameters | def _set_active_trace_parameters(prune_empty: bool = True,span_type: str = "function",name: Optional[str] = None,filter: Optional[Callable[[callable], bool]] = None,fn_type: Optional[str] = None,recursive: bool = False,# Only valid for Functions.depth: Optional[int] = None,skip_modules: Optional[List[ModuleType]] = None,skip_functions: Optional[List[Callable]] = None,):token = ACTIVE_TRACE_PARAMETERS.set({"prune_empty": prune_empty,"span_type": span_type,"name": name,"filter": filter,"fn_type": fn_type,"recursive": recursive,"depth": depth,"skip_modules": skip_modules,"skip_functions": skip_functions,},)return token | 1 | 25 | 10 | 137 | 1 | 330 | 354 | 330 | prune_empty,span_type,name,filter,fn_type,recursive,depth,skip_modules,skip_functions | ['token'] | Returns | {"Assign": 1, "Return": 1} | 1 | 25 | 1 | ["ACTIVE_TRACE_PARAMETERS.set"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"] | The function (_set_active_trace_parameters) defined within the public class called public.The function start at line 330 and ends at 354. It contains 25 lines of code and it has a cyclomatic complexity of 1. It takes 10 parameters, represented as [330.0], and this function return a value. It declare 1.0 function, It has 1.0 function called inside which is ["ACTIVE_TRACE_PARAMETERS.set"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"]. |
unifyai_unify | public | public | 0 | 0 | _reset_active_trace_parameters | def _reset_active_trace_parameters(token):ACTIVE_TRACE_PARAMETERS.reset(token) | 1 | 2 | 1 | 11 | 0 | 357 | 358 | 357 | token | [] | None | {"Expr": 1} | 1 | 2 | 1 | ["ACTIVE_TRACE_PARAMETERS.reset"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"] | The function (_reset_active_trace_parameters) defined within the public class called public.The function start at line 357 and ends at 358. It contains 2 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declare 1.0 function, It has 1.0 function called inside which is ["ACTIVE_TRACE_PARAMETERS.reset"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._trace_wrapper_factory"]. |
unifyai_unify | public | public | 0 | 0 | set_trace_context | def set_trace_context(context: str):global TRACING_LOG_CONTEXTctx_wrt = CONTEXT_WRITE.get()if ctx_wrt:if context:context = f"{ctx_wrt}/{context}"else:context = ctx_wrtTRACING_LOG_CONTEXT = contextif context is None:returnnames = [name for name, _ in unify.get_contexts().items()]if context not in names:unify.create_context(name=context) | 6 | 14 | 1 | 73 | 4 | 361 | 374 | 361 | context | ['context', 'ctx_wrt', 'names', 'TRACING_LOG_CONTEXT'] | None | {"Assign": 5, "Expr": 1, "If": 4, "Return": 1} | 4 | 14 | 4 | ["CONTEXT_WRITE.get", "items", "unify.get_contexts", "unify.create_context"] | 0 | [] | The function (set_trace_context) defined within the public class called public.The function start at line 361 and ends at 374. It contains 14 lines of code and it has a cyclomatic complexity of 6. The function does not take any parameters and does not return any value. It declares 4.0 functions, and It has 4.0 functions called inside which are ["CONTEXT_WRITE.get", "items", "unify.get_contexts", "unify.create_context"]. |
unifyai_unify | public | public | 0 | 0 | get_trace_context | def get_trace_context():global TRACING_LOG_CONTEXTreturn TRACING_LOG_CONTEXT | 1 | 3 | 0 | 8 | 0 | 377 | 379 | 377 | [] | Returns | {"Return": 1} | 0 | 3 | 0 | [] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__enter__"] | The function (get_trace_context) defined within the public class called public.The function start at line 377 and ends at 379. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters, and this function return a value. It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Traced.__enter__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py._Traced.__enter__"]. | |
unifyai_unify | public | public | 0 | 0 | mark_spans_as_done._traverse_trace_and_mark_done | def _traverse_trace_and_mark_done(trace: dict):if span_ids is None:trace["completed"] = Trueelif trace["id"] in span_ids:trace["completed"] = Truefor span in trace["child_spans"]:_traverse_trace_and_mark_done(span) | 4 | 7 | 1 | 44 | 0 | 399 | 406 | 399 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (mark_spans_as_done._traverse_trace_and_mark_done) defined within the public class called public.The function start at line 399 and ends at 406. It contains 7 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | mark_spans_as_done | def mark_spans_as_done(log_ids: Optional[Union[int, List[int]]] = None,span_ids: Optional[Union[str, List[str]]] = None,*,project: Optional[str] = None,contexts: Optional[Union[str, List[str]]] = None,):"""Marks all of the listed span ids for the listed logs in the listed contexts as completed.In all cases of this specification hierarchy, if none are provided then all associated spans are marked as complete."""if log_ids is not None:log_ids = [log_ids] if isinstance(log_ids, int) else log_idsif span_ids is not None:span_ids = [span_ids] if isinstance(span_ids, str) else span_idsdef _traverse_trace_and_mark_done(trace: dict):if span_ids is None:trace["completed"] = Trueelif trace["id"] in span_ids:trace["completed"] = Truefor span in trace["child_spans"]:_traverse_trace_and_mark_done(span)if log_ids:logs = unify.get_logs(project=project, context=contexts, from_ids=log_ids)else:if isinstance(contexts, list):logs = []for context in contexts:logs.extend(unify.get_logs(project=project,context=context,from_fields=["trace"],filter=f"trace != None",),)else:logs = unify.get_logs(project=project,context=contexts,from_fields=["trace"],filter=f"trace != None",)for log in logs:_traverse_trace_and_mark_done(log.entries["trace"])unify.update_logs(logs=log.id,entries={"trace": log.entries["trace"]},overwrite=True,) | 9 | 40 | 4 | 249 | 3 | 382 | 436 | 382 | log_ids,span_ids,project,contexts | ['logs', 'span_ids', 'log_ids'] | None | {"Assign": 7, "Expr": 5, "For": 3, "If": 6} | 10 | 55 | 10 | ["isinstance", "isinstance", "_traverse_trace_and_mark_done", "unify.get_logs", "isinstance", "logs.extend", "unify.get_logs", "unify.get_logs", "_traverse_trace_and_mark_done", "unify.update_logs"] | 0 | [] | The function (mark_spans_as_done) defined within the public class called public.The function start at line 382 and ends at 436. It contains 40 lines of code and it has a cyclomatic complexity of 9. It takes 4 parameters, represented as [382.0] and does not return any value. It declares 10.0 functions, and It has 10.0 functions called inside which are ["isinstance", "isinstance", "_traverse_trace_and_mark_done", "unify.get_logs", "isinstance", "logs.extend", "unify.get_logs", "unify.get_logs", "_traverse_trace_and_mark_done", "unify.update_logs"]. |
unifyai_unify | public | public | 0 | 0 | _apply_row_ids | def _apply_row_ids(row_ids_data: Optional[Dict[str, Any]],entries: List[Dict[str, Any]],) -> None:"""Apply row_ids data from server response to entry dictionaries.Handles the new standardized format {'names': List[str], 'ids': List[List[int]]}while maintaining backward compatibility with the old format during transition.Args:row_ids_data: The row_ids data from server responseentries: List of entry dictionaries to update with row_ids"""if not row_ids_data:returnnames = row_ids_data.get("names")ids = row_ids_data.get("ids")if not names or not ids:return# Ensure names is always treated as a list for consistent processingif not isinstance(names, list):names = [names]# Apply IDs to entriesfor entry, id_values in zip(entries, ids):if id_values is not None:# Handle both nested ID format (list of lists) and flat format (list of values)if isinstance(id_values, list) and len(names) > 1:# Nested format: zip names with id_valuesid_dict = dict(zip(names, id_values))entry.update(id_dict)else:# Single ID format: use first name with the id_valueif isinstance(id_values, list) and len(id_values) == 1:entry[names[0]] = id_values[0]else:entry[names[0]] = id_values | 11 | 22 | 2 | 164 | 3 | 439 | 479 | 439 | row_ids_data,entries | ['ids', 'id_dict', 'names'] | None | {"Assign": 6, "Expr": 2, "For": 1, "If": 6, "Return": 2} | 11 | 41 | 11 | ["row_ids_data.get", "row_ids_data.get", "isinstance", "zip", "isinstance", "len", "dict", "zip", "entry.update", "isinstance", "len"] | 2 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._sync_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs"] | The function (_apply_row_ids) defined within the public class called public.The function start at line 439 and ends at 479. It contains 22 lines of code and it has a cyclomatic complexity of 11. It takes 2 parameters, represented as [439.0] and does not return any value. It declares 11.0 functions, It has 11.0 functions called inside which are ["row_ids_data.get", "row_ids_data.get", "isinstance", "zip", "isinstance", "len", "dict", "zip", "entry.update", "isinstance", "len"], It has 2.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._sync_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs"]. |
unifyai_unify | public | public | 0 | 0 | _handle_cache.wrapped | def wrapped(*args, **kwargs):if not is_caching_enabled():return fn(*args, **kwargs)kw_for_key = flexible_deepcopy(kwargs)if fn.__name__ == "add_log_entries" and "trace" in kwargs:kw_for_key["trace"] = _removes_unique_trace_values(kw_for_key["trace"])combined_kw = {**{f"arg{i}": a for i, a in enumerate(args)}, **kw_for_key}ret = _get_cache(fn_name=fn.__name__,kw=combined_kw,)if ret is not None:return retret = fn(*args, **kwargs)_write_to_cache(fn_name=fn.__name__,kw=combined_kw,response=ret,)return ret | 6 | 20 | 2 | 128 | 0 | 483 | 502 | 483 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_handle_cache.wrapped) defined within the public class called public.The function start at line 483 and ends at 502. It contains 20 lines of code and it has a cyclomatic complexity of 6. It takes 2 parameters, represented as [483.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _handle_cache | def _handle_cache(fn: Callable) -> Callable:def wrapped(*args, **kwargs):if not is_caching_enabled():return fn(*args, **kwargs)kw_for_key = flexible_deepcopy(kwargs)if fn.__name__ == "add_log_entries" and "trace" in kwargs:kw_for_key["trace"] = _removes_unique_trace_values(kw_for_key["trace"])combined_kw = {**{f"arg{i}": a for i, a in enumerate(args)}, **kw_for_key}ret = _get_cache(fn_name=fn.__name__,kw=combined_kw,)if ret is not None:return retret = fn(*args, **kwargs)_write_to_cache(fn_name=fn.__name__,kw=combined_kw,response=ret,)return retreturn wrapped | 1 | 3 | 1 | 13 | 3 | 482 | 504 | 482 | fn | ['kw_for_key', 'combined_kw', 'ret'] | Callable | {"Assign": 5, "Expr": 1, "If": 3, "Return": 4} | 8 | 23 | 8 | ["is_caching_enabled", "fn", "flexible_deepcopy", "_removes_unique_trace_values", "enumerate", "_get_cache", "fn", "_write_to_cache"] | 0 | [] | The function (_handle_cache) defined within the public class called public.The function start at line 482 and ends at 504. It contains 3 lines of code and it has a cyclomatic complexity of 1. The function does not take any parameters and does not return any value. It declares 8.0 functions, and It has 8.0 functions called inside which are ["is_caching_enabled", "fn", "flexible_deepcopy", "_removes_unique_trace_values", "enumerate", "_get_cache", "fn", "_write_to_cache"]. |
unifyai_unify | public | public | 0 | 0 | _handle_special_types | def _handle_special_types(kwargs: Dict[str, Any],) -> Dict[str, Any]:new_kwargs = dict()for k, v in kwargs.items():if isinstance(v, unify.Dataset):v.upload()new_kwargs[k] = v.nameelif callable(v):new_kwargs[k] = inspect.getsource(v)else:new_kwargs[k] = vreturn new_kwargs | 4 | 13 | 1 | 86 | 1 | 507 | 519 | 507 | kwargs | ['new_kwargs'] | Dict[str, Any] | {"Assign": 4, "Expr": 1, "For": 1, "If": 2, "Return": 1} | 6 | 13 | 6 | ["dict", "kwargs.items", "isinstance", "v.upload", "callable", "inspect.getsource"] | 5 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Entries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Params.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"] | The function (_handle_special_types) defined within the public class called public.The function start at line 507 and ends at 519. It contains 13 lines of code and it has a cyclomatic complexity of 4. The function does not take any parameters and does not return any value. It declares 6.0 functions, It has 6.0 functions called inside which are ["dict", "kwargs.items", "isinstance", "v.upload", "callable", "inspect.getsource"], It has 5.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Entries.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.logs_py.Params.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]. |
unifyai_unify | public | public | 0 | 0 | _to_log_ids.resolve_log_id | def resolve_log_id(log):if isinstance(log, unify.Log):if log.id is None and hasattr(log, "_future"):try:# Wait (with timeout) for the future to resolvelog._id = log._future.result(timeout=5)except Exception as e:raise Exception(f"Failed to resolve log id: {e}")return log.idreturn log | 5 | 9 | 1 | 62 | 0 | 525 | 534 | 525 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (_to_log_ids.resolve_log_id) defined within the public class called public.The function start at line 525 and ends at 534. It contains 9 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | _to_log_ids | def _to_log_ids(logs: Optional[Union[int, unify.Log, List[Union[int, unify.Log]]]] = None,):def resolve_log_id(log):if isinstance(log, unify.Log):if log.id is None and hasattr(log, "_future"):try:# Wait (with timeout) for the future to resolvelog._id = log._future.result(timeout=5)except Exception as e:raise Exception(f"Failed to resolve log id: {e}")return log.idreturn logif logs is None:current_active_logs = ACTIVE_LOG.get()if not current_active_logs:raise Exception("If logs is unspecified, then current_global_active_log must be.",)return [resolve_log_id(current_active_logs[-1])]elif isinstance(logs, int):return [logs]elif isinstance(logs, unify.Log):return [resolve_log_id(logs)]elif isinstance(logs, list):if not logs:return logselif isinstance(logs[0], int):return logselif isinstance(logs[0], unify.Log):return [resolve_log_id(lg) for lg in logs]else:raise Exception(f"list must contain int or unify.Log types, but found first entry {logs[0]} of type {type(logs[0])}",)raise Exception(f"logs argument must be of type int, unify.Log, or list, but found {logs} of type {type(logs)}",) | 10 | 29 | 1 | 163 | 1 | 522 | 560 | 522 | logs | ['current_active_logs'] | Returns | {"Assign": 2, "If": 10, "Return": 8, "Try": 1} | 18 | 39 | 18 | ["isinstance", "hasattr", "log._future.result", "Exception", "ACTIVE_LOG.get", "Exception", "resolve_log_id", "isinstance", "isinstance", "resolve_log_id", "isinstance", "isinstance", "isinstance", "resolve_log_id", "Exception", "type", "Exception", "type"] | 6 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_log_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.update_logs"] | The function (_to_log_ids) defined within the public class called public.The function start at line 522 and ends at 560. It contains 29 lines of code and it has a cyclomatic complexity of 10. The function does not take any parameters, and this function return a value. It declares 18.0 functions, It has 18.0 functions called inside which are ["isinstance", "hasattr", "log._future.result", "Exception", "ACTIVE_LOG.get", "Exception", "resolve_log_id", "isinstance", "isinstance", "resolve_log_id", "isinstance", "isinstance", "isinstance", "resolve_log_id", "Exception", "type", "Exception", "type"], It has 6.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_entries", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.add_log_params", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_log_fields", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.delete_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.update_logs"]. |
unifyai_unify | public | public | 0 | 0 | _apply_col_context | def _apply_col_context(**data):if COLUMN_CONTEXT_MODE.get() == "both":assert COLUMN_CONTEXT_WRITE.get() == COLUMN_CONTEXT_READ.get()col_context = COLUMN_CONTEXT_WRITE.get()elif COLUMN_CONTEXT_MODE.get() == "write":col_context = COLUMN_CONTEXT_WRITE.get()elif COLUMN_CONTEXT_MODE.get() == "read":col_context = COLUMN_CONTEXT_READ.get()return {os.path.join(col_context, k): v for k, v in data.items()} | 5 | 9 | 1 | 91 | 1 | 563 | 571 | 563 | **data | ['col_context'] | Returns | {"Assign": 3, "If": 3, "Return": 1} | 10 | 9 | 10 | ["COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_READ.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_READ.get", "os.path.join", "data.items"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"] | The function (_apply_col_context) defined within the public class called public.The function start at line 563 and ends at 571. It contains 9 lines of code and it has a cyclomatic complexity of 5. The function does not take any parameters, and this function return a value. It declares 10.0 functions, It has 10.0 functions called inside which are ["COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_READ.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_WRITE.get", "COLUMN_CONTEXT_MODE.get", "COLUMN_CONTEXT_READ.get", "os.path.join", "data.items"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]. |
unifyai_unify | public | public | 0 | 0 | _handle_context | def _handle_context(context: Optional[Union[str, Dict[str, str]]] = None):if context is None:return {"name": CONTEXT_WRITE.get()}if isinstance(context, str):return {"name": context}else:return context | 3 | 7 | 1 | 55 | 0 | 574 | 580 | 574 | context | [] | Returns | {"If": 2, "Return": 3} | 2 | 7 | 2 | ["CONTEXT_WRITE.get", "isinstance"] | 3 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"] | The function (_handle_context) defined within the public class called public.The function start at line 574 and ends at 580. It contains 7 lines of code and it has a cyclomatic complexity of 3. The function does not take any parameters, and this function return a value. It declares 2.0 functions, It has 2.0 functions called inside which are ["CONTEXT_WRITE.get", "isinstance"], It has 3.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]. |
unifyai_unify | public | public | 0 | 0 | _handle_mutability | def _handle_mutability(mutable: Optional[Union[bool, Dict[str, bool]]],data: Optional[Union[List[Dict[str, Any]], Dict[str, Any]]] = None,):if mutable is None or data is None:return dataif isinstance(data, list):single_item = Falsenew_data = flexible_deepcopy(data, on_fail="shallow")else:single_item = Truenew_data = [flexible_deepcopy(data, on_fail="shallow")]if isinstance(mutable, dict):for field, mut in mutable.items():for item in new_data:if field in item:item.setdefault("explicit_types", {})[field] = {"mutable": mut}elif isinstance(mutable, bool):for item in new_data:for k in list(item.keys()):if k != "explicit_types":item.setdefault("explicit_types", {})[k] = {"mutable": mutable}if single_item:return new_data[0]return new_data | 13 | 25 | 3 | 202 | 2 | 583 | 608 | 583 | mutable,data | ['new_data', 'single_item'] | Returns | {"Assign": 6, "For": 4, "If": 7, "Return": 3} | 10 | 26 | 10 | ["isinstance", "flexible_deepcopy", "flexible_deepcopy", "isinstance", "mutable.items", "item.setdefault", "isinstance", "list", "item.keys", "item.setdefault"] | 4 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"] | The function (_handle_mutability) defined within the public class called public.The function start at line 583 and ends at 608. It contains 25 lines of code and it has a cyclomatic complexity of 13. It takes 3 parameters, represented as [583.0], and this function return a value. It declares 10.0 functions, It has 10.0 functions called inside which are ["isinstance", "flexible_deepcopy", "flexible_deepcopy", "isinstance", "mutable.items", "item.setdefault", "isinstance", "list", "item.keys", "item.setdefault"], It has 4.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._AsyncTraceLogger._send_request", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py._add_to_log", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.log"]. |
unifyai_unify | public | public | 0 | 0 | _json_chunker | def _json_chunker(big_dict, chunk_size=1024 * 1024):json_string = json.dumps(big_dict)total_bytes = len(json_string)pbar = tqdm(total=total_bytes, unit="B", unit_scale=True, desc="Uploading JSON")start = 0while start < total_bytes:end = min(start + chunk_size, total_bytes)chunk = json_string[start:end]yield chunkpbar.update(len(chunk))start = endpbar.close() | 2 | 12 | 2 | 90 | 6 | 611 | 622 | 611 | big_dict,chunk_size | ['end', 'start', 'chunk', 'total_bytes', 'json_string', 'pbar'] | None | {"Assign": 7, "Expr": 3, "While": 1} | 7 | 12 | 7 | ["json.dumps", "len", "tqdm", "min", "pbar.update", "len", "pbar.close"] | 1 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs"] | The function (_json_chunker) defined within the public class called public.The function start at line 611 and ends at 622. It contains 12 lines of code and it has a cyclomatic complexity of 2. It takes 2 parameters, represented as [611.0] and does not return any value. It declares 7.0 functions, It has 7.0 functions called inside which are ["json.dumps", "len", "tqdm", "min", "pbar.update", "len", "pbar.close"], It has 1.0 function calling this function which is ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.95053461_unifyai_unify.unify.logging.utils.logs_py.create_logs"]. |
unifyai_unify | public | public | 0 | 0 | log.async_wrapper | async def async_wrapper(*args, **kwargs):transformed = log_decorator(fn)return await transformed(*args, **kwargs) | 1 | 3 | 2 | 25 | 0 | 680 | 682 | 680 | null | [] | None | null | 0 | 0 | 0 | null | 0 | null | The function (log.async_wrapper) defined within the public class called public.The function start at line 680 and ends at 682. It contains 3 lines of code and it has a cyclomatic complexity of 1. It takes 2 parameters, represented as [680.0] and does not return any value.. |
unifyai_unify | public | public | 0 | 0 | log | def log(fn: Optional[Callable] = None,*,project: Optional[str] = None,context: Optional[str] = None,params: Dict[str, Any] = None,new: bool = False,overwrite: bool = False,mutable: Optional[Union[bool, Dict[str, bool]]] = True,api_key: Optional[str] = None,**entries,) -> Union[unify.Log, Callable]:"""Can be used either as a regular function to create logs or as a decorator to log function inputs, intermediates and outputs.When used as a regular function:Creates one or more logs associated to a project. unify.Logs are LLM-call-level datathat might depend on other variables.When used as a decorator:Logs function inputs and intermediate values.Args:fn: When used as a decorator, this is the function to be wrapped.project: Name of the project the stored logs will be associated to.context: Context for the logs.params: Dictionary containing one or more key:value pairs that will belogged into the platform as params.new: Whether to create a new log if there is a currently active global log.Defaults to False, in which case log will add to the existing log.overwrite: If adding to an existing log, dictates whether or not to overwritefields with the same name.mutable: Either a boolean to apply uniform mutability for all fields, or a dictionary mapping field names to booleans for per-field control. Defaults to True.api_key: If specified, unify API key to be used. Defaults to the value in the`UNIFY_KEY` environment variable.entries: Dictionary containing one or more key:value pairs that will be loggedinto the platform as entries.Returns:When used as a regular function: The unique id of newly created log.When used as a decorator: The wrapped function."""# If used as a decoratorif fn is not None and callable(fn):from unify.logging.logs import log_decoratorif inspect.iscoroutinefunction(fn):async def async_wrapper(*args, **kwargs):transformed = log_decorator(fn)return await transformed(*args, **kwargs)return async_wrappertransformed = log_decorator(fn)return transformed# Regular log function logicglobal ASYNC_LOGGINGapi_key = _validate_api_key(api_key)context = _handle_context(context)if not new and ACTIVE_LOG.get():_add_to_log(context=context,mode="entries",overwrite=overwrite,mutable=mutable,api_key=api_key,**entries,)_add_to_log(context=context,mode="params",overwrite=overwrite,mutable=mutable,api_key=api_key,**(params if params is not None else {}),)log = ACTIVE_LOG.get()[-1]if USR_LOGGING:logger.info(f"Updated Log({log.id})")return log# Process parameters and entriesparams = _apply_col_context(**(params if params else {}))params = {**params, **ACTIVE_PARAMS_WRITE.get()}params = _handle_special_types(params)params = _handle_mutability(mutable, params)entries = _apply_col_context(**entries)entries = {**entries, **ACTIVE_ENTRIES_WRITE.get()}entries = _handle_special_types(entries)entries = _handle_mutability(mutable, entries)project = _get_and_maybe_create_project(project, api_key=api_key)if ASYNC_LOGGING and _async_logger is not None:# Use async logging: enqueue a create event and capture the Future.log_future = _async_logger.log_create(project=project,context=context,params=params,entries=entries,)created_log = unify.Log(id=None,# Placeholder; will be updated when the Future resolves._future=log_future,api_key=api_key,**entries,params=params,context=context,)else:# Use synchronous loggingcreated_log = _sync_log(project=project,context=context,params=params,entries=entries,api_key=api_key,)created_log.entries.pop("explicit_types", None)if PARAMS_NEST_LEVEL.get() > 0 or ENTRIES_NEST_LEVEL.get() > 0:LOGGED.set({**LOGGED.get(),created_log.id: list(params.keys()) + list(entries.keys()),},)if USR_LOGGING:logger.info(f"Created Log({created_log.id})")return created_log | 14 | 86 | 9 | 495 | 9 | 625 | 760 | 625 | fn,project,context,params,new,overwrite,mutable,api_key,**entries | ['entries', 'transformed', 'log', 'log_future', 'context', 'api_key', 'project', 'params', 'created_log'] | Union[unify.Log, Callable] | {"Assign": 17, "Expr": 7, "If": 7, "Return": 5} | 34 | 136 | 34 | ["callable", "inspect.iscoroutinefunction", "log_decorator", "transformed", "log_decorator", "_validate_api_key", "_handle_context", "ACTIVE_LOG.get", "_add_to_log", "_add_to_log", "ACTIVE_LOG.get", "logger.info", "_apply_col_context", "ACTIVE_PARAMS_WRITE.get", "_handle_special_types", "_handle_mutability", "_apply_col_context", "ACTIVE_ENTRIES_WRITE.get", "_handle_special_types", "_handle_mutability", "_get_and_maybe_create_project", "_async_logger.log_create", "unify.Log", "_sync_log", "created_log.entries.pop", "PARAMS_NEST_LEVEL.get", "ENTRIES_NEST_LEVEL.get", "LOGGED.set", "LOGGED.get", "list", "params.keys", "list", "entries.keys", "logger.info"] | 148 | ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3538486_freenas_corral_build.build.lib.utils_py.debug", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_async_py.fifo_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_fast_py.fifo_fast", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_sync_py.fifo_sync", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.system.memmap.barebone_py.Barebone.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.system.stream.avalonst_py.AvalonStream.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.test.test_system.test_to_generic_py.testbench_to_generic", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3675814_usyd_blockchain_vandal.tools.bulk_analyser.analyse_py.analyse_contract", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3675814_usyd_blockchain_vandal.tools.bulk_analyser.analyse_py.handle_signal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.admit_abstract_py.transform_abstract_to_admit_statement", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.binding_util_py.process_maybe_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.coq_version_py.subprocess_Popen_memoized", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.diagnose_error_py.memory_robust_timeout_Popen_communicate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.find_bug_py.default_on_fatal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_old_py.get_all_semiwait_iter", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_old_py.split_statements_to_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_py.split_statements_to_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.build_dists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.bump_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.drop_dist_dirs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.generate_changelog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.tag_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.upload_dists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913638_scijava_scyjava.src.scyjava._convert_py._convert", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917005_sergeypirogov_webdriver_manager.tests.conftest_py.delete_drivers_dir"] | The function (log) defined within the public class called public.The function start at line 625 and ends at 760. It contains 86 lines of code and it has a cyclomatic complexity of 14. It takes 9 parameters, represented as [625.0] and does not return any value. It declares 34.0 functions, It has 34.0 functions called inside which are ["callable", "inspect.iscoroutinefunction", "log_decorator", "transformed", "log_decorator", "_validate_api_key", "_handle_context", "ACTIVE_LOG.get", "_add_to_log", "_add_to_log", "ACTIVE_LOG.get", "logger.info", "_apply_col_context", "ACTIVE_PARAMS_WRITE.get", "_handle_special_types", "_handle_mutability", "_apply_col_context", "ACTIVE_ENTRIES_WRITE.get", "_handle_special_types", "_handle_mutability", "_get_and_maybe_create_project", "_async_logger.log_create", "unify.Log", "_sync_log", "created_log.entries.pop", "PARAMS_NEST_LEVEL.get", "ENTRIES_NEST_LEVEL.get", "LOGGED.set", "LOGGED.get", "list", "params.keys", "list", "entries.keys", "logger.info"], It has 148.0 functions calling this function which are ["_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3538486_freenas_corral_build.build.lib.utils_py.debug", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_async_py.fifo_async", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_fast_py.fifo_fast", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.cores.fifo.fifo_sync_py.fifo_sync", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.system.memmap.barebone_py.Barebone.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.rhea.system.stream.avalonst_py.AvalonStream.__init__", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3588611_cfelton_rhea.test.test_system.test_to_generic_py.testbench_to_generic", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3675814_usyd_blockchain_vandal.tools.bulk_analyser.analyse_py.analyse_contract", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3675814_usyd_blockchain_vandal.tools.bulk_analyser.analyse_py.handle_signal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.admit_abstract_py.transform_abstract_to_admit_statement", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.binding_util_py.process_maybe_list", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.coq_version_py.subprocess_Popen_memoized", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.diagnose_error_py.memory_robust_timeout_Popen_communicate", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.find_bug_py.default_on_fatal", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_old_py.get_all_semiwait_iter", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_old_py.split_statements_to_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3691476_jasongross_coq_tools.coq_tools.split_definitions_py.split_statements_to_definitions", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.build_dists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.bump_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.drop_dist_dirs", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.generate_changelog", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.tag_version", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3705359_sarugaku_pythonfinder.tasks.release_py.upload_dists", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3913638_scijava_scyjava.src.scyjava._convert_py._convert", "_.content.gdrive.MyDrive.Phd_Thesis.Dataset_Creation.Output.Cloned_Repo_3.3917005_sergeypirogov_webdriver_manager.tests.conftest_py.delete_drivers_dir"]. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.