ZTWHHH commited on
Commit
56e4d17
·
verified ·
1 Parent(s): 30a252b

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. minigpt2/lib/python3.10/site-packages/networkx/classes/__init__.py +13 -0
  2. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/__init__.cpython-310.pyc +0 -0
  3. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/coreviews.cpython-310.pyc +0 -0
  4. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/digraph.cpython-310.pyc +0 -0
  5. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/filters.cpython-310.pyc +0 -0
  6. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/function.cpython-310.pyc +0 -0
  7. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/graph.cpython-310.pyc +0 -0
  8. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/graphviews.cpython-310.pyc +0 -0
  9. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/multidigraph.cpython-310.pyc +0 -0
  10. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/multigraph.cpython-310.pyc +0 -0
  11. minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/reportviews.cpython-310.pyc +0 -0
  12. minigpt2/lib/python3.10/site-packages/networkx/classes/filters.py +95 -0
  13. minigpt2/lib/python3.10/site-packages/networkx/classes/function.py +1407 -0
  14. minigpt2/lib/python3.10/site-packages/networkx/classes/graph.py +2058 -0
  15. minigpt2/lib/python3.10/site-packages/networkx/classes/graphviews.py +269 -0
  16. minigpt2/lib/python3.10/site-packages/networkx/classes/multidigraph.py +966 -0
  17. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__init__.py +0 -0
  18. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/__init__.cpython-310.pyc +0 -0
  19. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/dispatch_interface.cpython-310.pyc +0 -0
  20. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/historical_tests.cpython-310.pyc +0 -0
  21. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_coreviews.cpython-310.pyc +0 -0
  22. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_digraph.cpython-310.pyc +0 -0
  23. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_digraph_historical.cpython-310.pyc +0 -0
  24. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_graph.cpython-310.pyc +0 -0
  25. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_graph_historical.cpython-310.pyc +0 -0
  26. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_multidigraph.cpython-310.pyc +0 -0
  27. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_reportviews.cpython-310.pyc +0 -0
  28. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_special.cpython-310.pyc +0 -0
  29. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_subgraphviews.cpython-310.pyc +0 -0
  30. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_coreviews.py +362 -0
  31. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_digraph_historical.py +111 -0
  32. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_filters.py +177 -0
  33. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_graph_historical.py +13 -0
  34. minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_multidigraph.py +459 -0
  35. minigpt2/lib/python3.10/site-packages/open_flamingo/__init__.py +2 -0
  36. minigpt2/lib/python3.10/site-packages/open_flamingo/eval/__init__.py +1 -0
  37. minigpt2/lib/python3.10/site-packages/open_flamingo/eval/eval_datasets.py +95 -0
  38. minigpt2/lib/python3.10/site-packages/open_flamingo/eval/evaluate.py +961 -0
  39. minigpt2/lib/python3.10/site-packages/open_flamingo/eval/ok_vqa_utils.py +214 -0
  40. minigpt2/lib/python3.10/site-packages/open_flamingo/eval/vqa_metric.py +578 -0
  41. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/INSTALLER +1 -0
  42. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/LICENSE-3RD-PARTY.txt +0 -0
  43. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/LICENSE.txt +21 -0
  44. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/METADATA +305 -0
  45. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/RECORD +112 -0
  46. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/REQUESTED +0 -0
  47. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/WHEEL +6 -0
  48. minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/top_level.txt +1 -0
  49. minigpt2/lib/python3.10/site-packages/orjson-3.10.14.dist-info/INSTALLER +1 -0
  50. minigpt2/lib/python3.10/site-packages/orjson-3.10.14.dist-info/METADATA +1141 -0
minigpt2/lib/python3.10/site-packages/networkx/classes/__init__.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from .graph import Graph
2
+ from .digraph import DiGraph
3
+ from .multigraph import MultiGraph
4
+ from .multidigraph import MultiDiGraph
5
+
6
+ from .function import *
7
+ from .graphviews import subgraph_view, reverse_view
8
+
9
+ from networkx.classes import filters
10
+
11
+ from networkx.classes import coreviews
12
+ from networkx.classes import graphviews
13
+ from networkx.classes import reportviews
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (574 Bytes). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/coreviews.cpython-310.pyc ADDED
Binary file (16.4 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/digraph.cpython-310.pyc ADDED
Binary file (46.6 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/filters.cpython-310.pyc ADDED
Binary file (5.02 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/function.cpython-310.pyc ADDED
Binary file (39.4 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/graph.cpython-310.pyc ADDED
Binary file (69.8 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/graphviews.cpython-310.pyc ADDED
Binary file (8.12 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/multidigraph.cpython-310.pyc ADDED
Binary file (36 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/multigraph.cpython-310.pyc ADDED
Binary file (46.3 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/__pycache__/reportviews.cpython-310.pyc ADDED
Binary file (49.2 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/filters.py ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Filter factories to hide or show sets of nodes and edges.
2
+
3
+ These filters return the function used when creating `SubGraph`.
4
+ """
5
+
6
+ __all__ = [
7
+ "no_filter",
8
+ "hide_nodes",
9
+ "hide_edges",
10
+ "hide_multiedges",
11
+ "hide_diedges",
12
+ "hide_multidiedges",
13
+ "show_nodes",
14
+ "show_edges",
15
+ "show_multiedges",
16
+ "show_diedges",
17
+ "show_multidiedges",
18
+ ]
19
+
20
+
21
+ def no_filter(*items):
22
+ """Returns a filter function that always evaluates to True."""
23
+ return True
24
+
25
+
26
+ def hide_nodes(nodes):
27
+ """Returns a filter function that hides specific nodes."""
28
+ nodes = set(nodes)
29
+ return lambda node: node not in nodes
30
+
31
+
32
+ def hide_diedges(edges):
33
+ """Returns a filter function that hides specific directed edges."""
34
+ edges = {(u, v) for u, v in edges}
35
+ return lambda u, v: (u, v) not in edges
36
+
37
+
38
+ def hide_edges(edges):
39
+ """Returns a filter function that hides specific undirected edges."""
40
+ alledges = set(edges) | {(v, u) for (u, v) in edges}
41
+ return lambda u, v: (u, v) not in alledges
42
+
43
+
44
+ def hide_multidiedges(edges):
45
+ """Returns a filter function that hides specific multi-directed edges."""
46
+ edges = {(u, v, k) for u, v, k in edges}
47
+ return lambda u, v, k: (u, v, k) not in edges
48
+
49
+
50
+ def hide_multiedges(edges):
51
+ """Returns a filter function that hides specific multi-undirected edges."""
52
+ alledges = set(edges) | {(v, u, k) for (u, v, k) in edges}
53
+ return lambda u, v, k: (u, v, k) not in alledges
54
+
55
+
56
+ # write show_nodes as a class to make SubGraph pickleable
57
+ class show_nodes:
58
+ """Filter class to show specific nodes.
59
+
60
+ Attach the set of nodes as an attribute to speed up this commonly used filter
61
+
62
+ Note that another allowed attribute for filters is to store the number of nodes
63
+ on the filter as attribute `length` (used in `__len__`). It is a user
64
+ responsibility to ensure this attribute is accurate if present.
65
+ """
66
+
67
+ def __init__(self, nodes):
68
+ self.nodes = set(nodes)
69
+
70
+ def __call__(self, node):
71
+ return node in self.nodes
72
+
73
+
74
+ def show_diedges(edges):
75
+ """Returns a filter function that shows specific directed edges."""
76
+ edges = {(u, v) for u, v in edges}
77
+ return lambda u, v: (u, v) in edges
78
+
79
+
80
+ def show_edges(edges):
81
+ """Returns a filter function that shows specific undirected edges."""
82
+ alledges = set(edges) | {(v, u) for (u, v) in edges}
83
+ return lambda u, v: (u, v) in alledges
84
+
85
+
86
+ def show_multidiedges(edges):
87
+ """Returns a filter function that shows specific multi-directed edges."""
88
+ edges = {(u, v, k) for u, v, k in edges}
89
+ return lambda u, v, k: (u, v, k) in edges
90
+
91
+
92
+ def show_multiedges(edges):
93
+ """Returns a filter function that shows specific multi-undirected edges."""
94
+ alledges = set(edges) | {(v, u, k) for (u, v, k) in edges}
95
+ return lambda u, v, k: (u, v, k) in alledges
minigpt2/lib/python3.10/site-packages/networkx/classes/function.py ADDED
@@ -0,0 +1,1407 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Functional interface to graph methods and assorted utilities."""
2
+
3
+ from collections import Counter
4
+ from itertools import chain
5
+
6
+ import networkx as nx
7
+ from networkx.utils import not_implemented_for, pairwise
8
+
9
+ __all__ = [
10
+ "nodes",
11
+ "edges",
12
+ "degree",
13
+ "degree_histogram",
14
+ "neighbors",
15
+ "number_of_nodes",
16
+ "number_of_edges",
17
+ "density",
18
+ "is_directed",
19
+ "freeze",
20
+ "is_frozen",
21
+ "subgraph",
22
+ "induced_subgraph",
23
+ "edge_subgraph",
24
+ "restricted_view",
25
+ "to_directed",
26
+ "to_undirected",
27
+ "add_star",
28
+ "add_path",
29
+ "add_cycle",
30
+ "create_empty_copy",
31
+ "set_node_attributes",
32
+ "get_node_attributes",
33
+ "remove_node_attributes",
34
+ "set_edge_attributes",
35
+ "get_edge_attributes",
36
+ "remove_edge_attributes",
37
+ "all_neighbors",
38
+ "non_neighbors",
39
+ "non_edges",
40
+ "common_neighbors",
41
+ "is_weighted",
42
+ "is_negatively_weighted",
43
+ "is_empty",
44
+ "selfloop_edges",
45
+ "nodes_with_selfloops",
46
+ "number_of_selfloops",
47
+ "path_weight",
48
+ "is_path",
49
+ ]
50
+
51
+
52
+ def nodes(G):
53
+ """Returns a NodeView over the graph nodes.
54
+
55
+ This function wraps the :func:`G.nodes <networkx.Graph.nodes>` property.
56
+ """
57
+ return G.nodes()
58
+
59
+
60
+ def edges(G, nbunch=None):
61
+ """Returns an edge view of edges incident to nodes in nbunch.
62
+
63
+ Return all edges if nbunch is unspecified or nbunch=None.
64
+
65
+ For digraphs, edges=out_edges
66
+
67
+ This function wraps the :func:`G.edges <networkx.Graph.edges>` property.
68
+ """
69
+ return G.edges(nbunch)
70
+
71
+
72
+ def degree(G, nbunch=None, weight=None):
73
+ """Returns a degree view of single node or of nbunch of nodes.
74
+ If nbunch is omitted, then return degrees of *all* nodes.
75
+
76
+ This function wraps the :func:`G.degree <networkx.Graph.degree>` property.
77
+ """
78
+ return G.degree(nbunch, weight)
79
+
80
+
81
+ def neighbors(G, n):
82
+ """Returns an iterator over all neighbors of node n.
83
+
84
+ This function wraps the :func:`G.neighbors <networkx.Graph.neighbors>` function.
85
+ """
86
+ return G.neighbors(n)
87
+
88
+
89
+ def number_of_nodes(G):
90
+ """Returns the number of nodes in the graph.
91
+
92
+ This function wraps the :func:`G.number_of_nodes <networkx.Graph.number_of_nodes>` function.
93
+ """
94
+ return G.number_of_nodes()
95
+
96
+
97
+ def number_of_edges(G):
98
+ """Returns the number of edges in the graph.
99
+
100
+ This function wraps the :func:`G.number_of_edges <networkx.Graph.number_of_edges>` function.
101
+ """
102
+ return G.number_of_edges()
103
+
104
+
105
+ def density(G):
106
+ r"""Returns the density of a graph.
107
+
108
+ The density for undirected graphs is
109
+
110
+ .. math::
111
+
112
+ d = \frac{2m}{n(n-1)},
113
+
114
+ and for directed graphs is
115
+
116
+ .. math::
117
+
118
+ d = \frac{m}{n(n-1)},
119
+
120
+ where `n` is the number of nodes and `m` is the number of edges in `G`.
121
+
122
+ Notes
123
+ -----
124
+ The density is 0 for a graph without edges and 1 for a complete graph.
125
+ The density of multigraphs can be higher than 1.
126
+
127
+ Self loops are counted in the total number of edges so graphs with self
128
+ loops can have density higher than 1.
129
+ """
130
+ n = number_of_nodes(G)
131
+ m = number_of_edges(G)
132
+ if m == 0 or n <= 1:
133
+ return 0
134
+ d = m / (n * (n - 1))
135
+ if not G.is_directed():
136
+ d *= 2
137
+ return d
138
+
139
+
140
+ def degree_histogram(G):
141
+ """Returns a list of the frequency of each degree value.
142
+
143
+ Parameters
144
+ ----------
145
+ G : Networkx graph
146
+ A graph
147
+
148
+ Returns
149
+ -------
150
+ hist : list
151
+ A list of frequencies of degrees.
152
+ The degree values are the index in the list.
153
+
154
+ Notes
155
+ -----
156
+ Note: the bins are width one, hence len(list) can be large
157
+ (Order(number_of_edges))
158
+ """
159
+ counts = Counter(d for n, d in G.degree())
160
+ return [counts.get(i, 0) for i in range(max(counts) + 1 if counts else 0)]
161
+
162
+
163
+ def is_directed(G):
164
+ """Return True if graph is directed."""
165
+ return G.is_directed()
166
+
167
+
168
+ def frozen(*args, **kwargs):
169
+ """Dummy method for raising errors when trying to modify frozen graphs"""
170
+ raise nx.NetworkXError("Frozen graph can't be modified")
171
+
172
+
173
+ def freeze(G):
174
+ """Modify graph to prevent further change by adding or removing
175
+ nodes or edges.
176
+
177
+ Node and edge data can still be modified.
178
+
179
+ Parameters
180
+ ----------
181
+ G : graph
182
+ A NetworkX graph
183
+
184
+ Examples
185
+ --------
186
+ >>> G = nx.path_graph(4)
187
+ >>> G = nx.freeze(G)
188
+ >>> try:
189
+ ... G.add_edge(4, 5)
190
+ ... except nx.NetworkXError as err:
191
+ ... print(str(err))
192
+ Frozen graph can't be modified
193
+
194
+ Notes
195
+ -----
196
+ To "unfreeze" a graph you must make a copy by creating a new graph object:
197
+
198
+ >>> graph = nx.path_graph(4)
199
+ >>> frozen_graph = nx.freeze(graph)
200
+ >>> unfrozen_graph = nx.Graph(frozen_graph)
201
+ >>> nx.is_frozen(unfrozen_graph)
202
+ False
203
+
204
+ See Also
205
+ --------
206
+ is_frozen
207
+ """
208
+ G.add_node = frozen
209
+ G.add_nodes_from = frozen
210
+ G.remove_node = frozen
211
+ G.remove_nodes_from = frozen
212
+ G.add_edge = frozen
213
+ G.add_edges_from = frozen
214
+ G.add_weighted_edges_from = frozen
215
+ G.remove_edge = frozen
216
+ G.remove_edges_from = frozen
217
+ G.clear = frozen
218
+ G.clear_edges = frozen
219
+ G.frozen = True
220
+ return G
221
+
222
+
223
+ def is_frozen(G):
224
+ """Returns True if graph is frozen.
225
+
226
+ Parameters
227
+ ----------
228
+ G : graph
229
+ A NetworkX graph
230
+
231
+ See Also
232
+ --------
233
+ freeze
234
+ """
235
+ try:
236
+ return G.frozen
237
+ except AttributeError:
238
+ return False
239
+
240
+
241
+ def add_star(G_to_add_to, nodes_for_star, **attr):
242
+ """Add a star to Graph G_to_add_to.
243
+
244
+ The first node in `nodes_for_star` is the middle of the star.
245
+ It is connected to all other nodes.
246
+
247
+ Parameters
248
+ ----------
249
+ G_to_add_to : graph
250
+ A NetworkX graph
251
+ nodes_for_star : iterable container
252
+ A container of nodes.
253
+ attr : keyword arguments, optional (default= no attributes)
254
+ Attributes to add to every edge in star.
255
+
256
+ See Also
257
+ --------
258
+ add_path, add_cycle
259
+
260
+ Examples
261
+ --------
262
+ >>> G = nx.Graph()
263
+ >>> nx.add_star(G, [0, 1, 2, 3])
264
+ >>> nx.add_star(G, [10, 11, 12], weight=2)
265
+ """
266
+ nlist = iter(nodes_for_star)
267
+ try:
268
+ v = next(nlist)
269
+ except StopIteration:
270
+ return
271
+ G_to_add_to.add_node(v)
272
+ edges = ((v, n) for n in nlist)
273
+ G_to_add_to.add_edges_from(edges, **attr)
274
+
275
+
276
+ def add_path(G_to_add_to, nodes_for_path, **attr):
277
+ """Add a path to the Graph G_to_add_to.
278
+
279
+ Parameters
280
+ ----------
281
+ G_to_add_to : graph
282
+ A NetworkX graph
283
+ nodes_for_path : iterable container
284
+ A container of nodes. A path will be constructed from
285
+ the nodes (in order) and added to the graph.
286
+ attr : keyword arguments, optional (default= no attributes)
287
+ Attributes to add to every edge in path.
288
+
289
+ See Also
290
+ --------
291
+ add_star, add_cycle
292
+
293
+ Examples
294
+ --------
295
+ >>> G = nx.Graph()
296
+ >>> nx.add_path(G, [0, 1, 2, 3])
297
+ >>> nx.add_path(G, [10, 11, 12], weight=7)
298
+ """
299
+ nlist = iter(nodes_for_path)
300
+ try:
301
+ first_node = next(nlist)
302
+ except StopIteration:
303
+ return
304
+ G_to_add_to.add_node(first_node)
305
+ G_to_add_to.add_edges_from(pairwise(chain((first_node,), nlist)), **attr)
306
+
307
+
308
+ def add_cycle(G_to_add_to, nodes_for_cycle, **attr):
309
+ """Add a cycle to the Graph G_to_add_to.
310
+
311
+ Parameters
312
+ ----------
313
+ G_to_add_to : graph
314
+ A NetworkX graph
315
+ nodes_for_cycle: iterable container
316
+ A container of nodes. A cycle will be constructed from
317
+ the nodes (in order) and added to the graph.
318
+ attr : keyword arguments, optional (default= no attributes)
319
+ Attributes to add to every edge in cycle.
320
+
321
+ See Also
322
+ --------
323
+ add_path, add_star
324
+
325
+ Examples
326
+ --------
327
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
328
+ >>> nx.add_cycle(G, [0, 1, 2, 3])
329
+ >>> nx.add_cycle(G, [10, 11, 12], weight=7)
330
+ """
331
+ nlist = iter(nodes_for_cycle)
332
+ try:
333
+ first_node = next(nlist)
334
+ except StopIteration:
335
+ return
336
+ G_to_add_to.add_node(first_node)
337
+ G_to_add_to.add_edges_from(
338
+ pairwise(chain((first_node,), nlist), cyclic=True), **attr
339
+ )
340
+
341
+
342
+ def subgraph(G, nbunch):
343
+ """Returns the subgraph induced on nodes in nbunch.
344
+
345
+ Parameters
346
+ ----------
347
+ G : graph
348
+ A NetworkX graph
349
+
350
+ nbunch : list, iterable
351
+ A container of nodes that will be iterated through once (thus
352
+ it should be an iterator or be iterable). Each element of the
353
+ container should be a valid node type: any hashable type except
354
+ None. If nbunch is None, return all edges data in the graph.
355
+ Nodes in nbunch that are not in the graph will be (quietly)
356
+ ignored.
357
+
358
+ Notes
359
+ -----
360
+ subgraph(G) calls G.subgraph()
361
+ """
362
+ return G.subgraph(nbunch)
363
+
364
+
365
+ def induced_subgraph(G, nbunch):
366
+ """Returns a SubGraph view of `G` showing only nodes in nbunch.
367
+
368
+ The induced subgraph of a graph on a set of nodes N is the
369
+ graph with nodes N and edges from G which have both ends in N.
370
+
371
+ Parameters
372
+ ----------
373
+ G : NetworkX Graph
374
+ nbunch : node, container of nodes or None (for all nodes)
375
+
376
+ Returns
377
+ -------
378
+ subgraph : SubGraph View
379
+ A read-only view of the subgraph in `G` induced by the nodes.
380
+ Changes to the graph `G` will be reflected in the view.
381
+
382
+ Notes
383
+ -----
384
+ To create a mutable subgraph with its own copies of nodes
385
+ edges and attributes use `subgraph.copy()` or `Graph(subgraph)`
386
+
387
+ For an inplace reduction of a graph to a subgraph you can remove nodes:
388
+ `G.remove_nodes_from(n in G if n not in set(nbunch))`
389
+
390
+ If you are going to compute subgraphs of your subgraphs you could
391
+ end up with a chain of views that can be very slow once the chain
392
+ has about 15 views in it. If they are all induced subgraphs, you
393
+ can short-cut the chain by making them all subgraphs of the original
394
+ graph. The graph class method `G.subgraph` does this when `G` is
395
+ a subgraph. In contrast, this function allows you to choose to build
396
+ chains or not, as you wish. The returned subgraph is a view on `G`.
397
+
398
+ Examples
399
+ --------
400
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
401
+ >>> H = nx.induced_subgraph(G, [0, 1, 3])
402
+ >>> list(H.edges)
403
+ [(0, 1)]
404
+ >>> list(H.nodes)
405
+ [0, 1, 3]
406
+ """
407
+ induced_nodes = nx.filters.show_nodes(G.nbunch_iter(nbunch))
408
+ return nx.subgraph_view(G, filter_node=induced_nodes)
409
+
410
+
411
+ def edge_subgraph(G, edges):
412
+ """Returns a view of the subgraph induced by the specified edges.
413
+
414
+ The induced subgraph contains each edge in `edges` and each
415
+ node incident to any of those edges.
416
+
417
+ Parameters
418
+ ----------
419
+ G : NetworkX Graph
420
+ edges : iterable
421
+ An iterable of edges. Edges not present in `G` are ignored.
422
+
423
+ Returns
424
+ -------
425
+ subgraph : SubGraph View
426
+ A read-only edge-induced subgraph of `G`.
427
+ Changes to `G` are reflected in the view.
428
+
429
+ Notes
430
+ -----
431
+ To create a mutable subgraph with its own copies of nodes
432
+ edges and attributes use `subgraph.copy()` or `Graph(subgraph)`
433
+
434
+ If you create a subgraph of a subgraph recursively you can end up
435
+ with a chain of subgraphs that becomes very slow with about 15
436
+ nested subgraph views. Luckily the edge_subgraph filter nests
437
+ nicely so you can use the original graph as G in this function
438
+ to avoid chains. We do not rule out chains programmatically so
439
+ that odd cases like an `edge_subgraph` of a `restricted_view`
440
+ can be created.
441
+
442
+ Examples
443
+ --------
444
+ >>> G = nx.path_graph(5)
445
+ >>> H = G.edge_subgraph([(0, 1), (3, 4)])
446
+ >>> list(H.nodes)
447
+ [0, 1, 3, 4]
448
+ >>> list(H.edges)
449
+ [(0, 1), (3, 4)]
450
+ """
451
+ nxf = nx.filters
452
+ edges = set(edges)
453
+ nodes = set()
454
+ for e in edges:
455
+ nodes.update(e[:2])
456
+ induced_nodes = nxf.show_nodes(nodes)
457
+ if G.is_multigraph():
458
+ if G.is_directed():
459
+ induced_edges = nxf.show_multidiedges(edges)
460
+ else:
461
+ induced_edges = nxf.show_multiedges(edges)
462
+ else:
463
+ if G.is_directed():
464
+ induced_edges = nxf.show_diedges(edges)
465
+ else:
466
+ induced_edges = nxf.show_edges(edges)
467
+ return nx.subgraph_view(G, filter_node=induced_nodes, filter_edge=induced_edges)
468
+
469
+
470
+ def restricted_view(G, nodes, edges):
471
+ """Returns a view of `G` with hidden nodes and edges.
472
+
473
+ The resulting subgraph filters out node `nodes` and edges `edges`.
474
+ Filtered out nodes also filter out any of their edges.
475
+
476
+ Parameters
477
+ ----------
478
+ G : NetworkX Graph
479
+ nodes : iterable
480
+ An iterable of nodes. Nodes not present in `G` are ignored.
481
+ edges : iterable
482
+ An iterable of edges. Edges not present in `G` are ignored.
483
+
484
+ Returns
485
+ -------
486
+ subgraph : SubGraph View
487
+ A read-only restricted view of `G` filtering out nodes and edges.
488
+ Changes to `G` are reflected in the view.
489
+
490
+ Notes
491
+ -----
492
+ To create a mutable subgraph with its own copies of nodes
493
+ edges and attributes use `subgraph.copy()` or `Graph(subgraph)`
494
+
495
+ If you create a subgraph of a subgraph recursively you may end up
496
+ with a chain of subgraph views. Such chains can get quite slow
497
+ for lengths near 15. To avoid long chains, try to make your subgraph
498
+ based on the original graph. We do not rule out chains programmatically
499
+ so that odd cases like an `edge_subgraph` of a `restricted_view`
500
+ can be created.
501
+
502
+ Examples
503
+ --------
504
+ >>> G = nx.path_graph(5)
505
+ >>> H = nx.restricted_view(G, [0], [(1, 2), (3, 4)])
506
+ >>> list(H.nodes)
507
+ [1, 2, 3, 4]
508
+ >>> list(H.edges)
509
+ [(2, 3)]
510
+ """
511
+ nxf = nx.filters
512
+ hide_nodes = nxf.hide_nodes(nodes)
513
+ if G.is_multigraph():
514
+ if G.is_directed():
515
+ hide_edges = nxf.hide_multidiedges(edges)
516
+ else:
517
+ hide_edges = nxf.hide_multiedges(edges)
518
+ else:
519
+ if G.is_directed():
520
+ hide_edges = nxf.hide_diedges(edges)
521
+ else:
522
+ hide_edges = nxf.hide_edges(edges)
523
+ return nx.subgraph_view(G, filter_node=hide_nodes, filter_edge=hide_edges)
524
+
525
+
526
+ def to_directed(graph):
527
+ """Returns a directed view of the graph `graph`.
528
+
529
+ Identical to graph.to_directed(as_view=True)
530
+ Note that graph.to_directed defaults to `as_view=False`
531
+ while this function always provides a view.
532
+ """
533
+ return graph.to_directed(as_view=True)
534
+
535
+
536
+ def to_undirected(graph):
537
+ """Returns an undirected view of the graph `graph`.
538
+
539
+ Identical to graph.to_undirected(as_view=True)
540
+ Note that graph.to_undirected defaults to `as_view=False`
541
+ while this function always provides a view.
542
+ """
543
+ return graph.to_undirected(as_view=True)
544
+
545
+
546
+ def create_empty_copy(G, with_data=True):
547
+ """Returns a copy of the graph G with all of the edges removed.
548
+
549
+ Parameters
550
+ ----------
551
+ G : graph
552
+ A NetworkX graph
553
+
554
+ with_data : bool (default=True)
555
+ Propagate Graph and Nodes data to the new graph.
556
+
557
+ See Also
558
+ --------
559
+ empty_graph
560
+
561
+ """
562
+ H = G.__class__()
563
+ H.add_nodes_from(G.nodes(data=with_data))
564
+ if with_data:
565
+ H.graph.update(G.graph)
566
+ return H
567
+
568
+
569
+ def set_node_attributes(G, values, name=None):
570
+ """Sets node attributes from a given value or dictionary of values.
571
+
572
+ .. Warning:: The call order of arguments `values` and `name`
573
+ switched between v1.x & v2.x.
574
+
575
+ Parameters
576
+ ----------
577
+ G : NetworkX Graph
578
+
579
+ values : scalar value, dict-like
580
+ What the node attribute should be set to. If `values` is
581
+ not a dictionary, then it is treated as a single attribute value
582
+ that is then applied to every node in `G`. This means that if
583
+ you provide a mutable object, like a list, updates to that object
584
+ will be reflected in the node attribute for every node.
585
+ The attribute name will be `name`.
586
+
587
+ If `values` is a dict or a dict of dict, it should be keyed
588
+ by node to either an attribute value or a dict of attribute key/value
589
+ pairs used to update the node's attributes.
590
+
591
+ name : string (optional, default=None)
592
+ Name of the node attribute to set if values is a scalar.
593
+
594
+ Examples
595
+ --------
596
+ After computing some property of the nodes of a graph, you may want
597
+ to assign a node attribute to store the value of that property for
598
+ each node::
599
+
600
+ >>> G = nx.path_graph(3)
601
+ >>> bb = nx.betweenness_centrality(G)
602
+ >>> isinstance(bb, dict)
603
+ True
604
+ >>> nx.set_node_attributes(G, bb, "betweenness")
605
+ >>> G.nodes[1]["betweenness"]
606
+ 1.0
607
+
608
+ If you provide a list as the second argument, updates to the list
609
+ will be reflected in the node attribute for each node::
610
+
611
+ >>> G = nx.path_graph(3)
612
+ >>> labels = []
613
+ >>> nx.set_node_attributes(G, labels, "labels")
614
+ >>> labels.append("foo")
615
+ >>> G.nodes[0]["labels"]
616
+ ['foo']
617
+ >>> G.nodes[1]["labels"]
618
+ ['foo']
619
+ >>> G.nodes[2]["labels"]
620
+ ['foo']
621
+
622
+ If you provide a dictionary of dictionaries as the second argument,
623
+ the outer dictionary is assumed to be keyed by node to an inner
624
+ dictionary of node attributes for that node::
625
+
626
+ >>> G = nx.path_graph(3)
627
+ >>> attrs = {0: {"attr1": 20, "attr2": "nothing"}, 1: {"attr2": 3}}
628
+ >>> nx.set_node_attributes(G, attrs)
629
+ >>> G.nodes[0]["attr1"]
630
+ 20
631
+ >>> G.nodes[0]["attr2"]
632
+ 'nothing'
633
+ >>> G.nodes[1]["attr2"]
634
+ 3
635
+ >>> G.nodes[2]
636
+ {}
637
+
638
+ Note that if the dictionary contains nodes that are not in `G`, the
639
+ values are silently ignored::
640
+
641
+ >>> G = nx.Graph()
642
+ >>> G.add_node(0)
643
+ >>> nx.set_node_attributes(G, {0: "red", 1: "blue"}, name="color")
644
+ >>> G.nodes[0]["color"]
645
+ 'red'
646
+ >>> 1 in G.nodes
647
+ False
648
+
649
+ """
650
+ # Set node attributes based on type of `values`
651
+ if name is not None: # `values` must not be a dict of dict
652
+ try: # `values` is a dict
653
+ for n, v in values.items():
654
+ try:
655
+ G.nodes[n][name] = values[n]
656
+ except KeyError:
657
+ pass
658
+ except AttributeError: # `values` is a constant
659
+ for n in G:
660
+ G.nodes[n][name] = values
661
+ else: # `values` must be dict of dict
662
+ for n, d in values.items():
663
+ try:
664
+ G.nodes[n].update(d)
665
+ except KeyError:
666
+ pass
667
+ nx._clear_cache(G)
668
+
669
+
670
+ def get_node_attributes(G, name, default=None):
671
+ """Get node attributes from graph
672
+
673
+ Parameters
674
+ ----------
675
+ G : NetworkX Graph
676
+
677
+ name : string
678
+ Attribute name
679
+
680
+ default: object (default=None)
681
+ Default value of the node attribute if there is no value set for that
682
+ node in graph. If `None` then nodes without this attribute are not
683
+ included in the returned dict.
684
+
685
+ Returns
686
+ -------
687
+ Dictionary of attributes keyed by node.
688
+
689
+ Examples
690
+ --------
691
+ >>> G = nx.Graph()
692
+ >>> G.add_nodes_from([1, 2, 3], color="red")
693
+ >>> color = nx.get_node_attributes(G, "color")
694
+ >>> color[1]
695
+ 'red'
696
+ >>> G.add_node(4)
697
+ >>> color = nx.get_node_attributes(G, "color", default="yellow")
698
+ >>> color[4]
699
+ 'yellow'
700
+ """
701
+ if default is not None:
702
+ return {n: d.get(name, default) for n, d in G.nodes.items()}
703
+ return {n: d[name] for n, d in G.nodes.items() if name in d}
704
+
705
+
706
+ def remove_node_attributes(G, *attr_names, nbunch=None):
707
+ """Remove node attributes from all nodes in the graph.
708
+
709
+ Parameters
710
+ ----------
711
+ G : NetworkX Graph
712
+
713
+ *attr_names : List of Strings
714
+ The attribute names to remove from the graph.
715
+
716
+ nbunch : List of Nodes
717
+ Remove the node attributes only from the nodes in this list.
718
+
719
+ Examples
720
+ --------
721
+ >>> G = nx.Graph()
722
+ >>> G.add_nodes_from([1, 2, 3], color="blue")
723
+ >>> nx.get_node_attributes(G, "color")
724
+ {1: 'blue', 2: 'blue', 3: 'blue'}
725
+ >>> nx.remove_node_attributes(G, "color")
726
+ >>> nx.get_node_attributes(G, "color")
727
+ {}
728
+ """
729
+
730
+ if nbunch is None:
731
+ nbunch = G.nodes()
732
+
733
+ for attr in attr_names:
734
+ for n, d in G.nodes(data=True):
735
+ if n in nbunch:
736
+ try:
737
+ del d[attr]
738
+ except KeyError:
739
+ pass
740
+
741
+
742
+ def set_edge_attributes(G, values, name=None):
743
+ """Sets edge attributes from a given value or dictionary of values.
744
+
745
+ .. Warning:: The call order of arguments `values` and `name`
746
+ switched between v1.x & v2.x.
747
+
748
+ Parameters
749
+ ----------
750
+ G : NetworkX Graph
751
+
752
+ values : scalar value, dict-like
753
+ What the edge attribute should be set to. If `values` is
754
+ not a dictionary, then it is treated as a single attribute value
755
+ that is then applied to every edge in `G`. This means that if
756
+ you provide a mutable object, like a list, updates to that object
757
+ will be reflected in the edge attribute for each edge. The attribute
758
+ name will be `name`.
759
+
760
+ If `values` is a dict or a dict of dict, it should be keyed
761
+ by edge tuple to either an attribute value or a dict of attribute
762
+ key/value pairs used to update the edge's attributes.
763
+ For multigraphs, the edge tuples must be of the form ``(u, v, key)``,
764
+ where `u` and `v` are nodes and `key` is the edge key.
765
+ For non-multigraphs, the keys must be tuples of the form ``(u, v)``.
766
+
767
+ name : string (optional, default=None)
768
+ Name of the edge attribute to set if values is a scalar.
769
+
770
+ Examples
771
+ --------
772
+ After computing some property of the edges of a graph, you may want
773
+ to assign a edge attribute to store the value of that property for
774
+ each edge::
775
+
776
+ >>> G = nx.path_graph(3)
777
+ >>> bb = nx.edge_betweenness_centrality(G, normalized=False)
778
+ >>> nx.set_edge_attributes(G, bb, "betweenness")
779
+ >>> G.edges[1, 2]["betweenness"]
780
+ 2.0
781
+
782
+ If you provide a list as the second argument, updates to the list
783
+ will be reflected in the edge attribute for each edge::
784
+
785
+ >>> labels = []
786
+ >>> nx.set_edge_attributes(G, labels, "labels")
787
+ >>> labels.append("foo")
788
+ >>> G.edges[0, 1]["labels"]
789
+ ['foo']
790
+ >>> G.edges[1, 2]["labels"]
791
+ ['foo']
792
+
793
+ If you provide a dictionary of dictionaries as the second argument,
794
+ the entire dictionary will be used to update edge attributes::
795
+
796
+ >>> G = nx.path_graph(3)
797
+ >>> attrs = {(0, 1): {"attr1": 20, "attr2": "nothing"}, (1, 2): {"attr2": 3}}
798
+ >>> nx.set_edge_attributes(G, attrs)
799
+ >>> G[0][1]["attr1"]
800
+ 20
801
+ >>> G[0][1]["attr2"]
802
+ 'nothing'
803
+ >>> G[1][2]["attr2"]
804
+ 3
805
+
806
+ The attributes of one Graph can be used to set those of another.
807
+
808
+ >>> H = nx.path_graph(3)
809
+ >>> nx.set_edge_attributes(H, G.edges)
810
+
811
+ Note that if the dict contains edges that are not in `G`, they are
812
+ silently ignored::
813
+
814
+ >>> G = nx.Graph([(0, 1)])
815
+ >>> nx.set_edge_attributes(G, {(1, 2): {"weight": 2.0}})
816
+ >>> (1, 2) in G.edges()
817
+ False
818
+
819
+ For multigraphs, the `values` dict is expected to be keyed by 3-tuples
820
+ including the edge key::
821
+
822
+ >>> MG = nx.MultiGraph()
823
+ >>> edges = [(0, 1), (0, 1)]
824
+ >>> MG.add_edges_from(edges) # Returns list of edge keys
825
+ [0, 1]
826
+ >>> attributes = {(0, 1, 0): {"cost": 21}, (0, 1, 1): {"cost": 7}}
827
+ >>> nx.set_edge_attributes(MG, attributes)
828
+ >>> MG[0][1][0]["cost"]
829
+ 21
830
+ >>> MG[0][1][1]["cost"]
831
+ 7
832
+
833
+ If MultiGraph attributes are desired for a Graph, you must convert the 3-tuple
834
+ multiedge to a 2-tuple edge and the last multiedge's attribute value will
835
+ overwrite the previous values. Continuing from the previous case we get::
836
+
837
+ >>> H = nx.path_graph([0, 1, 2])
838
+ >>> nx.set_edge_attributes(H, {(u, v): ed for u, v, ed in MG.edges.data()})
839
+ >>> nx.get_edge_attributes(H, "cost")
840
+ {(0, 1): 7}
841
+
842
+ """
843
+ if name is not None:
844
+ # `values` does not contain attribute names
845
+ try:
846
+ # if `values` is a dict using `.items()` => {edge: value}
847
+ if G.is_multigraph():
848
+ for (u, v, key), value in values.items():
849
+ try:
850
+ G._adj[u][v][key][name] = value
851
+ except KeyError:
852
+ pass
853
+ else:
854
+ for (u, v), value in values.items():
855
+ try:
856
+ G._adj[u][v][name] = value
857
+ except KeyError:
858
+ pass
859
+ except AttributeError:
860
+ # treat `values` as a constant
861
+ for u, v, data in G.edges(data=True):
862
+ data[name] = values
863
+ else:
864
+ # `values` consists of doct-of-dict {edge: {attr: value}} shape
865
+ if G.is_multigraph():
866
+ for (u, v, key), d in values.items():
867
+ try:
868
+ G._adj[u][v][key].update(d)
869
+ except KeyError:
870
+ pass
871
+ else:
872
+ for (u, v), d in values.items():
873
+ try:
874
+ G._adj[u][v].update(d)
875
+ except KeyError:
876
+ pass
877
+ nx._clear_cache(G)
878
+
879
+
880
+ def get_edge_attributes(G, name, default=None):
881
+ """Get edge attributes from graph
882
+
883
+ Parameters
884
+ ----------
885
+ G : NetworkX Graph
886
+
887
+ name : string
888
+ Attribute name
889
+
890
+ default: object (default=None)
891
+ Default value of the edge attribute if there is no value set for that
892
+ edge in graph. If `None` then edges without this attribute are not
893
+ included in the returned dict.
894
+
895
+ Returns
896
+ -------
897
+ Dictionary of attributes keyed by edge. For (di)graphs, the keys are
898
+ 2-tuples of the form: (u, v). For multi(di)graphs, the keys are 3-tuples of
899
+ the form: (u, v, key).
900
+
901
+ Examples
902
+ --------
903
+ >>> G = nx.Graph()
904
+ >>> nx.add_path(G, [1, 2, 3], color="red")
905
+ >>> color = nx.get_edge_attributes(G, "color")
906
+ >>> color[(1, 2)]
907
+ 'red'
908
+ >>> G.add_edge(3, 4)
909
+ >>> color = nx.get_edge_attributes(G, "color", default="yellow")
910
+ >>> color[(3, 4)]
911
+ 'yellow'
912
+ """
913
+ if G.is_multigraph():
914
+ edges = G.edges(keys=True, data=True)
915
+ else:
916
+ edges = G.edges(data=True)
917
+ if default is not None:
918
+ return {x[:-1]: x[-1].get(name, default) for x in edges}
919
+ return {x[:-1]: x[-1][name] for x in edges if name in x[-1]}
920
+
921
+
922
+ def remove_edge_attributes(G, *attr_names, ebunch=None):
923
+ """Remove edge attributes from all edges in the graph.
924
+
925
+ Parameters
926
+ ----------
927
+ G : NetworkX Graph
928
+
929
+ *attr_names : List of Strings
930
+ The attribute names to remove from the graph.
931
+
932
+ Examples
933
+ --------
934
+ >>> G = nx.path_graph(3)
935
+ >>> nx.set_edge_attributes(G, {(u, v): u + v for u, v in G.edges()}, name="weight")
936
+ >>> nx.get_edge_attributes(G, "weight")
937
+ {(0, 1): 1, (1, 2): 3}
938
+ >>> remove_edge_attributes(G, "weight")
939
+ >>> nx.get_edge_attributes(G, "weight")
940
+ {}
941
+ """
942
+ if ebunch is None:
943
+ ebunch = G.edges(keys=True) if G.is_multigraph() else G.edges()
944
+
945
+ for attr in attr_names:
946
+ edges = (
947
+ G.edges(keys=True, data=True) if G.is_multigraph() else G.edges(data=True)
948
+ )
949
+ for *e, d in edges:
950
+ if tuple(e) in ebunch:
951
+ try:
952
+ del d[attr]
953
+ except KeyError:
954
+ pass
955
+
956
+
957
+ def all_neighbors(graph, node):
958
+ """Returns all of the neighbors of a node in the graph.
959
+
960
+ If the graph is directed returns predecessors as well as successors.
961
+
962
+ Parameters
963
+ ----------
964
+ graph : NetworkX graph
965
+ Graph to find neighbors.
966
+
967
+ node : node
968
+ The node whose neighbors will be returned.
969
+
970
+ Returns
971
+ -------
972
+ neighbors : iterator
973
+ Iterator of neighbors
974
+ """
975
+ if graph.is_directed():
976
+ values = chain(graph.predecessors(node), graph.successors(node))
977
+ else:
978
+ values = graph.neighbors(node)
979
+ return values
980
+
981
+
982
+ def non_neighbors(graph, node):
983
+ """Returns the non-neighbors of the node in the graph.
984
+
985
+ Parameters
986
+ ----------
987
+ graph : NetworkX graph
988
+ Graph to find neighbors.
989
+
990
+ node : node
991
+ The node whose neighbors will be returned.
992
+
993
+ Returns
994
+ -------
995
+ non_neighbors : set
996
+ Set of nodes in the graph that are not neighbors of the node.
997
+ """
998
+ return graph._adj.keys() - graph._adj[node].keys() - {node}
999
+
1000
+
1001
+ def non_edges(graph):
1002
+ """Returns the nonexistent edges in the graph.
1003
+
1004
+ Parameters
1005
+ ----------
1006
+ graph : NetworkX graph.
1007
+ Graph to find nonexistent edges.
1008
+
1009
+ Returns
1010
+ -------
1011
+ non_edges : iterator
1012
+ Iterator of edges that are not in the graph.
1013
+ """
1014
+ if graph.is_directed():
1015
+ for u in graph:
1016
+ for v in non_neighbors(graph, u):
1017
+ yield (u, v)
1018
+ else:
1019
+ nodes = set(graph)
1020
+ while nodes:
1021
+ u = nodes.pop()
1022
+ for v in nodes - set(graph[u]):
1023
+ yield (u, v)
1024
+
1025
+
1026
+ @not_implemented_for("directed")
1027
+ def common_neighbors(G, u, v):
1028
+ """Returns the common neighbors of two nodes in a graph.
1029
+
1030
+ Parameters
1031
+ ----------
1032
+ G : graph
1033
+ A NetworkX undirected graph.
1034
+
1035
+ u, v : nodes
1036
+ Nodes in the graph.
1037
+
1038
+ Returns
1039
+ -------
1040
+ cnbors : set
1041
+ Set of common neighbors of u and v in the graph.
1042
+
1043
+ Raises
1044
+ ------
1045
+ NetworkXError
1046
+ If u or v is not a node in the graph.
1047
+
1048
+ Examples
1049
+ --------
1050
+ >>> G = nx.complete_graph(5)
1051
+ >>> sorted(nx.common_neighbors(G, 0, 1))
1052
+ [2, 3, 4]
1053
+ """
1054
+ if u not in G:
1055
+ raise nx.NetworkXError("u is not in the graph.")
1056
+ if v not in G:
1057
+ raise nx.NetworkXError("v is not in the graph.")
1058
+
1059
+ return G._adj[u].keys() & G._adj[v].keys() - {u, v}
1060
+
1061
+
1062
+ def is_weighted(G, edge=None, weight="weight"):
1063
+ """Returns True if `G` has weighted edges.
1064
+
1065
+ Parameters
1066
+ ----------
1067
+ G : graph
1068
+ A NetworkX graph.
1069
+
1070
+ edge : tuple, optional
1071
+ A 2-tuple specifying the only edge in `G` that will be tested. If
1072
+ None, then every edge in `G` is tested.
1073
+
1074
+ weight: string, optional
1075
+ The attribute name used to query for edge weights.
1076
+
1077
+ Returns
1078
+ -------
1079
+ bool
1080
+ A boolean signifying if `G`, or the specified edge, is weighted.
1081
+
1082
+ Raises
1083
+ ------
1084
+ NetworkXError
1085
+ If the specified edge does not exist.
1086
+
1087
+ Examples
1088
+ --------
1089
+ >>> G = nx.path_graph(4)
1090
+ >>> nx.is_weighted(G)
1091
+ False
1092
+ >>> nx.is_weighted(G, (2, 3))
1093
+ False
1094
+
1095
+ >>> G = nx.DiGraph()
1096
+ >>> G.add_edge(1, 2, weight=1)
1097
+ >>> nx.is_weighted(G)
1098
+ True
1099
+
1100
+ """
1101
+ if edge is not None:
1102
+ data = G.get_edge_data(*edge)
1103
+ if data is None:
1104
+ msg = f"Edge {edge!r} does not exist."
1105
+ raise nx.NetworkXError(msg)
1106
+ return weight in data
1107
+
1108
+ if is_empty(G):
1109
+ # Special handling required since: all([]) == True
1110
+ return False
1111
+
1112
+ return all(weight in data for u, v, data in G.edges(data=True))
1113
+
1114
+
1115
+ @nx._dispatchable(edge_attrs="weight")
1116
+ def is_negatively_weighted(G, edge=None, weight="weight"):
1117
+ """Returns True if `G` has negatively weighted edges.
1118
+
1119
+ Parameters
1120
+ ----------
1121
+ G : graph
1122
+ A NetworkX graph.
1123
+
1124
+ edge : tuple, optional
1125
+ A 2-tuple specifying the only edge in `G` that will be tested. If
1126
+ None, then every edge in `G` is tested.
1127
+
1128
+ weight: string, optional
1129
+ The attribute name used to query for edge weights.
1130
+
1131
+ Returns
1132
+ -------
1133
+ bool
1134
+ A boolean signifying if `G`, or the specified edge, is negatively
1135
+ weighted.
1136
+
1137
+ Raises
1138
+ ------
1139
+ NetworkXError
1140
+ If the specified edge does not exist.
1141
+
1142
+ Examples
1143
+ --------
1144
+ >>> G = nx.Graph()
1145
+ >>> G.add_edges_from([(1, 3), (2, 4), (2, 6)])
1146
+ >>> G.add_edge(1, 2, weight=4)
1147
+ >>> nx.is_negatively_weighted(G, (1, 2))
1148
+ False
1149
+ >>> G[2][4]["weight"] = -2
1150
+ >>> nx.is_negatively_weighted(G)
1151
+ True
1152
+ >>> G = nx.DiGraph()
1153
+ >>> edges = [("0", "3", 3), ("0", "1", -5), ("1", "0", -2)]
1154
+ >>> G.add_weighted_edges_from(edges)
1155
+ >>> nx.is_negatively_weighted(G)
1156
+ True
1157
+
1158
+ """
1159
+ if edge is not None:
1160
+ data = G.get_edge_data(*edge)
1161
+ if data is None:
1162
+ msg = f"Edge {edge!r} does not exist."
1163
+ raise nx.NetworkXError(msg)
1164
+ return weight in data and data[weight] < 0
1165
+
1166
+ return any(weight in data and data[weight] < 0 for u, v, data in G.edges(data=True))
1167
+
1168
+
1169
+ def is_empty(G):
1170
+ """Returns True if `G` has no edges.
1171
+
1172
+ Parameters
1173
+ ----------
1174
+ G : graph
1175
+ A NetworkX graph.
1176
+
1177
+ Returns
1178
+ -------
1179
+ bool
1180
+ True if `G` has no edges, and False otherwise.
1181
+
1182
+ Notes
1183
+ -----
1184
+ An empty graph can have nodes but not edges. The empty graph with zero
1185
+ nodes is known as the null graph. This is an $O(n)$ operation where n
1186
+ is the number of nodes in the graph.
1187
+
1188
+ """
1189
+ return not any(G._adj.values())
1190
+
1191
+
1192
+ def nodes_with_selfloops(G):
1193
+ """Returns an iterator over nodes with self loops.
1194
+
1195
+ A node with a self loop has an edge with both ends adjacent
1196
+ to that node.
1197
+
1198
+ Returns
1199
+ -------
1200
+ nodelist : iterator
1201
+ A iterator over nodes with self loops.
1202
+
1203
+ See Also
1204
+ --------
1205
+ selfloop_edges, number_of_selfloops
1206
+
1207
+ Examples
1208
+ --------
1209
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1210
+ >>> G.add_edge(1, 1)
1211
+ >>> G.add_edge(1, 2)
1212
+ >>> list(nx.nodes_with_selfloops(G))
1213
+ [1]
1214
+
1215
+ """
1216
+ return (n for n, nbrs in G._adj.items() if n in nbrs)
1217
+
1218
+
1219
+ def selfloop_edges(G, data=False, keys=False, default=None):
1220
+ """Returns an iterator over selfloop edges.
1221
+
1222
+ A selfloop edge has the same node at both ends.
1223
+
1224
+ Parameters
1225
+ ----------
1226
+ G : graph
1227
+ A NetworkX graph.
1228
+ data : string or bool, optional (default=False)
1229
+ Return selfloop edges as two tuples (u, v) (data=False)
1230
+ or three-tuples (u, v, datadict) (data=True)
1231
+ or three-tuples (u, v, datavalue) (data='attrname')
1232
+ keys : bool, optional (default=False)
1233
+ If True, return edge keys with each edge.
1234
+ default : value, optional (default=None)
1235
+ Value used for edges that don't have the requested attribute.
1236
+ Only relevant if data is not True or False.
1237
+
1238
+ Returns
1239
+ -------
1240
+ edgeiter : iterator over edge tuples
1241
+ An iterator over all selfloop edges.
1242
+
1243
+ See Also
1244
+ --------
1245
+ nodes_with_selfloops, number_of_selfloops
1246
+
1247
+ Examples
1248
+ --------
1249
+ >>> G = nx.MultiGraph() # or Graph, DiGraph, MultiDiGraph, etc
1250
+ >>> ekey = G.add_edge(1, 1)
1251
+ >>> ekey = G.add_edge(1, 2)
1252
+ >>> list(nx.selfloop_edges(G))
1253
+ [(1, 1)]
1254
+ >>> list(nx.selfloop_edges(G, data=True))
1255
+ [(1, 1, {})]
1256
+ >>> list(nx.selfloop_edges(G, keys=True))
1257
+ [(1, 1, 0)]
1258
+ >>> list(nx.selfloop_edges(G, keys=True, data=True))
1259
+ [(1, 1, 0, {})]
1260
+ """
1261
+ if data is True:
1262
+ if G.is_multigraph():
1263
+ if keys is True:
1264
+ return (
1265
+ (n, n, k, d)
1266
+ for n, nbrs in G._adj.items()
1267
+ if n in nbrs
1268
+ for k, d in nbrs[n].items()
1269
+ )
1270
+ else:
1271
+ return (
1272
+ (n, n, d)
1273
+ for n, nbrs in G._adj.items()
1274
+ if n in nbrs
1275
+ for d in nbrs[n].values()
1276
+ )
1277
+ else:
1278
+ return ((n, n, nbrs[n]) for n, nbrs in G._adj.items() if n in nbrs)
1279
+ elif data is not False:
1280
+ if G.is_multigraph():
1281
+ if keys is True:
1282
+ return (
1283
+ (n, n, k, d.get(data, default))
1284
+ for n, nbrs in G._adj.items()
1285
+ if n in nbrs
1286
+ for k, d in nbrs[n].items()
1287
+ )
1288
+ else:
1289
+ return (
1290
+ (n, n, d.get(data, default))
1291
+ for n, nbrs in G._adj.items()
1292
+ if n in nbrs
1293
+ for d in nbrs[n].values()
1294
+ )
1295
+ else:
1296
+ return (
1297
+ (n, n, nbrs[n].get(data, default))
1298
+ for n, nbrs in G._adj.items()
1299
+ if n in nbrs
1300
+ )
1301
+ else:
1302
+ if G.is_multigraph():
1303
+ if keys is True:
1304
+ return (
1305
+ (n, n, k)
1306
+ for n, nbrs in G._adj.items()
1307
+ if n in nbrs
1308
+ for k in nbrs[n]
1309
+ )
1310
+ else:
1311
+ return (
1312
+ (n, n)
1313
+ for n, nbrs in G._adj.items()
1314
+ if n in nbrs
1315
+ for i in range(len(nbrs[n])) # for easy edge removal (#4068)
1316
+ )
1317
+ else:
1318
+ return ((n, n) for n, nbrs in G._adj.items() if n in nbrs)
1319
+
1320
+
1321
+ def number_of_selfloops(G):
1322
+ """Returns the number of selfloop edges.
1323
+
1324
+ A selfloop edge has the same node at both ends.
1325
+
1326
+ Returns
1327
+ -------
1328
+ nloops : int
1329
+ The number of selfloops.
1330
+
1331
+ See Also
1332
+ --------
1333
+ nodes_with_selfloops, selfloop_edges
1334
+
1335
+ Examples
1336
+ --------
1337
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1338
+ >>> G.add_edge(1, 1)
1339
+ >>> G.add_edge(1, 2)
1340
+ >>> nx.number_of_selfloops(G)
1341
+ 1
1342
+ """
1343
+ return sum(1 for _ in nx.selfloop_edges(G))
1344
+
1345
+
1346
+ def is_path(G, path):
1347
+ """Returns whether or not the specified path exists.
1348
+
1349
+ For it to return True, every node on the path must exist and
1350
+ each consecutive pair must be connected via one or more edges.
1351
+
1352
+ Parameters
1353
+ ----------
1354
+ G : graph
1355
+ A NetworkX graph.
1356
+
1357
+ path : list
1358
+ A list of nodes which defines the path to traverse
1359
+
1360
+ Returns
1361
+ -------
1362
+ bool
1363
+ True if `path` is a valid path in `G`
1364
+
1365
+ """
1366
+ try:
1367
+ return all(nbr in G._adj[node] for node, nbr in nx.utils.pairwise(path))
1368
+ except (KeyError, TypeError):
1369
+ return False
1370
+
1371
+
1372
+ def path_weight(G, path, weight):
1373
+ """Returns total cost associated with specified path and weight
1374
+
1375
+ Parameters
1376
+ ----------
1377
+ G : graph
1378
+ A NetworkX graph.
1379
+
1380
+ path: list
1381
+ A list of node labels which defines the path to traverse
1382
+
1383
+ weight: string
1384
+ A string indicating which edge attribute to use for path cost
1385
+
1386
+ Returns
1387
+ -------
1388
+ cost: int or float
1389
+ An integer or a float representing the total cost with respect to the
1390
+ specified weight of the specified path
1391
+
1392
+ Raises
1393
+ ------
1394
+ NetworkXNoPath
1395
+ If the specified edge does not exist.
1396
+ """
1397
+ multigraph = G.is_multigraph()
1398
+ cost = 0
1399
+
1400
+ if not nx.is_path(G, path):
1401
+ raise nx.NetworkXNoPath("path does not exist")
1402
+ for node, nbr in nx.utils.pairwise(path):
1403
+ if multigraph:
1404
+ cost += min(v[weight] for v in G._adj[node][nbr].values())
1405
+ else:
1406
+ cost += G._adj[node][nbr][weight]
1407
+ return cost
minigpt2/lib/python3.10/site-packages/networkx/classes/graph.py ADDED
@@ -0,0 +1,2058 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Base class for undirected graphs.
2
+
3
+ The Graph class allows any hashable object as a node
4
+ and can associate key/value attribute pairs with each undirected edge.
5
+
6
+ Self-loops are allowed but multiple edges are not (see MultiGraph).
7
+
8
+ For directed graphs see DiGraph and MultiDiGraph.
9
+ """
10
+
11
+ from copy import deepcopy
12
+ from functools import cached_property
13
+
14
+ import networkx as nx
15
+ from networkx import convert
16
+ from networkx.classes.coreviews import AdjacencyView
17
+ from networkx.classes.reportviews import DegreeView, EdgeView, NodeView
18
+ from networkx.exception import NetworkXError
19
+
20
+ __all__ = ["Graph"]
21
+
22
+
23
+ class _CachedPropertyResetterAdj:
24
+ """Data Descriptor class for _adj that resets ``adj`` cached_property when needed
25
+
26
+ This assumes that the ``cached_property`` ``G.adj`` should be reset whenever
27
+ ``G._adj`` is set to a new value.
28
+
29
+ This object sits on a class and ensures that any instance of that
30
+ class clears its cached property "adj" whenever the underlying
31
+ instance attribute "_adj" is set to a new object. It only affects
32
+ the set process of the obj._adj attribute. All get/del operations
33
+ act as they normally would.
34
+
35
+ For info on Data Descriptors see: https://docs.python.org/3/howto/descriptor.html
36
+ """
37
+
38
+ def __set__(self, obj, value):
39
+ od = obj.__dict__
40
+ od["_adj"] = value
41
+ # reset cached properties
42
+ props = ["adj", "edges", "degree"]
43
+ for prop in props:
44
+ if prop in od:
45
+ del od[prop]
46
+
47
+
48
+ class _CachedPropertyResetterNode:
49
+ """Data Descriptor class for _node that resets ``nodes`` cached_property when needed
50
+
51
+ This assumes that the ``cached_property`` ``G.node`` should be reset whenever
52
+ ``G._node`` is set to a new value.
53
+
54
+ This object sits on a class and ensures that any instance of that
55
+ class clears its cached property "nodes" whenever the underlying
56
+ instance attribute "_node" is set to a new object. It only affects
57
+ the set process of the obj._adj attribute. All get/del operations
58
+ act as they normally would.
59
+
60
+ For info on Data Descriptors see: https://docs.python.org/3/howto/descriptor.html
61
+ """
62
+
63
+ def __set__(self, obj, value):
64
+ od = obj.__dict__
65
+ od["_node"] = value
66
+ # reset cached properties
67
+ if "nodes" in od:
68
+ del od["nodes"]
69
+
70
+
71
+ class Graph:
72
+ """
73
+ Base class for undirected graphs.
74
+
75
+ A Graph stores nodes and edges with optional data, or attributes.
76
+
77
+ Graphs hold undirected edges. Self loops are allowed but multiple
78
+ (parallel) edges are not.
79
+
80
+ Nodes can be arbitrary (hashable) Python objects with optional
81
+ key/value attributes, except that `None` is not allowed as a node.
82
+
83
+ Edges are represented as links between nodes with optional
84
+ key/value attributes.
85
+
86
+ Parameters
87
+ ----------
88
+ incoming_graph_data : input graph (optional, default: None)
89
+ Data to initialize graph. If None (default) an empty
90
+ graph is created. The data can be any format that is supported
91
+ by the to_networkx_graph() function, currently including edge list,
92
+ dict of dicts, dict of lists, NetworkX graph, 2D NumPy array, SciPy
93
+ sparse matrix, or PyGraphviz graph.
94
+
95
+ attr : keyword arguments, optional (default= no attributes)
96
+ Attributes to add to graph as key=value pairs.
97
+
98
+ See Also
99
+ --------
100
+ DiGraph
101
+ MultiGraph
102
+ MultiDiGraph
103
+
104
+ Examples
105
+ --------
106
+ Create an empty graph structure (a "null graph") with no nodes and
107
+ no edges.
108
+
109
+ >>> G = nx.Graph()
110
+
111
+ G can be grown in several ways.
112
+
113
+ **Nodes:**
114
+
115
+ Add one node at a time:
116
+
117
+ >>> G.add_node(1)
118
+
119
+ Add the nodes from any container (a list, dict, set or
120
+ even the lines from a file or the nodes from another graph).
121
+
122
+ >>> G.add_nodes_from([2, 3])
123
+ >>> G.add_nodes_from(range(100, 110))
124
+ >>> H = nx.path_graph(10)
125
+ >>> G.add_nodes_from(H)
126
+
127
+ In addition to strings and integers any hashable Python object
128
+ (except None) can represent a node, e.g. a customized node object,
129
+ or even another Graph.
130
+
131
+ >>> G.add_node(H)
132
+
133
+ **Edges:**
134
+
135
+ G can also be grown by adding edges.
136
+
137
+ Add one edge,
138
+
139
+ >>> G.add_edge(1, 2)
140
+
141
+ a list of edges,
142
+
143
+ >>> G.add_edges_from([(1, 2), (1, 3)])
144
+
145
+ or a collection of edges,
146
+
147
+ >>> G.add_edges_from(H.edges)
148
+
149
+ If some edges connect nodes not yet in the graph, the nodes
150
+ are added automatically. There are no errors when adding
151
+ nodes or edges that already exist.
152
+
153
+ **Attributes:**
154
+
155
+ Each graph, node, and edge can hold key/value attribute pairs
156
+ in an associated attribute dictionary (the keys must be hashable).
157
+ By default these are empty, but can be added or changed using
158
+ add_edge, add_node or direct manipulation of the attribute
159
+ dictionaries named graph, node and edge respectively.
160
+
161
+ >>> G = nx.Graph(day="Friday")
162
+ >>> G.graph
163
+ {'day': 'Friday'}
164
+
165
+ Add node attributes using add_node(), add_nodes_from() or G.nodes
166
+
167
+ >>> G.add_node(1, time="5pm")
168
+ >>> G.add_nodes_from([3], time="2pm")
169
+ >>> G.nodes[1]
170
+ {'time': '5pm'}
171
+ >>> G.nodes[1]["room"] = 714 # node must exist already to use G.nodes
172
+ >>> del G.nodes[1]["room"] # remove attribute
173
+ >>> list(G.nodes(data=True))
174
+ [(1, {'time': '5pm'}), (3, {'time': '2pm'})]
175
+
176
+ Add edge attributes using add_edge(), add_edges_from(), subscript
177
+ notation, or G.edges.
178
+
179
+ >>> G.add_edge(1, 2, weight=4.7)
180
+ >>> G.add_edges_from([(3, 4), (4, 5)], color="red")
181
+ >>> G.add_edges_from([(1, 2, {"color": "blue"}), (2, 3, {"weight": 8})])
182
+ >>> G[1][2]["weight"] = 4.7
183
+ >>> G.edges[1, 2]["weight"] = 4
184
+
185
+ Warning: we protect the graph data structure by making `G.edges` a
186
+ read-only dict-like structure. However, you can assign to attributes
187
+ in e.g. `G.edges[1, 2]`. Thus, use 2 sets of brackets to add/change
188
+ data attributes: `G.edges[1, 2]['weight'] = 4`
189
+ (For multigraphs: `MG.edges[u, v, key][name] = value`).
190
+
191
+ **Shortcuts:**
192
+
193
+ Many common graph features allow python syntax to speed reporting.
194
+
195
+ >>> 1 in G # check if node in graph
196
+ True
197
+ >>> [n for n in G if n < 3] # iterate through nodes
198
+ [1, 2]
199
+ >>> len(G) # number of nodes in graph
200
+ 5
201
+
202
+ Often the best way to traverse all edges of a graph is via the neighbors.
203
+ The neighbors are reported as an adjacency-dict `G.adj` or `G.adjacency()`
204
+
205
+ >>> for n, nbrsdict in G.adjacency():
206
+ ... for nbr, eattr in nbrsdict.items():
207
+ ... if "weight" in eattr:
208
+ ... # Do something useful with the edges
209
+ ... pass
210
+
211
+ But the edges() method is often more convenient:
212
+
213
+ >>> for u, v, weight in G.edges.data("weight"):
214
+ ... if weight is not None:
215
+ ... # Do something useful with the edges
216
+ ... pass
217
+
218
+ **Reporting:**
219
+
220
+ Simple graph information is obtained using object-attributes and methods.
221
+ Reporting typically provides views instead of containers to reduce memory
222
+ usage. The views update as the graph is updated similarly to dict-views.
223
+ The objects `nodes`, `edges` and `adj` provide access to data attributes
224
+ via lookup (e.g. `nodes[n]`, `edges[u, v]`, `adj[u][v]`) and iteration
225
+ (e.g. `nodes.items()`, `nodes.data('color')`,
226
+ `nodes.data('color', default='blue')` and similarly for `edges`)
227
+ Views exist for `nodes`, `edges`, `neighbors()`/`adj` and `degree`.
228
+
229
+ For details on these and other miscellaneous methods, see below.
230
+
231
+ **Subclasses (Advanced):**
232
+
233
+ The Graph class uses a dict-of-dict-of-dict data structure.
234
+ The outer dict (node_dict) holds adjacency information keyed by node.
235
+ The next dict (adjlist_dict) represents the adjacency information and holds
236
+ edge data keyed by neighbor. The inner dict (edge_attr_dict) represents
237
+ the edge data and holds edge attribute values keyed by attribute names.
238
+
239
+ Each of these three dicts can be replaced in a subclass by a user defined
240
+ dict-like object. In general, the dict-like features should be
241
+ maintained but extra features can be added. To replace one of the
242
+ dicts create a new graph class by changing the class(!) variable
243
+ holding the factory for that dict-like structure.
244
+
245
+ node_dict_factory : function, (default: dict)
246
+ Factory function to be used to create the dict containing node
247
+ attributes, keyed by node id.
248
+ It should require no arguments and return a dict-like object
249
+
250
+ node_attr_dict_factory: function, (default: dict)
251
+ Factory function to be used to create the node attribute
252
+ dict which holds attribute values keyed by attribute name.
253
+ It should require no arguments and return a dict-like object
254
+
255
+ adjlist_outer_dict_factory : function, (default: dict)
256
+ Factory function to be used to create the outer-most dict
257
+ in the data structure that holds adjacency info keyed by node.
258
+ It should require no arguments and return a dict-like object.
259
+
260
+ adjlist_inner_dict_factory : function, (default: dict)
261
+ Factory function to be used to create the adjacency list
262
+ dict which holds edge data keyed by neighbor.
263
+ It should require no arguments and return a dict-like object
264
+
265
+ edge_attr_dict_factory : function, (default: dict)
266
+ Factory function to be used to create the edge attribute
267
+ dict which holds attribute values keyed by attribute name.
268
+ It should require no arguments and return a dict-like object.
269
+
270
+ graph_attr_dict_factory : function, (default: dict)
271
+ Factory function to be used to create the graph attribute
272
+ dict which holds attribute values keyed by attribute name.
273
+ It should require no arguments and return a dict-like object.
274
+
275
+ Typically, if your extension doesn't impact the data structure all
276
+ methods will inherit without issue except: `to_directed/to_undirected`.
277
+ By default these methods create a DiGraph/Graph class and you probably
278
+ want them to create your extension of a DiGraph/Graph. To facilitate
279
+ this we define two class variables that you can set in your subclass.
280
+
281
+ to_directed_class : callable, (default: DiGraph or MultiDiGraph)
282
+ Class to create a new graph structure in the `to_directed` method.
283
+ If `None`, a NetworkX class (DiGraph or MultiDiGraph) is used.
284
+
285
+ to_undirected_class : callable, (default: Graph or MultiGraph)
286
+ Class to create a new graph structure in the `to_undirected` method.
287
+ If `None`, a NetworkX class (Graph or MultiGraph) is used.
288
+
289
+ **Subclassing Example**
290
+
291
+ Create a low memory graph class that effectively disallows edge
292
+ attributes by using a single attribute dict for all edges.
293
+ This reduces the memory used, but you lose edge attributes.
294
+
295
+ >>> class ThinGraph(nx.Graph):
296
+ ... all_edge_dict = {"weight": 1}
297
+ ...
298
+ ... def single_edge_dict(self):
299
+ ... return self.all_edge_dict
300
+ ...
301
+ ... edge_attr_dict_factory = single_edge_dict
302
+ >>> G = ThinGraph()
303
+ >>> G.add_edge(2, 1)
304
+ >>> G[2][1]
305
+ {'weight': 1}
306
+ >>> G.add_edge(2, 2)
307
+ >>> G[2][1] is G[2][2]
308
+ True
309
+ """
310
+
311
+ __networkx_backend__ = "networkx"
312
+
313
+ _adj = _CachedPropertyResetterAdj()
314
+ _node = _CachedPropertyResetterNode()
315
+
316
+ node_dict_factory = dict
317
+ node_attr_dict_factory = dict
318
+ adjlist_outer_dict_factory = dict
319
+ adjlist_inner_dict_factory = dict
320
+ edge_attr_dict_factory = dict
321
+ graph_attr_dict_factory = dict
322
+
323
+ def to_directed_class(self):
324
+ """Returns the class to use for empty directed copies.
325
+
326
+ If you subclass the base classes, use this to designate
327
+ what directed class to use for `to_directed()` copies.
328
+ """
329
+ return nx.DiGraph
330
+
331
+ def to_undirected_class(self):
332
+ """Returns the class to use for empty undirected copies.
333
+
334
+ If you subclass the base classes, use this to designate
335
+ what directed class to use for `to_directed()` copies.
336
+ """
337
+ return Graph
338
+
339
+ def __init__(self, incoming_graph_data=None, **attr):
340
+ """Initialize a graph with edges, name, or graph attributes.
341
+
342
+ Parameters
343
+ ----------
344
+ incoming_graph_data : input graph (optional, default: None)
345
+ Data to initialize graph. If None (default) an empty
346
+ graph is created. The data can be an edge list, or any
347
+ NetworkX graph object. If the corresponding optional Python
348
+ packages are installed the data can also be a 2D NumPy array, a
349
+ SciPy sparse array, or a PyGraphviz graph.
350
+
351
+ attr : keyword arguments, optional (default= no attributes)
352
+ Attributes to add to graph as key=value pairs.
353
+
354
+ See Also
355
+ --------
356
+ convert
357
+
358
+ Examples
359
+ --------
360
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
361
+ >>> G = nx.Graph(name="my graph")
362
+ >>> e = [(1, 2), (2, 3), (3, 4)] # list of edges
363
+ >>> G = nx.Graph(e)
364
+
365
+ Arbitrary graph attribute pairs (key=value) may be assigned
366
+
367
+ >>> G = nx.Graph(e, day="Friday")
368
+ >>> G.graph
369
+ {'day': 'Friday'}
370
+
371
+ """
372
+ self.graph = self.graph_attr_dict_factory() # dictionary for graph attributes
373
+ self._node = self.node_dict_factory() # empty node attribute dict
374
+ self._adj = self.adjlist_outer_dict_factory() # empty adjacency dict
375
+ self.__networkx_cache__ = {}
376
+ # attempt to load graph with data
377
+ if incoming_graph_data is not None:
378
+ convert.to_networkx_graph(incoming_graph_data, create_using=self)
379
+ # load graph attributes (must be after convert)
380
+ self.graph.update(attr)
381
+
382
+ @cached_property
383
+ def adj(self):
384
+ """Graph adjacency object holding the neighbors of each node.
385
+
386
+ This object is a read-only dict-like structure with node keys
387
+ and neighbor-dict values. The neighbor-dict is keyed by neighbor
388
+ to the edge-data-dict. So `G.adj[3][2]['color'] = 'blue'` sets
389
+ the color of the edge `(3, 2)` to `"blue"`.
390
+
391
+ Iterating over G.adj behaves like a dict. Useful idioms include
392
+ `for nbr, datadict in G.adj[n].items():`.
393
+
394
+ The neighbor information is also provided by subscripting the graph.
395
+ So `for nbr, foovalue in G[node].data('foo', default=1):` works.
396
+
397
+ For directed graphs, `G.adj` holds outgoing (successor) info.
398
+ """
399
+ return AdjacencyView(self._adj)
400
+
401
+ @property
402
+ def name(self):
403
+ """String identifier of the graph.
404
+
405
+ This graph attribute appears in the attribute dict G.graph
406
+ keyed by the string `"name"`. as well as an attribute (technically
407
+ a property) `G.name`. This is entirely user controlled.
408
+ """
409
+ return self.graph.get("name", "")
410
+
411
+ @name.setter
412
+ def name(self, s):
413
+ self.graph["name"] = s
414
+ nx._clear_cache(self)
415
+
416
+ def __str__(self):
417
+ """Returns a short summary of the graph.
418
+
419
+ Returns
420
+ -------
421
+ info : string
422
+ Graph information including the graph name (if any), graph type, and the
423
+ number of nodes and edges.
424
+
425
+ Examples
426
+ --------
427
+ >>> G = nx.Graph(name="foo")
428
+ >>> str(G)
429
+ "Graph named 'foo' with 0 nodes and 0 edges"
430
+
431
+ >>> G = nx.path_graph(3)
432
+ >>> str(G)
433
+ 'Graph with 3 nodes and 2 edges'
434
+
435
+ """
436
+ return "".join(
437
+ [
438
+ type(self).__name__,
439
+ f" named {self.name!r}" if self.name else "",
440
+ f" with {self.number_of_nodes()} nodes and {self.number_of_edges()} edges",
441
+ ]
442
+ )
443
+
444
+ def __iter__(self):
445
+ """Iterate over the nodes. Use: 'for n in G'.
446
+
447
+ Returns
448
+ -------
449
+ niter : iterator
450
+ An iterator over all nodes in the graph.
451
+
452
+ Examples
453
+ --------
454
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
455
+ >>> [n for n in G]
456
+ [0, 1, 2, 3]
457
+ >>> list(G)
458
+ [0, 1, 2, 3]
459
+ """
460
+ return iter(self._node)
461
+
462
+ def __contains__(self, n):
463
+ """Returns True if n is a node, False otherwise. Use: 'n in G'.
464
+
465
+ Examples
466
+ --------
467
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
468
+ >>> 1 in G
469
+ True
470
+ """
471
+ try:
472
+ return n in self._node
473
+ except TypeError:
474
+ return False
475
+
476
+ def __len__(self):
477
+ """Returns the number of nodes in the graph. Use: 'len(G)'.
478
+
479
+ Returns
480
+ -------
481
+ nnodes : int
482
+ The number of nodes in the graph.
483
+
484
+ See Also
485
+ --------
486
+ number_of_nodes: identical method
487
+ order: identical method
488
+
489
+ Examples
490
+ --------
491
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
492
+ >>> len(G)
493
+ 4
494
+
495
+ """
496
+ return len(self._node)
497
+
498
+ def __getitem__(self, n):
499
+ """Returns a dict of neighbors of node n. Use: 'G[n]'.
500
+
501
+ Parameters
502
+ ----------
503
+ n : node
504
+ A node in the graph.
505
+
506
+ Returns
507
+ -------
508
+ adj_dict : dictionary
509
+ The adjacency dictionary for nodes connected to n.
510
+
511
+ Notes
512
+ -----
513
+ G[n] is the same as G.adj[n] and similar to G.neighbors(n)
514
+ (which is an iterator over G.adj[n])
515
+
516
+ Examples
517
+ --------
518
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
519
+ >>> G[0]
520
+ AtlasView({1: {}})
521
+ """
522
+ return self.adj[n]
523
+
524
+ def add_node(self, node_for_adding, **attr):
525
+ """Add a single node `node_for_adding` and update node attributes.
526
+
527
+ Parameters
528
+ ----------
529
+ node_for_adding : node
530
+ A node can be any hashable Python object except None.
531
+ attr : keyword arguments, optional
532
+ Set or change node attributes using key=value.
533
+
534
+ See Also
535
+ --------
536
+ add_nodes_from
537
+
538
+ Examples
539
+ --------
540
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
541
+ >>> G.add_node(1)
542
+ >>> G.add_node("Hello")
543
+ >>> K3 = nx.Graph([(0, 1), (1, 2), (2, 0)])
544
+ >>> G.add_node(K3)
545
+ >>> G.number_of_nodes()
546
+ 3
547
+
548
+ Use keywords set/change node attributes:
549
+
550
+ >>> G.add_node(1, size=10)
551
+ >>> G.add_node(3, weight=0.4, UTM=("13S", 382871, 3972649))
552
+
553
+ Notes
554
+ -----
555
+ A hashable object is one that can be used as a key in a Python
556
+ dictionary. This includes strings, numbers, tuples of strings
557
+ and numbers, etc.
558
+
559
+ On many platforms hashable items also include mutables such as
560
+ NetworkX Graphs, though one should be careful that the hash
561
+ doesn't change on mutables.
562
+ """
563
+ if node_for_adding not in self._node:
564
+ if node_for_adding is None:
565
+ raise ValueError("None cannot be a node")
566
+ self._adj[node_for_adding] = self.adjlist_inner_dict_factory()
567
+ attr_dict = self._node[node_for_adding] = self.node_attr_dict_factory()
568
+ attr_dict.update(attr)
569
+ else: # update attr even if node already exists
570
+ self._node[node_for_adding].update(attr)
571
+ nx._clear_cache(self)
572
+
573
+ def add_nodes_from(self, nodes_for_adding, **attr):
574
+ """Add multiple nodes.
575
+
576
+ Parameters
577
+ ----------
578
+ nodes_for_adding : iterable container
579
+ A container of nodes (list, dict, set, etc.).
580
+ OR
581
+ A container of (node, attribute dict) tuples.
582
+ Node attributes are updated using the attribute dict.
583
+ attr : keyword arguments, optional (default= no attributes)
584
+ Update attributes for all nodes in nodes.
585
+ Node attributes specified in nodes as a tuple take
586
+ precedence over attributes specified via keyword arguments.
587
+
588
+ See Also
589
+ --------
590
+ add_node
591
+
592
+ Notes
593
+ -----
594
+ When adding nodes from an iterator over the graph you are changing,
595
+ a `RuntimeError` can be raised with message:
596
+ `RuntimeError: dictionary changed size during iteration`. This
597
+ happens when the graph's underlying dictionary is modified during
598
+ iteration. To avoid this error, evaluate the iterator into a separate
599
+ object, e.g. by using `list(iterator_of_nodes)`, and pass this
600
+ object to `G.add_nodes_from`.
601
+
602
+ Examples
603
+ --------
604
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
605
+ >>> G.add_nodes_from("Hello")
606
+ >>> K3 = nx.Graph([(0, 1), (1, 2), (2, 0)])
607
+ >>> G.add_nodes_from(K3)
608
+ >>> sorted(G.nodes(), key=str)
609
+ [0, 1, 2, 'H', 'e', 'l', 'o']
610
+
611
+ Use keywords to update specific node attributes for every node.
612
+
613
+ >>> G.add_nodes_from([1, 2], size=10)
614
+ >>> G.add_nodes_from([3, 4], weight=0.4)
615
+
616
+ Use (node, attrdict) tuples to update attributes for specific nodes.
617
+
618
+ >>> G.add_nodes_from([(1, dict(size=11)), (2, {"color": "blue"})])
619
+ >>> G.nodes[1]["size"]
620
+ 11
621
+ >>> H = nx.Graph()
622
+ >>> H.add_nodes_from(G.nodes(data=True))
623
+ >>> H.nodes[1]["size"]
624
+ 11
625
+
626
+ Evaluate an iterator over a graph if using it to modify the same graph
627
+
628
+ >>> G = nx.Graph([(0, 1), (1, 2), (3, 4)])
629
+ >>> # wrong way - will raise RuntimeError
630
+ >>> # G.add_nodes_from(n + 1 for n in G.nodes)
631
+ >>> # correct way
632
+ >>> G.add_nodes_from(list(n + 1 for n in G.nodes))
633
+ """
634
+ for n in nodes_for_adding:
635
+ try:
636
+ newnode = n not in self._node
637
+ newdict = attr
638
+ except TypeError:
639
+ n, ndict = n
640
+ newnode = n not in self._node
641
+ newdict = attr.copy()
642
+ newdict.update(ndict)
643
+ if newnode:
644
+ if n is None:
645
+ raise ValueError("None cannot be a node")
646
+ self._adj[n] = self.adjlist_inner_dict_factory()
647
+ self._node[n] = self.node_attr_dict_factory()
648
+ self._node[n].update(newdict)
649
+ nx._clear_cache(self)
650
+
651
+ def remove_node(self, n):
652
+ """Remove node n.
653
+
654
+ Removes the node n and all adjacent edges.
655
+ Attempting to remove a nonexistent node will raise an exception.
656
+
657
+ Parameters
658
+ ----------
659
+ n : node
660
+ A node in the graph
661
+
662
+ Raises
663
+ ------
664
+ NetworkXError
665
+ If n is not in the graph.
666
+
667
+ See Also
668
+ --------
669
+ remove_nodes_from
670
+
671
+ Examples
672
+ --------
673
+ >>> G = nx.path_graph(3) # or DiGraph, MultiGraph, MultiDiGraph, etc
674
+ >>> list(G.edges)
675
+ [(0, 1), (1, 2)]
676
+ >>> G.remove_node(1)
677
+ >>> list(G.edges)
678
+ []
679
+
680
+ """
681
+ adj = self._adj
682
+ try:
683
+ nbrs = list(adj[n]) # list handles self-loops (allows mutation)
684
+ del self._node[n]
685
+ except KeyError as err: # NetworkXError if n not in self
686
+ raise NetworkXError(f"The node {n} is not in the graph.") from err
687
+ for u in nbrs:
688
+ del adj[u][n] # remove all edges n-u in graph
689
+ del adj[n] # now remove node
690
+ nx._clear_cache(self)
691
+
692
+ def remove_nodes_from(self, nodes):
693
+ """Remove multiple nodes.
694
+
695
+ Parameters
696
+ ----------
697
+ nodes : iterable container
698
+ A container of nodes (list, dict, set, etc.). If a node
699
+ in the container is not in the graph it is silently
700
+ ignored.
701
+
702
+ See Also
703
+ --------
704
+ remove_node
705
+
706
+ Notes
707
+ -----
708
+ When removing nodes from an iterator over the graph you are changing,
709
+ a `RuntimeError` will be raised with message:
710
+ `RuntimeError: dictionary changed size during iteration`. This
711
+ happens when the graph's underlying dictionary is modified during
712
+ iteration. To avoid this error, evaluate the iterator into a separate
713
+ object, e.g. by using `list(iterator_of_nodes)`, and pass this
714
+ object to `G.remove_nodes_from`.
715
+
716
+ Examples
717
+ --------
718
+ >>> G = nx.path_graph(3) # or DiGraph, MultiGraph, MultiDiGraph, etc
719
+ >>> e = list(G.nodes)
720
+ >>> e
721
+ [0, 1, 2]
722
+ >>> G.remove_nodes_from(e)
723
+ >>> list(G.nodes)
724
+ []
725
+
726
+ Evaluate an iterator over a graph if using it to modify the same graph
727
+
728
+ >>> G = nx.Graph([(0, 1), (1, 2), (3, 4)])
729
+ >>> # this command will fail, as the graph's dict is modified during iteration
730
+ >>> # G.remove_nodes_from(n for n in G.nodes if n < 2)
731
+ >>> # this command will work, since the dictionary underlying graph is not modified
732
+ >>> G.remove_nodes_from(list(n for n in G.nodes if n < 2))
733
+ """
734
+ adj = self._adj
735
+ for n in nodes:
736
+ try:
737
+ del self._node[n]
738
+ for u in list(adj[n]): # list handles self-loops
739
+ del adj[u][n] # (allows mutation of dict in loop)
740
+ del adj[n]
741
+ except KeyError:
742
+ pass
743
+ nx._clear_cache(self)
744
+
745
+ @cached_property
746
+ def nodes(self):
747
+ """A NodeView of the Graph as G.nodes or G.nodes().
748
+
749
+ Can be used as `G.nodes` for data lookup and for set-like operations.
750
+ Can also be used as `G.nodes(data='color', default=None)` to return a
751
+ NodeDataView which reports specific node data but no set operations.
752
+ It presents a dict-like interface as well with `G.nodes.items()`
753
+ iterating over `(node, nodedata)` 2-tuples and `G.nodes[3]['foo']`
754
+ providing the value of the `foo` attribute for node `3`. In addition,
755
+ a view `G.nodes.data('foo')` provides a dict-like interface to the
756
+ `foo` attribute of each node. `G.nodes.data('foo', default=1)`
757
+ provides a default for nodes that do not have attribute `foo`.
758
+
759
+ Parameters
760
+ ----------
761
+ data : string or bool, optional (default=False)
762
+ The node attribute returned in 2-tuple (n, ddict[data]).
763
+ If True, return entire node attribute dict as (n, ddict).
764
+ If False, return just the nodes n.
765
+
766
+ default : value, optional (default=None)
767
+ Value used for nodes that don't have the requested attribute.
768
+ Only relevant if data is not True or False.
769
+
770
+ Returns
771
+ -------
772
+ NodeView
773
+ Allows set-like operations over the nodes as well as node
774
+ attribute dict lookup and calling to get a NodeDataView.
775
+ A NodeDataView iterates over `(n, data)` and has no set operations.
776
+ A NodeView iterates over `n` and includes set operations.
777
+
778
+ When called, if data is False, an iterator over nodes.
779
+ Otherwise an iterator of 2-tuples (node, attribute value)
780
+ where the attribute is specified in `data`.
781
+ If data is True then the attribute becomes the
782
+ entire data dictionary.
783
+
784
+ Notes
785
+ -----
786
+ If your node data is not needed, it is simpler and equivalent
787
+ to use the expression ``for n in G``, or ``list(G)``.
788
+
789
+ Examples
790
+ --------
791
+ There are two simple ways of getting a list of all nodes in the graph:
792
+
793
+ >>> G = nx.path_graph(3)
794
+ >>> list(G.nodes)
795
+ [0, 1, 2]
796
+ >>> list(G)
797
+ [0, 1, 2]
798
+
799
+ To get the node data along with the nodes:
800
+
801
+ >>> G.add_node(1, time="5pm")
802
+ >>> G.nodes[0]["foo"] = "bar"
803
+ >>> list(G.nodes(data=True))
804
+ [(0, {'foo': 'bar'}), (1, {'time': '5pm'}), (2, {})]
805
+ >>> list(G.nodes.data())
806
+ [(0, {'foo': 'bar'}), (1, {'time': '5pm'}), (2, {})]
807
+
808
+ >>> list(G.nodes(data="foo"))
809
+ [(0, 'bar'), (1, None), (2, None)]
810
+ >>> list(G.nodes.data("foo"))
811
+ [(0, 'bar'), (1, None), (2, None)]
812
+
813
+ >>> list(G.nodes(data="time"))
814
+ [(0, None), (1, '5pm'), (2, None)]
815
+ >>> list(G.nodes.data("time"))
816
+ [(0, None), (1, '5pm'), (2, None)]
817
+
818
+ >>> list(G.nodes(data="time", default="Not Available"))
819
+ [(0, 'Not Available'), (1, '5pm'), (2, 'Not Available')]
820
+ >>> list(G.nodes.data("time", default="Not Available"))
821
+ [(0, 'Not Available'), (1, '5pm'), (2, 'Not Available')]
822
+
823
+ If some of your nodes have an attribute and the rest are assumed
824
+ to have a default attribute value you can create a dictionary
825
+ from node/attribute pairs using the `default` keyword argument
826
+ to guarantee the value is never None::
827
+
828
+ >>> G = nx.Graph()
829
+ >>> G.add_node(0)
830
+ >>> G.add_node(1, weight=2)
831
+ >>> G.add_node(2, weight=3)
832
+ >>> dict(G.nodes(data="weight", default=1))
833
+ {0: 1, 1: 2, 2: 3}
834
+
835
+ """
836
+ return NodeView(self)
837
+
838
+ def number_of_nodes(self):
839
+ """Returns the number of nodes in the graph.
840
+
841
+ Returns
842
+ -------
843
+ nnodes : int
844
+ The number of nodes in the graph.
845
+
846
+ See Also
847
+ --------
848
+ order: identical method
849
+ __len__: identical method
850
+
851
+ Examples
852
+ --------
853
+ >>> G = nx.path_graph(3) # or DiGraph, MultiGraph, MultiDiGraph, etc
854
+ >>> G.number_of_nodes()
855
+ 3
856
+ """
857
+ return len(self._node)
858
+
859
+ def order(self):
860
+ """Returns the number of nodes in the graph.
861
+
862
+ Returns
863
+ -------
864
+ nnodes : int
865
+ The number of nodes in the graph.
866
+
867
+ See Also
868
+ --------
869
+ number_of_nodes: identical method
870
+ __len__: identical method
871
+
872
+ Examples
873
+ --------
874
+ >>> G = nx.path_graph(3) # or DiGraph, MultiGraph, MultiDiGraph, etc
875
+ >>> G.order()
876
+ 3
877
+ """
878
+ return len(self._node)
879
+
880
+ def has_node(self, n):
881
+ """Returns True if the graph contains the node n.
882
+
883
+ Identical to `n in G`
884
+
885
+ Parameters
886
+ ----------
887
+ n : node
888
+
889
+ Examples
890
+ --------
891
+ >>> G = nx.path_graph(3) # or DiGraph, MultiGraph, MultiDiGraph, etc
892
+ >>> G.has_node(0)
893
+ True
894
+
895
+ It is more readable and simpler to use
896
+
897
+ >>> 0 in G
898
+ True
899
+
900
+ """
901
+ try:
902
+ return n in self._node
903
+ except TypeError:
904
+ return False
905
+
906
+ def add_edge(self, u_of_edge, v_of_edge, **attr):
907
+ """Add an edge between u and v.
908
+
909
+ The nodes u and v will be automatically added if they are
910
+ not already in the graph.
911
+
912
+ Edge attributes can be specified with keywords or by directly
913
+ accessing the edge's attribute dictionary. See examples below.
914
+
915
+ Parameters
916
+ ----------
917
+ u_of_edge, v_of_edge : nodes
918
+ Nodes can be, for example, strings or numbers.
919
+ Nodes must be hashable (and not None) Python objects.
920
+ attr : keyword arguments, optional
921
+ Edge data (or labels or objects) can be assigned using
922
+ keyword arguments.
923
+
924
+ See Also
925
+ --------
926
+ add_edges_from : add a collection of edges
927
+
928
+ Notes
929
+ -----
930
+ Adding an edge that already exists updates the edge data.
931
+
932
+ Many NetworkX algorithms designed for weighted graphs use
933
+ an edge attribute (by default `weight`) to hold a numerical value.
934
+
935
+ Examples
936
+ --------
937
+ The following all add the edge e=(1, 2) to graph G:
938
+
939
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
940
+ >>> e = (1, 2)
941
+ >>> G.add_edge(1, 2) # explicit two-node form
942
+ >>> G.add_edge(*e) # single edge as tuple of two nodes
943
+ >>> G.add_edges_from([(1, 2)]) # add edges from iterable container
944
+
945
+ Associate data to edges using keywords:
946
+
947
+ >>> G.add_edge(1, 2, weight=3)
948
+ >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7)
949
+
950
+ For non-string attribute keys, use subscript notation.
951
+
952
+ >>> G.add_edge(1, 2)
953
+ >>> G[1][2].update({0: 5})
954
+ >>> G.edges[1, 2].update({0: 5})
955
+ """
956
+ u, v = u_of_edge, v_of_edge
957
+ # add nodes
958
+ if u not in self._node:
959
+ if u is None:
960
+ raise ValueError("None cannot be a node")
961
+ self._adj[u] = self.adjlist_inner_dict_factory()
962
+ self._node[u] = self.node_attr_dict_factory()
963
+ if v not in self._node:
964
+ if v is None:
965
+ raise ValueError("None cannot be a node")
966
+ self._adj[v] = self.adjlist_inner_dict_factory()
967
+ self._node[v] = self.node_attr_dict_factory()
968
+ # add the edge
969
+ datadict = self._adj[u].get(v, self.edge_attr_dict_factory())
970
+ datadict.update(attr)
971
+ self._adj[u][v] = datadict
972
+ self._adj[v][u] = datadict
973
+ nx._clear_cache(self)
974
+
975
+ def add_edges_from(self, ebunch_to_add, **attr):
976
+ """Add all the edges in ebunch_to_add.
977
+
978
+ Parameters
979
+ ----------
980
+ ebunch_to_add : container of edges
981
+ Each edge given in the container will be added to the
982
+ graph. The edges must be given as 2-tuples (u, v) or
983
+ 3-tuples (u, v, d) where d is a dictionary containing edge data.
984
+ attr : keyword arguments, optional
985
+ Edge data (or labels or objects) can be assigned using
986
+ keyword arguments.
987
+
988
+ See Also
989
+ --------
990
+ add_edge : add a single edge
991
+ add_weighted_edges_from : convenient way to add weighted edges
992
+
993
+ Notes
994
+ -----
995
+ Adding the same edge twice has no effect but any edge data
996
+ will be updated when each duplicate edge is added.
997
+
998
+ Edge attributes specified in an ebunch take precedence over
999
+ attributes specified via keyword arguments.
1000
+
1001
+ When adding edges from an iterator over the graph you are changing,
1002
+ a `RuntimeError` can be raised with message:
1003
+ `RuntimeError: dictionary changed size during iteration`. This
1004
+ happens when the graph's underlying dictionary is modified during
1005
+ iteration. To avoid this error, evaluate the iterator into a separate
1006
+ object, e.g. by using `list(iterator_of_edges)`, and pass this
1007
+ object to `G.add_edges_from`.
1008
+
1009
+ Examples
1010
+ --------
1011
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1012
+ >>> G.add_edges_from([(0, 1), (1, 2)]) # using a list of edge tuples
1013
+ >>> e = zip(range(0, 3), range(1, 4))
1014
+ >>> G.add_edges_from(e) # Add the path graph 0-1-2-3
1015
+
1016
+ Associate data to edges
1017
+
1018
+ >>> G.add_edges_from([(1, 2), (2, 3)], weight=3)
1019
+ >>> G.add_edges_from([(3, 4), (1, 4)], label="WN2898")
1020
+
1021
+ Evaluate an iterator over a graph if using it to modify the same graph
1022
+
1023
+ >>> G = nx.Graph([(1, 2), (2, 3), (3, 4)])
1024
+ >>> # Grow graph by one new node, adding edges to all existing nodes.
1025
+ >>> # wrong way - will raise RuntimeError
1026
+ >>> # G.add_edges_from(((5, n) for n in G.nodes))
1027
+ >>> # correct way - note that there will be no self-edge for node 5
1028
+ >>> G.add_edges_from(list((5, n) for n in G.nodes))
1029
+ """
1030
+ for e in ebunch_to_add:
1031
+ ne = len(e)
1032
+ if ne == 3:
1033
+ u, v, dd = e
1034
+ elif ne == 2:
1035
+ u, v = e
1036
+ dd = {} # doesn't need edge_attr_dict_factory
1037
+ else:
1038
+ raise NetworkXError(f"Edge tuple {e} must be a 2-tuple or 3-tuple.")
1039
+ if u not in self._node:
1040
+ if u is None:
1041
+ raise ValueError("None cannot be a node")
1042
+ self._adj[u] = self.adjlist_inner_dict_factory()
1043
+ self._node[u] = self.node_attr_dict_factory()
1044
+ if v not in self._node:
1045
+ if v is None:
1046
+ raise ValueError("None cannot be a node")
1047
+ self._adj[v] = self.adjlist_inner_dict_factory()
1048
+ self._node[v] = self.node_attr_dict_factory()
1049
+ datadict = self._adj[u].get(v, self.edge_attr_dict_factory())
1050
+ datadict.update(attr)
1051
+ datadict.update(dd)
1052
+ self._adj[u][v] = datadict
1053
+ self._adj[v][u] = datadict
1054
+ nx._clear_cache(self)
1055
+
1056
+ def add_weighted_edges_from(self, ebunch_to_add, weight="weight", **attr):
1057
+ """Add weighted edges in `ebunch_to_add` with specified weight attr
1058
+
1059
+ Parameters
1060
+ ----------
1061
+ ebunch_to_add : container of edges
1062
+ Each edge given in the list or container will be added
1063
+ to the graph. The edges must be given as 3-tuples (u, v, w)
1064
+ where w is a number.
1065
+ weight : string, optional (default= 'weight')
1066
+ The attribute name for the edge weights to be added.
1067
+ attr : keyword arguments, optional (default= no attributes)
1068
+ Edge attributes to add/update for all edges.
1069
+
1070
+ See Also
1071
+ --------
1072
+ add_edge : add a single edge
1073
+ add_edges_from : add multiple edges
1074
+
1075
+ Notes
1076
+ -----
1077
+ Adding the same edge twice for Graph/DiGraph simply updates
1078
+ the edge data. For MultiGraph/MultiDiGraph, duplicate edges
1079
+ are stored.
1080
+
1081
+ When adding edges from an iterator over the graph you are changing,
1082
+ a `RuntimeError` can be raised with message:
1083
+ `RuntimeError: dictionary changed size during iteration`. This
1084
+ happens when the graph's underlying dictionary is modified during
1085
+ iteration. To avoid this error, evaluate the iterator into a separate
1086
+ object, e.g. by using `list(iterator_of_edges)`, and pass this
1087
+ object to `G.add_weighted_edges_from`.
1088
+
1089
+ Examples
1090
+ --------
1091
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1092
+ >>> G.add_weighted_edges_from([(0, 1, 3.0), (1, 2, 7.5)])
1093
+
1094
+ Evaluate an iterator over edges before passing it
1095
+
1096
+ >>> G = nx.Graph([(1, 2), (2, 3), (3, 4)])
1097
+ >>> weight = 0.1
1098
+ >>> # Grow graph by one new node, adding edges to all existing nodes.
1099
+ >>> # wrong way - will raise RuntimeError
1100
+ >>> # G.add_weighted_edges_from(((5, n, weight) for n in G.nodes))
1101
+ >>> # correct way - note that there will be no self-edge for node 5
1102
+ >>> G.add_weighted_edges_from(list((5, n, weight) for n in G.nodes))
1103
+ """
1104
+ self.add_edges_from(((u, v, {weight: d}) for u, v, d in ebunch_to_add), **attr)
1105
+ nx._clear_cache(self)
1106
+
1107
+ def remove_edge(self, u, v):
1108
+ """Remove the edge between u and v.
1109
+
1110
+ Parameters
1111
+ ----------
1112
+ u, v : nodes
1113
+ Remove the edge between nodes u and v.
1114
+
1115
+ Raises
1116
+ ------
1117
+ NetworkXError
1118
+ If there is not an edge between u and v.
1119
+
1120
+ See Also
1121
+ --------
1122
+ remove_edges_from : remove a collection of edges
1123
+
1124
+ Examples
1125
+ --------
1126
+ >>> G = nx.path_graph(4) # or DiGraph, etc
1127
+ >>> G.remove_edge(0, 1)
1128
+ >>> e = (1, 2)
1129
+ >>> G.remove_edge(*e) # unpacks e from an edge tuple
1130
+ >>> e = (2, 3, {"weight": 7}) # an edge with attribute data
1131
+ >>> G.remove_edge(*e[:2]) # select first part of edge tuple
1132
+ """
1133
+ try:
1134
+ del self._adj[u][v]
1135
+ if u != v: # self-loop needs only one entry removed
1136
+ del self._adj[v][u]
1137
+ except KeyError as err:
1138
+ raise NetworkXError(f"The edge {u}-{v} is not in the graph") from err
1139
+ nx._clear_cache(self)
1140
+
1141
+ def remove_edges_from(self, ebunch):
1142
+ """Remove all edges specified in ebunch.
1143
+
1144
+ Parameters
1145
+ ----------
1146
+ ebunch: list or container of edge tuples
1147
+ Each edge given in the list or container will be removed
1148
+ from the graph. The edges can be:
1149
+
1150
+ - 2-tuples (u, v) edge between u and v.
1151
+ - 3-tuples (u, v, k) where k is ignored.
1152
+
1153
+ See Also
1154
+ --------
1155
+ remove_edge : remove a single edge
1156
+
1157
+ Notes
1158
+ -----
1159
+ Will fail silently if an edge in ebunch is not in the graph.
1160
+
1161
+ Examples
1162
+ --------
1163
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1164
+ >>> ebunch = [(1, 2), (2, 3)]
1165
+ >>> G.remove_edges_from(ebunch)
1166
+ """
1167
+ adj = self._adj
1168
+ for e in ebunch:
1169
+ u, v = e[:2] # ignore edge data if present
1170
+ if u in adj and v in adj[u]:
1171
+ del adj[u][v]
1172
+ if u != v: # self loop needs only one entry removed
1173
+ del adj[v][u]
1174
+ nx._clear_cache(self)
1175
+
1176
+ def update(self, edges=None, nodes=None):
1177
+ """Update the graph using nodes/edges/graphs as input.
1178
+
1179
+ Like dict.update, this method takes a graph as input, adding the
1180
+ graph's nodes and edges to this graph. It can also take two inputs:
1181
+ edges and nodes. Finally it can take either edges or nodes.
1182
+ To specify only nodes the keyword `nodes` must be used.
1183
+
1184
+ The collections of edges and nodes are treated similarly to
1185
+ the add_edges_from/add_nodes_from methods. When iterated, they
1186
+ should yield 2-tuples (u, v) or 3-tuples (u, v, datadict).
1187
+
1188
+ Parameters
1189
+ ----------
1190
+ edges : Graph object, collection of edges, or None
1191
+ The first parameter can be a graph or some edges. If it has
1192
+ attributes `nodes` and `edges`, then it is taken to be a
1193
+ Graph-like object and those attributes are used as collections
1194
+ of nodes and edges to be added to the graph.
1195
+ If the first parameter does not have those attributes, it is
1196
+ treated as a collection of edges and added to the graph.
1197
+ If the first argument is None, no edges are added.
1198
+ nodes : collection of nodes, or None
1199
+ The second parameter is treated as a collection of nodes
1200
+ to be added to the graph unless it is None.
1201
+ If `edges is None` and `nodes is None` an exception is raised.
1202
+ If the first parameter is a Graph, then `nodes` is ignored.
1203
+
1204
+ Examples
1205
+ --------
1206
+ >>> G = nx.path_graph(5)
1207
+ >>> G.update(nx.complete_graph(range(4, 10)))
1208
+ >>> from itertools import combinations
1209
+ >>> edges = (
1210
+ ... (u, v, {"power": u * v})
1211
+ ... for u, v in combinations(range(10, 20), 2)
1212
+ ... if u * v < 225
1213
+ ... )
1214
+ >>> nodes = [1000] # for singleton, use a container
1215
+ >>> G.update(edges, nodes)
1216
+
1217
+ Notes
1218
+ -----
1219
+ It you want to update the graph using an adjacency structure
1220
+ it is straightforward to obtain the edges/nodes from adjacency.
1221
+ The following examples provide common cases, your adjacency may
1222
+ be slightly different and require tweaks of these examples::
1223
+
1224
+ >>> # dict-of-set/list/tuple
1225
+ >>> adj = {1: {2, 3}, 2: {1, 3}, 3: {1, 2}}
1226
+ >>> e = [(u, v) for u, nbrs in adj.items() for v in nbrs]
1227
+ >>> G.update(edges=e, nodes=adj)
1228
+
1229
+ >>> DG = nx.DiGraph()
1230
+ >>> # dict-of-dict-of-attribute
1231
+ >>> adj = {1: {2: 1.3, 3: 0.7}, 2: {1: 1.4}, 3: {1: 0.7}}
1232
+ >>> e = [
1233
+ ... (u, v, {"weight": d})
1234
+ ... for u, nbrs in adj.items()
1235
+ ... for v, d in nbrs.items()
1236
+ ... ]
1237
+ >>> DG.update(edges=e, nodes=adj)
1238
+
1239
+ >>> # dict-of-dict-of-dict
1240
+ >>> adj = {1: {2: {"weight": 1.3}, 3: {"color": 0.7, "weight": 1.2}}}
1241
+ >>> e = [
1242
+ ... (u, v, {"weight": d})
1243
+ ... for u, nbrs in adj.items()
1244
+ ... for v, d in nbrs.items()
1245
+ ... ]
1246
+ >>> DG.update(edges=e, nodes=adj)
1247
+
1248
+ >>> # predecessor adjacency (dict-of-set)
1249
+ >>> pred = {1: {2, 3}, 2: {3}, 3: {3}}
1250
+ >>> e = [(v, u) for u, nbrs in pred.items() for v in nbrs]
1251
+
1252
+ >>> # MultiGraph dict-of-dict-of-dict-of-attribute
1253
+ >>> MDG = nx.MultiDiGraph()
1254
+ >>> adj = {
1255
+ ... 1: {2: {0: {"weight": 1.3}, 1: {"weight": 1.2}}},
1256
+ ... 3: {2: {0: {"weight": 0.7}}},
1257
+ ... }
1258
+ >>> e = [
1259
+ ... (u, v, ekey, d)
1260
+ ... for u, nbrs in adj.items()
1261
+ ... for v, keydict in nbrs.items()
1262
+ ... for ekey, d in keydict.items()
1263
+ ... ]
1264
+ >>> MDG.update(edges=e)
1265
+
1266
+ See Also
1267
+ --------
1268
+ add_edges_from: add multiple edges to a graph
1269
+ add_nodes_from: add multiple nodes to a graph
1270
+ """
1271
+ if edges is not None:
1272
+ if nodes is not None:
1273
+ self.add_nodes_from(nodes)
1274
+ self.add_edges_from(edges)
1275
+ else:
1276
+ # check if edges is a Graph object
1277
+ try:
1278
+ graph_nodes = edges.nodes
1279
+ graph_edges = edges.edges
1280
+ except AttributeError:
1281
+ # edge not Graph-like
1282
+ self.add_edges_from(edges)
1283
+ else: # edges is Graph-like
1284
+ self.add_nodes_from(graph_nodes.data())
1285
+ self.add_edges_from(graph_edges.data())
1286
+ self.graph.update(edges.graph)
1287
+ elif nodes is not None:
1288
+ self.add_nodes_from(nodes)
1289
+ else:
1290
+ raise NetworkXError("update needs nodes or edges input")
1291
+
1292
+ def has_edge(self, u, v):
1293
+ """Returns True if the edge (u, v) is in the graph.
1294
+
1295
+ This is the same as `v in G[u]` without KeyError exceptions.
1296
+
1297
+ Parameters
1298
+ ----------
1299
+ u, v : nodes
1300
+ Nodes can be, for example, strings or numbers.
1301
+ Nodes must be hashable (and not None) Python objects.
1302
+
1303
+ Returns
1304
+ -------
1305
+ edge_ind : bool
1306
+ True if edge is in the graph, False otherwise.
1307
+
1308
+ Examples
1309
+ --------
1310
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1311
+ >>> G.has_edge(0, 1) # using two nodes
1312
+ True
1313
+ >>> e = (0, 1)
1314
+ >>> G.has_edge(*e) # e is a 2-tuple (u, v)
1315
+ True
1316
+ >>> e = (0, 1, {"weight": 7})
1317
+ >>> G.has_edge(*e[:2]) # e is a 3-tuple (u, v, data_dictionary)
1318
+ True
1319
+
1320
+ The following syntax are equivalent:
1321
+
1322
+ >>> G.has_edge(0, 1)
1323
+ True
1324
+ >>> 1 in G[0] # though this gives KeyError if 0 not in G
1325
+ True
1326
+
1327
+ """
1328
+ try:
1329
+ return v in self._adj[u]
1330
+ except KeyError:
1331
+ return False
1332
+
1333
+ def neighbors(self, n):
1334
+ """Returns an iterator over all neighbors of node n.
1335
+
1336
+ This is identical to `iter(G[n])`
1337
+
1338
+ Parameters
1339
+ ----------
1340
+ n : node
1341
+ A node in the graph
1342
+
1343
+ Returns
1344
+ -------
1345
+ neighbors : iterator
1346
+ An iterator over all neighbors of node n
1347
+
1348
+ Raises
1349
+ ------
1350
+ NetworkXError
1351
+ If the node n is not in the graph.
1352
+
1353
+ Examples
1354
+ --------
1355
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1356
+ >>> [n for n in G.neighbors(0)]
1357
+ [1]
1358
+
1359
+ Notes
1360
+ -----
1361
+ Alternate ways to access the neighbors are ``G.adj[n]`` or ``G[n]``:
1362
+
1363
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1364
+ >>> G.add_edge("a", "b", weight=7)
1365
+ >>> G["a"]
1366
+ AtlasView({'b': {'weight': 7}})
1367
+ >>> G = nx.path_graph(4)
1368
+ >>> [n for n in G[0]]
1369
+ [1]
1370
+ """
1371
+ try:
1372
+ return iter(self._adj[n])
1373
+ except KeyError as err:
1374
+ raise NetworkXError(f"The node {n} is not in the graph.") from err
1375
+
1376
+ @cached_property
1377
+ def edges(self):
1378
+ """An EdgeView of the Graph as G.edges or G.edges().
1379
+
1380
+ edges(self, nbunch=None, data=False, default=None)
1381
+
1382
+ The EdgeView provides set-like operations on the edge-tuples
1383
+ as well as edge attribute lookup. When called, it also provides
1384
+ an EdgeDataView object which allows control of access to edge
1385
+ attributes (but does not provide set-like operations).
1386
+ Hence, `G.edges[u, v]['color']` provides the value of the color
1387
+ attribute for edge `(u, v)` while
1388
+ `for (u, v, c) in G.edges.data('color', default='red'):`
1389
+ iterates through all the edges yielding the color attribute
1390
+ with default `'red'` if no color attribute exists.
1391
+
1392
+ Parameters
1393
+ ----------
1394
+ nbunch : single node, container, or all nodes (default= all nodes)
1395
+ The view will only report edges from these nodes.
1396
+ data : string or bool, optional (default=False)
1397
+ The edge attribute returned in 3-tuple (u, v, ddict[data]).
1398
+ If True, return edge attribute dict in 3-tuple (u, v, ddict).
1399
+ If False, return 2-tuple (u, v).
1400
+ default : value, optional (default=None)
1401
+ Value used for edges that don't have the requested attribute.
1402
+ Only relevant if data is not True or False.
1403
+
1404
+ Returns
1405
+ -------
1406
+ edges : EdgeView
1407
+ A view of edge attributes, usually it iterates over (u, v)
1408
+ or (u, v, d) tuples of edges, but can also be used for
1409
+ attribute lookup as `edges[u, v]['foo']`.
1410
+
1411
+ Notes
1412
+ -----
1413
+ Nodes in nbunch that are not in the graph will be (quietly) ignored.
1414
+ For directed graphs this returns the out-edges.
1415
+
1416
+ Examples
1417
+ --------
1418
+ >>> G = nx.path_graph(3) # or MultiGraph, etc
1419
+ >>> G.add_edge(2, 3, weight=5)
1420
+ >>> [e for e in G.edges]
1421
+ [(0, 1), (1, 2), (2, 3)]
1422
+ >>> G.edges.data() # default data is {} (empty dict)
1423
+ EdgeDataView([(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})])
1424
+ >>> G.edges.data("weight", default=1)
1425
+ EdgeDataView([(0, 1, 1), (1, 2, 1), (2, 3, 5)])
1426
+ >>> G.edges([0, 3]) # only edges from these nodes
1427
+ EdgeDataView([(0, 1), (3, 2)])
1428
+ >>> G.edges(0) # only edges from node 0
1429
+ EdgeDataView([(0, 1)])
1430
+ """
1431
+ return EdgeView(self)
1432
+
1433
+ def get_edge_data(self, u, v, default=None):
1434
+ """Returns the attribute dictionary associated with edge (u, v).
1435
+
1436
+ This is identical to `G[u][v]` except the default is returned
1437
+ instead of an exception if the edge doesn't exist.
1438
+
1439
+ Parameters
1440
+ ----------
1441
+ u, v : nodes
1442
+ default: any Python object (default=None)
1443
+ Value to return if the edge (u, v) is not found.
1444
+
1445
+ Returns
1446
+ -------
1447
+ edge_dict : dictionary
1448
+ The edge attribute dictionary.
1449
+
1450
+ Examples
1451
+ --------
1452
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1453
+ >>> G[0][1]
1454
+ {}
1455
+
1456
+ Warning: Assigning to `G[u][v]` is not permitted.
1457
+ But it is safe to assign attributes `G[u][v]['foo']`
1458
+
1459
+ >>> G[0][1]["weight"] = 7
1460
+ >>> G[0][1]["weight"]
1461
+ 7
1462
+ >>> G[1][0]["weight"]
1463
+ 7
1464
+
1465
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1466
+ >>> G.get_edge_data(0, 1) # default edge data is {}
1467
+ {}
1468
+ >>> e = (0, 1)
1469
+ >>> G.get_edge_data(*e) # tuple form
1470
+ {}
1471
+ >>> G.get_edge_data("a", "b", default=0) # edge not in graph, return 0
1472
+ 0
1473
+ """
1474
+ try:
1475
+ return self._adj[u][v]
1476
+ except KeyError:
1477
+ return default
1478
+
1479
+ def adjacency(self):
1480
+ """Returns an iterator over (node, adjacency dict) tuples for all nodes.
1481
+
1482
+ For directed graphs, only outgoing neighbors/adjacencies are included.
1483
+
1484
+ Returns
1485
+ -------
1486
+ adj_iter : iterator
1487
+ An iterator over (node, adjacency dictionary) for all nodes in
1488
+ the graph.
1489
+
1490
+ Examples
1491
+ --------
1492
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1493
+ >>> [(n, nbrdict) for n, nbrdict in G.adjacency()]
1494
+ [(0, {1: {}}), (1, {0: {}, 2: {}}), (2, {1: {}, 3: {}}), (3, {2: {}})]
1495
+
1496
+ """
1497
+ return iter(self._adj.items())
1498
+
1499
+ @cached_property
1500
+ def degree(self):
1501
+ """A DegreeView for the Graph as G.degree or G.degree().
1502
+
1503
+ The node degree is the number of edges adjacent to the node.
1504
+ The weighted node degree is the sum of the edge weights for
1505
+ edges incident to that node.
1506
+
1507
+ This object provides an iterator for (node, degree) as well as
1508
+ lookup for the degree for a single node.
1509
+
1510
+ Parameters
1511
+ ----------
1512
+ nbunch : single node, container, or all nodes (default= all nodes)
1513
+ The view will only report edges incident to these nodes.
1514
+
1515
+ weight : string or None, optional (default=None)
1516
+ The name of an edge attribute that holds the numerical value used
1517
+ as a weight. If None, then each edge has weight 1.
1518
+ The degree is the sum of the edge weights adjacent to the node.
1519
+
1520
+ Returns
1521
+ -------
1522
+ DegreeView or int
1523
+ If multiple nodes are requested (the default), returns a `DegreeView`
1524
+ mapping nodes to their degree.
1525
+ If a single node is requested, returns the degree of the node as an integer.
1526
+
1527
+ Examples
1528
+ --------
1529
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1530
+ >>> G.degree[0] # node 0 has degree 1
1531
+ 1
1532
+ >>> list(G.degree([0, 1, 2]))
1533
+ [(0, 1), (1, 2), (2, 2)]
1534
+ """
1535
+ return DegreeView(self)
1536
+
1537
+ def clear(self):
1538
+ """Remove all nodes and edges from the graph.
1539
+
1540
+ This also removes the name, and all graph, node, and edge attributes.
1541
+
1542
+ Examples
1543
+ --------
1544
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1545
+ >>> G.clear()
1546
+ >>> list(G.nodes)
1547
+ []
1548
+ >>> list(G.edges)
1549
+ []
1550
+
1551
+ """
1552
+ self._adj.clear()
1553
+ self._node.clear()
1554
+ self.graph.clear()
1555
+ nx._clear_cache(self)
1556
+
1557
+ def clear_edges(self):
1558
+ """Remove all edges from the graph without altering nodes.
1559
+
1560
+ Examples
1561
+ --------
1562
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1563
+ >>> G.clear_edges()
1564
+ >>> list(G.nodes)
1565
+ [0, 1, 2, 3]
1566
+ >>> list(G.edges)
1567
+ []
1568
+ """
1569
+ for nbr_dict in self._adj.values():
1570
+ nbr_dict.clear()
1571
+ nx._clear_cache(self)
1572
+
1573
+ def is_multigraph(self):
1574
+ """Returns True if graph is a multigraph, False otherwise."""
1575
+ return False
1576
+
1577
+ def is_directed(self):
1578
+ """Returns True if graph is directed, False otherwise."""
1579
+ return False
1580
+
1581
+ def copy(self, as_view=False):
1582
+ """Returns a copy of the graph.
1583
+
1584
+ The copy method by default returns an independent shallow copy
1585
+ of the graph and attributes. That is, if an attribute is a
1586
+ container, that container is shared by the original an the copy.
1587
+ Use Python's `copy.deepcopy` for new containers.
1588
+
1589
+ If `as_view` is True then a view is returned instead of a copy.
1590
+
1591
+ Notes
1592
+ -----
1593
+ All copies reproduce the graph structure, but data attributes
1594
+ may be handled in different ways. There are four types of copies
1595
+ of a graph that people might want.
1596
+
1597
+ Deepcopy -- A "deepcopy" copies the graph structure as well as
1598
+ all data attributes and any objects they might contain.
1599
+ The entire graph object is new so that changes in the copy
1600
+ do not affect the original object. (see Python's copy.deepcopy)
1601
+
1602
+ Data Reference (Shallow) -- For a shallow copy the graph structure
1603
+ is copied but the edge, node and graph attribute dicts are
1604
+ references to those in the original graph. This saves
1605
+ time and memory but could cause confusion if you change an attribute
1606
+ in one graph and it changes the attribute in the other.
1607
+ NetworkX does not provide this level of shallow copy.
1608
+
1609
+ Independent Shallow -- This copy creates new independent attribute
1610
+ dicts and then does a shallow copy of the attributes. That is, any
1611
+ attributes that are containers are shared between the new graph
1612
+ and the original. This is exactly what `dict.copy()` provides.
1613
+ You can obtain this style copy using:
1614
+
1615
+ >>> G = nx.path_graph(5)
1616
+ >>> H = G.copy()
1617
+ >>> H = G.copy(as_view=False)
1618
+ >>> H = nx.Graph(G)
1619
+ >>> H = G.__class__(G)
1620
+
1621
+ Fresh Data -- For fresh data, the graph structure is copied while
1622
+ new empty data attribute dicts are created. The resulting graph
1623
+ is independent of the original and it has no edge, node or graph
1624
+ attributes. Fresh copies are not enabled. Instead use:
1625
+
1626
+ >>> H = G.__class__()
1627
+ >>> H.add_nodes_from(G)
1628
+ >>> H.add_edges_from(G.edges)
1629
+
1630
+ View -- Inspired by dict-views, graph-views act like read-only
1631
+ versions of the original graph, providing a copy of the original
1632
+ structure without requiring any memory for copying the information.
1633
+
1634
+ See the Python copy module for more information on shallow
1635
+ and deep copies, https://docs.python.org/3/library/copy.html.
1636
+
1637
+ Parameters
1638
+ ----------
1639
+ as_view : bool, optional (default=False)
1640
+ If True, the returned graph-view provides a read-only view
1641
+ of the original graph without actually copying any data.
1642
+
1643
+ Returns
1644
+ -------
1645
+ G : Graph
1646
+ A copy of the graph.
1647
+
1648
+ See Also
1649
+ --------
1650
+ to_directed: return a directed copy of the graph.
1651
+
1652
+ Examples
1653
+ --------
1654
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1655
+ >>> H = G.copy()
1656
+
1657
+ """
1658
+ if as_view is True:
1659
+ return nx.graphviews.generic_graph_view(self)
1660
+ G = self.__class__()
1661
+ G.graph.update(self.graph)
1662
+ G.add_nodes_from((n, d.copy()) for n, d in self._node.items())
1663
+ G.add_edges_from(
1664
+ (u, v, datadict.copy())
1665
+ for u, nbrs in self._adj.items()
1666
+ for v, datadict in nbrs.items()
1667
+ )
1668
+ return G
1669
+
1670
+ def to_directed(self, as_view=False):
1671
+ """Returns a directed representation of the graph.
1672
+
1673
+ Returns
1674
+ -------
1675
+ G : DiGraph
1676
+ A directed graph with the same name, same nodes, and with
1677
+ each edge (u, v, data) replaced by two directed edges
1678
+ (u, v, data) and (v, u, data).
1679
+
1680
+ Notes
1681
+ -----
1682
+ This returns a "deepcopy" of the edge, node, and
1683
+ graph attributes which attempts to completely copy
1684
+ all of the data and references.
1685
+
1686
+ This is in contrast to the similar D=DiGraph(G) which returns a
1687
+ shallow copy of the data.
1688
+
1689
+ See the Python copy module for more information on shallow
1690
+ and deep copies, https://docs.python.org/3/library/copy.html.
1691
+
1692
+ Warning: If you have subclassed Graph to use dict-like objects
1693
+ in the data structure, those changes do not transfer to the
1694
+ DiGraph created by this method.
1695
+
1696
+ Examples
1697
+ --------
1698
+ >>> G = nx.Graph() # or MultiGraph, etc
1699
+ >>> G.add_edge(0, 1)
1700
+ >>> H = G.to_directed()
1701
+ >>> list(H.edges)
1702
+ [(0, 1), (1, 0)]
1703
+
1704
+ If already directed, return a (deep) copy
1705
+
1706
+ >>> G = nx.DiGraph() # or MultiDiGraph, etc
1707
+ >>> G.add_edge(0, 1)
1708
+ >>> H = G.to_directed()
1709
+ >>> list(H.edges)
1710
+ [(0, 1)]
1711
+ """
1712
+ graph_class = self.to_directed_class()
1713
+ if as_view is True:
1714
+ return nx.graphviews.generic_graph_view(self, graph_class)
1715
+ # deepcopy when not a view
1716
+ G = graph_class()
1717
+ G.graph.update(deepcopy(self.graph))
1718
+ G.add_nodes_from((n, deepcopy(d)) for n, d in self._node.items())
1719
+ G.add_edges_from(
1720
+ (u, v, deepcopy(data))
1721
+ for u, nbrs in self._adj.items()
1722
+ for v, data in nbrs.items()
1723
+ )
1724
+ return G
1725
+
1726
+ def to_undirected(self, as_view=False):
1727
+ """Returns an undirected copy of the graph.
1728
+
1729
+ Parameters
1730
+ ----------
1731
+ as_view : bool (optional, default=False)
1732
+ If True return a view of the original undirected graph.
1733
+
1734
+ Returns
1735
+ -------
1736
+ G : Graph/MultiGraph
1737
+ A deepcopy of the graph.
1738
+
1739
+ See Also
1740
+ --------
1741
+ Graph, copy, add_edge, add_edges_from
1742
+
1743
+ Notes
1744
+ -----
1745
+ This returns a "deepcopy" of the edge, node, and
1746
+ graph attributes which attempts to completely copy
1747
+ all of the data and references.
1748
+
1749
+ This is in contrast to the similar `G = nx.DiGraph(D)` which returns a
1750
+ shallow copy of the data.
1751
+
1752
+ See the Python copy module for more information on shallow
1753
+ and deep copies, https://docs.python.org/3/library/copy.html.
1754
+
1755
+ Warning: If you have subclassed DiGraph to use dict-like objects
1756
+ in the data structure, those changes do not transfer to the
1757
+ Graph created by this method.
1758
+
1759
+ Examples
1760
+ --------
1761
+ >>> G = nx.path_graph(2) # or MultiGraph, etc
1762
+ >>> H = G.to_directed()
1763
+ >>> list(H.edges)
1764
+ [(0, 1), (1, 0)]
1765
+ >>> G2 = H.to_undirected()
1766
+ >>> list(G2.edges)
1767
+ [(0, 1)]
1768
+ """
1769
+ graph_class = self.to_undirected_class()
1770
+ if as_view is True:
1771
+ return nx.graphviews.generic_graph_view(self, graph_class)
1772
+ # deepcopy when not a view
1773
+ G = graph_class()
1774
+ G.graph.update(deepcopy(self.graph))
1775
+ G.add_nodes_from((n, deepcopy(d)) for n, d in self._node.items())
1776
+ G.add_edges_from(
1777
+ (u, v, deepcopy(d))
1778
+ for u, nbrs in self._adj.items()
1779
+ for v, d in nbrs.items()
1780
+ )
1781
+ return G
1782
+
1783
+ def subgraph(self, nodes):
1784
+ """Returns a SubGraph view of the subgraph induced on `nodes`.
1785
+
1786
+ The induced subgraph of the graph contains the nodes in `nodes`
1787
+ and the edges between those nodes.
1788
+
1789
+ Parameters
1790
+ ----------
1791
+ nodes : list, iterable
1792
+ A container of nodes which will be iterated through once.
1793
+
1794
+ Returns
1795
+ -------
1796
+ G : SubGraph View
1797
+ A subgraph view of the graph. The graph structure cannot be
1798
+ changed but node/edge attributes can and are shared with the
1799
+ original graph.
1800
+
1801
+ Notes
1802
+ -----
1803
+ The graph, edge and node attributes are shared with the original graph.
1804
+ Changes to the graph structure is ruled out by the view, but changes
1805
+ to attributes are reflected in the original graph.
1806
+
1807
+ To create a subgraph with its own copy of the edge/node attributes use:
1808
+ G.subgraph(nodes).copy()
1809
+
1810
+ For an inplace reduction of a graph to a subgraph you can remove nodes:
1811
+ G.remove_nodes_from([n for n in G if n not in set(nodes)])
1812
+
1813
+ Subgraph views are sometimes NOT what you want. In most cases where
1814
+ you want to do more than simply look at the induced edges, it makes
1815
+ more sense to just create the subgraph as its own graph with code like:
1816
+
1817
+ ::
1818
+
1819
+ # Create a subgraph SG based on a (possibly multigraph) G
1820
+ SG = G.__class__()
1821
+ SG.add_nodes_from((n, G.nodes[n]) for n in largest_wcc)
1822
+ if SG.is_multigraph():
1823
+ SG.add_edges_from(
1824
+ (n, nbr, key, d)
1825
+ for n, nbrs in G.adj.items()
1826
+ if n in largest_wcc
1827
+ for nbr, keydict in nbrs.items()
1828
+ if nbr in largest_wcc
1829
+ for key, d in keydict.items()
1830
+ )
1831
+ else:
1832
+ SG.add_edges_from(
1833
+ (n, nbr, d)
1834
+ for n, nbrs in G.adj.items()
1835
+ if n in largest_wcc
1836
+ for nbr, d in nbrs.items()
1837
+ if nbr in largest_wcc
1838
+ )
1839
+ SG.graph.update(G.graph)
1840
+
1841
+ Examples
1842
+ --------
1843
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1844
+ >>> H = G.subgraph([0, 1, 2])
1845
+ >>> list(H.edges)
1846
+ [(0, 1), (1, 2)]
1847
+ """
1848
+ induced_nodes = nx.filters.show_nodes(self.nbunch_iter(nodes))
1849
+ # if already a subgraph, don't make a chain
1850
+ subgraph = nx.subgraph_view
1851
+ if hasattr(self, "_NODE_OK"):
1852
+ return subgraph(
1853
+ self._graph, filter_node=induced_nodes, filter_edge=self._EDGE_OK
1854
+ )
1855
+ return subgraph(self, filter_node=induced_nodes)
1856
+
1857
+ def edge_subgraph(self, edges):
1858
+ """Returns the subgraph induced by the specified edges.
1859
+
1860
+ The induced subgraph contains each edge in `edges` and each
1861
+ node incident to any one of those edges.
1862
+
1863
+ Parameters
1864
+ ----------
1865
+ edges : iterable
1866
+ An iterable of edges in this graph.
1867
+
1868
+ Returns
1869
+ -------
1870
+ G : Graph
1871
+ An edge-induced subgraph of this graph with the same edge
1872
+ attributes.
1873
+
1874
+ Notes
1875
+ -----
1876
+ The graph, edge, and node attributes in the returned subgraph
1877
+ view are references to the corresponding attributes in the original
1878
+ graph. The view is read-only.
1879
+
1880
+ To create a full graph version of the subgraph with its own copy
1881
+ of the edge or node attributes, use::
1882
+
1883
+ G.edge_subgraph(edges).copy()
1884
+
1885
+ Examples
1886
+ --------
1887
+ >>> G = nx.path_graph(5)
1888
+ >>> H = G.edge_subgraph([(0, 1), (3, 4)])
1889
+ >>> list(H.nodes)
1890
+ [0, 1, 3, 4]
1891
+ >>> list(H.edges)
1892
+ [(0, 1), (3, 4)]
1893
+
1894
+ """
1895
+ return nx.edge_subgraph(self, edges)
1896
+
1897
+ def size(self, weight=None):
1898
+ """Returns the number of edges or total of all edge weights.
1899
+
1900
+ Parameters
1901
+ ----------
1902
+ weight : string or None, optional (default=None)
1903
+ The edge attribute that holds the numerical value used
1904
+ as a weight. If None, then each edge has weight 1.
1905
+
1906
+ Returns
1907
+ -------
1908
+ size : numeric
1909
+ The number of edges or
1910
+ (if weight keyword is provided) the total weight sum.
1911
+
1912
+ If weight is None, returns an int. Otherwise a float
1913
+ (or more general numeric if the weights are more general).
1914
+
1915
+ See Also
1916
+ --------
1917
+ number_of_edges
1918
+
1919
+ Examples
1920
+ --------
1921
+ >>> G = nx.path_graph(4) # or DiGraph, MultiGraph, MultiDiGraph, etc
1922
+ >>> G.size()
1923
+ 3
1924
+
1925
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
1926
+ >>> G.add_edge("a", "b", weight=2)
1927
+ >>> G.add_edge("b", "c", weight=4)
1928
+ >>> G.size()
1929
+ 2
1930
+ >>> G.size(weight="weight")
1931
+ 6.0
1932
+ """
1933
+ s = sum(d for v, d in self.degree(weight=weight))
1934
+ # If `weight` is None, the sum of the degrees is guaranteed to be
1935
+ # even, so we can perform integer division and hence return an
1936
+ # integer. Otherwise, the sum of the weighted degrees is not
1937
+ # guaranteed to be an integer, so we perform "real" division.
1938
+ return s // 2 if weight is None else s / 2
1939
+
1940
+ def number_of_edges(self, u=None, v=None):
1941
+ """Returns the number of edges between two nodes.
1942
+
1943
+ Parameters
1944
+ ----------
1945
+ u, v : nodes, optional (default=all edges)
1946
+ If u and v are specified, return the number of edges between
1947
+ u and v. Otherwise return the total number of all edges.
1948
+
1949
+ Returns
1950
+ -------
1951
+ nedges : int
1952
+ The number of edges in the graph. If nodes `u` and `v` are
1953
+ specified return the number of edges between those nodes. If
1954
+ the graph is directed, this only returns the number of edges
1955
+ from `u` to `v`.
1956
+
1957
+ See Also
1958
+ --------
1959
+ size
1960
+
1961
+ Examples
1962
+ --------
1963
+ For undirected graphs, this method counts the total number of
1964
+ edges in the graph:
1965
+
1966
+ >>> G = nx.path_graph(4)
1967
+ >>> G.number_of_edges()
1968
+ 3
1969
+
1970
+ If you specify two nodes, this counts the total number of edges
1971
+ joining the two nodes:
1972
+
1973
+ >>> G.number_of_edges(0, 1)
1974
+ 1
1975
+
1976
+ For directed graphs, this method can count the total number of
1977
+ directed edges from `u` to `v`:
1978
+
1979
+ >>> G = nx.DiGraph()
1980
+ >>> G.add_edge(0, 1)
1981
+ >>> G.add_edge(1, 0)
1982
+ >>> G.number_of_edges(0, 1)
1983
+ 1
1984
+
1985
+ """
1986
+ if u is None:
1987
+ return int(self.size())
1988
+ if v in self._adj[u]:
1989
+ return 1
1990
+ return 0
1991
+
1992
+ def nbunch_iter(self, nbunch=None):
1993
+ """Returns an iterator over nodes contained in nbunch that are
1994
+ also in the graph.
1995
+
1996
+ The nodes in nbunch are checked for membership in the graph
1997
+ and if not are silently ignored.
1998
+
1999
+ Parameters
2000
+ ----------
2001
+ nbunch : single node, container, or all nodes (default= all nodes)
2002
+ The view will only report edges incident to these nodes.
2003
+
2004
+ Returns
2005
+ -------
2006
+ niter : iterator
2007
+ An iterator over nodes in nbunch that are also in the graph.
2008
+ If nbunch is None, iterate over all nodes in the graph.
2009
+
2010
+ Raises
2011
+ ------
2012
+ NetworkXError
2013
+ If nbunch is not a node or sequence of nodes.
2014
+ If a node in nbunch is not hashable.
2015
+
2016
+ See Also
2017
+ --------
2018
+ Graph.__iter__
2019
+
2020
+ Notes
2021
+ -----
2022
+ When nbunch is an iterator, the returned iterator yields values
2023
+ directly from nbunch, becoming exhausted when nbunch is exhausted.
2024
+
2025
+ To test whether nbunch is a single node, one can use
2026
+ "if nbunch in self:", even after processing with this routine.
2027
+
2028
+ If nbunch is not a node or a (possibly empty) sequence/iterator
2029
+ or None, a :exc:`NetworkXError` is raised. Also, if any object in
2030
+ nbunch is not hashable, a :exc:`NetworkXError` is raised.
2031
+ """
2032
+ if nbunch is None: # include all nodes via iterator
2033
+ bunch = iter(self._adj)
2034
+ elif nbunch in self: # if nbunch is a single node
2035
+ bunch = iter([nbunch])
2036
+ else: # if nbunch is a sequence of nodes
2037
+
2038
+ def bunch_iter(nlist, adj):
2039
+ try:
2040
+ for n in nlist:
2041
+ if n in adj:
2042
+ yield n
2043
+ except TypeError as err:
2044
+ exc, message = err, err.args[0]
2045
+ # capture error for non-sequence/iterator nbunch.
2046
+ if "iter" in message:
2047
+ exc = NetworkXError(
2048
+ "nbunch is not a node or a sequence of nodes."
2049
+ )
2050
+ # capture error for unhashable node.
2051
+ if "hashable" in message:
2052
+ exc = NetworkXError(
2053
+ f"Node {n} in sequence nbunch is not a valid node."
2054
+ )
2055
+ raise exc
2056
+
2057
+ bunch = bunch_iter(nbunch, self._adj)
2058
+ return bunch
minigpt2/lib/python3.10/site-packages/networkx/classes/graphviews.py ADDED
@@ -0,0 +1,269 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """View of Graphs as SubGraph, Reverse, Directed, Undirected.
2
+
3
+ In some algorithms it is convenient to temporarily morph
4
+ a graph to exclude some nodes or edges. It should be better
5
+ to do that via a view than to remove and then re-add.
6
+ In other algorithms it is convenient to temporarily morph
7
+ a graph to reverse directed edges, or treat a directed graph
8
+ as undirected, etc. This module provides those graph views.
9
+
10
+ The resulting views are essentially read-only graphs that
11
+ report data from the original graph object. We provide an
12
+ attribute G._graph which points to the underlying graph object.
13
+
14
+ Note: Since graphviews look like graphs, one can end up with
15
+ view-of-view-of-view chains. Be careful with chains because
16
+ they become very slow with about 15 nested views.
17
+ For the common simple case of node induced subgraphs created
18
+ from the graph class, we short-cut the chain by returning a
19
+ subgraph of the original graph directly rather than a subgraph
20
+ of a subgraph. We are careful not to disrupt any edge filter in
21
+ the middle subgraph. In general, determining how to short-cut
22
+ the chain is tricky and much harder with restricted_views than
23
+ with induced subgraphs.
24
+ Often it is easiest to use .copy() to avoid chains.
25
+ """
26
+
27
+ import networkx as nx
28
+ from networkx.classes.coreviews import (
29
+ FilterAdjacency,
30
+ FilterAtlas,
31
+ FilterMultiAdjacency,
32
+ UnionAdjacency,
33
+ UnionMultiAdjacency,
34
+ )
35
+ from networkx.classes.filters import no_filter
36
+ from networkx.exception import NetworkXError
37
+ from networkx.utils import not_implemented_for
38
+
39
+ __all__ = ["generic_graph_view", "subgraph_view", "reverse_view"]
40
+
41
+
42
+ def generic_graph_view(G, create_using=None):
43
+ """Returns a read-only view of `G`.
44
+
45
+ The graph `G` and its attributes are not copied but viewed through the new graph object
46
+ of the same class as `G` (or of the class specified in `create_using`).
47
+
48
+ Parameters
49
+ ----------
50
+ G : graph
51
+ A directed/undirected graph/multigraph.
52
+
53
+ create_using : NetworkX graph constructor, optional (default=None)
54
+ Graph type to create. If graph instance, then cleared before populated.
55
+ If `None`, then the appropriate Graph type is inferred from `G`.
56
+
57
+ Returns
58
+ -------
59
+ newG : graph
60
+ A view of the input graph `G` and its attributes as viewed through
61
+ the `create_using` class.
62
+
63
+ Raises
64
+ ------
65
+ NetworkXError
66
+ If `G` is a multigraph (or multidigraph) but `create_using` is not, or vice versa.
67
+
68
+ Notes
69
+ -----
70
+ The returned graph view is read-only (cannot modify the graph).
71
+ Yet the view reflects any changes in `G`. The intent is to mimic dict views.
72
+
73
+ Examples
74
+ --------
75
+ >>> G = nx.Graph()
76
+ >>> G.add_edge(1, 2, weight=0.3)
77
+ >>> G.add_edge(2, 3, weight=0.5)
78
+ >>> G.edges(data=True)
79
+ EdgeDataView([(1, 2, {'weight': 0.3}), (2, 3, {'weight': 0.5})])
80
+
81
+ The view exposes the attributes from the original graph.
82
+
83
+ >>> viewG = nx.graphviews.generic_graph_view(G)
84
+ >>> viewG.edges(data=True)
85
+ EdgeDataView([(1, 2, {'weight': 0.3}), (2, 3, {'weight': 0.5})])
86
+
87
+ Changes to `G` are reflected in `viewG`.
88
+
89
+ >>> G.remove_edge(2, 3)
90
+ >>> G.edges(data=True)
91
+ EdgeDataView([(1, 2, {'weight': 0.3})])
92
+
93
+ >>> viewG.edges(data=True)
94
+ EdgeDataView([(1, 2, {'weight': 0.3})])
95
+
96
+ We can change the graph type with the `create_using` parameter.
97
+
98
+ >>> type(G)
99
+ <class 'networkx.classes.graph.Graph'>
100
+ >>> viewDG = nx.graphviews.generic_graph_view(G, create_using=nx.DiGraph)
101
+ >>> type(viewDG)
102
+ <class 'networkx.classes.digraph.DiGraph'>
103
+ """
104
+ if create_using is None:
105
+ newG = G.__class__()
106
+ else:
107
+ newG = nx.empty_graph(0, create_using)
108
+ if G.is_multigraph() != newG.is_multigraph():
109
+ raise NetworkXError("Multigraph for G must agree with create_using")
110
+ newG = nx.freeze(newG)
111
+
112
+ # create view by assigning attributes from G
113
+ newG._graph = G
114
+ newG.graph = G.graph
115
+
116
+ newG._node = G._node
117
+ if newG.is_directed():
118
+ if G.is_directed():
119
+ newG._succ = G._succ
120
+ newG._pred = G._pred
121
+ # newG._adj is synced with _succ
122
+ else:
123
+ newG._succ = G._adj
124
+ newG._pred = G._adj
125
+ # newG._adj is synced with _succ
126
+ elif G.is_directed():
127
+ if G.is_multigraph():
128
+ newG._adj = UnionMultiAdjacency(G._succ, G._pred)
129
+ else:
130
+ newG._adj = UnionAdjacency(G._succ, G._pred)
131
+ else:
132
+ newG._adj = G._adj
133
+ return newG
134
+
135
+
136
+ def subgraph_view(G, *, filter_node=no_filter, filter_edge=no_filter):
137
+ """View of `G` applying a filter on nodes and edges.
138
+
139
+ `subgraph_view` provides a read-only view of the input graph that excludes
140
+ nodes and edges based on the outcome of two filter functions `filter_node`
141
+ and `filter_edge`.
142
+
143
+ The `filter_node` function takes one argument --- the node --- and returns
144
+ `True` if the node should be included in the subgraph, and `False` if it
145
+ should not be included.
146
+
147
+ The `filter_edge` function takes two (or three arguments if `G` is a
148
+ multi-graph) --- the nodes describing an edge, plus the edge-key if
149
+ parallel edges are possible --- and returns `True` if the edge should be
150
+ included in the subgraph, and `False` if it should not be included.
151
+
152
+ Both node and edge filter functions are called on graph elements as they
153
+ are queried, meaning there is no up-front cost to creating the view.
154
+
155
+ Parameters
156
+ ----------
157
+ G : networkx.Graph
158
+ A directed/undirected graph/multigraph
159
+
160
+ filter_node : callable, optional
161
+ A function taking a node as input, which returns `True` if the node
162
+ should appear in the view.
163
+
164
+ filter_edge : callable, optional
165
+ A function taking as input the two nodes describing an edge (plus the
166
+ edge-key if `G` is a multi-graph), which returns `True` if the edge
167
+ should appear in the view.
168
+
169
+ Returns
170
+ -------
171
+ graph : networkx.Graph
172
+ A read-only graph view of the input graph.
173
+
174
+ Examples
175
+ --------
176
+ >>> G = nx.path_graph(6)
177
+
178
+ Filter functions operate on the node, and return `True` if the node should
179
+ appear in the view:
180
+
181
+ >>> def filter_node(n1):
182
+ ... return n1 != 5
183
+ >>> view = nx.subgraph_view(G, filter_node=filter_node)
184
+ >>> view.nodes()
185
+ NodeView((0, 1, 2, 3, 4))
186
+
187
+ We can use a closure pattern to filter graph elements based on additional
188
+ data --- for example, filtering on edge data attached to the graph:
189
+
190
+ >>> G[3][4]["cross_me"] = False
191
+ >>> def filter_edge(n1, n2):
192
+ ... return G[n1][n2].get("cross_me", True)
193
+ >>> view = nx.subgraph_view(G, filter_edge=filter_edge)
194
+ >>> view.edges()
195
+ EdgeView([(0, 1), (1, 2), (2, 3), (4, 5)])
196
+
197
+ >>> view = nx.subgraph_view(
198
+ ... G,
199
+ ... filter_node=filter_node,
200
+ ... filter_edge=filter_edge,
201
+ ... )
202
+ >>> view.nodes()
203
+ NodeView((0, 1, 2, 3, 4))
204
+ >>> view.edges()
205
+ EdgeView([(0, 1), (1, 2), (2, 3)])
206
+ """
207
+ newG = nx.freeze(G.__class__())
208
+ newG._NODE_OK = filter_node
209
+ newG._EDGE_OK = filter_edge
210
+
211
+ # create view by assigning attributes from G
212
+ newG._graph = G
213
+ newG.graph = G.graph
214
+
215
+ newG._node = FilterAtlas(G._node, filter_node)
216
+ if G.is_multigraph():
217
+ Adj = FilterMultiAdjacency
218
+
219
+ def reverse_edge(u, v, k=None):
220
+ return filter_edge(v, u, k)
221
+
222
+ else:
223
+ Adj = FilterAdjacency
224
+
225
+ def reverse_edge(u, v, k=None):
226
+ return filter_edge(v, u)
227
+
228
+ if G.is_directed():
229
+ newG._succ = Adj(G._succ, filter_node, filter_edge)
230
+ newG._pred = Adj(G._pred, filter_node, reverse_edge)
231
+ # newG._adj is synced with _succ
232
+ else:
233
+ newG._adj = Adj(G._adj, filter_node, filter_edge)
234
+ return newG
235
+
236
+
237
+ @not_implemented_for("undirected")
238
+ def reverse_view(G):
239
+ """View of `G` with edge directions reversed
240
+
241
+ `reverse_view` returns a read-only view of the input graph where
242
+ edge directions are reversed.
243
+
244
+ Identical to digraph.reverse(copy=False)
245
+
246
+ Parameters
247
+ ----------
248
+ G : networkx.DiGraph
249
+
250
+ Returns
251
+ -------
252
+ graph : networkx.DiGraph
253
+
254
+ Examples
255
+ --------
256
+ >>> G = nx.DiGraph()
257
+ >>> G.add_edge(1, 2)
258
+ >>> G.add_edge(2, 3)
259
+ >>> G.edges()
260
+ OutEdgeView([(1, 2), (2, 3)])
261
+
262
+ >>> view = nx.reverse_view(G)
263
+ >>> view.edges()
264
+ OutEdgeView([(2, 1), (3, 2)])
265
+ """
266
+ newG = generic_graph_view(G)
267
+ newG._succ, newG._pred = G._pred, G._succ
268
+ # newG._adj is synced with _succ
269
+ return newG
minigpt2/lib/python3.10/site-packages/networkx/classes/multidigraph.py ADDED
@@ -0,0 +1,966 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Base class for MultiDiGraph."""
2
+
3
+ from copy import deepcopy
4
+ from functools import cached_property
5
+
6
+ import networkx as nx
7
+ from networkx import convert
8
+ from networkx.classes.coreviews import MultiAdjacencyView
9
+ from networkx.classes.digraph import DiGraph
10
+ from networkx.classes.multigraph import MultiGraph
11
+ from networkx.classes.reportviews import (
12
+ DiMultiDegreeView,
13
+ InMultiDegreeView,
14
+ InMultiEdgeView,
15
+ OutMultiDegreeView,
16
+ OutMultiEdgeView,
17
+ )
18
+ from networkx.exception import NetworkXError
19
+
20
+ __all__ = ["MultiDiGraph"]
21
+
22
+
23
+ class MultiDiGraph(MultiGraph, DiGraph):
24
+ """A directed graph class that can store multiedges.
25
+
26
+ Multiedges are multiple edges between two nodes. Each edge
27
+ can hold optional data or attributes.
28
+
29
+ A MultiDiGraph holds directed edges. Self loops are allowed.
30
+
31
+ Nodes can be arbitrary (hashable) Python objects with optional
32
+ key/value attributes. By convention `None` is not used as a node.
33
+
34
+ Edges are represented as links between nodes with optional
35
+ key/value attributes.
36
+
37
+ Parameters
38
+ ----------
39
+ incoming_graph_data : input graph (optional, default: None)
40
+ Data to initialize graph. If None (default) an empty
41
+ graph is created. The data can be any format that is supported
42
+ by the to_networkx_graph() function, currently including edge list,
43
+ dict of dicts, dict of lists, NetworkX graph, 2D NumPy array, SciPy
44
+ sparse matrix, or PyGraphviz graph.
45
+
46
+ multigraph_input : bool or None (default None)
47
+ Note: Only used when `incoming_graph_data` is a dict.
48
+ If True, `incoming_graph_data` is assumed to be a
49
+ dict-of-dict-of-dict-of-dict structure keyed by
50
+ node to neighbor to edge keys to edge data for multi-edges.
51
+ A NetworkXError is raised if this is not the case.
52
+ If False, :func:`to_networkx_graph` is used to try to determine
53
+ the dict's graph data structure as either a dict-of-dict-of-dict
54
+ keyed by node to neighbor to edge data, or a dict-of-iterable
55
+ keyed by node to neighbors.
56
+ If None, the treatment for True is tried, but if it fails,
57
+ the treatment for False is tried.
58
+
59
+ attr : keyword arguments, optional (default= no attributes)
60
+ Attributes to add to graph as key=value pairs.
61
+
62
+ See Also
63
+ --------
64
+ Graph
65
+ DiGraph
66
+ MultiGraph
67
+
68
+ Examples
69
+ --------
70
+ Create an empty graph structure (a "null graph") with no nodes and
71
+ no edges.
72
+
73
+ >>> G = nx.MultiDiGraph()
74
+
75
+ G can be grown in several ways.
76
+
77
+ **Nodes:**
78
+
79
+ Add one node at a time:
80
+
81
+ >>> G.add_node(1)
82
+
83
+ Add the nodes from any container (a list, dict, set or
84
+ even the lines from a file or the nodes from another graph).
85
+
86
+ >>> G.add_nodes_from([2, 3])
87
+ >>> G.add_nodes_from(range(100, 110))
88
+ >>> H = nx.path_graph(10)
89
+ >>> G.add_nodes_from(H)
90
+
91
+ In addition to strings and integers any hashable Python object
92
+ (except None) can represent a node, e.g. a customized node object,
93
+ or even another Graph.
94
+
95
+ >>> G.add_node(H)
96
+
97
+ **Edges:**
98
+
99
+ G can also be grown by adding edges.
100
+
101
+ Add one edge,
102
+
103
+ >>> key = G.add_edge(1, 2)
104
+
105
+ a list of edges,
106
+
107
+ >>> keys = G.add_edges_from([(1, 2), (1, 3)])
108
+
109
+ or a collection of edges,
110
+
111
+ >>> keys = G.add_edges_from(H.edges)
112
+
113
+ If some edges connect nodes not yet in the graph, the nodes
114
+ are added automatically. If an edge already exists, an additional
115
+ edge is created and stored using a key to identify the edge.
116
+ By default the key is the lowest unused integer.
117
+
118
+ >>> keys = G.add_edges_from([(4, 5, dict(route=282)), (4, 5, dict(route=37))])
119
+ >>> G[4]
120
+ AdjacencyView({5: {0: {}, 1: {'route': 282}, 2: {'route': 37}}})
121
+
122
+ **Attributes:**
123
+
124
+ Each graph, node, and edge can hold key/value attribute pairs
125
+ in an associated attribute dictionary (the keys must be hashable).
126
+ By default these are empty, but can be added or changed using
127
+ add_edge, add_node or direct manipulation of the attribute
128
+ dictionaries named graph, node and edge respectively.
129
+
130
+ >>> G = nx.MultiDiGraph(day="Friday")
131
+ >>> G.graph
132
+ {'day': 'Friday'}
133
+
134
+ Add node attributes using add_node(), add_nodes_from() or G.nodes
135
+
136
+ >>> G.add_node(1, time="5pm")
137
+ >>> G.add_nodes_from([3], time="2pm")
138
+ >>> G.nodes[1]
139
+ {'time': '5pm'}
140
+ >>> G.nodes[1]["room"] = 714
141
+ >>> del G.nodes[1]["room"] # remove attribute
142
+ >>> list(G.nodes(data=True))
143
+ [(1, {'time': '5pm'}), (3, {'time': '2pm'})]
144
+
145
+ Add edge attributes using add_edge(), add_edges_from(), subscript
146
+ notation, or G.edges.
147
+
148
+ >>> key = G.add_edge(1, 2, weight=4.7)
149
+ >>> keys = G.add_edges_from([(3, 4), (4, 5)], color="red")
150
+ >>> keys = G.add_edges_from([(1, 2, {"color": "blue"}), (2, 3, {"weight": 8})])
151
+ >>> G[1][2][0]["weight"] = 4.7
152
+ >>> G.edges[1, 2, 0]["weight"] = 4
153
+
154
+ Warning: we protect the graph data structure by making `G.edges[1,
155
+ 2, 0]` a read-only dict-like structure. However, you can assign to
156
+ attributes in e.g. `G.edges[1, 2, 0]`. Thus, use 2 sets of brackets
157
+ to add/change data attributes: `G.edges[1, 2, 0]['weight'] = 4`
158
+ (for multigraphs the edge key is required: `MG.edges[u, v,
159
+ key][name] = value`).
160
+
161
+ **Shortcuts:**
162
+
163
+ Many common graph features allow python syntax to speed reporting.
164
+
165
+ >>> 1 in G # check if node in graph
166
+ True
167
+ >>> [n for n in G if n < 3] # iterate through nodes
168
+ [1, 2]
169
+ >>> len(G) # number of nodes in graph
170
+ 5
171
+ >>> G[1] # adjacency dict-like view mapping neighbor -> edge key -> edge attributes
172
+ AdjacencyView({2: {0: {'weight': 4}, 1: {'color': 'blue'}}})
173
+
174
+ Often the best way to traverse all edges of a graph is via the neighbors.
175
+ The neighbors are available as an adjacency-view `G.adj` object or via
176
+ the method `G.adjacency()`.
177
+
178
+ >>> for n, nbrsdict in G.adjacency():
179
+ ... for nbr, keydict in nbrsdict.items():
180
+ ... for key, eattr in keydict.items():
181
+ ... if "weight" in eattr:
182
+ ... # Do something useful with the edges
183
+ ... pass
184
+
185
+ But the edges() method is often more convenient:
186
+
187
+ >>> for u, v, keys, weight in G.edges(data="weight", keys=True):
188
+ ... if weight is not None:
189
+ ... # Do something useful with the edges
190
+ ... pass
191
+
192
+ **Reporting:**
193
+
194
+ Simple graph information is obtained using methods and object-attributes.
195
+ Reporting usually provides views instead of containers to reduce memory
196
+ usage. The views update as the graph is updated similarly to dict-views.
197
+ The objects `nodes`, `edges` and `adj` provide access to data attributes
198
+ via lookup (e.g. `nodes[n]`, `edges[u, v, k]`, `adj[u][v]`) and iteration
199
+ (e.g. `nodes.items()`, `nodes.data('color')`,
200
+ `nodes.data('color', default='blue')` and similarly for `edges`)
201
+ Views exist for `nodes`, `edges`, `neighbors()`/`adj` and `degree`.
202
+
203
+ For details on these and other miscellaneous methods, see below.
204
+
205
+ **Subclasses (Advanced):**
206
+
207
+ The MultiDiGraph class uses a dict-of-dict-of-dict-of-dict structure.
208
+ The outer dict (node_dict) holds adjacency information keyed by node.
209
+ The next dict (adjlist_dict) represents the adjacency information
210
+ and holds edge_key dicts keyed by neighbor. The edge_key dict holds
211
+ each edge_attr dict keyed by edge key. The inner dict
212
+ (edge_attr_dict) represents the edge data and holds edge attribute
213
+ values keyed by attribute names.
214
+
215
+ Each of these four dicts in the dict-of-dict-of-dict-of-dict
216
+ structure can be replaced by a user defined dict-like object.
217
+ In general, the dict-like features should be maintained but
218
+ extra features can be added. To replace one of the dicts create
219
+ a new graph class by changing the class(!) variable holding the
220
+ factory for that dict-like structure. The variable names are
221
+ node_dict_factory, node_attr_dict_factory, adjlist_inner_dict_factory,
222
+ adjlist_outer_dict_factory, edge_key_dict_factory, edge_attr_dict_factory
223
+ and graph_attr_dict_factory.
224
+
225
+ node_dict_factory : function, (default: dict)
226
+ Factory function to be used to create the dict containing node
227
+ attributes, keyed by node id.
228
+ It should require no arguments and return a dict-like object
229
+
230
+ node_attr_dict_factory: function, (default: dict)
231
+ Factory function to be used to create the node attribute
232
+ dict which holds attribute values keyed by attribute name.
233
+ It should require no arguments and return a dict-like object
234
+
235
+ adjlist_outer_dict_factory : function, (default: dict)
236
+ Factory function to be used to create the outer-most dict
237
+ in the data structure that holds adjacency info keyed by node.
238
+ It should require no arguments and return a dict-like object.
239
+
240
+ adjlist_inner_dict_factory : function, (default: dict)
241
+ Factory function to be used to create the adjacency list
242
+ dict which holds multiedge key dicts keyed by neighbor.
243
+ It should require no arguments and return a dict-like object.
244
+
245
+ edge_key_dict_factory : function, (default: dict)
246
+ Factory function to be used to create the edge key dict
247
+ which holds edge data keyed by edge key.
248
+ It should require no arguments and return a dict-like object.
249
+
250
+ edge_attr_dict_factory : function, (default: dict)
251
+ Factory function to be used to create the edge attribute
252
+ dict which holds attribute values keyed by attribute name.
253
+ It should require no arguments and return a dict-like object.
254
+
255
+ graph_attr_dict_factory : function, (default: dict)
256
+ Factory function to be used to create the graph attribute
257
+ dict which holds attribute values keyed by attribute name.
258
+ It should require no arguments and return a dict-like object.
259
+
260
+ Typically, if your extension doesn't impact the data structure all
261
+ methods will inherited without issue except: `to_directed/to_undirected`.
262
+ By default these methods create a DiGraph/Graph class and you probably
263
+ want them to create your extension of a DiGraph/Graph. To facilitate
264
+ this we define two class variables that you can set in your subclass.
265
+
266
+ to_directed_class : callable, (default: DiGraph or MultiDiGraph)
267
+ Class to create a new graph structure in the `to_directed` method.
268
+ If `None`, a NetworkX class (DiGraph or MultiDiGraph) is used.
269
+
270
+ to_undirected_class : callable, (default: Graph or MultiGraph)
271
+ Class to create a new graph structure in the `to_undirected` method.
272
+ If `None`, a NetworkX class (Graph or MultiGraph) is used.
273
+
274
+ **Subclassing Example**
275
+
276
+ Create a low memory graph class that effectively disallows edge
277
+ attributes by using a single attribute dict for all edges.
278
+ This reduces the memory used, but you lose edge attributes.
279
+
280
+ >>> class ThinGraph(nx.Graph):
281
+ ... all_edge_dict = {"weight": 1}
282
+ ...
283
+ ... def single_edge_dict(self):
284
+ ... return self.all_edge_dict
285
+ ...
286
+ ... edge_attr_dict_factory = single_edge_dict
287
+ >>> G = ThinGraph()
288
+ >>> G.add_edge(2, 1)
289
+ >>> G[2][1]
290
+ {'weight': 1}
291
+ >>> G.add_edge(2, 2)
292
+ >>> G[2][1] is G[2][2]
293
+ True
294
+ """
295
+
296
+ # node_dict_factory = dict # already assigned in Graph
297
+ # adjlist_outer_dict_factory = dict
298
+ # adjlist_inner_dict_factory = dict
299
+ edge_key_dict_factory = dict
300
+ # edge_attr_dict_factory = dict
301
+
302
+ def __init__(self, incoming_graph_data=None, multigraph_input=None, **attr):
303
+ """Initialize a graph with edges, name, or graph attributes.
304
+
305
+ Parameters
306
+ ----------
307
+ incoming_graph_data : input graph
308
+ Data to initialize graph. If incoming_graph_data=None (default)
309
+ an empty graph is created. The data can be an edge list, or any
310
+ NetworkX graph object. If the corresponding optional Python
311
+ packages are installed the data can also be a 2D NumPy array, a
312
+ SciPy sparse array, or a PyGraphviz graph.
313
+
314
+ multigraph_input : bool or None (default None)
315
+ Note: Only used when `incoming_graph_data` is a dict.
316
+ If True, `incoming_graph_data` is assumed to be a
317
+ dict-of-dict-of-dict-of-dict structure keyed by
318
+ node to neighbor to edge keys to edge data for multi-edges.
319
+ A NetworkXError is raised if this is not the case.
320
+ If False, :func:`to_networkx_graph` is used to try to determine
321
+ the dict's graph data structure as either a dict-of-dict-of-dict
322
+ keyed by node to neighbor to edge data, or a dict-of-iterable
323
+ keyed by node to neighbors.
324
+ If None, the treatment for True is tried, but if it fails,
325
+ the treatment for False is tried.
326
+
327
+ attr : keyword arguments, optional (default= no attributes)
328
+ Attributes to add to graph as key=value pairs.
329
+
330
+ See Also
331
+ --------
332
+ convert
333
+
334
+ Examples
335
+ --------
336
+ >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc
337
+ >>> G = nx.Graph(name="my graph")
338
+ >>> e = [(1, 2), (2, 3), (3, 4)] # list of edges
339
+ >>> G = nx.Graph(e)
340
+
341
+ Arbitrary graph attribute pairs (key=value) may be assigned
342
+
343
+ >>> G = nx.Graph(e, day="Friday")
344
+ >>> G.graph
345
+ {'day': 'Friday'}
346
+
347
+ """
348
+ # multigraph_input can be None/True/False. So check "is not False"
349
+ if isinstance(incoming_graph_data, dict) and multigraph_input is not False:
350
+ DiGraph.__init__(self)
351
+ try:
352
+ convert.from_dict_of_dicts(
353
+ incoming_graph_data, create_using=self, multigraph_input=True
354
+ )
355
+ self.graph.update(attr)
356
+ except Exception as err:
357
+ if multigraph_input is True:
358
+ raise nx.NetworkXError(
359
+ f"converting multigraph_input raised:\n{type(err)}: {err}"
360
+ )
361
+ DiGraph.__init__(self, incoming_graph_data, **attr)
362
+ else:
363
+ DiGraph.__init__(self, incoming_graph_data, **attr)
364
+
365
+ @cached_property
366
+ def adj(self):
367
+ """Graph adjacency object holding the neighbors of each node.
368
+
369
+ This object is a read-only dict-like structure with node keys
370
+ and neighbor-dict values. The neighbor-dict is keyed by neighbor
371
+ to the edgekey-dict. So `G.adj[3][2][0]['color'] = 'blue'` sets
372
+ the color of the edge `(3, 2, 0)` to `"blue"`.
373
+
374
+ Iterating over G.adj behaves like a dict. Useful idioms include
375
+ `for nbr, datadict in G.adj[n].items():`.
376
+
377
+ The neighbor information is also provided by subscripting the graph.
378
+ So `for nbr, foovalue in G[node].data('foo', default=1):` works.
379
+
380
+ For directed graphs, `G.adj` holds outgoing (successor) info.
381
+ """
382
+ return MultiAdjacencyView(self._succ)
383
+
384
+ @cached_property
385
+ def succ(self):
386
+ """Graph adjacency object holding the successors of each node.
387
+
388
+ This object is a read-only dict-like structure with node keys
389
+ and neighbor-dict values. The neighbor-dict is keyed by neighbor
390
+ to the edgekey-dict. So `G.adj[3][2][0]['color'] = 'blue'` sets
391
+ the color of the edge `(3, 2, 0)` to `"blue"`.
392
+
393
+ Iterating over G.adj behaves like a dict. Useful idioms include
394
+ `for nbr, datadict in G.adj[n].items():`.
395
+
396
+ The neighbor information is also provided by subscripting the graph.
397
+ So `for nbr, foovalue in G[node].data('foo', default=1):` works.
398
+
399
+ For directed graphs, `G.succ` is identical to `G.adj`.
400
+ """
401
+ return MultiAdjacencyView(self._succ)
402
+
403
+ @cached_property
404
+ def pred(self):
405
+ """Graph adjacency object holding the predecessors of each node.
406
+
407
+ This object is a read-only dict-like structure with node keys
408
+ and neighbor-dict values. The neighbor-dict is keyed by neighbor
409
+ to the edgekey-dict. So `G.adj[3][2][0]['color'] = 'blue'` sets
410
+ the color of the edge `(3, 2, 0)` to `"blue"`.
411
+
412
+ Iterating over G.adj behaves like a dict. Useful idioms include
413
+ `for nbr, datadict in G.adj[n].items():`.
414
+ """
415
+ return MultiAdjacencyView(self._pred)
416
+
417
+ def add_edge(self, u_for_edge, v_for_edge, key=None, **attr):
418
+ """Add an edge between u and v.
419
+
420
+ The nodes u and v will be automatically added if they are
421
+ not already in the graph.
422
+
423
+ Edge attributes can be specified with keywords or by directly
424
+ accessing the edge's attribute dictionary. See examples below.
425
+
426
+ Parameters
427
+ ----------
428
+ u_for_edge, v_for_edge : nodes
429
+ Nodes can be, for example, strings or numbers.
430
+ Nodes must be hashable (and not None) Python objects.
431
+ key : hashable identifier, optional (default=lowest unused integer)
432
+ Used to distinguish multiedges between a pair of nodes.
433
+ attr : keyword arguments, optional
434
+ Edge data (or labels or objects) can be assigned using
435
+ keyword arguments.
436
+
437
+ Returns
438
+ -------
439
+ The edge key assigned to the edge.
440
+
441
+ See Also
442
+ --------
443
+ add_edges_from : add a collection of edges
444
+
445
+ Notes
446
+ -----
447
+ To replace/update edge data, use the optional key argument
448
+ to identify a unique edge. Otherwise a new edge will be created.
449
+
450
+ NetworkX algorithms designed for weighted graphs cannot use
451
+ multigraphs directly because it is not clear how to handle
452
+ multiedge weights. Convert to Graph using edge attribute
453
+ 'weight' to enable weighted graph algorithms.
454
+
455
+ Default keys are generated using the method `new_edge_key()`.
456
+ This method can be overridden by subclassing the base class and
457
+ providing a custom `new_edge_key()` method.
458
+
459
+ Examples
460
+ --------
461
+ The following all add the edge e=(1, 2) to graph G:
462
+
463
+ >>> G = nx.MultiDiGraph()
464
+ >>> e = (1, 2)
465
+ >>> key = G.add_edge(1, 2) # explicit two-node form
466
+ >>> G.add_edge(*e) # single edge as tuple of two nodes
467
+ 1
468
+ >>> G.add_edges_from([(1, 2)]) # add edges from iterable container
469
+ [2]
470
+
471
+ Associate data to edges using keywords:
472
+
473
+ >>> key = G.add_edge(1, 2, weight=3)
474
+ >>> key = G.add_edge(1, 2, key=0, weight=4) # update data for key=0
475
+ >>> key = G.add_edge(1, 3, weight=7, capacity=15, length=342.7)
476
+
477
+ For non-string attribute keys, use subscript notation.
478
+
479
+ >>> ekey = G.add_edge(1, 2)
480
+ >>> G[1][2][0].update({0: 5})
481
+ >>> G.edges[1, 2, 0].update({0: 5})
482
+ """
483
+ u, v = u_for_edge, v_for_edge
484
+ # add nodes
485
+ if u not in self._succ:
486
+ if u is None:
487
+ raise ValueError("None cannot be a node")
488
+ self._succ[u] = self.adjlist_inner_dict_factory()
489
+ self._pred[u] = self.adjlist_inner_dict_factory()
490
+ self._node[u] = self.node_attr_dict_factory()
491
+ if v not in self._succ:
492
+ if v is None:
493
+ raise ValueError("None cannot be a node")
494
+ self._succ[v] = self.adjlist_inner_dict_factory()
495
+ self._pred[v] = self.adjlist_inner_dict_factory()
496
+ self._node[v] = self.node_attr_dict_factory()
497
+ if key is None:
498
+ key = self.new_edge_key(u, v)
499
+ if v in self._succ[u]:
500
+ keydict = self._adj[u][v]
501
+ datadict = keydict.get(key, self.edge_attr_dict_factory())
502
+ datadict.update(attr)
503
+ keydict[key] = datadict
504
+ else:
505
+ # selfloops work this way without special treatment
506
+ datadict = self.edge_attr_dict_factory()
507
+ datadict.update(attr)
508
+ keydict = self.edge_key_dict_factory()
509
+ keydict[key] = datadict
510
+ self._succ[u][v] = keydict
511
+ self._pred[v][u] = keydict
512
+ nx._clear_cache(self)
513
+ return key
514
+
515
+ def remove_edge(self, u, v, key=None):
516
+ """Remove an edge between u and v.
517
+
518
+ Parameters
519
+ ----------
520
+ u, v : nodes
521
+ Remove an edge between nodes u and v.
522
+ key : hashable identifier, optional (default=None)
523
+ Used to distinguish multiple edges between a pair of nodes.
524
+ If None, remove a single edge between u and v. If there are
525
+ multiple edges, removes the last edge added in terms of
526
+ insertion order.
527
+
528
+ Raises
529
+ ------
530
+ NetworkXError
531
+ If there is not an edge between u and v, or
532
+ if there is no edge with the specified key.
533
+
534
+ See Also
535
+ --------
536
+ remove_edges_from : remove a collection of edges
537
+
538
+ Examples
539
+ --------
540
+ >>> G = nx.MultiDiGraph()
541
+ >>> nx.add_path(G, [0, 1, 2, 3])
542
+ >>> G.remove_edge(0, 1)
543
+ >>> e = (1, 2)
544
+ >>> G.remove_edge(*e) # unpacks e from an edge tuple
545
+
546
+ For multiple edges
547
+
548
+ >>> G = nx.MultiDiGraph()
549
+ >>> G.add_edges_from([(1, 2), (1, 2), (1, 2)]) # key_list returned
550
+ [0, 1, 2]
551
+
552
+ When ``key=None`` (the default), edges are removed in the opposite
553
+ order that they were added:
554
+
555
+ >>> G.remove_edge(1, 2)
556
+ >>> G.edges(keys=True)
557
+ OutMultiEdgeView([(1, 2, 0), (1, 2, 1)])
558
+
559
+ For edges with keys
560
+
561
+ >>> G = nx.MultiDiGraph()
562
+ >>> G.add_edge(1, 2, key="first")
563
+ 'first'
564
+ >>> G.add_edge(1, 2, key="second")
565
+ 'second'
566
+ >>> G.remove_edge(1, 2, key="first")
567
+ >>> G.edges(keys=True)
568
+ OutMultiEdgeView([(1, 2, 'second')])
569
+
570
+ """
571
+ try:
572
+ d = self._adj[u][v]
573
+ except KeyError as err:
574
+ raise NetworkXError(f"The edge {u}-{v} is not in the graph.") from err
575
+ # remove the edge with specified data
576
+ if key is None:
577
+ d.popitem()
578
+ else:
579
+ try:
580
+ del d[key]
581
+ except KeyError as err:
582
+ msg = f"The edge {u}-{v} with key {key} is not in the graph."
583
+ raise NetworkXError(msg) from err
584
+ if len(d) == 0:
585
+ # remove the key entries if last edge
586
+ del self._succ[u][v]
587
+ del self._pred[v][u]
588
+ nx._clear_cache(self)
589
+
590
+ @cached_property
591
+ def edges(self):
592
+ """An OutMultiEdgeView of the Graph as G.edges or G.edges().
593
+
594
+ edges(self, nbunch=None, data=False, keys=False, default=None)
595
+
596
+ The OutMultiEdgeView provides set-like operations on the edge-tuples
597
+ as well as edge attribute lookup. When called, it also provides
598
+ an EdgeDataView object which allows control of access to edge
599
+ attributes (but does not provide set-like operations).
600
+ Hence, ``G.edges[u, v, k]['color']`` provides the value of the color
601
+ attribute for the edge from ``u`` to ``v`` with key ``k`` while
602
+ ``for (u, v, k, c) in G.edges(data='color', default='red', keys=True):``
603
+ iterates through all the edges yielding the color attribute with
604
+ default `'red'` if no color attribute exists.
605
+
606
+ Edges are returned as tuples with optional data and keys
607
+ in the order (node, neighbor, key, data). If ``keys=True`` is not
608
+ provided, the tuples will just be (node, neighbor, data), but
609
+ multiple tuples with the same node and neighbor will be
610
+ generated when multiple edges between two nodes exist.
611
+
612
+ Parameters
613
+ ----------
614
+ nbunch : single node, container, or all nodes (default= all nodes)
615
+ The view will only report edges from these nodes.
616
+ data : string or bool, optional (default=False)
617
+ The edge attribute returned in 3-tuple (u, v, ddict[data]).
618
+ If True, return edge attribute dict in 3-tuple (u, v, ddict).
619
+ If False, return 2-tuple (u, v).
620
+ keys : bool, optional (default=False)
621
+ If True, return edge keys with each edge, creating (u, v, k,
622
+ d) tuples when data is also requested (the default) and (u,
623
+ v, k) tuples when data is not requested.
624
+ default : value, optional (default=None)
625
+ Value used for edges that don't have the requested attribute.
626
+ Only relevant if data is not True or False.
627
+
628
+ Returns
629
+ -------
630
+ edges : OutMultiEdgeView
631
+ A view of edge attributes, usually it iterates over (u, v)
632
+ (u, v, k) or (u, v, k, d) tuples of edges, but can also be
633
+ used for attribute lookup as ``edges[u, v, k]['foo']``.
634
+
635
+ Notes
636
+ -----
637
+ Nodes in nbunch that are not in the graph will be (quietly) ignored.
638
+ For directed graphs this returns the out-edges.
639
+
640
+ Examples
641
+ --------
642
+ >>> G = nx.MultiDiGraph()
643
+ >>> nx.add_path(G, [0, 1, 2])
644
+ >>> key = G.add_edge(2, 3, weight=5)
645
+ >>> key2 = G.add_edge(1, 2) # second edge between these nodes
646
+ >>> [e for e in G.edges()]
647
+ [(0, 1), (1, 2), (1, 2), (2, 3)]
648
+ >>> list(G.edges(data=True)) # default data is {} (empty dict)
649
+ [(0, 1, {}), (1, 2, {}), (1, 2, {}), (2, 3, {'weight': 5})]
650
+ >>> list(G.edges(data="weight", default=1))
651
+ [(0, 1, 1), (1, 2, 1), (1, 2, 1), (2, 3, 5)]
652
+ >>> list(G.edges(keys=True)) # default keys are integers
653
+ [(0, 1, 0), (1, 2, 0), (1, 2, 1), (2, 3, 0)]
654
+ >>> list(G.edges(data=True, keys=True))
655
+ [(0, 1, 0, {}), (1, 2, 0, {}), (1, 2, 1, {}), (2, 3, 0, {'weight': 5})]
656
+ >>> list(G.edges(data="weight", default=1, keys=True))
657
+ [(0, 1, 0, 1), (1, 2, 0, 1), (1, 2, 1, 1), (2, 3, 0, 5)]
658
+ >>> list(G.edges([0, 2]))
659
+ [(0, 1), (2, 3)]
660
+ >>> list(G.edges(0))
661
+ [(0, 1)]
662
+ >>> list(G.edges(1))
663
+ [(1, 2), (1, 2)]
664
+
665
+ See Also
666
+ --------
667
+ in_edges, out_edges
668
+ """
669
+ return OutMultiEdgeView(self)
670
+
671
+ # alias out_edges to edges
672
+ @cached_property
673
+ def out_edges(self):
674
+ return OutMultiEdgeView(self)
675
+
676
+ out_edges.__doc__ = edges.__doc__
677
+
678
+ @cached_property
679
+ def in_edges(self):
680
+ """A view of the in edges of the graph as G.in_edges or G.in_edges().
681
+
682
+ in_edges(self, nbunch=None, data=False, keys=False, default=None)
683
+
684
+ Parameters
685
+ ----------
686
+ nbunch : single node, container, or all nodes (default= all nodes)
687
+ The view will only report edges incident to these nodes.
688
+ data : string or bool, optional (default=False)
689
+ The edge attribute returned in 3-tuple (u, v, ddict[data]).
690
+ If True, return edge attribute dict in 3-tuple (u, v, ddict).
691
+ If False, return 2-tuple (u, v).
692
+ keys : bool, optional (default=False)
693
+ If True, return edge keys with each edge, creating 3-tuples
694
+ (u, v, k) or with data, 4-tuples (u, v, k, d).
695
+ default : value, optional (default=None)
696
+ Value used for edges that don't have the requested attribute.
697
+ Only relevant if data is not True or False.
698
+
699
+ Returns
700
+ -------
701
+ in_edges : InMultiEdgeView or InMultiEdgeDataView
702
+ A view of edge attributes, usually it iterates over (u, v)
703
+ or (u, v, k) or (u, v, k, d) tuples of edges, but can also be
704
+ used for attribute lookup as `edges[u, v, k]['foo']`.
705
+
706
+ See Also
707
+ --------
708
+ edges
709
+ """
710
+ return InMultiEdgeView(self)
711
+
712
+ @cached_property
713
+ def degree(self):
714
+ """A DegreeView for the Graph as G.degree or G.degree().
715
+
716
+ The node degree is the number of edges adjacent to the node.
717
+ The weighted node degree is the sum of the edge weights for
718
+ edges incident to that node.
719
+
720
+ This object provides an iterator for (node, degree) as well as
721
+ lookup for the degree for a single node.
722
+
723
+ Parameters
724
+ ----------
725
+ nbunch : single node, container, or all nodes (default= all nodes)
726
+ The view will only report edges incident to these nodes.
727
+
728
+ weight : string or None, optional (default=None)
729
+ The name of an edge attribute that holds the numerical value used
730
+ as a weight. If None, then each edge has weight 1.
731
+ The degree is the sum of the edge weights adjacent to the node.
732
+
733
+ Returns
734
+ -------
735
+ DiMultiDegreeView or int
736
+ If multiple nodes are requested (the default), returns a `DiMultiDegreeView`
737
+ mapping nodes to their degree.
738
+ If a single node is requested, returns the degree of the node as an integer.
739
+
740
+ See Also
741
+ --------
742
+ out_degree, in_degree
743
+
744
+ Examples
745
+ --------
746
+ >>> G = nx.MultiDiGraph()
747
+ >>> nx.add_path(G, [0, 1, 2, 3])
748
+ >>> G.degree(0) # node 0 with degree 1
749
+ 1
750
+ >>> list(G.degree([0, 1, 2]))
751
+ [(0, 1), (1, 2), (2, 2)]
752
+ >>> G.add_edge(0, 1) # parallel edge
753
+ 1
754
+ >>> list(G.degree([0, 1, 2])) # parallel edges are counted
755
+ [(0, 2), (1, 3), (2, 2)]
756
+
757
+ """
758
+ return DiMultiDegreeView(self)
759
+
760
+ @cached_property
761
+ def in_degree(self):
762
+ """A DegreeView for (node, in_degree) or in_degree for single node.
763
+
764
+ The node in-degree is the number of edges pointing into the node.
765
+ The weighted node degree is the sum of the edge weights for
766
+ edges incident to that node.
767
+
768
+ This object provides an iterator for (node, degree) as well as
769
+ lookup for the degree for a single node.
770
+
771
+ Parameters
772
+ ----------
773
+ nbunch : single node, container, or all nodes (default= all nodes)
774
+ The view will only report edges incident to these nodes.
775
+
776
+ weight : string or None, optional (default=None)
777
+ The edge attribute that holds the numerical value used
778
+ as a weight. If None, then each edge has weight 1.
779
+ The degree is the sum of the edge weights adjacent to the node.
780
+
781
+ Returns
782
+ -------
783
+ If a single node is requested
784
+ deg : int
785
+ Degree of the node
786
+
787
+ OR if multiple nodes are requested
788
+ nd_iter : iterator
789
+ The iterator returns two-tuples of (node, in-degree).
790
+
791
+ See Also
792
+ --------
793
+ degree, out_degree
794
+
795
+ Examples
796
+ --------
797
+ >>> G = nx.MultiDiGraph()
798
+ >>> nx.add_path(G, [0, 1, 2, 3])
799
+ >>> G.in_degree(0) # node 0 with degree 0
800
+ 0
801
+ >>> list(G.in_degree([0, 1, 2]))
802
+ [(0, 0), (1, 1), (2, 1)]
803
+ >>> G.add_edge(0, 1) # parallel edge
804
+ 1
805
+ >>> list(G.in_degree([0, 1, 2])) # parallel edges counted
806
+ [(0, 0), (1, 2), (2, 1)]
807
+
808
+ """
809
+ return InMultiDegreeView(self)
810
+
811
+ @cached_property
812
+ def out_degree(self):
813
+ """Returns an iterator for (node, out-degree) or out-degree for single node.
814
+
815
+ out_degree(self, nbunch=None, weight=None)
816
+
817
+ The node out-degree is the number of edges pointing out of the node.
818
+ This function returns the out-degree for a single node or an iterator
819
+ for a bunch of nodes or if nothing is passed as argument.
820
+
821
+ Parameters
822
+ ----------
823
+ nbunch : single node, container, or all nodes (default= all nodes)
824
+ The view will only report edges incident to these nodes.
825
+
826
+ weight : string or None, optional (default=None)
827
+ The edge attribute that holds the numerical value used
828
+ as a weight. If None, then each edge has weight 1.
829
+ The degree is the sum of the edge weights.
830
+
831
+ Returns
832
+ -------
833
+ If a single node is requested
834
+ deg : int
835
+ Degree of the node
836
+
837
+ OR if multiple nodes are requested
838
+ nd_iter : iterator
839
+ The iterator returns two-tuples of (node, out-degree).
840
+
841
+ See Also
842
+ --------
843
+ degree, in_degree
844
+
845
+ Examples
846
+ --------
847
+ >>> G = nx.MultiDiGraph()
848
+ >>> nx.add_path(G, [0, 1, 2, 3])
849
+ >>> G.out_degree(0) # node 0 with degree 1
850
+ 1
851
+ >>> list(G.out_degree([0, 1, 2]))
852
+ [(0, 1), (1, 1), (2, 1)]
853
+ >>> G.add_edge(0, 1) # parallel edge
854
+ 1
855
+ >>> list(G.out_degree([0, 1, 2])) # counts parallel edges
856
+ [(0, 2), (1, 1), (2, 1)]
857
+
858
+ """
859
+ return OutMultiDegreeView(self)
860
+
861
+ def is_multigraph(self):
862
+ """Returns True if graph is a multigraph, False otherwise."""
863
+ return True
864
+
865
+ def is_directed(self):
866
+ """Returns True if graph is directed, False otherwise."""
867
+ return True
868
+
869
+ def to_undirected(self, reciprocal=False, as_view=False):
870
+ """Returns an undirected representation of the digraph.
871
+
872
+ Parameters
873
+ ----------
874
+ reciprocal : bool (optional)
875
+ If True only keep edges that appear in both directions
876
+ in the original digraph.
877
+ as_view : bool (optional, default=False)
878
+ If True return an undirected view of the original directed graph.
879
+
880
+ Returns
881
+ -------
882
+ G : MultiGraph
883
+ An undirected graph with the same name and nodes and
884
+ with edge (u, v, data) if either (u, v, data) or (v, u, data)
885
+ is in the digraph. If both edges exist in digraph and
886
+ their edge data is different, only one edge is created
887
+ with an arbitrary choice of which edge data to use.
888
+ You must check and correct for this manually if desired.
889
+
890
+ See Also
891
+ --------
892
+ MultiGraph, copy, add_edge, add_edges_from
893
+
894
+ Notes
895
+ -----
896
+ This returns a "deepcopy" of the edge, node, and
897
+ graph attributes which attempts to completely copy
898
+ all of the data and references.
899
+
900
+ This is in contrast to the similar D=MultiDiGraph(G) which
901
+ returns a shallow copy of the data.
902
+
903
+ See the Python copy module for more information on shallow
904
+ and deep copies, https://docs.python.org/3/library/copy.html.
905
+
906
+ Warning: If you have subclassed MultiDiGraph to use dict-like
907
+ objects in the data structure, those changes do not transfer
908
+ to the MultiGraph created by this method.
909
+
910
+ Examples
911
+ --------
912
+ >>> G = nx.path_graph(2) # or MultiGraph, etc
913
+ >>> H = G.to_directed()
914
+ >>> list(H.edges)
915
+ [(0, 1), (1, 0)]
916
+ >>> G2 = H.to_undirected()
917
+ >>> list(G2.edges)
918
+ [(0, 1)]
919
+ """
920
+ graph_class = self.to_undirected_class()
921
+ if as_view is True:
922
+ return nx.graphviews.generic_graph_view(self, graph_class)
923
+ # deepcopy when not a view
924
+ G = graph_class()
925
+ G.graph.update(deepcopy(self.graph))
926
+ G.add_nodes_from((n, deepcopy(d)) for n, d in self._node.items())
927
+ if reciprocal is True:
928
+ G.add_edges_from(
929
+ (u, v, key, deepcopy(data))
930
+ for u, nbrs in self._adj.items()
931
+ for v, keydict in nbrs.items()
932
+ for key, data in keydict.items()
933
+ if v in self._pred[u] and key in self._pred[u][v]
934
+ )
935
+ else:
936
+ G.add_edges_from(
937
+ (u, v, key, deepcopy(data))
938
+ for u, nbrs in self._adj.items()
939
+ for v, keydict in nbrs.items()
940
+ for key, data in keydict.items()
941
+ )
942
+ return G
943
+
944
+ def reverse(self, copy=True):
945
+ """Returns the reverse of the graph.
946
+
947
+ The reverse is a graph with the same nodes and edges
948
+ but with the directions of the edges reversed.
949
+
950
+ Parameters
951
+ ----------
952
+ copy : bool optional (default=True)
953
+ If True, return a new DiGraph holding the reversed edges.
954
+ If False, the reverse graph is created using a view of
955
+ the original graph.
956
+ """
957
+ if copy:
958
+ H = self.__class__()
959
+ H.graph.update(deepcopy(self.graph))
960
+ H.add_nodes_from((n, deepcopy(d)) for n, d in self._node.items())
961
+ H.add_edges_from(
962
+ (v, u, k, deepcopy(d))
963
+ for u, v, k, d in self.edges(keys=True, data=True)
964
+ )
965
+ return H
966
+ return nx.reverse_view(self)
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__init__.py ADDED
File without changes
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (176 Bytes). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/dispatch_interface.cpython-310.pyc ADDED
Binary file (5.49 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/historical_tests.cpython-310.pyc ADDED
Binary file (14 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_coreviews.cpython-310.pyc ADDED
Binary file (13.4 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_digraph.cpython-310.pyc ADDED
Binary file (13.2 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_digraph_historical.cpython-310.pyc ADDED
Binary file (4.88 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_graph.cpython-310.pyc ADDED
Binary file (31.7 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_graph_historical.cpython-310.pyc ADDED
Binary file (693 Bytes). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_multidigraph.cpython-310.pyc ADDED
Binary file (14.7 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_reportviews.cpython-310.pyc ADDED
Binary file (41.6 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_special.cpython-310.pyc ADDED
Binary file (5.16 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/__pycache__/test_subgraphviews.cpython-310.pyc ADDED
Binary file (12.8 kB). View file
 
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_coreviews.py ADDED
@@ -0,0 +1,362 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import pickle
2
+
3
+ import pytest
4
+
5
+ import networkx as nx
6
+
7
+
8
+ class TestAtlasView:
9
+ # node->data
10
+ def setup_method(self):
11
+ self.d = {0: {"color": "blue", "weight": 1.2}, 1: {}, 2: {"color": 1}}
12
+ self.av = nx.classes.coreviews.AtlasView(self.d)
13
+
14
+ def test_pickle(self):
15
+ view = self.av
16
+ pview = pickle.loads(pickle.dumps(view, -1))
17
+ assert view == pview
18
+ assert view.__slots__ == pview.__slots__
19
+ pview = pickle.loads(pickle.dumps(view))
20
+ assert view == pview
21
+ assert view.__slots__ == pview.__slots__
22
+
23
+ def test_len(self):
24
+ assert len(self.av) == len(self.d)
25
+
26
+ def test_iter(self):
27
+ assert list(self.av) == list(self.d)
28
+
29
+ def test_getitem(self):
30
+ assert self.av[1] is self.d[1]
31
+ assert self.av[2]["color"] == 1
32
+ pytest.raises(KeyError, self.av.__getitem__, 3)
33
+
34
+ def test_copy(self):
35
+ avcopy = self.av.copy()
36
+ assert avcopy[0] == self.av[0]
37
+ assert avcopy == self.av
38
+ assert avcopy[0] is not self.av[0]
39
+ assert avcopy is not self.av
40
+ avcopy[5] = {}
41
+ assert avcopy != self.av
42
+
43
+ avcopy[0]["ht"] = 4
44
+ assert avcopy[0] != self.av[0]
45
+ self.av[0]["ht"] = 4
46
+ assert avcopy[0] == self.av[0]
47
+ del self.av[0]["ht"]
48
+
49
+ assert not hasattr(self.av, "__setitem__")
50
+
51
+ def test_items(self):
52
+ assert sorted(self.av.items()) == sorted(self.d.items())
53
+
54
+ def test_str(self):
55
+ out = str(self.d)
56
+ assert str(self.av) == out
57
+
58
+ def test_repr(self):
59
+ out = "AtlasView(" + str(self.d) + ")"
60
+ assert repr(self.av) == out
61
+
62
+
63
+ class TestAdjacencyView:
64
+ # node->nbr->data
65
+ def setup_method(self):
66
+ dd = {"color": "blue", "weight": 1.2}
67
+ self.nd = {0: dd, 1: {}, 2: {"color": 1}}
68
+ self.adj = {3: self.nd, 0: {3: dd}, 1: {}, 2: {3: {"color": 1}}}
69
+ self.adjview = nx.classes.coreviews.AdjacencyView(self.adj)
70
+
71
+ def test_pickle(self):
72
+ view = self.adjview
73
+ pview = pickle.loads(pickle.dumps(view, -1))
74
+ assert view == pview
75
+ assert view.__slots__ == pview.__slots__
76
+
77
+ def test_len(self):
78
+ assert len(self.adjview) == len(self.adj)
79
+
80
+ def test_iter(self):
81
+ assert list(self.adjview) == list(self.adj)
82
+
83
+ def test_getitem(self):
84
+ assert self.adjview[1] is not self.adj[1]
85
+ assert self.adjview[3][0] is self.adjview[0][3]
86
+ assert self.adjview[2][3]["color"] == 1
87
+ pytest.raises(KeyError, self.adjview.__getitem__, 4)
88
+
89
+ def test_copy(self):
90
+ avcopy = self.adjview.copy()
91
+ assert avcopy[0] == self.adjview[0]
92
+ assert avcopy[0] is not self.adjview[0]
93
+
94
+ avcopy[2][3]["ht"] = 4
95
+ assert avcopy[2] != self.adjview[2]
96
+ self.adjview[2][3]["ht"] = 4
97
+ assert avcopy[2] == self.adjview[2]
98
+ del self.adjview[2][3]["ht"]
99
+
100
+ assert not hasattr(self.adjview, "__setitem__")
101
+
102
+ def test_items(self):
103
+ view_items = sorted((n, dict(d)) for n, d in self.adjview.items())
104
+ assert view_items == sorted(self.adj.items())
105
+
106
+ def test_str(self):
107
+ out = str(dict(self.adj))
108
+ assert str(self.adjview) == out
109
+
110
+ def test_repr(self):
111
+ out = self.adjview.__class__.__name__ + "(" + str(self.adj) + ")"
112
+ assert repr(self.adjview) == out
113
+
114
+
115
+ class TestMultiAdjacencyView(TestAdjacencyView):
116
+ # node->nbr->key->data
117
+ def setup_method(self):
118
+ dd = {"color": "blue", "weight": 1.2}
119
+ self.kd = {0: dd, 1: {}, 2: {"color": 1}}
120
+ self.nd = {3: self.kd, 0: {3: dd}, 1: {0: {}}, 2: {3: {"color": 1}}}
121
+ self.adj = {3: self.nd, 0: {3: {3: dd}}, 1: {}, 2: {3: {8: {}}}}
122
+ self.adjview = nx.classes.coreviews.MultiAdjacencyView(self.adj)
123
+
124
+ def test_getitem(self):
125
+ assert self.adjview[1] is not self.adj[1]
126
+ assert self.adjview[3][0][3] is self.adjview[0][3][3]
127
+ assert self.adjview[3][2][3]["color"] == 1
128
+ pytest.raises(KeyError, self.adjview.__getitem__, 4)
129
+
130
+ def test_copy(self):
131
+ avcopy = self.adjview.copy()
132
+ assert avcopy[0] == self.adjview[0]
133
+ assert avcopy[0] is not self.adjview[0]
134
+
135
+ avcopy[2][3][8]["ht"] = 4
136
+ assert avcopy[2] != self.adjview[2]
137
+ self.adjview[2][3][8]["ht"] = 4
138
+ assert avcopy[2] == self.adjview[2]
139
+ del self.adjview[2][3][8]["ht"]
140
+
141
+ assert not hasattr(self.adjview, "__setitem__")
142
+
143
+
144
+ class TestUnionAtlas:
145
+ # node->data
146
+ def setup_method(self):
147
+ self.s = {0: {"color": "blue", "weight": 1.2}, 1: {}, 2: {"color": 1}}
148
+ self.p = {3: {"color": "blue", "weight": 1.2}, 4: {}, 2: {"watch": 2}}
149
+ self.av = nx.classes.coreviews.UnionAtlas(self.s, self.p)
150
+
151
+ def test_pickle(self):
152
+ view = self.av
153
+ pview = pickle.loads(pickle.dumps(view, -1))
154
+ assert view == pview
155
+ assert view.__slots__ == pview.__slots__
156
+
157
+ def test_len(self):
158
+ assert len(self.av) == len(self.s.keys() | self.p.keys()) == 5
159
+
160
+ def test_iter(self):
161
+ assert set(self.av) == set(self.s) | set(self.p)
162
+
163
+ def test_getitem(self):
164
+ assert self.av[0] is self.s[0]
165
+ assert self.av[4] is self.p[4]
166
+ assert self.av[2]["color"] == 1
167
+ pytest.raises(KeyError, self.av[2].__getitem__, "watch")
168
+ pytest.raises(KeyError, self.av.__getitem__, 8)
169
+
170
+ def test_copy(self):
171
+ avcopy = self.av.copy()
172
+ assert avcopy[0] == self.av[0]
173
+ assert avcopy[0] is not self.av[0]
174
+ assert avcopy is not self.av
175
+ avcopy[5] = {}
176
+ assert avcopy != self.av
177
+
178
+ avcopy[0]["ht"] = 4
179
+ assert avcopy[0] != self.av[0]
180
+ self.av[0]["ht"] = 4
181
+ assert avcopy[0] == self.av[0]
182
+ del self.av[0]["ht"]
183
+
184
+ assert not hasattr(self.av, "__setitem__")
185
+
186
+ def test_items(self):
187
+ expected = dict(self.p.items())
188
+ expected.update(self.s)
189
+ assert sorted(self.av.items()) == sorted(expected.items())
190
+
191
+ def test_str(self):
192
+ out = str(dict(self.av))
193
+ assert str(self.av) == out
194
+
195
+ def test_repr(self):
196
+ out = f"{self.av.__class__.__name__}({self.s}, {self.p})"
197
+ assert repr(self.av) == out
198
+
199
+
200
+ class TestUnionAdjacency:
201
+ # node->nbr->data
202
+ def setup_method(self):
203
+ dd = {"color": "blue", "weight": 1.2}
204
+ self.nd = {0: dd, 1: {}, 2: {"color": 1}}
205
+ self.s = {3: self.nd, 0: {}, 1: {}, 2: {3: {"color": 1}}}
206
+ self.p = {3: {}, 0: {3: dd}, 1: {0: {}}, 2: {1: {"color": 1}}}
207
+ self.adjview = nx.classes.coreviews.UnionAdjacency(self.s, self.p)
208
+
209
+ def test_pickle(self):
210
+ view = self.adjview
211
+ pview = pickle.loads(pickle.dumps(view, -1))
212
+ assert view == pview
213
+ assert view.__slots__ == pview.__slots__
214
+
215
+ def test_len(self):
216
+ assert len(self.adjview) == len(self.s)
217
+
218
+ def test_iter(self):
219
+ assert sorted(self.adjview) == sorted(self.s)
220
+
221
+ def test_getitem(self):
222
+ assert self.adjview[1] is not self.s[1]
223
+ assert self.adjview[3][0] is self.adjview[0][3]
224
+ assert self.adjview[2][3]["color"] == 1
225
+ pytest.raises(KeyError, self.adjview.__getitem__, 4)
226
+
227
+ def test_copy(self):
228
+ avcopy = self.adjview.copy()
229
+ assert avcopy[0] == self.adjview[0]
230
+ assert avcopy[0] is not self.adjview[0]
231
+
232
+ avcopy[2][3]["ht"] = 4
233
+ assert avcopy[2] != self.adjview[2]
234
+ self.adjview[2][3]["ht"] = 4
235
+ assert avcopy[2] == self.adjview[2]
236
+ del self.adjview[2][3]["ht"]
237
+
238
+ assert not hasattr(self.adjview, "__setitem__")
239
+
240
+ def test_str(self):
241
+ out = str(dict(self.adjview))
242
+ assert str(self.adjview) == out
243
+
244
+ def test_repr(self):
245
+ clsname = self.adjview.__class__.__name__
246
+ out = f"{clsname}({self.s}, {self.p})"
247
+ assert repr(self.adjview) == out
248
+
249
+
250
+ class TestUnionMultiInner(TestUnionAdjacency):
251
+ # nbr->key->data
252
+ def setup_method(self):
253
+ dd = {"color": "blue", "weight": 1.2}
254
+ self.kd = {7: {}, "ekey": {}, 9: {"color": 1}}
255
+ self.s = {3: self.kd, 0: {7: dd}, 1: {}, 2: {"key": {"color": 1}}}
256
+ self.p = {3: {}, 0: {3: dd}, 1: {}, 2: {1: {"span": 2}}}
257
+ self.adjview = nx.classes.coreviews.UnionMultiInner(self.s, self.p)
258
+
259
+ def test_len(self):
260
+ assert len(self.adjview) == len(self.s.keys() | self.p.keys()) == 4
261
+
262
+ def test_getitem(self):
263
+ assert self.adjview[1] is not self.s[1]
264
+ assert self.adjview[0][7] is self.adjview[0][3]
265
+ assert self.adjview[2]["key"]["color"] == 1
266
+ assert self.adjview[2][1]["span"] == 2
267
+ pytest.raises(KeyError, self.adjview.__getitem__, 4)
268
+ pytest.raises(KeyError, self.adjview[1].__getitem__, "key")
269
+
270
+ def test_copy(self):
271
+ avcopy = self.adjview.copy()
272
+ assert avcopy[0] == self.adjview[0]
273
+ assert avcopy[0] is not self.adjview[0]
274
+
275
+ avcopy[2][1]["width"] = 8
276
+ assert avcopy[2] != self.adjview[2]
277
+ self.adjview[2][1]["width"] = 8
278
+ assert avcopy[2] == self.adjview[2]
279
+ del self.adjview[2][1]["width"]
280
+
281
+ assert not hasattr(self.adjview, "__setitem__")
282
+ assert hasattr(avcopy, "__setitem__")
283
+
284
+
285
+ class TestUnionMultiAdjacency(TestUnionAdjacency):
286
+ # node->nbr->key->data
287
+ def setup_method(self):
288
+ dd = {"color": "blue", "weight": 1.2}
289
+ self.kd = {7: {}, 8: {}, 9: {"color": 1}}
290
+ self.nd = {3: self.kd, 0: {9: dd}, 1: {8: {}}, 2: {9: {"color": 1}}}
291
+ self.s = {3: self.nd, 0: {3: {7: dd}}, 1: {}, 2: {3: {8: {}}}}
292
+ self.p = {3: {}, 0: {3: {9: dd}}, 1: {}, 2: {1: {8: {}}}}
293
+ self.adjview = nx.classes.coreviews.UnionMultiAdjacency(self.s, self.p)
294
+
295
+ def test_getitem(self):
296
+ assert self.adjview[1] is not self.s[1]
297
+ assert self.adjview[3][0][9] is self.adjview[0][3][9]
298
+ assert self.adjview[3][2][9]["color"] == 1
299
+ pytest.raises(KeyError, self.adjview.__getitem__, 4)
300
+
301
+ def test_copy(self):
302
+ avcopy = self.adjview.copy()
303
+ assert avcopy[0] == self.adjview[0]
304
+ assert avcopy[0] is not self.adjview[0]
305
+
306
+ avcopy[2][3][8]["ht"] = 4
307
+ assert avcopy[2] != self.adjview[2]
308
+ self.adjview[2][3][8]["ht"] = 4
309
+ assert avcopy[2] == self.adjview[2]
310
+ del self.adjview[2][3][8]["ht"]
311
+
312
+ assert not hasattr(self.adjview, "__setitem__")
313
+ assert hasattr(avcopy, "__setitem__")
314
+
315
+
316
+ class TestFilteredGraphs:
317
+ def setup_method(self):
318
+ self.Graphs = [nx.Graph, nx.DiGraph, nx.MultiGraph, nx.MultiDiGraph]
319
+
320
+ def test_hide_show_nodes(self):
321
+ SubGraph = nx.subgraph_view
322
+ for Graph in self.Graphs:
323
+ G = nx.path_graph(4, Graph)
324
+ SG = G.subgraph([2, 3])
325
+ RG = SubGraph(G, filter_node=nx.filters.hide_nodes([0, 1]))
326
+ assert SG.nodes == RG.nodes
327
+ assert SG.edges == RG.edges
328
+ SGC = SG.copy()
329
+ RGC = RG.copy()
330
+ assert SGC.nodes == RGC.nodes
331
+ assert SGC.edges == RGC.edges
332
+
333
+ def test_str_repr(self):
334
+ SubGraph = nx.subgraph_view
335
+ for Graph in self.Graphs:
336
+ G = nx.path_graph(4, Graph)
337
+ SG = G.subgraph([2, 3])
338
+ RG = SubGraph(G, filter_node=nx.filters.hide_nodes([0, 1]))
339
+ str(SG.adj)
340
+ str(RG.adj)
341
+ repr(SG.adj)
342
+ repr(RG.adj)
343
+ str(SG.adj[2])
344
+ str(RG.adj[2])
345
+ repr(SG.adj[2])
346
+ repr(RG.adj[2])
347
+
348
+ def test_copy(self):
349
+ SubGraph = nx.subgraph_view
350
+ for Graph in self.Graphs:
351
+ G = nx.path_graph(4, Graph)
352
+ SG = G.subgraph([2, 3])
353
+ RG = SubGraph(G, filter_node=nx.filters.hide_nodes([0, 1]))
354
+ RsG = SubGraph(G, filter_node=nx.filters.show_nodes([2, 3]))
355
+ assert G.adj.copy() == G.adj
356
+ assert G.adj[2].copy() == G.adj[2]
357
+ assert SG.adj.copy() == SG.adj
358
+ assert SG.adj[2].copy() == SG.adj[2]
359
+ assert RG.adj.copy() == RG.adj
360
+ assert RG.adj[2].copy() == RG.adj[2]
361
+ assert RsG.adj.copy() == RsG.adj
362
+ assert RsG.adj[2].copy() == RsG.adj[2]
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_digraph_historical.py ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Original NetworkX graph tests"""
2
+
3
+ import pytest
4
+
5
+ import networkx
6
+ import networkx as nx
7
+
8
+ from .historical_tests import HistoricalTests
9
+
10
+
11
+ class TestDiGraphHistorical(HistoricalTests):
12
+ @classmethod
13
+ def setup_class(cls):
14
+ HistoricalTests.setup_class()
15
+ cls.G = nx.DiGraph
16
+
17
+ def test_in_degree(self):
18
+ G = self.G()
19
+ G.add_nodes_from("GJK")
20
+ G.add_edges_from([("A", "B"), ("A", "C"), ("B", "D"), ("B", "C"), ("C", "D")])
21
+
22
+ assert sorted(d for n, d in G.in_degree()) == [0, 0, 0, 0, 1, 2, 2]
23
+ assert dict(G.in_degree()) == {
24
+ "A": 0,
25
+ "C": 2,
26
+ "B": 1,
27
+ "D": 2,
28
+ "G": 0,
29
+ "K": 0,
30
+ "J": 0,
31
+ }
32
+
33
+ def test_out_degree(self):
34
+ G = self.G()
35
+ G.add_nodes_from("GJK")
36
+ G.add_edges_from([("A", "B"), ("A", "C"), ("B", "D"), ("B", "C"), ("C", "D")])
37
+ assert sorted(v for k, v in G.in_degree()) == [0, 0, 0, 0, 1, 2, 2]
38
+ assert dict(G.out_degree()) == {
39
+ "A": 2,
40
+ "C": 1,
41
+ "B": 2,
42
+ "D": 0,
43
+ "G": 0,
44
+ "K": 0,
45
+ "J": 0,
46
+ }
47
+
48
+ def test_degree_digraph(self):
49
+ H = nx.DiGraph()
50
+ H.add_edges_from([(1, 24), (1, 2)])
51
+ assert sorted(d for n, d in H.in_degree([1, 24])) == [0, 1]
52
+ assert sorted(d for n, d in H.out_degree([1, 24])) == [0, 2]
53
+ assert sorted(d for n, d in H.degree([1, 24])) == [1, 2]
54
+
55
+ def test_neighbors(self):
56
+ G = self.G()
57
+ G.add_nodes_from("GJK")
58
+ G.add_edges_from([("A", "B"), ("A", "C"), ("B", "D"), ("B", "C"), ("C", "D")])
59
+
60
+ assert sorted(G.neighbors("C")) == ["D"]
61
+ assert sorted(G["C"]) == ["D"]
62
+ assert sorted(G.neighbors("A")) == ["B", "C"]
63
+ pytest.raises(nx.NetworkXError, G.neighbors, "j")
64
+ pytest.raises(nx.NetworkXError, G.neighbors, "j")
65
+
66
+ def test_successors(self):
67
+ G = self.G()
68
+ G.add_nodes_from("GJK")
69
+ G.add_edges_from([("A", "B"), ("A", "C"), ("B", "D"), ("B", "C"), ("C", "D")])
70
+ assert sorted(G.successors("A")) == ["B", "C"]
71
+ assert sorted(G.successors("A")) == ["B", "C"]
72
+ assert sorted(G.successors("G")) == []
73
+ assert sorted(G.successors("D")) == []
74
+ assert sorted(G.successors("G")) == []
75
+ pytest.raises(nx.NetworkXError, G.successors, "j")
76
+ pytest.raises(nx.NetworkXError, G.successors, "j")
77
+
78
+ def test_predecessors(self):
79
+ G = self.G()
80
+ G.add_nodes_from("GJK")
81
+ G.add_edges_from([("A", "B"), ("A", "C"), ("B", "D"), ("B", "C"), ("C", "D")])
82
+ assert sorted(G.predecessors("C")) == ["A", "B"]
83
+ assert sorted(G.predecessors("C")) == ["A", "B"]
84
+ assert sorted(G.predecessors("G")) == []
85
+ assert sorted(G.predecessors("A")) == []
86
+ assert sorted(G.predecessors("G")) == []
87
+ assert sorted(G.predecessors("A")) == []
88
+ assert sorted(G.successors("D")) == []
89
+
90
+ pytest.raises(nx.NetworkXError, G.predecessors, "j")
91
+ pytest.raises(nx.NetworkXError, G.predecessors, "j")
92
+
93
+ def test_reverse(self):
94
+ G = nx.complete_graph(10)
95
+ H = G.to_directed()
96
+ HR = H.reverse()
97
+ assert nx.is_isomorphic(H, HR)
98
+ assert sorted(H.edges()) == sorted(HR.edges())
99
+
100
+ def test_reverse2(self):
101
+ H = nx.DiGraph()
102
+ foo = [H.add_edge(u, u + 1) for u in range(5)]
103
+ HR = H.reverse()
104
+ for u in range(5):
105
+ assert HR.has_edge(u + 1, u)
106
+
107
+ def test_reverse3(self):
108
+ H = nx.DiGraph()
109
+ H.add_nodes_from([1, 2, 3, 4])
110
+ HR = H.reverse()
111
+ assert sorted(HR.nodes()) == [1, 2, 3, 4]
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_filters.py ADDED
@@ -0,0 +1,177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import pytest
2
+
3
+ import networkx as nx
4
+
5
+
6
+ class TestFilterFactory:
7
+ def test_no_filter(self):
8
+ nf = nx.filters.no_filter
9
+ assert nf()
10
+ assert nf(1)
11
+ assert nf(2, 1)
12
+
13
+ def test_hide_nodes(self):
14
+ f = nx.classes.filters.hide_nodes([1, 2, 3])
15
+ assert not f(1)
16
+ assert not f(2)
17
+ assert not f(3)
18
+ assert f(4)
19
+ assert f(0)
20
+ assert f("a")
21
+ pytest.raises(TypeError, f, 1, 2)
22
+ pytest.raises(TypeError, f)
23
+
24
+ def test_show_nodes(self):
25
+ f = nx.classes.filters.show_nodes([1, 2, 3])
26
+ assert f(1)
27
+ assert f(2)
28
+ assert f(3)
29
+ assert not f(4)
30
+ assert not f(0)
31
+ assert not f("a")
32
+ pytest.raises(TypeError, f, 1, 2)
33
+ pytest.raises(TypeError, f)
34
+
35
+ def test_hide_edges(self):
36
+ factory = nx.classes.filters.hide_edges
37
+ f = factory([(1, 2), (3, 4)])
38
+ assert not f(1, 2)
39
+ assert not f(3, 4)
40
+ assert not f(4, 3)
41
+ assert f(2, 3)
42
+ assert f(0, -1)
43
+ assert f("a", "b")
44
+ pytest.raises(TypeError, f, 1, 2, 3)
45
+ pytest.raises(TypeError, f, 1)
46
+ pytest.raises(TypeError, f)
47
+ pytest.raises(TypeError, factory, [1, 2, 3])
48
+ pytest.raises(ValueError, factory, [(1, 2, 3)])
49
+
50
+ def test_show_edges(self):
51
+ factory = nx.classes.filters.show_edges
52
+ f = factory([(1, 2), (3, 4)])
53
+ assert f(1, 2)
54
+ assert f(3, 4)
55
+ assert f(4, 3)
56
+ assert not f(2, 3)
57
+ assert not f(0, -1)
58
+ assert not f("a", "b")
59
+ pytest.raises(TypeError, f, 1, 2, 3)
60
+ pytest.raises(TypeError, f, 1)
61
+ pytest.raises(TypeError, f)
62
+ pytest.raises(TypeError, factory, [1, 2, 3])
63
+ pytest.raises(ValueError, factory, [(1, 2, 3)])
64
+
65
+ def test_hide_diedges(self):
66
+ factory = nx.classes.filters.hide_diedges
67
+ f = factory([(1, 2), (3, 4)])
68
+ assert not f(1, 2)
69
+ assert not f(3, 4)
70
+ assert f(4, 3)
71
+ assert f(2, 3)
72
+ assert f(0, -1)
73
+ assert f("a", "b")
74
+ pytest.raises(TypeError, f, 1, 2, 3)
75
+ pytest.raises(TypeError, f, 1)
76
+ pytest.raises(TypeError, f)
77
+ pytest.raises(TypeError, factory, [1, 2, 3])
78
+ pytest.raises(ValueError, factory, [(1, 2, 3)])
79
+
80
+ def test_show_diedges(self):
81
+ factory = nx.classes.filters.show_diedges
82
+ f = factory([(1, 2), (3, 4)])
83
+ assert f(1, 2)
84
+ assert f(3, 4)
85
+ assert not f(4, 3)
86
+ assert not f(2, 3)
87
+ assert not f(0, -1)
88
+ assert not f("a", "b")
89
+ pytest.raises(TypeError, f, 1, 2, 3)
90
+ pytest.raises(TypeError, f, 1)
91
+ pytest.raises(TypeError, f)
92
+ pytest.raises(TypeError, factory, [1, 2, 3])
93
+ pytest.raises(ValueError, factory, [(1, 2, 3)])
94
+
95
+ def test_hide_multiedges(self):
96
+ factory = nx.classes.filters.hide_multiedges
97
+ f = factory([(1, 2, 0), (3, 4, 1), (1, 2, 1)])
98
+ assert not f(1, 2, 0)
99
+ assert not f(1, 2, 1)
100
+ assert f(1, 2, 2)
101
+ assert f(3, 4, 0)
102
+ assert not f(3, 4, 1)
103
+ assert not f(4, 3, 1)
104
+ assert f(4, 3, 0)
105
+ assert f(2, 3, 0)
106
+ assert f(0, -1, 0)
107
+ assert f("a", "b", 0)
108
+ pytest.raises(TypeError, f, 1, 2, 3, 4)
109
+ pytest.raises(TypeError, f, 1, 2)
110
+ pytest.raises(TypeError, f, 1)
111
+ pytest.raises(TypeError, f)
112
+ pytest.raises(TypeError, factory, [1, 2, 3])
113
+ pytest.raises(ValueError, factory, [(1, 2)])
114
+ pytest.raises(ValueError, factory, [(1, 2, 3, 4)])
115
+
116
+ def test_show_multiedges(self):
117
+ factory = nx.classes.filters.show_multiedges
118
+ f = factory([(1, 2, 0), (3, 4, 1), (1, 2, 1)])
119
+ assert f(1, 2, 0)
120
+ assert f(1, 2, 1)
121
+ assert not f(1, 2, 2)
122
+ assert not f(3, 4, 0)
123
+ assert f(3, 4, 1)
124
+ assert f(4, 3, 1)
125
+ assert not f(4, 3, 0)
126
+ assert not f(2, 3, 0)
127
+ assert not f(0, -1, 0)
128
+ assert not f("a", "b", 0)
129
+ pytest.raises(TypeError, f, 1, 2, 3, 4)
130
+ pytest.raises(TypeError, f, 1, 2)
131
+ pytest.raises(TypeError, f, 1)
132
+ pytest.raises(TypeError, f)
133
+ pytest.raises(TypeError, factory, [1, 2, 3])
134
+ pytest.raises(ValueError, factory, [(1, 2)])
135
+ pytest.raises(ValueError, factory, [(1, 2, 3, 4)])
136
+
137
+ def test_hide_multidiedges(self):
138
+ factory = nx.classes.filters.hide_multidiedges
139
+ f = factory([(1, 2, 0), (3, 4, 1), (1, 2, 1)])
140
+ assert not f(1, 2, 0)
141
+ assert not f(1, 2, 1)
142
+ assert f(1, 2, 2)
143
+ assert f(3, 4, 0)
144
+ assert not f(3, 4, 1)
145
+ assert f(4, 3, 1)
146
+ assert f(4, 3, 0)
147
+ assert f(2, 3, 0)
148
+ assert f(0, -1, 0)
149
+ assert f("a", "b", 0)
150
+ pytest.raises(TypeError, f, 1, 2, 3, 4)
151
+ pytest.raises(TypeError, f, 1, 2)
152
+ pytest.raises(TypeError, f, 1)
153
+ pytest.raises(TypeError, f)
154
+ pytest.raises(TypeError, factory, [1, 2, 3])
155
+ pytest.raises(ValueError, factory, [(1, 2)])
156
+ pytest.raises(ValueError, factory, [(1, 2, 3, 4)])
157
+
158
+ def test_show_multidiedges(self):
159
+ factory = nx.classes.filters.show_multidiedges
160
+ f = factory([(1, 2, 0), (3, 4, 1), (1, 2, 1)])
161
+ assert f(1, 2, 0)
162
+ assert f(1, 2, 1)
163
+ assert not f(1, 2, 2)
164
+ assert not f(3, 4, 0)
165
+ assert f(3, 4, 1)
166
+ assert not f(4, 3, 1)
167
+ assert not f(4, 3, 0)
168
+ assert not f(2, 3, 0)
169
+ assert not f(0, -1, 0)
170
+ assert not f("a", "b", 0)
171
+ pytest.raises(TypeError, f, 1, 2, 3, 4)
172
+ pytest.raises(TypeError, f, 1, 2)
173
+ pytest.raises(TypeError, f, 1)
174
+ pytest.raises(TypeError, f)
175
+ pytest.raises(TypeError, factory, [1, 2, 3])
176
+ pytest.raises(ValueError, factory, [(1, 2)])
177
+ pytest.raises(ValueError, factory, [(1, 2, 3, 4)])
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_graph_historical.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Original NetworkX graph tests"""
2
+
3
+ import networkx
4
+ import networkx as nx
5
+
6
+ from .historical_tests import HistoricalTests
7
+
8
+
9
+ class TestGraphHistorical(HistoricalTests):
10
+ @classmethod
11
+ def setup_class(cls):
12
+ HistoricalTests.setup_class()
13
+ cls.G = nx.Graph
minigpt2/lib/python3.10/site-packages/networkx/classes/tests/test_multidigraph.py ADDED
@@ -0,0 +1,459 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from collections import UserDict
2
+
3
+ import pytest
4
+
5
+ import networkx as nx
6
+ from networkx.utils import edges_equal
7
+
8
+ from .test_multigraph import BaseMultiGraphTester
9
+ from .test_multigraph import TestEdgeSubgraph as _TestMultiGraphEdgeSubgraph
10
+ from .test_multigraph import TestMultiGraph as _TestMultiGraph
11
+
12
+
13
+ class BaseMultiDiGraphTester(BaseMultiGraphTester):
14
+ def test_edges(self):
15
+ G = self.K3
16
+ edges = [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
17
+ assert sorted(G.edges()) == edges
18
+ assert sorted(G.edges(0)) == [(0, 1), (0, 2)]
19
+ pytest.raises((KeyError, nx.NetworkXError), G.edges, -1)
20
+
21
+ def test_edges_data(self):
22
+ G = self.K3
23
+ edges = [(0, 1, {}), (0, 2, {}), (1, 0, {}), (1, 2, {}), (2, 0, {}), (2, 1, {})]
24
+ assert sorted(G.edges(data=True)) == edges
25
+ assert sorted(G.edges(0, data=True)) == [(0, 1, {}), (0, 2, {})]
26
+ pytest.raises((KeyError, nx.NetworkXError), G.neighbors, -1)
27
+
28
+ def test_edges_multi(self):
29
+ G = self.K3
30
+ assert sorted(G.edges()) == [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
31
+ assert sorted(G.edges(0)) == [(0, 1), (0, 2)]
32
+ G.add_edge(0, 1)
33
+ assert sorted(G.edges()) == [
34
+ (0, 1),
35
+ (0, 1),
36
+ (0, 2),
37
+ (1, 0),
38
+ (1, 2),
39
+ (2, 0),
40
+ (2, 1),
41
+ ]
42
+
43
+ def test_out_edges(self):
44
+ G = self.K3
45
+ assert sorted(G.out_edges()) == [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
46
+ assert sorted(G.out_edges(0)) == [(0, 1), (0, 2)]
47
+ pytest.raises((KeyError, nx.NetworkXError), G.out_edges, -1)
48
+ assert sorted(G.out_edges(0, keys=True)) == [(0, 1, 0), (0, 2, 0)]
49
+
50
+ def test_out_edges_multi(self):
51
+ G = self.K3
52
+ assert sorted(G.out_edges()) == [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
53
+ assert sorted(G.out_edges(0)) == [(0, 1), (0, 2)]
54
+ G.add_edge(0, 1, 2)
55
+ assert sorted(G.out_edges()) == [
56
+ (0, 1),
57
+ (0, 1),
58
+ (0, 2),
59
+ (1, 0),
60
+ (1, 2),
61
+ (2, 0),
62
+ (2, 1),
63
+ ]
64
+
65
+ def test_out_edges_data(self):
66
+ G = self.K3
67
+ assert sorted(G.edges(0, data=True)) == [(0, 1, {}), (0, 2, {})]
68
+ G.remove_edge(0, 1)
69
+ G.add_edge(0, 1, data=1)
70
+ assert sorted(G.edges(0, data=True)) == [(0, 1, {"data": 1}), (0, 2, {})]
71
+ assert sorted(G.edges(0, data="data")) == [(0, 1, 1), (0, 2, None)]
72
+ assert sorted(G.edges(0, data="data", default=-1)) == [(0, 1, 1), (0, 2, -1)]
73
+
74
+ def test_in_edges(self):
75
+ G = self.K3
76
+ assert sorted(G.in_edges()) == [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
77
+ assert sorted(G.in_edges(0)) == [(1, 0), (2, 0)]
78
+ pytest.raises((KeyError, nx.NetworkXError), G.in_edges, -1)
79
+ G.add_edge(0, 1, 2)
80
+ assert sorted(G.in_edges()) == [
81
+ (0, 1),
82
+ (0, 1),
83
+ (0, 2),
84
+ (1, 0),
85
+ (1, 2),
86
+ (2, 0),
87
+ (2, 1),
88
+ ]
89
+ assert sorted(G.in_edges(0, keys=True)) == [(1, 0, 0), (2, 0, 0)]
90
+
91
+ def test_in_edges_no_keys(self):
92
+ G = self.K3
93
+ assert sorted(G.in_edges()) == [(0, 1), (0, 2), (1, 0), (1, 2), (2, 0), (2, 1)]
94
+ assert sorted(G.in_edges(0)) == [(1, 0), (2, 0)]
95
+ G.add_edge(0, 1, 2)
96
+ assert sorted(G.in_edges()) == [
97
+ (0, 1),
98
+ (0, 1),
99
+ (0, 2),
100
+ (1, 0),
101
+ (1, 2),
102
+ (2, 0),
103
+ (2, 1),
104
+ ]
105
+
106
+ assert sorted(G.in_edges(data=True, keys=False)) == [
107
+ (0, 1, {}),
108
+ (0, 1, {}),
109
+ (0, 2, {}),
110
+ (1, 0, {}),
111
+ (1, 2, {}),
112
+ (2, 0, {}),
113
+ (2, 1, {}),
114
+ ]
115
+
116
+ def test_in_edges_data(self):
117
+ G = self.K3
118
+ assert sorted(G.in_edges(0, data=True)) == [(1, 0, {}), (2, 0, {})]
119
+ G.remove_edge(1, 0)
120
+ G.add_edge(1, 0, data=1)
121
+ assert sorted(G.in_edges(0, data=True)) == [(1, 0, {"data": 1}), (2, 0, {})]
122
+ assert sorted(G.in_edges(0, data="data")) == [(1, 0, 1), (2, 0, None)]
123
+ assert sorted(G.in_edges(0, data="data", default=-1)) == [(1, 0, 1), (2, 0, -1)]
124
+
125
+ def is_shallow(self, H, G):
126
+ # graph
127
+ assert G.graph["foo"] == H.graph["foo"]
128
+ G.graph["foo"].append(1)
129
+ assert G.graph["foo"] == H.graph["foo"]
130
+ # node
131
+ assert G.nodes[0]["foo"] == H.nodes[0]["foo"]
132
+ G.nodes[0]["foo"].append(1)
133
+ assert G.nodes[0]["foo"] == H.nodes[0]["foo"]
134
+ # edge
135
+ assert G[1][2][0]["foo"] == H[1][2][0]["foo"]
136
+ G[1][2][0]["foo"].append(1)
137
+ assert G[1][2][0]["foo"] == H[1][2][0]["foo"]
138
+
139
+ def is_deep(self, H, G):
140
+ # graph
141
+ assert G.graph["foo"] == H.graph["foo"]
142
+ G.graph["foo"].append(1)
143
+ assert G.graph["foo"] != H.graph["foo"]
144
+ # node
145
+ assert G.nodes[0]["foo"] == H.nodes[0]["foo"]
146
+ G.nodes[0]["foo"].append(1)
147
+ assert G.nodes[0]["foo"] != H.nodes[0]["foo"]
148
+ # edge
149
+ assert G[1][2][0]["foo"] == H[1][2][0]["foo"]
150
+ G[1][2][0]["foo"].append(1)
151
+ assert G[1][2][0]["foo"] != H[1][2][0]["foo"]
152
+
153
+ def test_to_undirected(self):
154
+ # MultiDiGraph -> MultiGraph changes number of edges so it is
155
+ # not a copy operation... use is_shallow, not is_shallow_copy
156
+ G = self.K3
157
+ self.add_attributes(G)
158
+ H = nx.MultiGraph(G)
159
+ # self.is_shallow(H,G)
160
+ # the result is traversal order dependent so we
161
+ # can't use the is_shallow() test here.
162
+ try:
163
+ assert edges_equal(H.edges(), [(0, 1), (1, 2), (2, 0)])
164
+ except AssertionError:
165
+ assert edges_equal(H.edges(), [(0, 1), (1, 2), (1, 2), (2, 0)])
166
+ H = G.to_undirected()
167
+ self.is_deep(H, G)
168
+
169
+ def test_has_successor(self):
170
+ G = self.K3
171
+ assert G.has_successor(0, 1)
172
+ assert not G.has_successor(0, -1)
173
+
174
+ def test_successors(self):
175
+ G = self.K3
176
+ assert sorted(G.successors(0)) == [1, 2]
177
+ pytest.raises((KeyError, nx.NetworkXError), G.successors, -1)
178
+
179
+ def test_has_predecessor(self):
180
+ G = self.K3
181
+ assert G.has_predecessor(0, 1)
182
+ assert not G.has_predecessor(0, -1)
183
+
184
+ def test_predecessors(self):
185
+ G = self.K3
186
+ assert sorted(G.predecessors(0)) == [1, 2]
187
+ pytest.raises((KeyError, nx.NetworkXError), G.predecessors, -1)
188
+
189
+ def test_degree(self):
190
+ G = self.K3
191
+ assert sorted(G.degree()) == [(0, 4), (1, 4), (2, 4)]
192
+ assert dict(G.degree()) == {0: 4, 1: 4, 2: 4}
193
+ assert G.degree(0) == 4
194
+ assert list(G.degree(iter([0]))) == [(0, 4)]
195
+ G.add_edge(0, 1, weight=0.3, other=1.2)
196
+ assert sorted(G.degree(weight="weight")) == [(0, 4.3), (1, 4.3), (2, 4)]
197
+ assert sorted(G.degree(weight="other")) == [(0, 5.2), (1, 5.2), (2, 4)]
198
+
199
+ def test_in_degree(self):
200
+ G = self.K3
201
+ assert sorted(G.in_degree()) == [(0, 2), (1, 2), (2, 2)]
202
+ assert dict(G.in_degree()) == {0: 2, 1: 2, 2: 2}
203
+ assert G.in_degree(0) == 2
204
+ assert list(G.in_degree(iter([0]))) == [(0, 2)]
205
+ assert G.in_degree(0, weight="weight") == 2
206
+
207
+ def test_out_degree(self):
208
+ G = self.K3
209
+ assert sorted(G.out_degree()) == [(0, 2), (1, 2), (2, 2)]
210
+ assert dict(G.out_degree()) == {0: 2, 1: 2, 2: 2}
211
+ assert G.out_degree(0) == 2
212
+ assert list(G.out_degree(iter([0]))) == [(0, 2)]
213
+ assert G.out_degree(0, weight="weight") == 2
214
+
215
+ def test_size(self):
216
+ G = self.K3
217
+ assert G.size() == 6
218
+ assert G.number_of_edges() == 6
219
+ G.add_edge(0, 1, weight=0.3, other=1.2)
220
+ assert round(G.size(weight="weight"), 2) == 6.3
221
+ assert round(G.size(weight="other"), 2) == 7.2
222
+
223
+ def test_to_undirected_reciprocal(self):
224
+ G = self.Graph()
225
+ G.add_edge(1, 2)
226
+ assert G.to_undirected().has_edge(1, 2)
227
+ assert not G.to_undirected(reciprocal=True).has_edge(1, 2)
228
+ G.add_edge(2, 1)
229
+ assert G.to_undirected(reciprocal=True).has_edge(1, 2)
230
+
231
+ def test_reverse_copy(self):
232
+ G = nx.MultiDiGraph([(0, 1), (0, 1)])
233
+ R = G.reverse()
234
+ assert sorted(R.edges()) == [(1, 0), (1, 0)]
235
+ R.remove_edge(1, 0)
236
+ assert sorted(R.edges()) == [(1, 0)]
237
+ assert sorted(G.edges()) == [(0, 1), (0, 1)]
238
+
239
+ def test_reverse_nocopy(self):
240
+ G = nx.MultiDiGraph([(0, 1), (0, 1)])
241
+ R = G.reverse(copy=False)
242
+ assert sorted(R.edges()) == [(1, 0), (1, 0)]
243
+ pytest.raises(nx.NetworkXError, R.remove_edge, 1, 0)
244
+
245
+ def test_di_attributes_cached(self):
246
+ G = self.K3.copy()
247
+ assert id(G.in_edges) == id(G.in_edges)
248
+ assert id(G.out_edges) == id(G.out_edges)
249
+ assert id(G.in_degree) == id(G.in_degree)
250
+ assert id(G.out_degree) == id(G.out_degree)
251
+ assert id(G.succ) == id(G.succ)
252
+ assert id(G.pred) == id(G.pred)
253
+
254
+
255
+ class TestMultiDiGraph(BaseMultiDiGraphTester, _TestMultiGraph):
256
+ def setup_method(self):
257
+ self.Graph = nx.MultiDiGraph
258
+ # build K3
259
+ self.k3edges = [(0, 1), (0, 2), (1, 2)]
260
+ self.k3nodes = [0, 1, 2]
261
+ self.K3 = self.Graph()
262
+ self.K3._succ = {0: {}, 1: {}, 2: {}}
263
+ # K3._adj is synced with K3._succ
264
+ self.K3._pred = {0: {}, 1: {}, 2: {}}
265
+ for u in self.k3nodes:
266
+ for v in self.k3nodes:
267
+ if u == v:
268
+ continue
269
+ d = {0: {}}
270
+ self.K3._succ[u][v] = d
271
+ self.K3._pred[v][u] = d
272
+ self.K3._node = {}
273
+ self.K3._node[0] = {}
274
+ self.K3._node[1] = {}
275
+ self.K3._node[2] = {}
276
+
277
+ def test_add_edge(self):
278
+ G = self.Graph()
279
+ G.add_edge(0, 1)
280
+ assert G._adj == {0: {1: {0: {}}}, 1: {}}
281
+ assert G._succ == {0: {1: {0: {}}}, 1: {}}
282
+ assert G._pred == {0: {}, 1: {0: {0: {}}}}
283
+ G = self.Graph()
284
+ G.add_edge(*(0, 1))
285
+ assert G._adj == {0: {1: {0: {}}}, 1: {}}
286
+ assert G._succ == {0: {1: {0: {}}}, 1: {}}
287
+ assert G._pred == {0: {}, 1: {0: {0: {}}}}
288
+ with pytest.raises(ValueError, match="None cannot be a node"):
289
+ G.add_edge(None, 3)
290
+
291
+ def test_add_edges_from(self):
292
+ G = self.Graph()
293
+ G.add_edges_from([(0, 1), (0, 1, {"weight": 3})])
294
+ assert G._adj == {0: {1: {0: {}, 1: {"weight": 3}}}, 1: {}}
295
+ assert G._succ == {0: {1: {0: {}, 1: {"weight": 3}}}, 1: {}}
296
+ assert G._pred == {0: {}, 1: {0: {0: {}, 1: {"weight": 3}}}}
297
+
298
+ G.add_edges_from([(0, 1), (0, 1, {"weight": 3})], weight=2)
299
+ assert G._succ == {
300
+ 0: {1: {0: {}, 1: {"weight": 3}, 2: {"weight": 2}, 3: {"weight": 3}}},
301
+ 1: {},
302
+ }
303
+ assert G._pred == {
304
+ 0: {},
305
+ 1: {0: {0: {}, 1: {"weight": 3}, 2: {"weight": 2}, 3: {"weight": 3}}},
306
+ }
307
+
308
+ G = self.Graph()
309
+ edges = [
310
+ (0, 1, {"weight": 3}),
311
+ (0, 1, (("weight", 2),)),
312
+ (0, 1, 5),
313
+ (0, 1, "s"),
314
+ ]
315
+ G.add_edges_from(edges)
316
+ keydict = {0: {"weight": 3}, 1: {"weight": 2}, 5: {}, "s": {}}
317
+ assert G._succ == {0: {1: keydict}, 1: {}}
318
+ assert G._pred == {1: {0: keydict}, 0: {}}
319
+
320
+ # too few in tuple
321
+ pytest.raises(nx.NetworkXError, G.add_edges_from, [(0,)])
322
+ # too many in tuple
323
+ pytest.raises(nx.NetworkXError, G.add_edges_from, [(0, 1, 2, 3, 4)])
324
+ # not a tuple
325
+ pytest.raises(TypeError, G.add_edges_from, [0])
326
+ with pytest.raises(ValueError, match="None cannot be a node"):
327
+ G.add_edges_from([(None, 3), (3, 2)])
328
+
329
+ def test_remove_edge(self):
330
+ G = self.K3
331
+ G.remove_edge(0, 1)
332
+ assert G._succ == {
333
+ 0: {2: {0: {}}},
334
+ 1: {0: {0: {}}, 2: {0: {}}},
335
+ 2: {0: {0: {}}, 1: {0: {}}},
336
+ }
337
+ assert G._pred == {
338
+ 0: {1: {0: {}}, 2: {0: {}}},
339
+ 1: {2: {0: {}}},
340
+ 2: {0: {0: {}}, 1: {0: {}}},
341
+ }
342
+ pytest.raises((KeyError, nx.NetworkXError), G.remove_edge, -1, 0)
343
+ pytest.raises((KeyError, nx.NetworkXError), G.remove_edge, 0, 2, key=1)
344
+
345
+ def test_remove_multiedge(self):
346
+ G = self.K3
347
+ G.add_edge(0, 1, key="parallel edge")
348
+ G.remove_edge(0, 1, key="parallel edge")
349
+ assert G._adj == {
350
+ 0: {1: {0: {}}, 2: {0: {}}},
351
+ 1: {0: {0: {}}, 2: {0: {}}},
352
+ 2: {0: {0: {}}, 1: {0: {}}},
353
+ }
354
+
355
+ assert G._succ == {
356
+ 0: {1: {0: {}}, 2: {0: {}}},
357
+ 1: {0: {0: {}}, 2: {0: {}}},
358
+ 2: {0: {0: {}}, 1: {0: {}}},
359
+ }
360
+
361
+ assert G._pred == {
362
+ 0: {1: {0: {}}, 2: {0: {}}},
363
+ 1: {0: {0: {}}, 2: {0: {}}},
364
+ 2: {0: {0: {}}, 1: {0: {}}},
365
+ }
366
+ G.remove_edge(0, 1)
367
+ assert G._succ == {
368
+ 0: {2: {0: {}}},
369
+ 1: {0: {0: {}}, 2: {0: {}}},
370
+ 2: {0: {0: {}}, 1: {0: {}}},
371
+ }
372
+ assert G._pred == {
373
+ 0: {1: {0: {}}, 2: {0: {}}},
374
+ 1: {2: {0: {}}},
375
+ 2: {0: {0: {}}, 1: {0: {}}},
376
+ }
377
+ pytest.raises((KeyError, nx.NetworkXError), G.remove_edge, -1, 0)
378
+
379
+ def test_remove_edges_from(self):
380
+ G = self.K3
381
+ G.remove_edges_from([(0, 1)])
382
+ assert G._succ == {
383
+ 0: {2: {0: {}}},
384
+ 1: {0: {0: {}}, 2: {0: {}}},
385
+ 2: {0: {0: {}}, 1: {0: {}}},
386
+ }
387
+ assert G._pred == {
388
+ 0: {1: {0: {}}, 2: {0: {}}},
389
+ 1: {2: {0: {}}},
390
+ 2: {0: {0: {}}, 1: {0: {}}},
391
+ }
392
+ G.remove_edges_from([(0, 0)]) # silent fail
393
+
394
+
395
+ class TestEdgeSubgraph(_TestMultiGraphEdgeSubgraph):
396
+ """Unit tests for the :meth:`MultiDiGraph.edge_subgraph` method."""
397
+
398
+ def setup_method(self):
399
+ # Create a quadruply-linked path graph on five nodes.
400
+ G = nx.MultiDiGraph()
401
+ nx.add_path(G, range(5))
402
+ nx.add_path(G, range(5))
403
+ nx.add_path(G, reversed(range(5)))
404
+ nx.add_path(G, reversed(range(5)))
405
+ # Add some node, edge, and graph attributes.
406
+ for i in range(5):
407
+ G.nodes[i]["name"] = f"node{i}"
408
+ G.adj[0][1][0]["name"] = "edge010"
409
+ G.adj[0][1][1]["name"] = "edge011"
410
+ G.adj[3][4][0]["name"] = "edge340"
411
+ G.adj[3][4][1]["name"] = "edge341"
412
+ G.graph["name"] = "graph"
413
+ # Get the subgraph induced by one of the first edges and one of
414
+ # the last edges.
415
+ self.G = G
416
+ self.H = G.edge_subgraph([(0, 1, 0), (3, 4, 1)])
417
+
418
+
419
+ class CustomDictClass(UserDict):
420
+ pass
421
+
422
+
423
+ class MultiDiGraphSubClass(nx.MultiDiGraph):
424
+ node_dict_factory = CustomDictClass # type: ignore[assignment]
425
+ node_attr_dict_factory = CustomDictClass # type: ignore[assignment]
426
+ adjlist_outer_dict_factory = CustomDictClass # type: ignore[assignment]
427
+ adjlist_inner_dict_factory = CustomDictClass # type: ignore[assignment]
428
+ edge_key_dict_factory = CustomDictClass # type: ignore[assignment]
429
+ edge_attr_dict_factory = CustomDictClass # type: ignore[assignment]
430
+ graph_attr_dict_factory = CustomDictClass # type: ignore[assignment]
431
+
432
+
433
+ class TestMultiDiGraphSubclass(TestMultiDiGraph):
434
+ def setup_method(self):
435
+ self.Graph = MultiDiGraphSubClass
436
+ # build K3
437
+ self.k3edges = [(0, 1), (0, 2), (1, 2)]
438
+ self.k3nodes = [0, 1, 2]
439
+ self.K3 = self.Graph()
440
+ self.K3._succ = self.K3.adjlist_outer_dict_factory(
441
+ {
442
+ 0: self.K3.adjlist_inner_dict_factory(),
443
+ 1: self.K3.adjlist_inner_dict_factory(),
444
+ 2: self.K3.adjlist_inner_dict_factory(),
445
+ }
446
+ )
447
+ # K3._adj is synced with K3._succ
448
+ self.K3._pred = {0: {}, 1: {}, 2: {}}
449
+ for u in self.k3nodes:
450
+ for v in self.k3nodes:
451
+ if u == v:
452
+ continue
453
+ d = {0: {}}
454
+ self.K3._succ[u][v] = d
455
+ self.K3._pred[v][u] = d
456
+ self.K3._node = self.K3.node_dict_factory()
457
+ self.K3._node[0] = self.K3.node_attr_dict_factory()
458
+ self.K3._node[1] = self.K3.node_attr_dict_factory()
459
+ self.K3._node[2] = self.K3.node_attr_dict_factory()
minigpt2/lib/python3.10/site-packages/open_flamingo/__init__.py ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ from .src.flamingo import Flamingo
2
+ from .src.factory import create_model_and_transforms
minigpt2/lib/python3.10/site-packages/open_flamingo/eval/__init__.py ADDED
@@ -0,0 +1 @@
 
 
1
+
minigpt2/lib/python3.10/site-packages/open_flamingo/eval/eval_datasets.py ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import os
3
+
4
+ from PIL import Image
5
+ from torch.utils.data import Dataset
6
+ from torchvision.datasets import ImageFolder
7
+
8
+ from open_flamingo.eval.imagenet_utils import IMAGENET_1K_CLASS_ID_TO_LABEL
9
+
10
+
11
+ class COCOFlickrDataset(Dataset):
12
+ def __init__(
13
+ self,
14
+ image_dir_path="/mmfs1/gscratch/efml/anasa2/data/coco/train2017/",
15
+ annotations_path="/mmfs1/gscratch/efml/anasa2/data/coco/annotations/captions_train2017.json",
16
+ is_flickr=False,
17
+ ):
18
+ self.image_dir_path = image_dir_path
19
+ self.annotations = json.load(open(annotations_path))["annotations"]
20
+ self.is_flickr = is_flickr
21
+
22
+ def __len__(self):
23
+ return len(self.annotations)
24
+
25
+ def get_img_path(self, idx):
26
+ if self.is_flickr:
27
+ return f"{self.image_dir_path}/{self.annotations[idx]['image_id']}.jpg"
28
+ else:
29
+ return f"{self.image_dir_path}/COCO_train2014_{self.annotations[idx]['image_id']:012d}.jpg"
30
+
31
+ def __getitem__(self, idx):
32
+ image = Image.open(self.get_img_path(idx))
33
+ caption = self.annotations[idx]["caption"]
34
+ return {
35
+ "image": image,
36
+ "caption": caption,
37
+ "image_id": self.annotations[idx]["image_id"],
38
+ }
39
+
40
+
41
+ class VQADataset(Dataset):
42
+ def __init__(
43
+ self,
44
+ image_dir_path="/mmfs1/gscratch/efml/anasa2/data/vqav2/train2014/",
45
+ question_path="/mmfs1/gscratch/efml/anasa2/data/vqav2/v2_OpenEnded_mscoco_train2014_questions.json",
46
+ annotations_path="/mmfs1/gscratch/efml/anasa2/data/vqav2/v2_mscoco_train2014_annotations.json",
47
+ vqa_dataset="vqa",
48
+ ):
49
+ self.questions = json.load(open(question_path, "r"))["questions"]
50
+ self.answers = json.load(open(annotations_path, "r"))["annotations"]
51
+ self.image_dir_path = image_dir_path
52
+ self.vqa_dataset = vqa_dataset
53
+
54
+ def __len__(self):
55
+ return len(self.questions)
56
+
57
+ def get_img_path(self, question):
58
+ if self.vqa_dataset == "vqa":
59
+ return os.path.join(
60
+ self.image_dir_path, f"COCO_train2014_{question['image_id']:012d}.jpg"
61
+ )
62
+ elif self.vqa_dataset == "ok_vqa":
63
+ return os.path.join(
64
+ self.image_dir_path, f"COCO_val2014_{question['image_id']:012d}.jpg"
65
+ )
66
+ else:
67
+ raise Exception(f"Unknown VQA dataset {self.vqa_dataset}")
68
+
69
+ def __getitem__(self, idx):
70
+ question = self.questions[idx]
71
+ answers = self.answers[idx]
72
+ img_path = self.get_img_path(question)
73
+ image = Image.open(img_path)
74
+ return {
75
+ "image": image,
76
+ "question": question["question"],
77
+ "answers": [a["answer"] for a in answers["answers"]],
78
+ "question_id": question["question_id"],
79
+ }
80
+
81
+
82
+ class ImageNetDataset(ImageFolder):
83
+ """Class to represent the ImageNet1k dataset."""
84
+
85
+ def __init__(self, root, **kwargs):
86
+ super().__init__(root=root, **kwargs)
87
+
88
+ def __getitem__(self, idx):
89
+ sample, target = super().__getitem__(idx)
90
+ target_label = IMAGENET_1K_CLASS_ID_TO_LABEL[target]
91
+ return {
92
+ "image": sample,
93
+ "class_id": target, # numeric ID of the ImageNet class
94
+ "class_name": target_label, # human-readable name of ImageNet class
95
+ }
minigpt2/lib/python3.10/site-packages/open_flamingo/eval/evaluate.py ADDED
@@ -0,0 +1,961 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+ import json
3
+ from math import ceil
4
+ import os
5
+ import random
6
+ import uuid
7
+ from collections import defaultdict
8
+ from typing import Callable
9
+
10
+ import more_itertools
11
+ import numpy as np
12
+ import torch
13
+ from coco_metric import compute_cider, postprocess_captioning_generation
14
+ from eval_datasets import COCOFlickrDataset, VQADataset, ImageNetDataset
15
+ from tqdm import tqdm
16
+
17
+ from open_flamingo.eval.ok_vqa_utils import postprocess_ok_vqa_generation
18
+ from vqa_metric import compute_vqa_accuracy, postprocess_vqa_generation
19
+ from open_flamingo.eval.classification import (
20
+ compute_per_sample_probs,
21
+ compute_per_sample_loss,
22
+ )
23
+ from open_flamingo.eval.imagenet_utils import (
24
+ openai_imagenet_classnames,
25
+ IMAGENET_1K_CLASS_ID_TO_LABEL,
26
+ )
27
+
28
+ from open_flamingo.src.factory import create_model_and_transforms
29
+
30
+ parser = argparse.ArgumentParser()
31
+ parser.add_argument("--lm_path", type=str, default="facebook/opt-1.3b")
32
+ parser.add_argument("--lm_tokenizer_path", type=str, default="facebook/opt-30b")
33
+ parser.add_argument("--vision_encoder_path", default="ViT-L-14", type=str)
34
+ parser.add_argument("--vision_encoder_pretrained", default="openai", type=str)
35
+ parser.add_argument("--checkpoint_path", type=str, required=True)
36
+ parser.add_argument(
37
+ "--cross_attn_every_n_layers",
38
+ type=int,
39
+ default=1,
40
+ help="how often to add a cross-attention layer after each transformer layer",
41
+ )
42
+ parser.add_argument(
43
+ "--results_file", type=str, default=None, help="JSON file to save results"
44
+ )
45
+
46
+ # Trial arguments
47
+ parser.add_argument("--shots", nargs="+", default=[0, 4, 8, 16, 32], type=int)
48
+ parser.add_argument(
49
+ "--num_trials",
50
+ type=int,
51
+ default=1,
52
+ help="Number of trials to run for each shot using different demonstrations",
53
+ )
54
+ parser.add_argument(
55
+ "--trial_seeds",
56
+ nargs="+",
57
+ default=[0],
58
+ help="Seeds to use for each trial for picking demonstrations and eval sets",
59
+ )
60
+ parser.add_argument(
61
+ "--num_samples", type=int, default=5000, help="Number of samples to evaluate on"
62
+ )
63
+
64
+ parser.add_argument("--batch_size", type=int, default=8)
65
+ parser.add_argument("--device", type=int, default=0)
66
+
67
+ # Per-dataset evaluation flags
68
+ parser.add_argument(
69
+ "--eval_coco",
70
+ action="store_true",
71
+ default=False,
72
+ help="Whether to evaluate on COCO.",
73
+ )
74
+ parser.add_argument(
75
+ "--eval_vqav2",
76
+ action="store_true",
77
+ default=False,
78
+ help="Whether to evaluate on VQAV2.",
79
+ )
80
+ parser.add_argument(
81
+ "--eval_ok_vqa",
82
+ action="store_true",
83
+ default=False,
84
+ help="Whether to evaluate on OK-VQA.",
85
+ )
86
+ parser.add_argument(
87
+ "--eval_imagenet",
88
+ action="store_true",
89
+ default=False,
90
+ help="Whether to evaluate on ImageNet.",
91
+ )
92
+
93
+ parser.add_argument(
94
+ "--eval_flickr30",
95
+ action="store_true",
96
+ default=False,
97
+ help="Whether to evaluate on Flickr30.",
98
+ )
99
+
100
+ # Dataset arguments
101
+
102
+ ## Flickr30 Dataset
103
+ parser.add_argument(
104
+ "--flickr_image_dir_path",
105
+ type=str,
106
+ help="Path to the flickr30/flickr30k_images directory.",
107
+ default=None,
108
+ )
109
+ parser.add_argument(
110
+ "--flickr_annotations_json_path",
111
+ type=str,
112
+ help="Path to the dataset_flickr30k_coco_style.json file.",
113
+ default=None,
114
+ )
115
+
116
+ ## COCO Dataset
117
+ parser.add_argument(
118
+ "--coco_image_dir_path",
119
+ type=str,
120
+ help="Path to the flickr30/flickr30k_images directory.",
121
+ default=None,
122
+ )
123
+ parser.add_argument(
124
+ "--coco_annotations_json_path",
125
+ type=str,
126
+ default=None,
127
+ )
128
+
129
+ ## VQAV2 Dataset
130
+ parser.add_argument(
131
+ "--vqav2_image_dir_path",
132
+ type=str,
133
+ default=None,
134
+ )
135
+ parser.add_argument(
136
+ "--vqav2_questions_json_path",
137
+ type=str,
138
+ default=None,
139
+ )
140
+ parser.add_argument(
141
+ "--vqav2_annotations_json_path",
142
+ type=str,
143
+ default=None,
144
+ )
145
+
146
+ ## OK-VQA Dataset
147
+ parser.add_argument(
148
+ "--ok_vqa_image_dir_path",
149
+ type=str,
150
+ help="Path to the vqav2/train2014 directory.",
151
+ default=None,
152
+ )
153
+ parser.add_argument(
154
+ "--ok_vqa_questions_json_path",
155
+ type=str,
156
+ help="Path to the v2_OpenEnded_mscoco_train2014_questions.json file.",
157
+ default=None,
158
+ )
159
+ parser.add_argument(
160
+ "--ok_vqa_annotations_json_path",
161
+ type=str,
162
+ help="Path to the v2_mscoco_train2014_annotations.json file.",
163
+ default=None,
164
+ )
165
+
166
+ ## Imagenet dataset
167
+ parser.add_argument("--imagenet_root", type=str, default="/tmp")
168
+
169
+
170
+ def main():
171
+ args = parser.parse_args()
172
+
173
+ # load model
174
+ flamingo, image_processor, tokenizer = create_model_and_transforms(
175
+ args.vision_encoder_path,
176
+ args.vision_encoder_pretrained,
177
+ args.lm_path,
178
+ args.lm_tokenizer_path,
179
+ cross_attn_every_n_layers=args.cross_attn_every_n_layers,
180
+ )
181
+
182
+ checkpoint = torch.load(args.checkpoint_path, map_location="cpu")
183
+ flamingo.load_state_dict(checkpoint, strict=False)
184
+ flamingo.to(args.device if args.device >= 0 else "cpu")
185
+
186
+ results = defaultdict(list)
187
+
188
+ if args.eval_flickr30:
189
+ print("Evaluating on Flickr30...")
190
+ for shot in args.shots:
191
+ scores = []
192
+ for seed, trial in zip(args.trial_seeds, range(args.num_trials)):
193
+ cider_score = evaluate_coco_flickr(
194
+ model=flamingo,
195
+ tokenizer=tokenizer,
196
+ image_processor=image_processor,
197
+ batch_size=args.batch_size,
198
+ image_dir_path=args.flickr_image_dir_path,
199
+ annotations_json_path=args.flickr_annotations_json_path,
200
+ num_samples=args.num_samples,
201
+ num_shots=shot,
202
+ device=args.device,
203
+ seed=seed,
204
+ is_flickr=True,
205
+ )
206
+ print(f"Shots {shot} Trial {trial} CIDEr score: {cider_score}")
207
+ scores.append(cider_score)
208
+ print(f"Shots {shot} Mean CIDEr score: {np.mean(scores)}")
209
+ results["flickr30"].append(
210
+ {"shots": shot, "trials": scores, "mean": np.mean(scores)}
211
+ )
212
+ results = defaultdict(list)
213
+
214
+ if args.eval_coco:
215
+ print("Evaluating on COCO...")
216
+ for shot in args.shots:
217
+ scores = []
218
+ for seed, trial in zip(args.trial_seeds, range(args.num_trials)):
219
+ cider_score = evaluate_coco_flickr(
220
+ model=flamingo,
221
+ tokenizer=tokenizer,
222
+ image_processor=image_processor,
223
+ batch_size=args.batch_size,
224
+ image_dir_path=args.coco_image_dir_path,
225
+ annotations_json_path=args.coco_annotations_json_path,
226
+ num_samples=args.num_samples,
227
+ num_shots=shot,
228
+ device=args.device,
229
+ seed=seed,
230
+ )
231
+ print(f"Shots {shot} Trial {trial} CIDEr score: {cider_score}")
232
+ scores.append(cider_score)
233
+ print(f"Shots {shot} Mean CIDEr score: {np.mean(scores)}")
234
+ results["coco"].append(
235
+ {"shots": shot, "trials": scores, "mean": np.mean(scores)}
236
+ )
237
+
238
+ if args.eval_ok_vqa:
239
+ print("Evaluating on OK-VQA...")
240
+ for shot in args.shots:
241
+ scores = []
242
+ for seed, trial in zip(args.trial_seeds, range(args.num_trials)):
243
+ ok_vqa_score = evaluate_vqa(
244
+ model=flamingo,
245
+ tokenizer=tokenizer,
246
+ image_processor=image_processor,
247
+ batch_size=args.batch_size,
248
+ num_samples=args.num_samples,
249
+ num_shots=shot,
250
+ device=args.device,
251
+ seed=seed,
252
+ image_dir_path=args.ok_vqa_image_dir_path,
253
+ questions_json_path=args.ok_vqa_questions_json_path,
254
+ annotations_json_path=args.ok_vqa_annotations_json_path,
255
+ vqa_dataset="ok_vqa",
256
+ )
257
+ print(f"Shots {shot} Trial {trial} OK-VQA score: {ok_vqa_score}")
258
+ scores.append(ok_vqa_score)
259
+ print(f"Shots {shot} Mean OK-VQA score: {np.mean(scores)}")
260
+ results["ok_vqa"].append(
261
+ {"shots": shot, "trials": scores, "mean": np.mean(scores)}
262
+ )
263
+
264
+ if args.eval_vqav2:
265
+ print("Evaluating on VQAv2...")
266
+ for shot in args.shots:
267
+ scores = []
268
+ for seed, trial in zip(args.trial_seeds, range(args.num_trials)):
269
+ vqa_score = evaluate_vqa(
270
+ model=flamingo,
271
+ tokenizer=tokenizer,
272
+ image_processor=image_processor,
273
+ batch_size=args.batch_size,
274
+ num_samples=args.num_samples,
275
+ num_shots=shot,
276
+ device=args.device,
277
+ seed=seed,
278
+ image_dir_path=args.vqav2_image_dir_path,
279
+ questions_json_path=args.vqav2_questions_json_path,
280
+ annotations_json_path=args.vqav2_annotations_json_path,
281
+ vqa_dataset="vqa",
282
+ )
283
+ print(f"Shots {shot} Trial {trial} VQA score: {vqa_score}")
284
+ scores.append(vqa_score)
285
+ print(f"Shots {shot} Mean VQA score: {np.mean(scores)}")
286
+ results["vqav2"].append(
287
+ {"shots": shot, "trials": scores, "mean": np.mean(scores)}
288
+ )
289
+
290
+ if args.eval_imagenet:
291
+ print("Evaluating on ImageNet...")
292
+ for shot in args.shots:
293
+ scores = []
294
+ for seed, trial in zip(args.trial_seeds, range(args.num_trials)):
295
+ imagenet_score = evaluate_imagenet(
296
+ model=flamingo,
297
+ tokenizer=tokenizer,
298
+ image_processor=image_processor,
299
+ batch_size=args.batch_size,
300
+ num_samples=args.num_samples,
301
+ num_shots=shot,
302
+ device=args.device,
303
+ seed=seed,
304
+ imagenet_root=args.imagenet_root,
305
+ )
306
+ print(
307
+ f"Shots {shot} Trial {trial} " f"ImageNet score: {imagenet_score}"
308
+ )
309
+ scores.append(imagenet_score)
310
+ print(f"Shots {shot} Mean ImageNet score: {np.mean(scores)}")
311
+ results["imagenet"].append(
312
+ {"shots": shot, "trials": scores, "mean": np.mean(scores)}
313
+ )
314
+
315
+ if args.results_file is not None:
316
+ with open(args.results_file, "w") as f:
317
+ json.dump(results, f)
318
+
319
+
320
+ def get_random_indices(num_samples, query_set_size, full_dataset, seed):
321
+ if num_samples + query_set_size > len(full_dataset):
322
+ raise ValueError(
323
+ f"num_samples + num_shots must be less than {len(full_dataset)}"
324
+ )
325
+
326
+ # get a random subset of the dataset
327
+ np.random.seed(seed)
328
+ random_indices = np.random.choice(
329
+ len(full_dataset), num_samples + query_set_size, replace=False
330
+ )
331
+ return random_indices
332
+
333
+
334
+ def prepare_eval_samples_and_dataset(full_dataset, random_indices, query_set_size):
335
+ # get in context samples
336
+ in_context_samples = [full_dataset[i] for i in random_indices[:query_set_size]]
337
+ eval_dataset = torch.utils.data.Subset(
338
+ full_dataset, random_indices[query_set_size:]
339
+ )
340
+ return in_context_samples, eval_dataset
341
+
342
+
343
+ def get_context_images(image_processor, in_context_samples, num_shots):
344
+ if num_shots > 0:
345
+ context_images = [
346
+ image_processor(s["image"]).unsqueeze(0) for s in in_context_samples
347
+ ]
348
+ context_images = torch.cat(context_images, dim=0)
349
+ context_images = context_images.unsqueeze(1).unsqueeze(0)
350
+ else:
351
+ context_images = None
352
+ return context_images
353
+
354
+
355
+ def get_context_text(
356
+ get_prompt: Callable[[dict], str],
357
+ in_context_samples,
358
+ effective_num_shots,
359
+ num_shots,
360
+ ) -> str:
361
+ context_text = (
362
+ "".join([get_prompt(s) for s in in_context_samples])
363
+ if effective_num_shots > 0
364
+ else ""
365
+ )
366
+
367
+ if num_shots == 0:
368
+ context_text = context_text.replace("<image>", "")
369
+ return context_text
370
+
371
+
372
+ def prepare_batch_images(batch, image_processor, context_images, num_shots):
373
+ batch_images = None
374
+ for b, sample_imgs in zip(batch, context_images):
375
+ b_image = image_processor(b["image"]).unsqueeze(0).unsqueeze(1).unsqueeze(0)
376
+ b_image = torch.cat([sample_imgs, b_image], dim=1) if num_shots > 0 else b_image
377
+
378
+ if batch_images is None:
379
+ batch_images = b_image
380
+ else:
381
+ batch_images = torch.cat([batch_images, b_image], dim=0)
382
+ return batch_images
383
+
384
+
385
+ def sample_batch_demos_from_query_set(query_set, num_samples, batch_size):
386
+ return [random.sample(query_set, num_samples) for _ in range(batch_size)]
387
+
388
+
389
+ def get_outputs(
390
+ model,
391
+ batch_images,
392
+ device,
393
+ attention_mask,
394
+ max_generation_length,
395
+ num_beams,
396
+ length_penalty,
397
+ input_ids,
398
+ ):
399
+ with torch.inference_mode():
400
+ outputs = model.generate(
401
+ batch_images.to(device if device >= 0 else "cpu"),
402
+ input_ids.to(device if device >= 0 else "cpu"),
403
+ attention_mask=attention_mask.to(device if device >= 0 else "cpu"),
404
+ max_new_tokens=max_generation_length,
405
+ num_beams=num_beams,
406
+ length_penalty=length_penalty,
407
+ )
408
+
409
+ outputs = outputs[:, len(input_ids[0]) :]
410
+ return outputs
411
+
412
+
413
+ def evaluate_coco_flickr(
414
+ model,
415
+ tokenizer,
416
+ image_processor,
417
+ batch_size,
418
+ image_dir_path,
419
+ annotations_json_path,
420
+ seed=42,
421
+ max_generation_length=20,
422
+ num_beams=3,
423
+ length_penalty=-2.0,
424
+ num_samples=5000,
425
+ query_set_size=2048,
426
+ num_shots=8,
427
+ device=-1,
428
+ is_flickr=False,
429
+ ):
430
+ """Evaluate a model on COCO dataset.
431
+
432
+ Args:
433
+ model (nn.Module): model to evaluate
434
+ tokenizer (transformers.PreTrainedTokenizer): tokenizer for the model
435
+ image_processor : image processor for the model
436
+ batch_size (int): batch size
437
+ image_dir_path (str, optional): path to the directory containing the images.
438
+ annotations_json_path (str, optional): path to the json file containing the annotations.
439
+ seed (int, optional): seed for random number generator. Defaults to 42.
440
+ max_generation_length (int, optional): maximum length of the generated caption. Defaults to 10.
441
+ num_beams (int, optional): number of beams to use for beam search. Defaults to 3.
442
+ length_penalty (float, optional): length penalty for beam search. Defaults to -2.0.
443
+ num_samples (int, optional): number of samples to evaluate on. Defaults to 5000.
444
+ query_set_size (int, optional): number of samples to use for query set. Defaults to 2048.
445
+ num_shots (int, optional): number of in-context samples to use. Defaults to 8.
446
+ device (int, optional): device to use. Defaults to -1.
447
+ num_workers (int, optional): number of workers to use for dataloader. Defaults to 4.
448
+ is_flickr (bool): defines if that data is COCO or Flickr. Defaults to False (COCO).
449
+
450
+ Returns:
451
+ float: CIDEr score
452
+
453
+ """
454
+
455
+ full_dataset = COCOFlickrDataset(
456
+ image_dir_path=image_dir_path,
457
+ annotations_path=annotations_json_path,
458
+ is_flickr=is_flickr,
459
+ )
460
+ effective_num_shots = num_shots if num_shots > 0 else 2
461
+ random_indices = get_random_indices(num_samples, query_set_size, full_dataset, seed)
462
+
463
+ in_context_samples, eval_dataset = prepare_eval_samples_and_dataset(
464
+ full_dataset=full_dataset,
465
+ random_indices=random_indices,
466
+ query_set_size=query_set_size,
467
+ )
468
+
469
+ model.eval()
470
+
471
+ def get_prompt(sample):
472
+ return f"<image>Output:{sample['caption'].strip()}<|endofchunk|>"
473
+
474
+ predictions = defaultdict()
475
+
476
+ desc = "Running inference Flickr30" if is_flickr else "Running inference COCO"
477
+
478
+ for batch in more_itertools.chunked(tqdm(eval_dataset, desc=desc), batch_size):
479
+ batch_demo_samples = sample_batch_demos_from_query_set(
480
+ in_context_samples, effective_num_shots, len(batch)
481
+ )
482
+
483
+ context_images = [
484
+ get_context_images(
485
+ image_processor=image_processor,
486
+ in_context_samples=batch_demo_samples[i],
487
+ num_shots=num_shots,
488
+ )
489
+ for i in range(len(batch))
490
+ ]
491
+
492
+ context_text = [
493
+ get_context_text(
494
+ get_prompt,
495
+ in_context_samples=batch_demo_samples[i],
496
+ effective_num_shots=effective_num_shots,
497
+ num_shots=num_shots,
498
+ )
499
+ for i in range(len(batch))
500
+ ]
501
+
502
+ batch_images = prepare_batch_images(
503
+ batch=batch,
504
+ image_processor=image_processor,
505
+ context_images=context_images,
506
+ num_shots=num_shots,
507
+ )
508
+
509
+ batch_text = [f"{context_text[i]}<image>Output:" for i in range(len(batch))]
510
+
511
+ tokenizer.padding_side = "left"
512
+ encodings = tokenizer(
513
+ batch_text,
514
+ padding="longest",
515
+ truncation=True,
516
+ return_tensors="pt",
517
+ max_length=2000,
518
+ )
519
+ input_ids = encodings["input_ids"]
520
+ attention_mask = encodings["attention_mask"]
521
+
522
+ outputs = get_outputs(
523
+ model=model,
524
+ batch_images=batch_images,
525
+ device=device,
526
+ attention_mask=attention_mask,
527
+ max_generation_length=max_generation_length,
528
+ num_beams=num_beams,
529
+ length_penalty=length_penalty,
530
+ input_ids=input_ids,
531
+ )
532
+ new_predictions = [
533
+ postprocess_captioning_generation(out).replace('"', "")
534
+ for out in tokenizer.batch_decode(outputs, skip_special_tokens=True)
535
+ ]
536
+
537
+ for i, sample in enumerate(batch):
538
+ predictions[sample["image_id"]] = {
539
+ "caption": new_predictions[i],
540
+ }
541
+
542
+ # save the predictions to a temporary file
543
+ random_uuid = str(uuid.uuid4())
544
+ results_path = (
545
+ f"flickrresults_{random_uuid}.json"
546
+ if is_flickr
547
+ else f"cocoresults_{random_uuid}.json"
548
+ )
549
+ with open(results_path, "w") as f:
550
+ f.write(
551
+ json.dumps(
552
+ [
553
+ {"image_id": k, "caption": predictions[k]["caption"]}
554
+ for k in predictions
555
+ ],
556
+ indent=4,
557
+ )
558
+ )
559
+
560
+ metrics = compute_cider(
561
+ result_path=results_path,
562
+ annotations_path=annotations_json_path,
563
+ )
564
+
565
+ # delete the temporary file
566
+ os.remove(results_path)
567
+
568
+ return metrics["CIDEr"] * 100.0
569
+
570
+
571
+ def evaluate_vqa(
572
+ model,
573
+ tokenizer,
574
+ image_processor,
575
+ batch_size,
576
+ image_dir_path,
577
+ questions_json_path,
578
+ annotations_json_path,
579
+ seed=42,
580
+ max_generation_length=5,
581
+ num_beams=3,
582
+ length_penalty=-2.0,
583
+ num_samples=5000,
584
+ query_set_size=2048,
585
+ num_shots=8,
586
+ device=-1,
587
+ vqa_dataset="vqa",
588
+ ):
589
+ """
590
+ Evaluate a model on VQA datasets. Currently supports VQA v2.0.
591
+
592
+ Args:
593
+ model (nn.Module): model to evaluate
594
+ tokenizer (transformers.PreTrainedTokenizer): tokenizer for the model
595
+ image_processor : image processor for the model
596
+ batch_size (int): batch size
597
+ image_dir_path (str): path to image directory
598
+ questions_json_path (str): path to questions json file
599
+ annotations_json_path (str): path to annotations json file
600
+ seed (int, optional): random seed. Defaults to 42.
601
+ max_generation_length (int, optional): max generation length. Defaults to 5.
602
+ num_beams (int, optional): number of beams to use for beam search. Defaults to 3.
603
+ length_penalty (float, optional): length penalty for beam search. Defaults to -2.0.
604
+ num_samples (int, optional): number of samples to evaluate on. Defaults to 5000 samples.
605
+ query_set_size (int, optional): size of the query set. Defaults to 2048.
606
+ num_shots (int, optional): number of shots to use. Defaults to 8.
607
+ device (int, optional): device to use. Defaults to -1 (cpu).
608
+ num_workers (int, optional): number of workers to use. Defaults to 4.
609
+ vqa_dataset (string): type of vqa dataset: currently supports vqa, ok_vqa. Defaults to vqa.
610
+ Returns:
611
+ float: accuracy score
612
+ """
613
+
614
+ full_dataset = VQADataset(
615
+ image_dir_path=image_dir_path,
616
+ question_path=questions_json_path,
617
+ annotations_path=annotations_json_path,
618
+ vqa_dataset=vqa_dataset,
619
+ )
620
+
621
+ effective_num_shots = num_shots if num_shots > 0 else 2
622
+
623
+ if num_samples + effective_num_shots > len(full_dataset):
624
+ raise ValueError(
625
+ f"num_samples + num_shots must be less than or equal to {len(full_dataset)}"
626
+ )
627
+
628
+ random_indices = get_random_indices(num_samples, query_set_size, full_dataset, seed)
629
+
630
+ def get_prompt(sample, train=True):
631
+ return f"<image>Question:{sample['question'].strip()} Short Answer:{sample['answers'][0].strip() if train else ''}{'<|endofchunk|>' if train else ''}"
632
+
633
+ in_context_samples, eval_dataset = prepare_eval_samples_and_dataset(
634
+ full_dataset=full_dataset,
635
+ random_indices=random_indices,
636
+ query_set_size=query_set_size,
637
+ )
638
+
639
+ model.eval()
640
+ predictions = []
641
+
642
+ for batch in more_itertools.chunked(
643
+ tqdm(eval_dataset, desc="Running inference"), batch_size
644
+ ):
645
+ batch_demo_samples = sample_batch_demos_from_query_set(
646
+ in_context_samples, effective_num_shots, len(batch)
647
+ )
648
+
649
+ context_images = [
650
+ get_context_images(
651
+ image_processor=image_processor,
652
+ in_context_samples=batch_demo_samples[i],
653
+ num_shots=num_shots,
654
+ )
655
+ for i in range(len(batch))
656
+ ]
657
+
658
+ context_text = [
659
+ get_context_text(
660
+ get_prompt,
661
+ in_context_samples=batch_demo_samples[i],
662
+ effective_num_shots=effective_num_shots,
663
+ num_shots=num_shots,
664
+ )
665
+ for i in range(len(batch))
666
+ ]
667
+
668
+ batch_images = prepare_batch_images(
669
+ batch=batch,
670
+ image_processor=image_processor,
671
+ context_images=context_images,
672
+ num_shots=num_shots,
673
+ )
674
+
675
+ batch_text = [
676
+ context_text[i] + get_prompt(s, train=False) for i, s in enumerate(batch)
677
+ ]
678
+
679
+ tokenizer.padding_side = "left"
680
+ encodings = tokenizer(
681
+ batch_text,
682
+ return_tensors="pt",
683
+ padding="longest",
684
+ truncation=True,
685
+ max_length=2000,
686
+ )
687
+ input_ids = encodings["input_ids"].to(device if device >= 0 else "cpu")
688
+ attention_mask = encodings["attention_mask"].to(
689
+ device if device >= 0 else "cpu"
690
+ )
691
+
692
+ outputs = get_outputs(
693
+ model=model,
694
+ batch_images=batch_images,
695
+ device=device,
696
+ attention_mask=attention_mask,
697
+ max_generation_length=max_generation_length,
698
+ num_beams=num_beams,
699
+ length_penalty=length_penalty,
700
+ input_ids=input_ids,
701
+ )
702
+
703
+ process_function = (
704
+ postprocess_vqa_generation
705
+ if vqa_dataset == "vqa"
706
+ else postprocess_ok_vqa_generation
707
+ )
708
+
709
+ new_predictions = [
710
+ process_function(out)
711
+ for out in tokenizer.batch_decode(outputs, skip_special_tokens=True)
712
+ ]
713
+
714
+ predictions.extend(
715
+ [
716
+ {"answer": p, "question_id": sample["question_id"]}
717
+ for p, sample in zip(new_predictions, batch)
718
+ ]
719
+ )
720
+ # save the predictions to a temporary file
721
+ random_uuid = str(uuid.uuid4())
722
+ with open(f"{vqa_dataset}results_{random_uuid}.json", "w") as f:
723
+ f.write(json.dumps(predictions, indent=4))
724
+
725
+ acc = compute_vqa_accuracy(
726
+ f"{vqa_dataset}results_{random_uuid}.json",
727
+ questions_json_path,
728
+ annotations_json_path,
729
+ )
730
+
731
+ # delete the temporary file
732
+ os.remove(f"{vqa_dataset}results_{random_uuid}.json")
733
+
734
+ return acc
735
+
736
+
737
+ def evaluate_imagenet(
738
+ model,
739
+ tokenizer,
740
+ image_processor,
741
+ batch_size: int,
742
+ imagenet_root: str,
743
+ seed: int = 42,
744
+ num_samples: int = 5000,
745
+ num_shots: int = 8,
746
+ device: int = -1,
747
+ ):
748
+ """
749
+ Evaluate a model on ImageNet dataset.
750
+
751
+ Args:
752
+ model: model to evaluate
753
+ tokenizer (transformers.PreTrainedTokenizer): tokenizer for the model
754
+ image_processor : image processor for the model
755
+ batch_size (int): batch size
756
+ imagenet_root (str): path to imagenet root for the specified split.
757
+ seed (int, optional): random seed. Defaults to 42.
758
+ num_samples (int, optional): number of samples to evaluate on. Defaults to 5000 samples.
759
+ num_shots (int, optional): number of shots to use. Defaults to 8.
760
+ device (int, optional): device to use. Defaults to -1 (cpu).
761
+
762
+ Returns:
763
+ float: accuracy score
764
+ """
765
+
766
+ full_dataset = ImageNetDataset(root=imagenet_root)
767
+
768
+ effective_num_shots = num_shots if num_shots > 0 else 2
769
+
770
+ if num_samples + effective_num_shots > len(full_dataset):
771
+ raise ValueError(
772
+ f"num_samples + num_shots must be less than or equal to "
773
+ f"{len(full_dataset)} "
774
+ )
775
+
776
+ random_indices = get_random_indices(
777
+ num_samples, effective_num_shots, full_dataset, seed
778
+ )
779
+
780
+ eoc_token = "<|endofchunk|>"
781
+ eoc_token_id = tokenizer.additional_special_tokens_ids[
782
+ tokenizer.additional_special_tokens.index(eoc_token)
783
+ ]
784
+
785
+ # Padding from right allows efficient precomputing of context activations.
786
+ tokenizer.padding_side = "right"
787
+
788
+ def _imagenet_prompt(class_name, is_context: bool = True):
789
+ """Construct an imagenet prompt for a given label."""
790
+ prefix = "<image>A photo of a "
791
+ if is_context:
792
+ return prefix + class_name.strip()
793
+ else:
794
+ # Not a context example; insert EOS token before the class name
795
+ # so that we can compute the loss on the class name tokens only.
796
+ return prefix + tokenizer.eos_token + class_name.strip()
797
+
798
+ def get_imagenet_prompt(x: dict, is_context: bool = True) -> str:
799
+ """Construct an ImageNet prompt for an example, using its label."""
800
+ return _imagenet_prompt(x["class_name"], is_context=is_context)
801
+
802
+ in_context_samples, eval_dataset = prepare_eval_samples_and_dataset(
803
+ full_dataset=full_dataset,
804
+ random_indices=random_indices,
805
+ query_set_size=effective_num_shots, # NOTE: here we replace query_set_size with effective_num_shots but this is not the ideal evaluation setting.
806
+ # TODO: We should add a query_set_size argument to the function and use it to randomly sample the context for each example.
807
+ # This will be more consistent with the evaluation setting in the paper but will require some reworking of the caching.
808
+ )
809
+
810
+ device = device if device >= 0 else "cpu"
811
+
812
+ model.eval()
813
+ # Predictions based on the class target sequence with the maximal
814
+ # predicted probability
815
+ predictions_max_prob = []
816
+ # Predictions based on the class target sequence with the minimal loss on
817
+ # the model logits
818
+ predictions_min_loss = []
819
+ labels = []
820
+
821
+ context_images = [
822
+ get_context_images(
823
+ image_processor=image_processor,
824
+ in_context_samples=in_context_samples,
825
+ num_shots=num_shots,
826
+ )
827
+ for _ in range(batch_size)
828
+ ]
829
+
830
+ context_text = get_context_text(
831
+ get_imagenet_prompt,
832
+ in_context_samples=in_context_samples,
833
+ effective_num_shots=effective_num_shots,
834
+ num_shots=num_shots,
835
+ )
836
+
837
+ # kwargs to use when calling tokenizer
838
+ tokenizer_kwargs = {
839
+ "return_tensors": "pt",
840
+ "padding": True,
841
+ "truncation": True,
842
+ "max_length": 256,
843
+ }
844
+
845
+ for i, batch in enumerate(more_itertools.chunked(eval_dataset, batch_size)):
846
+ print(f"processing batch {i} of {ceil(len(eval_dataset) / batch_size)}")
847
+ batch_per_class_probs = []
848
+ batch_per_class_losses = []
849
+ batch_images = prepare_batch_images(
850
+ batch=batch,
851
+ image_processor=image_processor,
852
+ context_images=context_images,
853
+ num_shots=num_shots,
854
+ )
855
+
856
+ # Process the images only once.
857
+ batch_images = batch_images.to(device)
858
+ model._encode_vision_x(vision_x=batch_images)
859
+
860
+ # Process the context text only once.
861
+ context_encodings = tokenizer([context_text] * batch_size, **tokenizer_kwargs)
862
+ context_ids = context_encodings["input_ids"].to(device)
863
+ context_len = context_ids.shape[-1]
864
+ context_precomputed = model(
865
+ None,
866
+ context_ids,
867
+ use_cached_vision_x=True,
868
+ clear_conditioned_layers=False,
869
+ use_cache=True,
870
+ )
871
+
872
+ # For each ImageNet class, construct the output prompt, compute a
873
+ # forward pass, and store the results.
874
+ for imagenet_class_name in tqdm(openai_imagenet_classnames):
875
+ batch_text = [
876
+ context_text + _imagenet_prompt(imagenet_class_name, False) + eoc_token
877
+ ] * batch_size
878
+
879
+ full_batch_encodings = tokenizer(batch_text, **tokenizer_kwargs)
880
+
881
+ # full_batch_input_ids has shape [batch_size, seq_len], but we
882
+ # only need to run inference on the [batch_size,
883
+ # context_len:] inputs that have not been precomputed and
884
+ # vary per class.
885
+ full_batch_input_ids = full_batch_encodings["input_ids"].to(device)
886
+ full_batch_attention_mask = full_batch_encodings["attention_mask"].to(
887
+ device
888
+ )
889
+
890
+ # Sanity check that the encoded inputs with context are the same
891
+ # as the encoded context alone, for every example in the batch
892
+ assert torch.all(
893
+ context_ids[0, :] == full_batch_input_ids[:, :context_len]
894
+ ).item()
895
+
896
+ # Clone the nested structure of the past key values
897
+ past_key_values = tuple(
898
+ [
899
+ tuple([x.clone() for x in inner])
900
+ for inner in context_precomputed.past_key_values
901
+ ]
902
+ )
903
+
904
+ # Compute the outputs without recomputing context representations.
905
+ outputs = model(
906
+ vision_x=None,
907
+ lang_x=full_batch_input_ids[:, context_len:],
908
+ attention_mask=full_batch_attention_mask,
909
+ use_cached_vision_x=True,
910
+ clear_conditioned_layers=False,
911
+ past_key_values=past_key_values,
912
+ use_cache=True,
913
+ )
914
+
915
+ logits = torch.concat((context_precomputed.logits, outputs.logits), 1)
916
+
917
+ per_sample_probs = compute_per_sample_probs(
918
+ encodings=full_batch_encodings,
919
+ tokenizer=tokenizer,
920
+ logits=logits,
921
+ eoc_token_id=eoc_token_id,
922
+ )
923
+ per_sample_loss = compute_per_sample_loss(
924
+ encodings=full_batch_encodings,
925
+ tokenizer=tokenizer,
926
+ logits=logits,
927
+ eoc_token_id=eoc_token_id,
928
+ )
929
+ batch_per_class_probs.append(per_sample_probs.detach())
930
+ batch_per_class_losses.append(per_sample_loss.detach())
931
+
932
+ # Tensor of shape [batch_size, 1000] where the [i,j]th element is
933
+ # the (probability or loss) for batch element i on imagenet class j.
934
+ batch_probs = torch.stack(batch_per_class_probs, 1)
935
+ batch_losses = torch.stack(batch_per_class_losses, 1)
936
+
937
+ predictions_max_prob.extend(torch.argmax(batch_probs, 1).detach().tolist())
938
+ predictions_min_loss.extend(torch.argmin(batch_losses, 1).detach().tolist())
939
+ labels.extend(x["class_id"] for x in batch)
940
+
941
+ acc_max_prob = (np.array(predictions_max_prob) == np.array(labels)).mean()
942
+ acc_min_loss = (np.array(predictions_min_loss) == np.array(labels)).mean()
943
+ print(f"[DEBUG] ImageNet accuracy with max prob method is {acc_max_prob}")
944
+ print(f"[DEBUG] ImageNet accuracy with min loss method is {acc_min_loss}")
945
+ print(f"[DEBUG] printing ImageNet predictions and labels:")
946
+ for yhat_prob, yhat_loss, y in zip(
947
+ predictions_max_prob, predictions_min_loss, labels
948
+ ):
949
+ print(
950
+ " " * 30 + f"label: {IMAGENET_1K_CLASS_ID_TO_LABEL[y]}"
951
+ f"\nprediction (max prob method): "
952
+ f"{IMAGENET_1K_CLASS_ID_TO_LABEL[yhat_prob]}"
953
+ f"\nprediction (min loss method): "
954
+ f"{IMAGENET_1K_CLASS_ID_TO_LABEL[yhat_loss]}\n"
955
+ "#" * 25
956
+ )
957
+ return acc_max_prob
958
+
959
+
960
+ if __name__ == "__main__":
961
+ main()
minigpt2/lib/python3.10/site-packages/open_flamingo/eval/ok_vqa_utils.py ADDED
@@ -0,0 +1,214 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Those are manual mapping that are not caught by our stemming rules or would
2
+ # would be done incorrectly by our automatic stemming rule. In details,
3
+ # the keys of the _MANUAL_MATCHES dict contains the original word and the value
4
+ # contains the transformation of the word expected by the OKVQA stemming rule.
5
+ # These manual rules were found by checking the `raw_answers` and the `answers`
6
+ # fields of the released OKVQA dataset and checking all things that were not
7
+ # properly mapped by our automatic rules. In particular some of the mapping
8
+ # are sometimes constant, e.g. christmas -> christmas which was incorrectly
9
+ # singularized by our inflection.singularize.
10
+ import re
11
+ import nltk
12
+ from nltk.corpus.reader import VERB
13
+ import inflection
14
+
15
+ _MANUAL_MATCHES = {
16
+ "police": "police",
17
+ "las": "las",
18
+ "vegas": "vegas",
19
+ "yes": "yes",
20
+ "jeans": "jean",
21
+ "hell's": "hell",
22
+ "domino's": "domino",
23
+ "morning": "morn",
24
+ "clothes": "cloth",
25
+ "are": "are",
26
+ "riding": "ride",
27
+ "leaves": "leaf",
28
+ "dangerous": "danger",
29
+ "clothing": "cloth",
30
+ "texting": "text",
31
+ "kiting": "kite",
32
+ "firefighters": "firefight",
33
+ "ties": "tie",
34
+ "married": "married",
35
+ "teething": "teeth",
36
+ "gloves": "glove",
37
+ "tennis": "tennis",
38
+ "dining": "dine",
39
+ "directions": "direct",
40
+ "waves": "wave",
41
+ "christmas": "christmas",
42
+ "drives": "drive",
43
+ "pudding": "pud",
44
+ "coding": "code",
45
+ "plating": "plate",
46
+ "quantas": "quanta",
47
+ "hornes": "horn",
48
+ "graves": "grave",
49
+ "mating": "mate",
50
+ "paned": "pane",
51
+ "alertness": "alert",
52
+ "sunbathing": "sunbath",
53
+ "tenning": "ten",
54
+ "wetness": "wet",
55
+ "urinating": "urine",
56
+ "sickness": "sick",
57
+ "braves": "brave",
58
+ "firefighting": "firefight",
59
+ "lenses": "lens",
60
+ "reflections": "reflect",
61
+ "backpackers": "backpack",
62
+ "eatting": "eat",
63
+ "designers": "design",
64
+ "curiousity": "curious",
65
+ "playfulness": "play",
66
+ "blindness": "blind",
67
+ "hawke": "hawk",
68
+ "tomatoe": "tomato",
69
+ "rodeoing": "rodeo",
70
+ "brightness": "bright",
71
+ "circuses": "circus",
72
+ "skateboarders": "skateboard",
73
+ "staring": "stare",
74
+ "electronics": "electron",
75
+ "electicity": "elect",
76
+ "mountainous": "mountain",
77
+ "socializing": "social",
78
+ "hamburgers": "hamburg",
79
+ "caves": "cave",
80
+ "transitions": "transit",
81
+ "wading": "wade",
82
+ "creame": "cream",
83
+ "toileting": "toilet",
84
+ "sautee": "saute",
85
+ "buildings": "build",
86
+ "belongings": "belong",
87
+ "stockings": "stock",
88
+ "walle": "wall",
89
+ "cumulis": "cumuli",
90
+ "travelers": "travel",
91
+ "conducter": "conduct",
92
+ "browsing": "brows",
93
+ "pooping": "poop",
94
+ "haircutting": "haircut",
95
+ "toppings": "top",
96
+ "hearding": "heard",
97
+ "sunblocker": "sunblock",
98
+ "bases": "base",
99
+ "markings": "mark",
100
+ "mopeds": "mope",
101
+ "kindergartener": "kindergarten",
102
+ "pies": "pie",
103
+ "scrapbooking": "scrapbook",
104
+ "couponing": "coupon",
105
+ "meetings": "meet",
106
+ "elevators": "elev",
107
+ "lowes": "low",
108
+ "men's": "men",
109
+ "childrens": "children",
110
+ "shelves": "shelve",
111
+ "paintings": "paint",
112
+ "raines": "rain",
113
+ "paring": "pare",
114
+ "expressions": "express",
115
+ "routes": "rout",
116
+ "pease": "peas",
117
+ "vastness": "vast",
118
+ "awning": "awn",
119
+ "boy's": "boy",
120
+ "drunkenness": "drunken",
121
+ "teasing": "teas",
122
+ "conferences": "confer",
123
+ "ripeness": "ripe",
124
+ "suspenders": "suspend",
125
+ "earnings": "earn",
126
+ "reporters": "report",
127
+ "kid's": "kid",
128
+ "containers": "contain",
129
+ "corgie": "corgi",
130
+ "porche": "porch",
131
+ "microwaves": "microwave",
132
+ "batter's": "batter",
133
+ "sadness": "sad",
134
+ "apartments": "apart",
135
+ "oxygenize": "oxygen",
136
+ "striping": "stripe",
137
+ "purring": "pure",
138
+ "professionals": "profession",
139
+ "piping": "pipe",
140
+ "farmer's": "farmer",
141
+ "potatoe": "potato",
142
+ "emirates": "emir",
143
+ "womens": "women",
144
+ "veteran's": "veteran",
145
+ "wilderness": "wilder",
146
+ "propellers": "propel",
147
+ "alpes": "alp",
148
+ "charioteering": "chariot",
149
+ "swining": "swine",
150
+ "illness": "ill",
151
+ "crepte": "crept",
152
+ "adhesives": "adhesive",
153
+ "regent's": "regent",
154
+ "decorations": "decor",
155
+ "rabbies": "rabbi",
156
+ "overseas": "oversea",
157
+ "travellers": "travel",
158
+ "casings": "case",
159
+ "smugness": "smug",
160
+ "doves": "dove",
161
+ "nationals": "nation",
162
+ "mustange": "mustang",
163
+ "ringe": "ring",
164
+ "gondoliere": "gondolier",
165
+ "vacationing": "vacate",
166
+ "reminders": "remind",
167
+ "baldness": "bald",
168
+ "settings": "set",
169
+ "glaced": "glace",
170
+ "coniferous": "conifer",
171
+ "revelations": "revel",
172
+ "personals": "person",
173
+ "daughter's": "daughter",
174
+ "badness": "bad",
175
+ "projections": "project",
176
+ "polarizing": "polar",
177
+ "vandalizers": "vandal",
178
+ "minerals": "miner",
179
+ "protesters": "protest",
180
+ "controllers": "control",
181
+ "weddings": "wed",
182
+ "sometimes": "sometime",
183
+ "earing": "ear",
184
+ }
185
+
186
+
187
+ class OKVQAStemmer:
188
+ """Stemmer to match OKVQA v1.1 procedure."""
189
+
190
+ def __init__(self):
191
+ self._wordnet_lemmatizer = nltk.stem.WordNetLemmatizer()
192
+
193
+ def stem(self, input_string):
194
+ """Apply stemming."""
195
+ word_and_pos = nltk.pos_tag(nltk.tokenize.word_tokenize(input_string))
196
+ stemmed_words = []
197
+ for w, p in word_and_pos:
198
+ if w in _MANUAL_MATCHES:
199
+ w = _MANUAL_MATCHES[w]
200
+ elif w.endswith("ing"):
201
+ w = self._wordnet_lemmatizer.lemmatize(w, VERB)
202
+ elif p.startswith("NNS") or p.startswith("NNPS"):
203
+ w = inflection.singularize(w)
204
+ stemmed_words.append(w)
205
+ return " ".join(stemmed_words)
206
+
207
+
208
+ stemmer = OKVQAStemmer()
209
+
210
+
211
+ def postprocess_ok_vqa_generation(predictions) -> str:
212
+ prediction = re.split("Question|Answer", predictions, 1)[0]
213
+ prediction_stem = stemmer.stem(prediction)
214
+ return prediction_stem
minigpt2/lib/python3.10/site-packages/open_flamingo/eval/vqa_metric.py ADDED
@@ -0,0 +1,578 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import copy
2
+ import datetime
3
+ import json
4
+ import os
5
+ import random
6
+ import re
7
+ import sys
8
+
9
+ # Interface for accessing the VQA dataset.
10
+
11
+ # This code is based on the code written by Tsung-Yi Lin for MSCOCO Python API available at the following link:
12
+ # (https://github.com/pdollar/coco/blob/master/PythonAPI/pycocotools/coco.py).
13
+
14
+ # The following functions are defined:
15
+ # VQA - VQA class that loads VQA annotation file and prepares data structures.
16
+ # getQuesIds - Get question ids that satisfy given filter conditions.
17
+ # getImgIds - Get image ids that satisfy given filter conditions.
18
+ # loadQA - Load questions and answers with the specified question ids.
19
+ # showQA - Display the specified questions and answers.
20
+ # loadRes - Load result file and create result object.
21
+
22
+ # Help on each function can be accessed by: "help(COCO.function)"
23
+
24
+
25
+ class VQA:
26
+ def __init__(self, annotation_file=None, question_file=None):
27
+ """
28
+ Constructor of VQA helper class for reading and visualizing questions and answers.
29
+ :param annotation_file (str): location of VQA annotation file
30
+ :return:
31
+ """
32
+ # load dataset
33
+ self.dataset = {}
34
+ self.questions = {}
35
+ self.qa = {}
36
+ self.qqa = {}
37
+ self.imgToQA = {}
38
+ if not annotation_file == None and not question_file == None:
39
+ print("loading VQA annotations and questions into memory...")
40
+ time_t = datetime.datetime.utcnow()
41
+ dataset = json.load(open(annotation_file, "r"))
42
+ questions = json.load(open(question_file, "r"))
43
+ print(datetime.datetime.utcnow() - time_t)
44
+ self.dataset = dataset
45
+ self.questions = questions
46
+ self.createIndex()
47
+
48
+ def createIndex(self):
49
+ # create index
50
+ print("creating index...")
51
+ imgToQA = {ann["image_id"]: [] for ann in self.dataset["annotations"]}
52
+ qa = {ann["question_id"]: [] for ann in self.dataset["annotations"]}
53
+ qqa = {ann["question_id"]: [] for ann in self.dataset["annotations"]}
54
+ for ann in self.dataset["annotations"]:
55
+ imgToQA[ann["image_id"]] += [ann]
56
+ qa[ann["question_id"]] = ann
57
+ for ques in self.questions["questions"]:
58
+ qqa[ques["question_id"]] = ques
59
+ print("index created!")
60
+
61
+ # create class members
62
+ self.qa = qa
63
+ self.qqa = qqa
64
+ self.imgToQA = imgToQA
65
+
66
+ def info(self):
67
+ """
68
+ Print information about the VQA annotation file.
69
+ :return:
70
+ """
71
+ for key, value in self.dataset["info"].items():
72
+ print("%s: %s" % (key, value))
73
+
74
+ def getQuesIds(self, imgIds=[], quesTypes=[], ansTypes=[]):
75
+ """
76
+ Get question ids that satisfy given filter conditions. default skips that filter
77
+ :param imgIds (int array) : get question ids for given imgs
78
+ quesTypes (str array) : get question ids for given question types
79
+ ansTypes (str array) : get question ids for given answer types
80
+ :return: ids (int array) : integer array of question ids
81
+ """
82
+ imgIds = imgIds if type(imgIds) == list else [imgIds]
83
+ quesTypes = quesTypes if type(quesTypes) == list else [quesTypes]
84
+ ansTypes = ansTypes if type(ansTypes) == list else [ansTypes]
85
+
86
+ if len(imgIds) == len(quesTypes) == len(ansTypes) == 0:
87
+ anns = self.dataset["annotations"]
88
+ else:
89
+ if not len(imgIds) == 0:
90
+ anns = sum(
91
+ [self.imgToQA[imgId] for imgId in imgIds if imgId in self.imgToQA],
92
+ [],
93
+ )
94
+ else:
95
+ anns = self.dataset["annotations"]
96
+ anns = (
97
+ anns
98
+ if len(quesTypes) == 0
99
+ else [ann for ann in anns if ann["question_type"] in quesTypes]
100
+ )
101
+ anns = (
102
+ anns
103
+ if len(ansTypes) == 0
104
+ else [ann for ann in anns if ann["answer_type"] in ansTypes]
105
+ )
106
+ ids = [ann["question_id"] for ann in anns]
107
+ return ids
108
+
109
+ def getImgIds(self, quesIds=[], quesTypes=[], ansTypes=[]):
110
+ """
111
+ Get image ids that satisfy given filter conditions. default skips that filter
112
+ :param quesIds (int array) : get image ids for given question ids
113
+ quesTypes (str array) : get image ids for given question types
114
+ ansTypes (str array) : get image ids for given answer types
115
+ :return: ids (int array) : integer array of image ids
116
+ """
117
+ quesIds = quesIds if type(quesIds) == list else [quesIds]
118
+ quesTypes = quesTypes if type(quesTypes) == list else [quesTypes]
119
+ ansTypes = ansTypes if type(ansTypes) == list else [ansTypes]
120
+
121
+ if len(quesIds) == len(quesTypes) == len(ansTypes) == 0:
122
+ anns = self.dataset["annotations"]
123
+ else:
124
+ if not len(quesIds) == 0:
125
+ anns = sum(
126
+ [self.qa[quesId] for quesId in quesIds if quesId in self.qa], []
127
+ )
128
+ else:
129
+ anns = self.dataset["annotations"]
130
+ anns = (
131
+ anns
132
+ if len(quesTypes) == 0
133
+ else [ann for ann in anns if ann["question_type"] in quesTypes]
134
+ )
135
+ anns = (
136
+ anns
137
+ if len(ansTypes) == 0
138
+ else [ann for ann in anns if ann["answer_type"] in ansTypes]
139
+ )
140
+ ids = [ann["image_id"] for ann in anns]
141
+ return ids
142
+
143
+ def loadQA(self, ids=[]):
144
+ """
145
+ Load questions and answers with the specified question ids.
146
+ :param ids (int array) : integer ids specifying question ids
147
+ :return: qa (object array) : loaded qa objects
148
+ """
149
+ if type(ids) == list:
150
+ return [self.qa[id] for id in ids]
151
+ elif type(ids) == int:
152
+ return [self.qa[ids]]
153
+
154
+ def showQA(self, anns):
155
+ """
156
+ Display the specified annotations.
157
+ :param anns (array of object): annotations to display
158
+ :return: None
159
+ """
160
+ if len(anns) == 0:
161
+ return 0
162
+ for ann in anns:
163
+ quesId = ann["question_id"]
164
+ print("Question: %s" % (self.qqa[quesId]["question"]))
165
+ for ans in ann["answers"]:
166
+ print("Answer %d: %s" % (ans["answer_id"], ans["answer"]))
167
+
168
+ def loadRes(self, resFile, quesFile):
169
+ """
170
+ Load result file and return a result object.
171
+ :param resFile (str) : file name of result file
172
+ :return: res (obj) : result api object
173
+ """
174
+ res = VQA()
175
+ res.questions = json.load(open(quesFile))
176
+ res.dataset["info"] = copy.deepcopy(self.questions["info"])
177
+ res.dataset["task_type"] = copy.deepcopy(self.questions["task_type"])
178
+ res.dataset["data_type"] = copy.deepcopy(self.questions["data_type"])
179
+ res.dataset["data_subtype"] = copy.deepcopy(self.questions["data_subtype"])
180
+ res.dataset["license"] = copy.deepcopy(self.questions["license"])
181
+
182
+ print("Loading and preparing results... ")
183
+ time_t = datetime.datetime.utcnow()
184
+ anns = json.load(open(resFile))
185
+ assert type(anns) == list, "results is not an array of objects"
186
+ annsQuesIds = [ann["question_id"] for ann in anns]
187
+ # print set of question ids that do not have corresponding annotations
188
+
189
+ # assert set(annsQuesIds) == set(self.getQuesIds()), \
190
+ # 'Results do not correspond to current VQA set. Either the results do not have predictions for all question ids in annotation file or there is atleast one question id that does not belong to the question ids in the annotation file.'
191
+ for ann in anns:
192
+ quesId = ann["question_id"]
193
+ if res.dataset["task_type"] == "Multiple Choice":
194
+ assert (
195
+ ann["answer"] in self.qqa[quesId]["multiple_choices"]
196
+ ), "predicted answer is not one of the multiple choices"
197
+ qaAnn = self.qa[quesId]
198
+ ann["image_id"] = qaAnn["image_id"]
199
+ ann["question_type"] = qaAnn["question_type"]
200
+ ann["answer_type"] = qaAnn["answer_type"]
201
+ print(
202
+ "DONE (t=%0.2fs)" % ((datetime.datetime.utcnow() - time_t).total_seconds())
203
+ )
204
+
205
+ res.dataset["annotations"] = anns
206
+ res.createIndex()
207
+ return res
208
+
209
+
210
+ class VQAEval:
211
+ def __init__(self, vqa, vqaRes, n=2):
212
+ self.n = n
213
+ self.accuracy = {}
214
+ self.evalQA = {}
215
+ self.evalQuesType = {}
216
+ self.evalAnsType = {}
217
+ self.vqa = vqa
218
+ self.vqaRes = vqaRes
219
+ self.params = {"question_id": vqaRes.getQuesIds()}
220
+ self.contractions = {
221
+ "aint": "ain't",
222
+ "arent": "aren't",
223
+ "cant": "can't",
224
+ "couldve": "could've",
225
+ "couldnt": "couldn't",
226
+ "couldn'tve": "couldn't've",
227
+ "couldnt've": "couldn't've",
228
+ "didnt": "didn't",
229
+ "doesnt": "doesn't",
230
+ "dont": "don't",
231
+ "hadnt": "hadn't",
232
+ "hadnt've": "hadn't've",
233
+ "hadn'tve": "hadn't've",
234
+ "hasnt": "hasn't",
235
+ "havent": "haven't",
236
+ "hed": "he'd",
237
+ "hed've": "he'd've",
238
+ "he'dve": "he'd've",
239
+ "hes": "he's",
240
+ "howd": "how'd",
241
+ "howll": "how'll",
242
+ "hows": "how's",
243
+ "Id've": "I'd've",
244
+ "I'dve": "I'd've",
245
+ "Im": "I'm",
246
+ "Ive": "I've",
247
+ "isnt": "isn't",
248
+ "itd": "it'd",
249
+ "itd've": "it'd've",
250
+ "it'dve": "it'd've",
251
+ "itll": "it'll",
252
+ "let's": "let's",
253
+ "maam": "ma'am",
254
+ "mightnt": "mightn't",
255
+ "mightnt've": "mightn't've",
256
+ "mightn'tve": "mightn't've",
257
+ "mightve": "might've",
258
+ "mustnt": "mustn't",
259
+ "mustve": "must've",
260
+ "neednt": "needn't",
261
+ "notve": "not've",
262
+ "oclock": "o'clock",
263
+ "oughtnt": "oughtn't",
264
+ "ow's'at": "'ow's'at",
265
+ "'ows'at": "'ow's'at",
266
+ "'ow'sat": "'ow's'at",
267
+ "shant": "shan't",
268
+ "shed've": "she'd've",
269
+ "she'dve": "she'd've",
270
+ "she's": "she's",
271
+ "shouldve": "should've",
272
+ "shouldnt": "shouldn't",
273
+ "shouldnt've": "shouldn't've",
274
+ "shouldn'tve": "shouldn't've",
275
+ "somebody'd": "somebodyd",
276
+ "somebodyd've": "somebody'd've",
277
+ "somebody'dve": "somebody'd've",
278
+ "somebodyll": "somebody'll",
279
+ "somebodys": "somebody's",
280
+ "someoned": "someone'd",
281
+ "someoned've": "someone'd've",
282
+ "someone'dve": "someone'd've",
283
+ "someonell": "someone'll",
284
+ "someones": "someone's",
285
+ "somethingd": "something'd",
286
+ "somethingd've": "something'd've",
287
+ "something'dve": "something'd've",
288
+ "somethingll": "something'll",
289
+ "thats": "that's",
290
+ "thered": "there'd",
291
+ "thered've": "there'd've",
292
+ "there'dve": "there'd've",
293
+ "therere": "there're",
294
+ "theres": "there's",
295
+ "theyd": "they'd",
296
+ "theyd've": "they'd've",
297
+ "they'dve": "they'd've",
298
+ "theyll": "they'll",
299
+ "theyre": "they're",
300
+ "theyve": "they've",
301
+ "twas": "'twas",
302
+ "wasnt": "wasn't",
303
+ "wed've": "we'd've",
304
+ "we'dve": "we'd've",
305
+ "weve": "we've",
306
+ "werent": "weren't",
307
+ "whatll": "what'll",
308
+ "whatre": "what're",
309
+ "whats": "what's",
310
+ "whatve": "what've",
311
+ "whens": "when's",
312
+ "whered": "where'd",
313
+ "wheres": "where's",
314
+ "whereve": "where've",
315
+ "whod": "who'd",
316
+ "whod've": "who'd've",
317
+ "who'dve": "who'd've",
318
+ "wholl": "who'll",
319
+ "whos": "who's",
320
+ "whove": "who've",
321
+ "whyll": "why'll",
322
+ "whyre": "why're",
323
+ "whys": "why's",
324
+ "wont": "won't",
325
+ "wouldve": "would've",
326
+ "wouldnt": "wouldn't",
327
+ "wouldnt've": "wouldn't've",
328
+ "wouldn'tve": "wouldn't've",
329
+ "yall": "y'all",
330
+ "yall'll": "y'all'll",
331
+ "y'allll": "y'all'll",
332
+ "yall'd've": "y'all'd've",
333
+ "y'alld've": "y'all'd've",
334
+ "y'all'dve": "y'all'd've",
335
+ "youd": "you'd",
336
+ "youd've": "you'd've",
337
+ "you'dve": "you'd've",
338
+ "youll": "you'll",
339
+ "youre": "you're",
340
+ "youve": "you've",
341
+ }
342
+ self.manualMap = {
343
+ "none": "0",
344
+ "zero": "0",
345
+ "one": "1",
346
+ "two": "2",
347
+ "three": "3",
348
+ "four": "4",
349
+ "five": "5",
350
+ "six": "6",
351
+ "seven": "7",
352
+ "eight": "8",
353
+ "nine": "9",
354
+ "ten": "10",
355
+ }
356
+ self.articles = ["a", "an", "the"]
357
+
358
+ self.periodStrip = re.compile("(?!<=\d)(\.)(?!\d)")
359
+ self.commaStrip = re.compile("(\d)(\,)(\d)")
360
+ self.punct = [
361
+ ";",
362
+ r"/",
363
+ "[",
364
+ "]",
365
+ '"',
366
+ "{",
367
+ "}",
368
+ "(",
369
+ ")",
370
+ "=",
371
+ "+",
372
+ "\\",
373
+ "_",
374
+ "-",
375
+ ">",
376
+ "<",
377
+ "@",
378
+ "`",
379
+ ",",
380
+ "?",
381
+ "!",
382
+ ]
383
+
384
+ def evaluate(self, quesIds=None):
385
+ if quesIds == None:
386
+ quesIds = [quesId for quesId in self.params["question_id"]]
387
+ gts = {}
388
+ res = {}
389
+ for quesId in quesIds:
390
+ gts[quesId] = self.vqa.qa[quesId]
391
+ res[quesId] = self.vqaRes.qa[quesId]
392
+
393
+ # =================================================
394
+ # Compute accuracy
395
+ # =================================================
396
+ accQA = []
397
+ accQuesType = {}
398
+ accAnsType = {}
399
+ print("computing accuracy")
400
+ step = 0
401
+ for quesId in quesIds:
402
+ for ansDic in gts[quesId]["answers"]:
403
+ ansDic["answer"] = ansDic["answer"].replace("\n", " ")
404
+ ansDic["answer"] = ansDic["answer"].replace("\t", " ")
405
+ ansDic["answer"] = ansDic["answer"].strip()
406
+ resAns = res[quesId]["answer"]
407
+ resAns = resAns.replace("\n", " ")
408
+ resAns = resAns.replace("\t", " ")
409
+ resAns = resAns.strip()
410
+ gtAcc = []
411
+ gtAnswers = [ans["answer"] for ans in gts[quesId]["answers"]]
412
+
413
+ if len(set(gtAnswers)) > 1:
414
+ for ansDic in gts[quesId]["answers"]:
415
+ ansDic["answer"] = self.processPunctuation(ansDic["answer"])
416
+ ansDic["answer"] = self.processDigitArticle(ansDic["answer"])
417
+ resAns = self.processPunctuation(resAns)
418
+ resAns = self.processDigitArticle(resAns)
419
+
420
+ for gtAnsDatum in gts[quesId]["answers"]:
421
+ otherGTAns = [
422
+ item for item in gts[quesId]["answers"] if item != gtAnsDatum
423
+ ]
424
+ matchingAns = [item for item in otherGTAns if item["answer"] == resAns]
425
+ acc = min(1, float(len(matchingAns)) / 3)
426
+ gtAcc.append(acc)
427
+ quesType = gts[quesId]["question_type"]
428
+ ansType = gts[quesId]["answer_type"]
429
+ avgGTAcc = float(sum(gtAcc)) / len(gtAcc)
430
+ accQA.append(avgGTAcc)
431
+ if quesType not in accQuesType:
432
+ accQuesType[quesType] = []
433
+ accQuesType[quesType].append(avgGTAcc)
434
+ if ansType not in accAnsType:
435
+ accAnsType[ansType] = []
436
+ accAnsType[ansType].append(avgGTAcc)
437
+ self.setEvalQA(quesId, avgGTAcc)
438
+ self.setEvalQuesType(quesId, quesType, avgGTAcc)
439
+ self.setEvalAnsType(quesId, ansType, avgGTAcc)
440
+ if step % 100 == 0:
441
+ self.updateProgress(step / float(len(quesIds)))
442
+ step = step + 1
443
+
444
+ self.setAccuracy(accQA, accQuesType, accAnsType)
445
+ print("Done computing accuracy")
446
+
447
+ def processPunctuation(self, inText):
448
+ outText = inText
449
+ for p in self.punct:
450
+ if (p + " " in inText or " " + p in inText) or (
451
+ re.search(self.commaStrip, inText) != None
452
+ ):
453
+ outText = outText.replace(p, "")
454
+ else:
455
+ outText = outText.replace(p, " ")
456
+ outText = self.periodStrip.sub("", outText, re.UNICODE)
457
+ return outText
458
+
459
+ def processDigitArticle(self, inText):
460
+ outText = []
461
+ tempText = inText.lower().split()
462
+ for word in tempText:
463
+ word = self.manualMap.setdefault(word, word)
464
+ if word not in self.articles:
465
+ outText.append(word)
466
+ else:
467
+ pass
468
+ for wordId, word in enumerate(outText):
469
+ if word in self.contractions:
470
+ outText[wordId] = self.contractions[word]
471
+ outText = " ".join(outText)
472
+ return outText
473
+
474
+ def setAccuracy(self, accQA, accQuesType, accAnsType):
475
+ self.accuracy["overall"] = round(100 * float(sum(accQA)) / len(accQA), self.n)
476
+ self.accuracy["perQuestionType"] = {
477
+ quesType: round(
478
+ 100 * float(sum(accQuesType[quesType])) / len(accQuesType[quesType]),
479
+ self.n,
480
+ )
481
+ for quesType in accQuesType
482
+ }
483
+ self.accuracy["perAnswerType"] = {
484
+ ansType: round(
485
+ 100 * float(sum(accAnsType[ansType])) / len(accAnsType[ansType]), self.n
486
+ )
487
+ for ansType in accAnsType
488
+ }
489
+
490
+ def setEvalQA(self, quesId, acc):
491
+ self.evalQA[quesId] = round(100 * acc, self.n)
492
+
493
+ def setEvalQuesType(self, quesId, quesType, acc):
494
+ if quesType not in self.evalQuesType:
495
+ self.evalQuesType[quesType] = {}
496
+ self.evalQuesType[quesType][quesId] = round(100 * acc, self.n)
497
+
498
+ def setEvalAnsType(self, quesId, ansType, acc):
499
+ if ansType not in self.evalAnsType:
500
+ self.evalAnsType[ansType] = {}
501
+ self.evalAnsType[ansType][quesId] = round(100 * acc, self.n)
502
+
503
+ def updateProgress(self, progress):
504
+ barLength = 20
505
+ status = ""
506
+ if isinstance(progress, int):
507
+ progress = float(progress)
508
+ if not isinstance(progress, float):
509
+ progress = 0
510
+ status = "error: progress var must be float\r\n"
511
+ if progress < 0:
512
+ progress = 0
513
+ status = "Halt...\r\n"
514
+ if progress >= 1:
515
+ progress = 1
516
+ status = "Done...\r\n"
517
+ block = int(round(barLength * progress))
518
+ text = "\rFinshed Percent: [{0}] {1}% {2}".format(
519
+ "#" * block + "-" * (barLength - block), int(progress * 100), status
520
+ )
521
+ sys.stdout.write(text)
522
+ sys.stdout.flush()
523
+
524
+
525
+ def compute_vqa_accuracy(result_json_path, question_json_path, annotation_json_path):
526
+ """Compute the VQA accuracy metric.
527
+
528
+ Args:
529
+ predictions (List): list of predictions
530
+ ground_truth (List[List]): list of all possible ground truth answers
531
+
532
+ Returns:
533
+ float: VQA accuracy
534
+ """
535
+ # coding: utf-8
536
+ # dataDir = data_dir
537
+
538
+ # set up file names and paths
539
+ # versionType = 'v2_' # this should be '' when using VQA v2.0 dataset
540
+ # 'OpenEnded' only for v2.0. 'OpenEnded' or 'MultipleChoice' for v1.0
541
+ # taskType = 'OpenEnded'
542
+ # 'mscoco' only for v1.0. 'mscoco' for real and 'abstract_v002' for abstract for v1.0.
543
+ # dataType = 'mscoco'
544
+ # dataSubType = 'train2014'
545
+ # annFile = '%s/%s%s_%s_annotations.json' % (
546
+ # dataDir, versionType, dataType, dataSubType)
547
+ # quesFile = '%s/%s%s_%s_%s_questions.json' % (
548
+ # dataDir, versionType, taskType, dataType, dataSubType)
549
+ # imgDir = '%s/%s/%s/' % (dataDir, dataType, dataSubType)
550
+ # resultType = res_file_name
551
+ # fileTypes = ['results', 'accuracy',
552
+ # 'evalQA', 'evalQuesType', 'evalAnsType']
553
+
554
+ # An example result json file has been provided in './Results' folder.
555
+
556
+ # [resFile, accuracyFile, evalQAFile, evalQuesTypeFile, evalAnsTypeFile] = ['%s/%s%s_%s_%s_%s_%s.json' % (dataDir, versionType, taskType, dataType, dataSubType,
557
+ # resultType, fileType) for fileType in fileTypes]
558
+
559
+ # create vqa object and vqaRes object
560
+ vqa = VQA(annotation_json_path, question_json_path)
561
+ vqaRes = vqa.loadRes(result_json_path, question_json_path)
562
+
563
+ # create vqaEval object by taking vqa and vqaRes
564
+ # n is precision of accuracy (number of places after decimal), default is 2
565
+ vqaEval = VQAEval(vqa, vqaRes, n=2)
566
+
567
+ # evaluate results
568
+ """
569
+ If you have a list of question ids on which you would like to evaluate your results, pass it as a list to below function
570
+ By default it uses all the question ids in annotation file
571
+ """
572
+ vqaEval.evaluate()
573
+
574
+ return vqaEval.accuracy["overall"]
575
+
576
+
577
+ def postprocess_vqa_generation(predictions):
578
+ return re.split("Question|Answer", predictions, 1)[0]
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/LICENSE-3RD-PARTY.txt ADDED
The diff for this file is too large to render. See raw diff
 
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/LICENSE.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) Olli-Pekka Heinisuo
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/METADATA ADDED
@@ -0,0 +1,305 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: opencv-python-headless
3
+ Version: 4.10.0.84
4
+ Summary: Wrapper package for OpenCV python bindings.
5
+ Home-page: https://github.com/opencv/opencv-python
6
+ Maintainer: OpenCV Team
7
+ License: Apache 2.0
8
+ Platform: UNKNOWN
9
+ Classifier: Development Status :: 5 - Production/Stable
10
+ Classifier: Environment :: Console
11
+ Classifier: Intended Audience :: Developers
12
+ Classifier: Intended Audience :: Education
13
+ Classifier: Intended Audience :: Information Technology
14
+ Classifier: Intended Audience :: Science/Research
15
+ Classifier: License :: OSI Approved :: Apache Software License
16
+ Classifier: Operating System :: MacOS
17
+ Classifier: Operating System :: Microsoft :: Windows
18
+ Classifier: Operating System :: POSIX
19
+ Classifier: Operating System :: Unix
20
+ Classifier: Programming Language :: Python
21
+ Classifier: Programming Language :: Python :: 3
22
+ Classifier: Programming Language :: Python :: 3 :: Only
23
+ Classifier: Programming Language :: Python :: 3.6
24
+ Classifier: Programming Language :: Python :: 3.7
25
+ Classifier: Programming Language :: Python :: 3.8
26
+ Classifier: Programming Language :: Python :: 3.9
27
+ Classifier: Programming Language :: Python :: 3.10
28
+ Classifier: Programming Language :: Python :: 3.11
29
+ Classifier: Programming Language :: Python :: 3.12
30
+ Classifier: Programming Language :: C++
31
+ Classifier: Programming Language :: Python :: Implementation :: CPython
32
+ Classifier: Topic :: Scientific/Engineering
33
+ Classifier: Topic :: Scientific/Engineering :: Image Recognition
34
+ Classifier: Topic :: Software Development
35
+ Requires-Python: >=3.6
36
+ Description-Content-Type: text/markdown
37
+ License-File: LICENSE-3RD-PARTY.txt
38
+ License-File: LICENSE.txt
39
+ Requires-Dist: numpy >=1.13.3 ; python_version < "3.7"
40
+ Requires-Dist: numpy >=1.21.0 ; python_version <= "3.9" and platform_system == "Darwin" and platform_machine == "arm64"
41
+ Requires-Dist: numpy >=1.21.2 ; python_version >= "3.10"
42
+ Requires-Dist: numpy >=1.21.4 ; python_version >= "3.10" and platform_system == "Darwin"
43
+ Requires-Dist: numpy >=1.23.5 ; python_version >= "3.11"
44
+ Requires-Dist: numpy >=1.26.0 ; python_version >= "3.12"
45
+ Requires-Dist: numpy >=1.19.3 ; python_version >= "3.6" and platform_system == "Linux" and platform_machine == "aarch64"
46
+ Requires-Dist: numpy >=1.17.0 ; python_version >= "3.7"
47
+ Requires-Dist: numpy >=1.17.3 ; python_version >= "3.8"
48
+ Requires-Dist: numpy >=1.19.3 ; python_version >= "3.9"
49
+
50
+ [![Downloads](https://static.pepy.tech/badge/opencv-python)](http://pepy.tech/project/opencv-python)
51
+
52
+ ### Keep OpenCV Free
53
+
54
+ OpenCV is raising funds to keep the library free for everyone, and we need the support of the entire community to do it. [Donate to OpenCV on Github](https://github.com/sponsors/opencv) to show your support.
55
+
56
+ - [OpenCV on Wheels](#opencv-on-wheels)
57
+ - [Installation and Usage](#installation-and-usage)
58
+ - [Frequently Asked Questions](#frequently-asked-questions)
59
+ - [Documentation for opencv-python](#documentation-for-opencv-python)
60
+ - [CI build process](#ci-build-process)
61
+ - [Manual builds](#manual-builds)
62
+ - [Manual debug builds](#manual-debug-builds)
63
+ - [Source distributions](#source-distributions)
64
+ - [Licensing](#licensing)
65
+ - [Versioning](#versioning)
66
+ - [Releases](#releases)
67
+ - [Development builds](#development-builds)
68
+ - [Manylinux wheels](#manylinux-wheels)
69
+ - [Supported Python versions](#supported-python-versions)
70
+ - [Backward compatibility](#backward-compatibility)
71
+
72
+ ## OpenCV on Wheels
73
+
74
+ Pre-built CPU-only OpenCV packages for Python.
75
+
76
+ Check the manual build section if you wish to compile the bindings from source to enable additional modules such as CUDA.
77
+
78
+ ### Installation and Usage
79
+
80
+ 1. If you have previous/other manually installed (= not installed via ``pip``) version of OpenCV installed (e.g. cv2 module in the root of Python's site-packages), remove it before installation to avoid conflicts.
81
+ 2. Make sure that your `pip` version is up-to-date (19.3 is the minimum supported version): `pip install --upgrade pip`. Check version with `pip -V`. For example Linux distributions ship usually with very old `pip` versions which cause a lot of unexpected problems especially with the `manylinux` format.
82
+ 3. Select the correct package for your environment:
83
+
84
+ There are four different packages (see options 1, 2, 3 and 4 below) and you should **SELECT ONLY ONE OF THEM**. Do not install multiple different packages in the same environment. There is no plugin architecture: all the packages use the same namespace (`cv2`). If you installed multiple different packages in the same environment, uninstall them all with ``pip uninstall`` and reinstall only one package.
85
+
86
+ **a.** Packages for standard desktop environments (Windows, macOS, almost any GNU/Linux distribution)
87
+
88
+ - Option 1 - Main modules package: ``pip install opencv-python``
89
+ - Option 2 - Full package (contains both main modules and contrib/extra modules): ``pip install opencv-contrib-python`` (check contrib/extra modules listing from [OpenCV documentation](https://docs.opencv.org/master/))
90
+
91
+ **b.** Packages for server (headless) environments (such as Docker, cloud environments etc.), no GUI library dependencies
92
+
93
+ These packages are smaller than the two other packages above because they do not contain any GUI functionality (not compiled with Qt / other GUI components). This means that the packages avoid a heavy dependency chain to X11 libraries and you will have for example smaller Docker images as a result. You should always use these packages if you do not use `cv2.imshow` et al. or you are using some other package (such as PyQt) than OpenCV to create your GUI.
94
+
95
+ - Option 3 - Headless main modules package: ``pip install opencv-python-headless``
96
+ - Option 4 - Headless full package (contains both main modules and contrib/extra modules): ``pip install opencv-contrib-python-headless`` (check contrib/extra modules listing from [OpenCV documentation](https://docs.opencv.org/master/))
97
+
98
+ 4. Import the package:
99
+
100
+ ``import cv2``
101
+
102
+ All packages contain Haar cascade files. ``cv2.data.haarcascades`` can be used as a shortcut to the data folder. For example:
103
+
104
+ ``cv2.CascadeClassifier(cv2.data.haarcascades + "haarcascade_frontalface_default.xml")``
105
+
106
+ 5. Read [OpenCV documentation](https://docs.opencv.org/master/)
107
+
108
+ 6. Before opening a new issue, read the FAQ below and have a look at the other issues which are already open.
109
+
110
+ Frequently Asked Questions
111
+ --------------------------
112
+
113
+ **Q: Do I need to install also OpenCV separately?**
114
+
115
+ A: No, the packages are special wheel binary packages and they already contain statically built OpenCV binaries.
116
+
117
+ **Q: Pip install fails with ``ModuleNotFoundError: No module named 'skbuild'``?**
118
+
119
+ Since ``opencv-python`` version 4.3.0.\*, ``manylinux1`` wheels were replaced by ``manylinux2014`` wheels. If your pip is too old, it will try to use the new source distribution introduced in 4.3.0.38 to manually build OpenCV because it does not know how to install ``manylinux2014`` wheels. However, source build will also fail because of too old ``pip`` because it does not understand build dependencies in ``pyproject.toml``. To use the new ``manylinux2014`` pre-built wheels (or to build from source), your ``pip`` version must be >= 19.3. Please upgrade ``pip`` with ``pip install --upgrade pip``.
120
+
121
+ **Q: Import fails on Windows: ``ImportError: DLL load failed: The specified module could not be found.``?**
122
+
123
+ A: If the import fails on Windows, make sure you have [Visual C++ redistributable 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145) installed. If you are using older Windows version than Windows 10 and latest system updates are not installed, [Universal C Runtime](https://support.microsoft.com/en-us/help/2999226/update-for-universal-c-runtime-in-windows) might be also required.
124
+
125
+ Windows N and KN editions do not include Media Feature Pack which is required by OpenCV. If you are using Windows N or KN edition, please install also [Windows Media Feature Pack](https://support.microsoft.com/en-us/help/3145500/media-feature-pack-list-for-windows-n-editions).
126
+
127
+ If you have Windows Server 2012+, media DLLs are probably missing too; please install the Feature called "Media Foundation" in the Server Manager. Beware, some posts advise to install "Windows Server Essentials Media Pack", but this one requires the "Windows Server Essentials Experience" role, and this role will deeply affect your Windows Server configuration (by enforcing active directory integration etc.); so just installing the "Media Foundation" should be a safer choice.
128
+
129
+ If the above does not help, check if you are using Anaconda. Old Anaconda versions have a bug which causes the error, see [this issue](https://github.com/opencv/opencv-python/issues/36) for a manual fix.
130
+
131
+ If you still encounter the error after you have checked all the previous solutions, download [Dependencies](https://github.com/lucasg/Dependencies) and open the ``cv2.pyd`` (located usually at ``C:\Users\username\AppData\Local\Programs\Python\PythonXX\Lib\site-packages\cv2``) file with it to debug missing DLL issues.
132
+
133
+ **Q: I have some other import errors?**
134
+
135
+ A: Make sure you have removed old manual installations of OpenCV Python bindings (cv2.so or cv2.pyd in site-packages).
136
+
137
+ **Q: Function foo() or method bar() returns wrong result, throws exception or crashes interpreter. What should I do?**
138
+
139
+ A: The repository contains only OpenCV-Python package build scripts, but not OpenCV itself. Python bindings for OpenCV are developed in official OpenCV repository and it's the best place to report issues. Also please check [OpenCV wiki](https://github.com/opencv/opencv/wiki) and [the official OpenCV forum](https://forum.opencv.org/) before file new bugs.
140
+
141
+ **Q: Why the packages do not include non-free algorithms?**
142
+
143
+ A: Non-free algorithms such as SURF are not included in these packages because they are patented / non-free and therefore cannot be distributed as built binaries. Note that SIFT is included in the builds due to patent expiration since OpenCV versions 4.3.0 and 3.4.10. See this issue for more info: https://github.com/skvark/opencv-python/issues/126
144
+
145
+ **Q: Why the package and import are different (opencv-python vs. cv2)?**
146
+
147
+ A: It's easier for users to understand ``opencv-python`` than ``cv2`` and it makes it easier to find the package with search engines. `cv2` (old interface in old OpenCV versions was named as `cv`) is the name that OpenCV developers chose when they created the binding generators. This is kept as the import name to be consistent with different kind of tutorials around the internet. Changing the import name or behaviour would be also confusing to experienced users who are accustomed to the ``import cv2``.
148
+
149
+ ## Documentation for opencv-python
150
+
151
+ [![Windows Build Status](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_windows.yml/badge.svg)](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_windows.yml)
152
+ [![(Linux Build status)](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_linux.yml/badge.svg)](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_linux.yml)
153
+ [![(Mac OS Build status)](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_macos.yml/badge.svg)](https://github.com/opencv/opencv-python/actions/workflows/build_wheels_macos.yml)
154
+
155
+ The aim of this repository is to provide means to package each new [OpenCV release](https://github.com/opencv/opencv/releases) for the most used Python versions and platforms.
156
+
157
+ ### CI build process
158
+
159
+ The project is structured like a normal Python package with a standard ``setup.py`` file.
160
+ The build process for a single entry in the build matrices is as follows (see for example `.github/workflows/build_wheels_linux.yml` file):
161
+
162
+ 0. In Linux and MacOS build: get OpenCV's optional C dependencies that we compile against
163
+
164
+ 1. Checkout repository and submodules
165
+
166
+ - OpenCV is included as submodule and the version is updated
167
+ manually by maintainers when a new OpenCV release has been made
168
+ - Contrib modules are also included as a submodule
169
+
170
+ 2. Find OpenCV version from the sources
171
+
172
+ 3. Build OpenCV
173
+
174
+ - tests are disabled, otherwise build time increases too much
175
+ - there are 4 build matrix entries for each build combination: with and without contrib modules, with and without GUI (headless)
176
+ - Linux builds run in manylinux Docker containers (CentOS 5)
177
+ - source distributions are separate entries in the build matrix
178
+
179
+ 4. Rearrange OpenCV's build result, add our custom files and generate wheel
180
+
181
+ 5. Linux and macOS wheels are transformed with auditwheel and delocate, correspondingly
182
+
183
+ 6. Install the generated wheel
184
+ 7. Test that Python can import the library and run some sanity checks
185
+ 8. Use twine to upload the generated wheel to PyPI (only in release builds)
186
+
187
+ Steps 1--4 are handled by ``pip wheel``.
188
+
189
+ The build can be customized with environment variables. In addition to any variables that OpenCV's build accepts, we recognize:
190
+
191
+ - ``CI_BUILD``. Set to ``1`` to emulate the CI environment build behaviour. Used only in CI builds to force certain build flags on in ``setup.py``. Do not use this unless you know what you are doing.
192
+ - ``ENABLE_CONTRIB`` and ``ENABLE_HEADLESS``. Set to ``1`` to build the contrib and/or headless version
193
+ - ``ENABLE_JAVA``, Set to ``1`` to enable the Java client build. This is disabled by default.
194
+ - ``CMAKE_ARGS``. Additional arguments for OpenCV's CMake invocation. You can use this to make a custom build.
195
+
196
+ See the next section for more info about manual builds outside the CI environment.
197
+
198
+ ### Manual builds
199
+
200
+ If some dependency is not enabled in the pre-built wheels, you can also run the build locally to create a custom wheel.
201
+
202
+ 1. Clone this repository: `git clone --recursive https://github.com/opencv/opencv-python.git`
203
+ 2. ``cd opencv-python``
204
+ - you can use `git` to checkout some other version of OpenCV in the `opencv` and `opencv_contrib` submodules if needed
205
+ 3. Add custom Cmake flags if needed, for example: `export CMAKE_ARGS="-DSOME_FLAG=ON -DSOME_OTHER_FLAG=OFF"` (in Windows you need to set environment variables differently depending on Command Line or PowerShell)
206
+ 4. Select the package flavor which you wish to build with `ENABLE_CONTRIB` and `ENABLE_HEADLESS`: i.e. `export ENABLE_CONTRIB=1` if you wish to build `opencv-contrib-python`
207
+ 5. Run ``pip wheel . --verbose``. NOTE: make sure you have the latest ``pip`` version, the ``pip wheel`` command replaces the old ``python setup.py bdist_wheel`` command which does not support ``pyproject.toml``.
208
+ - this might take anything from 5 minutes to over 2 hours depending on your hardware
209
+ 6. Pip will print fresh will location at the end of build procedure. If you use old approach with `setup.py` file wheel package will be placed in `dist` folder. Package is ready and you can do with that whatever you wish.
210
+ - Optional: on Linux use some of the `manylinux` images as a build hosts if maximum portability is needed and run `auditwheel` for the wheel after build
211
+ - Optional: on macOS use ``delocate`` (same as ``auditwheel`` but for macOS) for better portability
212
+
213
+ #### Manual debug builds
214
+
215
+ In order to build `opencv-python` in an unoptimized debug build, you need to side-step the normal process a bit.
216
+
217
+ 1. Install the packages `scikit-build` and `numpy` via pip.
218
+ 2. Run the command `python setup.py bdist_wheel --build-type=Debug`.
219
+ 3. Install the generated wheel file in the `dist/` folder with `pip install dist/wheelname.whl`.
220
+
221
+ If you would like the build produce all compiler commands, then the following combination of flags and environment variables has been tested to work on Linux:
222
+ ```
223
+ export CMAKE_ARGS='-DCMAKE_VERBOSE_MAKEFILE=ON'
224
+ export VERBOSE=1
225
+
226
+ python3 setup.py bdist_wheel --build-type=Debug
227
+ ```
228
+
229
+ See this issue for more discussion: https://github.com/opencv/opencv-python/issues/424
230
+
231
+ #### Source distributions
232
+
233
+ Since OpenCV version 4.3.0, also source distributions are provided in PyPI. This means that if your system is not compatible with any of the wheels in PyPI, ``pip`` will attempt to build OpenCV from sources. If you need a OpenCV version which is not available in PyPI as a source distribution, please follow the manual build guidance above instead of this one.
234
+
235
+ You can also force ``pip`` to build the wheels from the source distribution. Some examples:
236
+
237
+ - ``pip install --no-binary opencv-python opencv-python``
238
+ - ``pip install --no-binary :all: opencv-python``
239
+
240
+ If you need contrib modules or headless version, just change the package name (step 4 in the previous section is not needed). However, any additional CMake flags can be provided via environment variables as described in step 3 of the manual build section. If none are provided, OpenCV's CMake scripts will attempt to find and enable any suitable dependencies. Headless distributions have hard coded CMake flags which disable all possible GUI dependencies.
241
+
242
+ On slow systems such as Raspberry Pi the full build may take several hours. On a 8-core Ryzen 7 3700X the build takes about 6 minutes.
243
+
244
+ ### Licensing
245
+
246
+ Opencv-python package (scripts in this repository) is available under MIT license.
247
+
248
+ OpenCV itself is available under [Apache 2](https://github.com/opencv/opencv/blob/master/LICENSE) license.
249
+
250
+ Third party package licenses are at [LICENSE-3RD-PARTY.txt](https://github.com/opencv/opencv-python/blob/master/LICENSE-3RD-PARTY.txt).
251
+
252
+ All wheels ship with [FFmpeg](http://ffmpeg.org) licensed under the [LGPLv2.1](http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html).
253
+
254
+ Non-headless Linux wheels ship with [Qt 5](http://doc.qt.io/qt-5/lgpl.html) licensed under the [LGPLv3](http://www.gnu.org/licenses/lgpl-3.0.html).
255
+
256
+ The packages include also other binaries. Full list of licenses can be found from [LICENSE-3RD-PARTY.txt](https://github.com/opencv/opencv-python/blob/master/LICENSE-3RD-PARTY.txt).
257
+
258
+ ### Versioning
259
+
260
+ ``find_version.py`` script searches for the version information from OpenCV sources and appends also a revision number specific to this repository to the version string. It saves the version information to ``version.py`` file under ``cv2`` in addition to some other flags.
261
+
262
+ ### Releases
263
+
264
+ A release is made and uploaded to PyPI when a new tag is pushed to master branch. These tags differentiate packages (this repo might have modifications but OpenCV version stays same) and should be incremented sequentially. In practice, release version numbers look like this:
265
+
266
+ ``cv_major.cv_minor.cv_revision.package_revision`` e.g. ``3.1.0.0``
267
+
268
+ The master branch follows OpenCV master branch releases. 3.4 branch follows OpenCV 3.4 bugfix releases.
269
+
270
+ ### Development builds
271
+
272
+ Every commit to the master branch of this repo will be built. Possible build artifacts use local version identifiers:
273
+
274
+ ``cv_major.cv_minor.cv_revision+git_hash_of_this_repo`` e.g. ``3.1.0+14a8d39``
275
+
276
+ These artifacts can't be and will not be uploaded to PyPI.
277
+
278
+ ### Manylinux wheels
279
+
280
+ Linux wheels are built using [manylinux2014](https://github.com/pypa/manylinux). These wheels should work out of the box for most of the distros (which use GNU C standard library) out there since they are built against an old version of glibc.
281
+
282
+ The default ``manylinux2014`` images have been extended with some OpenCV dependencies. See [Docker folder](https://github.com/skvark/opencv-python/tree/master/docker) for more info.
283
+
284
+ ### Supported Python versions
285
+
286
+ Python 3.x compatible pre-built wheels are provided for the officially supported Python versions (not in EOL):
287
+
288
+ - 3.7
289
+ - 3.8
290
+ - 3.9
291
+ - 3.10
292
+ - 3.11
293
+ - 3.12
294
+
295
+ ### Backward compatibility
296
+
297
+ Starting from 4.2.0 and 3.4.9 builds the macOS Travis build environment was updated to XCode 9.4. The change effectively dropped support for older than 10.13 macOS versions.
298
+
299
+ Starting from 4.3.0 and 3.4.10 builds the Linux build environment was updated from `manylinux1` to `manylinux2014`. This dropped support for old Linux distributions.
300
+
301
+ Starting from version 4.7.0 the Mac OS GitHub Actions build environment was update to version 11. Mac OS 10.x support depricated. See https://github.com/actions/runner-images/issues/5583
302
+
303
+ Starting from version 4.9.0 the Mac OS GitHub Actions build environment was update to version 12. Mac OS 10.x support depricated by Brew and most of used packages.
304
+
305
+
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/RECORD ADDED
@@ -0,0 +1,112 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ cv2/Error/__init__.pyi,sha256=A6NKtoMeZAvZWHC6DrJiwMVChY7LLxFfvuZ2dW4KSm8,4076
2
+ cv2/LICENSE-3RD-PARTY.txt,sha256=T8PBE9U-ldoPPEM2VaZwZ7cxDlOvMettKA6UzGkno2M,152360
3
+ cv2/LICENSE.txt,sha256=CdcZBY54Kse8cbohyUThE2zeK7lXwOiIEh8CGNa18Cw,1070
4
+ cv2/__init__.py,sha256=k2vZTFpd6_AhL8dRr3nToWNlLz6FAlnfIVnbaqPtitg,6612
5
+ cv2/__init__.pyi,sha256=OhpFobK-D08EJnTFveROVi0u4TwA5_7wuDpCCN4M01k,297966
6
+ cv2/__pycache__/__init__.cpython-310.pyc,,
7
+ cv2/__pycache__/config-3.cpython-310.pyc,,
8
+ cv2/__pycache__/config.cpython-310.pyc,,
9
+ cv2/__pycache__/load_config_py2.cpython-310.pyc,,
10
+ cv2/__pycache__/load_config_py3.cpython-310.pyc,,
11
+ cv2/__pycache__/version.cpython-310.pyc,,
12
+ cv2/aruco/__init__.pyi,sha256=XOaNz4SbfQ0UFH8guZ9WgTybx8gekTOWr8452Yjz54E,13995
13
+ cv2/barcode/__init__.pyi,sha256=19t0bbiTB8nxuT0DyqcTwEWGBynXm6NkaZg646flAL0,1441
14
+ cv2/config-3.py,sha256=mnqt9yS4IgAfXpY7Af1ON11F4su-Mo0sp7QqRAwIOhw,724
15
+ cv2/config.py,sha256=l04tQJbuGpqaNB3xvzPhaXNoO_GsczAG3if_LyO8WE0,111
16
+ cv2/cuda/__init__.pyi,sha256=gNkBAoEdrvkxwo4brAXNBCU_RDWixz575CWi2YEvYK4,16036
17
+ cv2/cv2.abi3.so,sha256=csf-mziezZVJtunNyjXuEZQN4V0T1GciuEEoZcr6vHg,65651673
18
+ cv2/data/__init__.py,sha256=125Pcte_OtB55ZxjWg5ko8ugpnogZ1sRMyP48dtBCMw,70
19
+ cv2/data/__pycache__/__init__.cpython-310.pyc,,
20
+ cv2/data/haarcascade_eye.xml,sha256=ccxk_DBaNV3GAGeID2-71D3RVb1j7jhEZhob2jSy_Yw,341406
21
+ cv2/data/haarcascade_eye_tree_eyeglasses.xml,sha256=4y-cZ5NcM-nRMx6xT6WFVP8Xg1wDdCZjvLl6iS6Talc,601661
22
+ cv2/data/haarcascade_frontalcatface.xml,sha256=rCusk07yQoTviisunY5X7vhKwdaUO00R5cnoWE3Aacg,411388
23
+ cv2/data/haarcascade_frontalcatface_extended.xml,sha256=_9DR0o8H0DdsidtMmEUAnChVzHbIz_dj1TMdyTYdqFQ,382918
24
+ cv2/data/haarcascade_frontalface_alt.xml,sha256=YoHfE0Wcwhj_BH0Csq44WbEv8UqT_-iVL3sz-te5aXs,676709
25
+ cv2/data/haarcascade_frontalface_alt2.xml,sha256=ewyWfZq7373gJeuceGlH0VG2QmBA0HqPlWLtj9kHJLQ,540616
26
+ cv2/data/haarcascade_frontalface_alt_tree.xml,sha256=Dl7kfswTJp1U3XpV-LU3UhZ8Ulh3IId3MjiPsHigSAo,2689040
27
+ cv2/data/haarcascade_frontalface_default.xml,sha256=D31FJ4ROtRTUpJSOgi2pD7sWo0oLu7xq3GSYdHpar7A,930127
28
+ cv2/data/haarcascade_fullbody.xml,sha256=BBdFxx7vG1yGrvIk8XznWwQtMzFMyPZ1dCT4vYzTCqE,476827
29
+ cv2/data/haarcascade_lefteye_2splits.xml,sha256=dMMjx4yBR1_JFY-sv7hmuwzKBr5B9XHfR9SsjQH5zkw,195369
30
+ cv2/data/haarcascade_license_plate_rus_16stages.xml,sha256=TRxEv3obxOIE-iWwRu0Kz_1_cTzBP-KVi2l3Elxg3eo,47775
31
+ cv2/data/haarcascade_lowerbody.xml,sha256=HmluHHxmxDmuIpz_-IcfQgN8NX6eHgkKK1nrwfj_XLs,395322
32
+ cv2/data/haarcascade_profileface.xml,sha256=s5pKO-RVOdsUan_B0-dhopLBluuIQhGF5qYVswVeYS0,828514
33
+ cv2/data/haarcascade_righteye_2splits.xml,sha256=TPDXK-pzB-mvfrmdSsvhXXEBpnwi_Nz77v1pKtN893Y,196170
34
+ cv2/data/haarcascade_russian_plate_number.xml,sha256=gUy1lUaCr1cOWDYfnl-LW1E6QRJ3a7nsrO-fDkymwtc,75482
35
+ cv2/data/haarcascade_smile.xml,sha256=TKHzBOq9C1rjAYDIGstT4Walhn5b4Xsxa9PzLP34fYo,188506
36
+ cv2/data/haarcascade_upperbody.xml,sha256=cyirT9sVkvU9mNfqWxudkOAa9dlfISrzeMfrV5BIu18,785819
37
+ cv2/detail/__init__.pyi,sha256=FXndW6oxsE46hjgKBezLvqJ_iEAcOCmNOAZSpbSM_-8,22374
38
+ cv2/dnn/__init__.pyi,sha256=v_SSO59MvE3Ys1To0zcO0QpJVK9XANaJf8JUxgjtjqI,22811
39
+ cv2/fisheye/__init__.pyi,sha256=Nbxh4ounDQfzsAxkM_hJAPp7zxiIO9ZNqke0JjFG3hs,8520
40
+ cv2/flann/__init__.pyi,sha256=ZxYG07bhFyFRA2d1lbPmAm_KEknsTcE1_NNw_Ksz1HQ,2677
41
+ cv2/gapi/__init__.py,sha256=6WBAjfq1FCiRADgYXGAKITHdBB6t0_jZ8hkTU8Biz-M,10298
42
+ cv2/gapi/__init__.pyi,sha256=zCLTsHvmbiGmlDUXPWqOGdgFcj66_iw7FXiTr4Y91m0,14636
43
+ cv2/gapi/__pycache__/__init__.cpython-310.pyc,,
44
+ cv2/gapi/core/__init__.pyi,sha256=_3OM_ITOrZomn7gs4HM-DRk8ngbjWkdr26KrmH3t4ks,142
45
+ cv2/gapi/core/cpu/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
46
+ cv2/gapi/core/fluid/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
47
+ cv2/gapi/core/ocl/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
48
+ cv2/gapi/ie/__init__.pyi,sha256=rbOXOU39Wpt9Lhh1o1qr7Zj7qljqAu6aqoYsm4433yQ,1117
49
+ cv2/gapi/ie/detail/__init__.pyi,sha256=hGTS3yIiIq1B-djXgSQIPmeF7VDyeyucUuZOnd4O0OQ,269
50
+ cv2/gapi/imgproc/__init__.pyi,sha256=UUtPJcDK_UaE_TKN8K9Oz1TEChCQHDDB_eTI08mVXmU,71
51
+ cv2/gapi/imgproc/fluid/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
52
+ cv2/gapi/oak/__init__.pyi,sha256=Tb7YXytKxnBFZZ8qTqHSZsDEpRt2937NXtbOQK23Ksc,1734
53
+ cv2/gapi/onnx/__init__.pyi,sha256=XAQ4M2p7kcm0gSL_2OJkjoI8h5AzlHQh6xDQEX7z5e4,1344
54
+ cv2/gapi/onnx/ep/__init__.pyi,sha256=dUYUbcjIjWtx7peQLPKU60qUzMqEH8On9mU4lsdXbmQ,1357
55
+ cv2/gapi/ot/__init__.pyi,sha256=XTMT90lnElxl_KfhFi5xDwQWvB0g5N8tf7Cgb8VHcAY,720
56
+ cv2/gapi/ot/cpu/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
57
+ cv2/gapi/ov/__init__.pyi,sha256=3BqKzC_lV-wzhwu2cawCBvGbMG_zxt5D6anjhORXvuM,2647
58
+ cv2/gapi/own/__init__.pyi,sha256=GzL91pOQQNsGcBGmZ_XDAXaLoF4N9qVgj_IaYzduSNc,69
59
+ cv2/gapi/own/detail/__init__.pyi,sha256=sTC8JFcjDcVxnaFfFc-VmuxjHBg6RMzfafFHtS8yrFU,140
60
+ cv2/gapi/render/__init__.pyi,sha256=S4FWzy_CJqqs3dPYl3bXJoLQSGeVZdoBK7EmHvbPVOM,66
61
+ cv2/gapi/render/ocv/__init__.pyi,sha256=MfRTDEPtcQekGnrvoaSSadxyylXPfa2lz8ucAkzjmh8,93
62
+ cv2/gapi/streaming/__init__.pyi,sha256=qIOndKlPMevrSglTW-vVugzy_n7nITT6lr_zrlUv9cI,813
63
+ cv2/gapi/video/__init__.pyi,sha256=V0Emspufw7x2-knfd7kE8LnLjY_ujIz_TaxR_oIyAps,150
64
+ cv2/gapi/wip/__init__.pyi,sha256=f7mz60ehM9yrK0_Vt28NP--WietDE65EjM5O91LVx5M,1086
65
+ cv2/gapi/wip/draw/__init__.pyi,sha256=x2BhywI5C-uMHF1H6L9AwrgjRtKHFr032TOnqtE9a9Q,3162
66
+ cv2/gapi/wip/gst/__init__.pyi,sha256=8VtSKP9duTmY7ETAACwzVEWP9xdDW0pW82UtL_8Z7Aw,467
67
+ cv2/gapi/wip/onevpl/__init__.pyi,sha256=eLbVPey7JCU5YdRSUH6lLlD1eT-1s7YqZrQh6xNdIlo,397
68
+ cv2/ipp/__init__.pyi,sha256=WSHVIqIT97vmudtuJjhOJYiZ0iBdYx4AtB0iJqtdD0o,223
69
+ cv2/load_config_py2.py,sha256=xP_h2pObzfbN8tONV7CAQmGh94fQ-0t0HysrXDDlt_Q,151
70
+ cv2/load_config_py3.py,sha256=A9wfETdKZnybfbEN1SdtZAsMLVsueGa0zO93JzK9OFI,262
71
+ cv2/mat_wrapper/__init__.py,sha256=i2JwY6kmDL_s7YXzIl-JZuWCMVYkRi4F6j60W3j4P9A,1124
72
+ cv2/mat_wrapper/__pycache__/__init__.cpython-310.pyc,,
73
+ cv2/misc/__init__.py,sha256=yr9PkxKslxRc87hhtIJRn5RommP9jaqksYr-ZDuj7cU,37
74
+ cv2/misc/__pycache__/__init__.cpython-310.pyc,,
75
+ cv2/misc/__pycache__/version.cpython-310.pyc,,
76
+ cv2/misc/version.py,sha256=iTExq1jwGgAv3jtYQHRI8pSpmfzPsjkG9brsH0bdYhk,90
77
+ cv2/ml/__init__.pyi,sha256=KGiSrNBU8YWqJzhV3owS_b_nKl_40EXwdGrmC1e41J4,22803
78
+ cv2/ocl/__init__.pyi,sha256=qv_ilpHZosfPEMHEEqqQLe6cJpsb9PiiwIZMbd---ho,5527
79
+ cv2/ogl/__init__.pyi,sha256=KxTX9DHYyXg2ipvOJiFeAsRivAjmvBkqeiLZV-0snII,1472
80
+ cv2/parallel/__init__.pyi,sha256=tc5nNoWrTkD7VAfhbajumKF79LBolpqlKjYX-lY2__8,129
81
+ cv2/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
82
+ cv2/samples/__init__.pyi,sha256=cjSW5vo2oMpIWHwP-3IY4hWjlKUTz8gd1MX7pLOCWKo,324
83
+ cv2/segmentation/__init__.pyi,sha256=jwKBUCRaXhHAM3FdzpLuGucGfNLWxWu5CDfLOpkcan4,1739
84
+ cv2/typing/__init__.py,sha256=lXV-dc4P2hCjZ63ZVA6Jwy_Cn34EfNHuhKAGJpgenlk,5256
85
+ cv2/typing/__pycache__/__init__.cpython-310.pyc,,
86
+ cv2/utils/__init__.py,sha256=fuw4GHHOXsxxKc-AadAEOKQq_I1Gr4G3yMlRvAbTP30,330
87
+ cv2/utils/__init__.pyi,sha256=q7PpnVUH597R_sF7AGrsRVDOIGKflT0b77ll-mkmb7g,3592
88
+ cv2/utils/__pycache__/__init__.cpython-310.pyc,,
89
+ cv2/utils/fs/__init__.pyi,sha256=lu2cK1Dbd7wRTOTju_kVVCvU4mNB5v5hSVpBxSXXvJg,87
90
+ cv2/utils/nested/__init__.pyi,sha256=n2J3aSxC2MrPKaKb4igY_d49luuuQqW7A_YTx6eZz9Q,573
91
+ cv2/version.py,sha256=_JDvSGXdI1Xyzdw247WfPCFhF2-HUjpfEGLtwbkI6bA,92
92
+ cv2/videoio_registry/__init__.pyi,sha256=h-7AlM3cFG5xxcPwZiVQ3n3ibe7BpGPlhgDcWOqZPA4,783
93
+ opencv_python_headless-4.10.0.84.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
94
+ opencv_python_headless-4.10.0.84.dist-info/LICENSE-3RD-PARTY.txt,sha256=T8PBE9U-ldoPPEM2VaZwZ7cxDlOvMettKA6UzGkno2M,152360
95
+ opencv_python_headless-4.10.0.84.dist-info/LICENSE.txt,sha256=CdcZBY54Kse8cbohyUThE2zeK7lXwOiIEh8CGNa18Cw,1070
96
+ opencv_python_headless-4.10.0.84.dist-info/METADATA,sha256=Qz3309JaFw44BrbKrzV2TjzgzY890rmgHV3rTICX55k,20268
97
+ opencv_python_headless-4.10.0.84.dist-info/RECORD,,
98
+ opencv_python_headless-4.10.0.84.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
99
+ opencv_python_headless-4.10.0.84.dist-info/WHEEL,sha256=EAtJEfUIJ_UiIhsbDQlddQslMIo1TSnut_vtG8YV5KA,142
100
+ opencv_python_headless-4.10.0.84.dist-info/top_level.txt,sha256=SY8vrf_sYOg99OP9euhz7q36pPy_2VK5vbeEWXwwSoc,4
101
+ opencv_python_headless.libs/libavcodec-9aae324f.so.59.37.100,sha256=gqg2Ki-6C7bWuPVlxHLyrjD6-g9oNsdwGg6kZB2tBhY,13448513
102
+ opencv_python_headless.libs/libavformat-3ff1be5b.so.59.27.100,sha256=_A2syd44-eJSf4nnEmfX337E9XT5WstE8IOb2bfs8Gg,2571489
103
+ opencv_python_headless.libs/libavutil-a0a0531e.so.57.28.100,sha256=_HhiKqfwZH7fZ95HlYWD9p3ANOucUPLvqFPHvhxTq6Y,844673
104
+ opencv_python_headless.libs/libcrypto-337dac8b.so.1.1,sha256=0opzjndvX1wXs1d7FrbfwJMTIGBhJ2nQPPQEjroQt6o,3481345
105
+ opencv_python_headless.libs/libgfortran-91cc3cb1.so.3.0.0,sha256=VePrZzBsL_F-b4oIEOqg3LJulM2DkkxQZdUEDoeBRgg,1259665
106
+ opencv_python_headless.libs/libopenblas-r0-f650aae0.3.3.so,sha256=eewCtT9XPNcRaonwTDl0cwGOf9oFcgs1TUNQXBnUeVg,37325001
107
+ opencv_python_headless.libs/libpng16-1bde1c40.so.16.43.0,sha256=02j5YLlUW3rzjlXdakRnHd852_9hWJ6dbvZ-Kwoex2Y,1105201
108
+ opencv_python_headless.libs/libquadmath-96973f99.so.0.0.0,sha256=k0wi3tDn0WnE1GeIdslgUa3z2UVF2pYvYLQWWbB12js,247609
109
+ opencv_python_headless.libs/libssl-28bef1ac.so.1.1,sha256=ztxM3ZFLkgmYMbZoTqNGqj_ycgrn64a6Wa9Ni66AWmU,736177
110
+ opencv_python_headless.libs/libswresample-2ec4394e.so.4.7.100,sha256=53S-M_Gn06zoAaUbYkdaMuLvXEWu2Mv1_YLkiW2oJ9I,132417
111
+ opencv_python_headless.libs/libswscale-2c3c8be7.so.6.7.100,sha256=Lp2HzwvDYmIHUUay0z4VqLo5jICmVQr3Z4uD1C1IXVA,619945
112
+ opencv_python_headless.libs/libvpx-c3a7933e.so.9.0.0,sha256=IGHYF4IPzg_AB5f9LeyGhur0ZGy4xgi_j_cJBUbdVF8,3508265
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/REQUESTED ADDED
File without changes
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/WHEEL ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: skbuild 0.17.6
3
+ Root-Is-Purelib: false
4
+ Tag: cp37-abi3-manylinux_2_17_x86_64
5
+ Tag: cp37-abi3-manylinux2014_x86_64
6
+
minigpt2/lib/python3.10/site-packages/opencv_python_headless-4.10.0.84.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ cv2
minigpt2/lib/python3.10/site-packages/orjson-3.10.14.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
minigpt2/lib/python3.10/site-packages/orjson-3.10.14.dist-info/METADATA ADDED
@@ -0,0 +1,1141 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.4
2
+ Name: orjson
3
+ Version: 3.10.14
4
+ Classifier: Development Status :: 5 - Production/Stable
5
+ Classifier: Intended Audience :: Developers
6
+ Classifier: License :: OSI Approved :: Apache Software License
7
+ Classifier: License :: OSI Approved :: MIT License
8
+ Classifier: Operating System :: MacOS
9
+ Classifier: Operating System :: Microsoft :: Windows
10
+ Classifier: Operating System :: POSIX :: Linux
11
+ Classifier: Programming Language :: Python :: 3
12
+ Classifier: Programming Language :: Python :: 3.8
13
+ Classifier: Programming Language :: Python :: 3.9
14
+ Classifier: Programming Language :: Python :: 3.10
15
+ Classifier: Programming Language :: Python :: 3.11
16
+ Classifier: Programming Language :: Python :: 3.12
17
+ Classifier: Programming Language :: Python :: 3.13
18
+ Classifier: Programming Language :: Python :: 3.14
19
+ Classifier: Programming Language :: Python :: Implementation :: CPython
20
+ Classifier: Programming Language :: Python
21
+ Classifier: Programming Language :: Rust
22
+ Classifier: Typing :: Typed
23
+ License-File: LICENSE-APACHE
24
+ License-File: LICENSE-MIT
25
+ Summary: Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy
26
+ Keywords: fast,json,dataclass,dataclasses,datetime,rfc,8259,3339
27
+ Home-Page: https://github.com/ijl/orjson
28
+ Author: ijl <ijl@mailbox.org>
29
+ Author-email: ijl <ijl@mailbox.org>
30
+ License: Apache-2.0 OR MIT
31
+ Requires-Python: >=3.8
32
+ Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
33
+ Project-URL: Documentation, https://github.com/ijl/orjson
34
+ Project-URL: Changelog, https://github.com/ijl/orjson/blob/master/CHANGELOG.md
35
+
36
+ # orjson
37
+
38
+ orjson is a fast, correct JSON library for Python. It
39
+ [benchmarks](https://github.com/ijl/orjson?tab=readme-ov-file#performance) as the fastest Python
40
+ library for JSON and is more correct than the standard json library or other
41
+ third-party libraries. It serializes
42
+ [dataclass](https://github.com/ijl/orjson?tab=readme-ov-file#dataclass),
43
+ [datetime](https://github.com/ijl/orjson?tab=readme-ov-file#datetime),
44
+ [numpy](https://github.com/ijl/orjson?tab=readme-ov-file#numpy), and
45
+ [UUID](https://github.com/ijl/orjson?tab=readme-ov-file#uuid) instances natively.
46
+
47
+ [orjson.dumps()](https://github.com/ijl/orjson?tab=readme-ov-file#serialize) is
48
+ something like 10x as fast as `json`, serializes
49
+ common types and subtypes, has a `default` parameter for the caller to specify
50
+ how to serialize arbitrary types, and has a number of flags controlling output.
51
+
52
+ [orjson.loads()](https://github.com/ijl/orjson?tab=readme-ov-file#deserialize)
53
+ is something like 2x as fast as `json`, and is strictly compliant with UTF-8 and
54
+ RFC 8259 ("The JavaScript Object Notation (JSON) Data Interchange Format").
55
+
56
+ Reading from and writing to files, line-delimited JSON files, and so on is
57
+ not provided by the library.
58
+
59
+ orjson supports CPython 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, and 3.14.
60
+
61
+ It distributes amd64/x86_64, i686/x86, aarch64/armv8, arm7, POWER/ppc64le,
62
+ and s390x wheels for Linux, amd64 and aarch64 wheels for macOS, and amd64
63
+ and i686/x86 wheels for Windows.
64
+
65
+ orjson does not and will not support PyPy, embedded Python builds for
66
+ Android/iOS, or PEP 554 subinterpreters.
67
+
68
+ Releases follow semantic versioning and serializing a new object type
69
+ without an opt-in flag is considered a breaking change.
70
+
71
+ orjson is licensed under both the Apache 2.0 and MIT licenses. The
72
+ repository and issue tracker is
73
+ [github.com/ijl/orjson](https://github.com/ijl/orjson), and patches may be
74
+ submitted there. There is a
75
+ [CHANGELOG](https://github.com/ijl/orjson/blob/master/CHANGELOG.md)
76
+ available in the repository.
77
+
78
+ 1. [Usage](https://github.com/ijl/orjson?tab=readme-ov-file#usage)
79
+ 1. [Install](https://github.com/ijl/orjson?tab=readme-ov-file#install)
80
+ 2. [Quickstart](https://github.com/ijl/orjson?tab=readme-ov-file#quickstart)
81
+ 3. [Migrating](https://github.com/ijl/orjson?tab=readme-ov-file#migrating)
82
+ 4. [Serialize](https://github.com/ijl/orjson?tab=readme-ov-file#serialize)
83
+ 1. [default](https://github.com/ijl/orjson?tab=readme-ov-file#default)
84
+ 2. [option](https://github.com/ijl/orjson?tab=readme-ov-file#option)
85
+ 3. [Fragment](https://github.com/ijl/orjson?tab=readme-ov-file#fragment)
86
+ 5. [Deserialize](https://github.com/ijl/orjson?tab=readme-ov-file#deserialize)
87
+ 2. [Types](https://github.com/ijl/orjson?tab=readme-ov-file#types)
88
+ 1. [dataclass](https://github.com/ijl/orjson?tab=readme-ov-file#dataclass)
89
+ 2. [datetime](https://github.com/ijl/orjson?tab=readme-ov-file#datetime)
90
+ 3. [enum](https://github.com/ijl/orjson?tab=readme-ov-file#enum)
91
+ 4. [float](https://github.com/ijl/orjson?tab=readme-ov-file#float)
92
+ 5. [int](https://github.com/ijl/orjson?tab=readme-ov-file#int)
93
+ 6. [numpy](https://github.com/ijl/orjson?tab=readme-ov-file#numpy)
94
+ 7. [str](https://github.com/ijl/orjson?tab=readme-ov-file#str)
95
+ 8. [uuid](https://github.com/ijl/orjson?tab=readme-ov-file#uuid)
96
+ 3. [Testing](https://github.com/ijl/orjson?tab=readme-ov-file#testing)
97
+ 4. [Performance](https://github.com/ijl/orjson?tab=readme-ov-file#performance)
98
+ 1. [Latency](https://github.com/ijl/orjson?tab=readme-ov-file#latency)
99
+ 2. [Reproducing](https://github.com/ijl/orjson?tab=readme-ov-file#reproducing)
100
+ 5. [Questions](https://github.com/ijl/orjson?tab=readme-ov-file#questions)
101
+ 6. [Packaging](https://github.com/ijl/orjson?tab=readme-ov-file#packaging)
102
+ 7. [License](https://github.com/ijl/orjson?tab=readme-ov-file#license)
103
+
104
+ ## Usage
105
+
106
+ ### Install
107
+
108
+ To install a wheel from PyPI, install the `orjson` package.
109
+
110
+ In `requirements.in` or `requirements.txt` format, specify:
111
+
112
+ ```txt
113
+ orjson >= 3.10,<4
114
+ ```
115
+
116
+ In `pyproject.toml` format, specify:
117
+
118
+ ```toml
119
+ orjson = "^3.10"
120
+ ```
121
+
122
+ To build a wheel, see [packaging](https://github.com/ijl/orjson?tab=readme-ov-file#packaging).
123
+
124
+ ### Quickstart
125
+
126
+ This is an example of serializing, with options specified, and deserializing:
127
+
128
+ ```python
129
+ >>> import orjson, datetime, numpy
130
+ >>> data = {
131
+ "type": "job",
132
+ "created_at": datetime.datetime(1970, 1, 1),
133
+ "status": "🆗",
134
+ "payload": numpy.array([[1, 2], [3, 4]]),
135
+ }
136
+ >>> orjson.dumps(data, option=orjson.OPT_NAIVE_UTC | orjson.OPT_SERIALIZE_NUMPY)
137
+ b'{"type":"job","created_at":"1970-01-01T00:00:00+00:00","status":"\xf0\x9f\x86\x97","payload":[[1,2],[3,4]]}'
138
+ >>> orjson.loads(_)
139
+ {'type': 'job', 'created_at': '1970-01-01T00:00:00+00:00', 'status': '🆗', 'payload': [[1, 2], [3, 4]]}
140
+ ```
141
+
142
+ ### Migrating
143
+
144
+ orjson version 3 serializes more types than version 2. Subclasses of `str`,
145
+ `int`, `dict`, and `list` are now serialized. This is faster and more similar
146
+ to the standard library. It can be disabled with
147
+ `orjson.OPT_PASSTHROUGH_SUBCLASS`.`dataclasses.dataclass` instances
148
+ are now serialized by default and cannot be customized in a
149
+ `default` function unless `option=orjson.OPT_PASSTHROUGH_DATACLASS` is
150
+ specified. `uuid.UUID` instances are serialized by default.
151
+ For any type that is now serialized,
152
+ implementations in a `default` function and options enabling them can be
153
+ removed but do not need to be. There was no change in deserialization.
154
+
155
+ To migrate from the standard library, the largest difference is that
156
+ `orjson.dumps` returns `bytes` and `json.dumps` returns a `str`.
157
+
158
+ Users with `dict` objects using non-`str` keys should specify `option=orjson.OPT_NON_STR_KEYS`.
159
+
160
+ `sort_keys` is replaced by `option=orjson.OPT_SORT_KEYS`.
161
+
162
+ `indent` is replaced by `option=orjson.OPT_INDENT_2` and other levels of indentation are not
163
+ supported.
164
+
165
+ `ensure_ascii` is probably not relevant today and UTF-8 characters cannot be
166
+ escaped to ASCII.
167
+
168
+ ### Serialize
169
+
170
+ ```python
171
+ def dumps(
172
+ __obj: Any,
173
+ default: Optional[Callable[[Any], Any]] = ...,
174
+ option: Optional[int] = ...,
175
+ ) -> bytes: ...
176
+ ```
177
+
178
+ `dumps()` serializes Python objects to JSON.
179
+
180
+ It natively serializes
181
+ `str`, `dict`, `list`, `tuple`, `int`, `float`, `bool`, `None`,
182
+ `dataclasses.dataclass`, `typing.TypedDict`, `datetime.datetime`,
183
+ `datetime.date`, `datetime.time`, `uuid.UUID`, `numpy.ndarray`, and
184
+ `orjson.Fragment` instances. It supports arbitrary types through `default`. It
185
+ serializes subclasses of `str`, `int`, `dict`, `list`,
186
+ `dataclasses.dataclass`, and `enum.Enum`. It does not serialize subclasses
187
+ of `tuple` to avoid serializing `namedtuple` objects as arrays. To avoid
188
+ serializing subclasses, specify the option `orjson.OPT_PASSTHROUGH_SUBCLASS`.
189
+
190
+ The output is a `bytes` object containing UTF-8.
191
+
192
+ The global interpreter lock (GIL) is held for the duration of the call.
193
+
194
+ It raises `JSONEncodeError` on an unsupported type. This exception message
195
+ describes the invalid object with the error message
196
+ `Type is not JSON serializable: ...`. To fix this, specify
197
+ [default](https://github.com/ijl/orjson?tab=readme-ov-file#default).
198
+
199
+ It raises `JSONEncodeError` on a `str` that contains invalid UTF-8.
200
+
201
+ It raises `JSONEncodeError` on an integer that exceeds 64 bits by default or,
202
+ with `OPT_STRICT_INTEGER`, 53 bits.
203
+
204
+ It raises `JSONEncodeError` if a `dict` has a key of a type other than `str`,
205
+ unless `OPT_NON_STR_KEYS` is specified.
206
+
207
+ It raises `JSONEncodeError` if the output of `default` recurses to handling by
208
+ `default` more than 254 levels deep.
209
+
210
+ It raises `JSONEncodeError` on circular references.
211
+
212
+ It raises `JSONEncodeError` if a `tzinfo` on a datetime object is
213
+ unsupported.
214
+
215
+ `JSONEncodeError` is a subclass of `TypeError`. This is for compatibility
216
+ with the standard library.
217
+
218
+ If the failure was caused by an exception in `default` then
219
+ `JSONEncodeError` chains the original exception as `__cause__`.
220
+
221
+ #### default
222
+
223
+ To serialize a subclass or arbitrary types, specify `default` as a
224
+ callable that returns a supported type. `default` may be a function,
225
+ lambda, or callable class instance. To specify that a type was not
226
+ handled by `default`, raise an exception such as `TypeError`.
227
+
228
+ ```python
229
+ >>> import orjson, decimal
230
+ >>>
231
+ def default(obj):
232
+ if isinstance(obj, decimal.Decimal):
233
+ return str(obj)
234
+ raise TypeError
235
+
236
+ >>> orjson.dumps(decimal.Decimal("0.0842389659712649442845"))
237
+ JSONEncodeError: Type is not JSON serializable: decimal.Decimal
238
+ >>> orjson.dumps(decimal.Decimal("0.0842389659712649442845"), default=default)
239
+ b'"0.0842389659712649442845"'
240
+ >>> orjson.dumps({1, 2}, default=default)
241
+ orjson.JSONEncodeError: Type is not JSON serializable: set
242
+ ```
243
+
244
+ The `default` callable may return an object that itself
245
+ must be handled by `default` up to 254 times before an exception
246
+ is raised.
247
+
248
+ It is important that `default` raise an exception if a type cannot be handled.
249
+ Python otherwise implicitly returns `None`, which appears to the caller
250
+ like a legitimate value and is serialized:
251
+
252
+ ```python
253
+ >>> import orjson, json
254
+ >>>
255
+ def default(obj):
256
+ if isinstance(obj, decimal.Decimal):
257
+ return str(obj)
258
+
259
+ >>> orjson.dumps({"set":{1, 2}}, default=default)
260
+ b'{"set":null}'
261
+ >>> json.dumps({"set":{1, 2}}, default=default)
262
+ '{"set":null}'
263
+ ```
264
+
265
+ #### option
266
+
267
+ To modify how data is serialized, specify `option`. Each `option` is an integer
268
+ constant in `orjson`. To specify multiple options, mask them together, e.g.,
269
+ `option=orjson.OPT_STRICT_INTEGER | orjson.OPT_NAIVE_UTC`.
270
+
271
+ ##### OPT_APPEND_NEWLINE
272
+
273
+ Append `\n` to the output. This is a convenience and optimization for the
274
+ pattern of `dumps(...) + "\n"`. `bytes` objects are immutable and this
275
+ pattern copies the original contents.
276
+
277
+ ```python
278
+ >>> import orjson
279
+ >>> orjson.dumps([])
280
+ b"[]"
281
+ >>> orjson.dumps([], option=orjson.OPT_APPEND_NEWLINE)
282
+ b"[]\n"
283
+ ```
284
+
285
+ ##### OPT_INDENT_2
286
+
287
+ Pretty-print output with an indent of two spaces. This is equivalent to
288
+ `indent=2` in the standard library. Pretty printing is slower and the output
289
+ larger. orjson is the fastest compared library at pretty printing and has
290
+ much less of a slowdown to pretty print than the standard library does. This
291
+ option is compatible with all other options.
292
+
293
+ ```python
294
+ >>> import orjson
295
+ >>> orjson.dumps({"a": "b", "c": {"d": True}, "e": [1, 2]})
296
+ b'{"a":"b","c":{"d":true},"e":[1,2]}'
297
+ >>> orjson.dumps(
298
+ {"a": "b", "c": {"d": True}, "e": [1, 2]},
299
+ option=orjson.OPT_INDENT_2
300
+ )
301
+ b'{\n "a": "b",\n "c": {\n "d": true\n },\n "e": [\n 1,\n 2\n ]\n}'
302
+ ```
303
+
304
+ If displayed, the indentation and linebreaks appear like this:
305
+
306
+ ```json
307
+ {
308
+ "a": "b",
309
+ "c": {
310
+ "d": true
311
+ },
312
+ "e": [
313
+ 1,
314
+ 2
315
+ ]
316
+ }
317
+ ```
318
+
319
+ This measures serializing the github.json fixture as compact (52KiB) or
320
+ pretty (64KiB):
321
+
322
+ | Library | compact (ms) | pretty (ms) | vs. orjson |
323
+ |-----------|----------------|---------------|--------------|
324
+ | orjson | 0.01 | 0.02 | 1 |
325
+ | json | 0.13 | 0.54 | 34 |
326
+
327
+ This measures serializing the citm_catalog.json fixture, more of a worst
328
+ case due to the amount of nesting and newlines, as compact (489KiB) or
329
+ pretty (1.1MiB):
330
+
331
+ | Library | compact (ms) | pretty (ms) | vs. orjson |
332
+ |-----------|----------------|---------------|--------------|
333
+ | orjson | 0.25 | 0.45 | 1 |
334
+ | json | 3.01 | 24.42 | 54.4 |
335
+
336
+ This can be reproduced using the `pyindent` script.
337
+
338
+ ##### OPT_NAIVE_UTC
339
+
340
+ Serialize `datetime.datetime` objects without a `tzinfo` as UTC. This
341
+ has no effect on `datetime.datetime` objects that have `tzinfo` set.
342
+
343
+ ```python
344
+ >>> import orjson, datetime
345
+ >>> orjson.dumps(
346
+ datetime.datetime(1970, 1, 1, 0, 0, 0),
347
+ )
348
+ b'"1970-01-01T00:00:00"'
349
+ >>> orjson.dumps(
350
+ datetime.datetime(1970, 1, 1, 0, 0, 0),
351
+ option=orjson.OPT_NAIVE_UTC,
352
+ )
353
+ b'"1970-01-01T00:00:00+00:00"'
354
+ ```
355
+
356
+ ##### OPT_NON_STR_KEYS
357
+
358
+ Serialize `dict` keys of type other than `str`. This allows `dict` keys
359
+ to be one of `str`, `int`, `float`, `bool`, `None`, `datetime.datetime`,
360
+ `datetime.date`, `datetime.time`, `enum.Enum`, and `uuid.UUID`. For comparison,
361
+ the standard library serializes `str`, `int`, `float`, `bool` or `None` by
362
+ default. orjson benchmarks as being faster at serializing non-`str` keys
363
+ than other libraries. This option is slower for `str` keys than the default.
364
+
365
+ ```python
366
+ >>> import orjson, datetime, uuid
367
+ >>> orjson.dumps(
368
+ {uuid.UUID("7202d115-7ff3-4c81-a7c1-2a1f067b1ece"): [1, 2, 3]},
369
+ option=orjson.OPT_NON_STR_KEYS,
370
+ )
371
+ b'{"7202d115-7ff3-4c81-a7c1-2a1f067b1ece":[1,2,3]}'
372
+ >>> orjson.dumps(
373
+ {datetime.datetime(1970, 1, 1, 0, 0, 0): [1, 2, 3]},
374
+ option=orjson.OPT_NON_STR_KEYS | orjson.OPT_NAIVE_UTC,
375
+ )
376
+ b'{"1970-01-01T00:00:00+00:00":[1,2,3]}'
377
+ ```
378
+
379
+ These types are generally serialized how they would be as
380
+ values, e.g., `datetime.datetime` is still an RFC 3339 string and respects
381
+ options affecting it. The exception is that `int` serialization does not
382
+ respect `OPT_STRICT_INTEGER`.
383
+
384
+ This option has the risk of creating duplicate keys. This is because non-`str`
385
+ objects may serialize to the same `str` as an existing key, e.g.,
386
+ `{"1": true, 1: false}`. The last key to be inserted to the `dict` will be
387
+ serialized last and a JSON deserializer will presumably take the last
388
+ occurrence of a key (in the above, `false`). The first value will be lost.
389
+
390
+ This option is compatible with `orjson.OPT_SORT_KEYS`. If sorting is used,
391
+ note the sort is unstable and will be unpredictable for duplicate keys.
392
+
393
+ ```python
394
+ >>> import orjson, datetime
395
+ >>> orjson.dumps(
396
+ {"other": 1, datetime.date(1970, 1, 5): 2, datetime.date(1970, 1, 3): 3},
397
+ option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SORT_KEYS
398
+ )
399
+ b'{"1970-01-03":3,"1970-01-05":2,"other":1}'
400
+ ```
401
+
402
+ This measures serializing 589KiB of JSON comprising a `list` of 100 `dict`
403
+ in which each `dict` has both 365 randomly-sorted `int` keys representing epoch
404
+ timestamps as well as one `str` key and the value for each key is a
405
+ single integer. In "str keys", the keys were converted to `str` before
406
+ serialization, and orjson still specifes `option=orjson.OPT_NON_STR_KEYS`
407
+ (which is always somewhat slower).
408
+
409
+ | Library | str keys (ms) | int keys (ms) | int keys sorted (ms) |
410
+ |-----------|-----------------|-----------------|------------------------|
411
+ | orjson | 0.5 | 0.93 | 2.08 |
412
+ | json | 2.72 | 3.59 | |
413
+
414
+ json is blank because it
415
+ raises `TypeError` on attempting to sort before converting all keys to `str`.
416
+ This can be reproduced using the `pynonstr` script.
417
+
418
+ ##### OPT_OMIT_MICROSECONDS
419
+
420
+ Do not serialize the `microsecond` field on `datetime.datetime` and
421
+ `datetime.time` instances.
422
+
423
+ ```python
424
+ >>> import orjson, datetime
425
+ >>> orjson.dumps(
426
+ datetime.datetime(1970, 1, 1, 0, 0, 0, 1),
427
+ )
428
+ b'"1970-01-01T00:00:00.000001"'
429
+ >>> orjson.dumps(
430
+ datetime.datetime(1970, 1, 1, 0, 0, 0, 1),
431
+ option=orjson.OPT_OMIT_MICROSECONDS,
432
+ )
433
+ b'"1970-01-01T00:00:00"'
434
+ ```
435
+
436
+ ##### OPT_PASSTHROUGH_DATACLASS
437
+
438
+ Passthrough `dataclasses.dataclass` instances to `default`. This allows
439
+ customizing their output but is much slower.
440
+
441
+
442
+ ```python
443
+ >>> import orjson, dataclasses
444
+ >>>
445
+ @dataclasses.dataclass
446
+ class User:
447
+ id: str
448
+ name: str
449
+ password: str
450
+
451
+ def default(obj):
452
+ if isinstance(obj, User):
453
+ return {"id": obj.id, "name": obj.name}
454
+ raise TypeError
455
+
456
+ >>> orjson.dumps(User("3b1", "asd", "zxc"))
457
+ b'{"id":"3b1","name":"asd","password":"zxc"}'
458
+ >>> orjson.dumps(User("3b1", "asd", "zxc"), option=orjson.OPT_PASSTHROUGH_DATACLASS)
459
+ TypeError: Type is not JSON serializable: User
460
+ >>> orjson.dumps(
461
+ User("3b1", "asd", "zxc"),
462
+ option=orjson.OPT_PASSTHROUGH_DATACLASS,
463
+ default=default,
464
+ )
465
+ b'{"id":"3b1","name":"asd"}'
466
+ ```
467
+
468
+ ##### OPT_PASSTHROUGH_DATETIME
469
+
470
+ Passthrough `datetime.datetime`, `datetime.date`, and `datetime.time` instances
471
+ to `default`. This allows serializing datetimes to a custom format, e.g.,
472
+ HTTP dates:
473
+
474
+ ```python
475
+ >>> import orjson, datetime
476
+ >>>
477
+ def default(obj):
478
+ if isinstance(obj, datetime.datetime):
479
+ return obj.strftime("%a, %d %b %Y %H:%M:%S GMT")
480
+ raise TypeError
481
+
482
+ >>> orjson.dumps({"created_at": datetime.datetime(1970, 1, 1)})
483
+ b'{"created_at":"1970-01-01T00:00:00"}'
484
+ >>> orjson.dumps({"created_at": datetime.datetime(1970, 1, 1)}, option=orjson.OPT_PASSTHROUGH_DATETIME)
485
+ TypeError: Type is not JSON serializable: datetime.datetime
486
+ >>> orjson.dumps(
487
+ {"created_at": datetime.datetime(1970, 1, 1)},
488
+ option=orjson.OPT_PASSTHROUGH_DATETIME,
489
+ default=default,
490
+ )
491
+ b'{"created_at":"Thu, 01 Jan 1970 00:00:00 GMT"}'
492
+ ```
493
+
494
+ This does not affect datetimes in `dict` keys if using OPT_NON_STR_KEYS.
495
+
496
+ ##### OPT_PASSTHROUGH_SUBCLASS
497
+
498
+ Passthrough subclasses of builtin types to `default`.
499
+
500
+ ```python
501
+ >>> import orjson
502
+ >>>
503
+ class Secret(str):
504
+ pass
505
+
506
+ def default(obj):
507
+ if isinstance(obj, Secret):
508
+ return "******"
509
+ raise TypeError
510
+
511
+ >>> orjson.dumps(Secret("zxc"))
512
+ b'"zxc"'
513
+ >>> orjson.dumps(Secret("zxc"), option=orjson.OPT_PASSTHROUGH_SUBCLASS)
514
+ TypeError: Type is not JSON serializable: Secret
515
+ >>> orjson.dumps(Secret("zxc"), option=orjson.OPT_PASSTHROUGH_SUBCLASS, default=default)
516
+ b'"******"'
517
+ ```
518
+
519
+ This does not affect serializing subclasses as `dict` keys if using
520
+ OPT_NON_STR_KEYS.
521
+
522
+ ##### OPT_SERIALIZE_DATACLASS
523
+
524
+ This is deprecated and has no effect in version 3. In version 2 this was
525
+ required to serialize `dataclasses.dataclass` instances. For more, see
526
+ [dataclass](https://github.com/ijl/orjson?tab=readme-ov-file#dataclass).
527
+
528
+ ##### OPT_SERIALIZE_NUMPY
529
+
530
+ Serialize `numpy.ndarray` instances. For more, see
531
+ [numpy](https://github.com/ijl/orjson?tab=readme-ov-file#numpy).
532
+
533
+ ##### OPT_SERIALIZE_UUID
534
+
535
+ This is deprecated and has no effect in version 3. In version 2 this was
536
+ required to serialize `uuid.UUID` instances. For more, see
537
+ [UUID](https://github.com/ijl/orjson?tab=readme-ov-file#UUID).
538
+
539
+ ##### OPT_SORT_KEYS
540
+
541
+ Serialize `dict` keys in sorted order. The default is to serialize in an
542
+ unspecified order. This is equivalent to `sort_keys=True` in the standard
543
+ library.
544
+
545
+ This can be used to ensure the order is deterministic for hashing or tests.
546
+ It has a substantial performance penalty and is not recommended in general.
547
+
548
+ ```python
549
+ >>> import orjson
550
+ >>> orjson.dumps({"b": 1, "c": 2, "a": 3})
551
+ b'{"b":1,"c":2,"a":3}'
552
+ >>> orjson.dumps({"b": 1, "c": 2, "a": 3}, option=orjson.OPT_SORT_KEYS)
553
+ b'{"a":3,"b":1,"c":2}'
554
+ ```
555
+
556
+ This measures serializing the twitter.json fixture unsorted and sorted:
557
+
558
+ | Library | unsorted (ms) | sorted (ms) | vs. orjson |
559
+ |-----------|-----------------|---------------|--------------|
560
+ | orjson | 0.11 | 0.3 | 1 |
561
+ | json | 1.36 | 1.93 | 6.4 |
562
+
563
+ The benchmark can be reproduced using the `pysort` script.
564
+
565
+ The sorting is not collation/locale-aware:
566
+
567
+ ```python
568
+ >>> import orjson
569
+ >>> orjson.dumps({"a": 1, "ä": 2, "A": 3}, option=orjson.OPT_SORT_KEYS)
570
+ b'{"A":3,"a":1,"\xc3\xa4":2}'
571
+ ```
572
+
573
+ This is the same sorting behavior as the standard library.
574
+
575
+ `dataclass` also serialize as maps but this has no effect on them.
576
+
577
+ ##### OPT_STRICT_INTEGER
578
+
579
+ Enforce 53-bit limit on integers. The limit is otherwise 64 bits, the same as
580
+ the Python standard library. For more, see [int](https://github.com/ijl/orjson?tab=readme-ov-file#int).
581
+
582
+ ##### OPT_UTC_Z
583
+
584
+ Serialize a UTC timezone on `datetime.datetime` instances as `Z` instead
585
+ of `+00:00`.
586
+
587
+ ```python
588
+ >>> import orjson, datetime, zoneinfo
589
+ >>> orjson.dumps(
590
+ datetime.datetime(1970, 1, 1, 0, 0, 0, tzinfo=zoneinfo.ZoneInfo("UTC")),
591
+ )
592
+ b'"1970-01-01T00:00:00+00:00"'
593
+ >>> orjson.dumps(
594
+ datetime.datetime(1970, 1, 1, 0, 0, 0, tzinfo=zoneinfo.ZoneInfo("UTC")),
595
+ option=orjson.OPT_UTC_Z
596
+ )
597
+ b'"1970-01-01T00:00:00Z"'
598
+ ```
599
+
600
+ #### Fragment
601
+
602
+ `orjson.Fragment` includes already-serialized JSON in a document. This is an
603
+ efficient way to include JSON blobs from a cache, JSONB field, or separately
604
+ serialized object without first deserializing to Python objects via `loads()`.
605
+
606
+ ```python
607
+ >>> import orjson
608
+ >>> orjson.dumps({"key": "zxc", "data": orjson.Fragment(b'{"a": "b", "c": 1}')})
609
+ b'{"key":"zxc","data":{"a": "b", "c": 1}}'
610
+ ```
611
+
612
+ It does no reformatting: `orjson.OPT_INDENT_2` will not affect a
613
+ compact blob nor will a pretty-printed JSON blob be rewritten as compact.
614
+
615
+ The input must be `bytes` or `str` and given as a positional argument.
616
+
617
+ This raises `orjson.JSONEncodeError` if a `str` is given and the input is
618
+ not valid UTF-8. It otherwise does no validation and it is possible to
619
+ write invalid JSON. This does not escape characters. The implementation is
620
+ tested to not crash if given invalid strings or invalid JSON.
621
+
622
+ ### Deserialize
623
+
624
+ ```python
625
+ def loads(__obj: Union[bytes, bytearray, memoryview, str]) -> Any: ...
626
+ ```
627
+
628
+ `loads()` deserializes JSON to Python objects. It deserializes to `dict`,
629
+ `list`, `int`, `float`, `str`, `bool`, and `None` objects.
630
+
631
+ `bytes`, `bytearray`, `memoryview`, and `str` input are accepted. If the input
632
+ exists as a `memoryview`, `bytearray`, or `bytes` object, it is recommended to
633
+ pass these directly rather than creating an unnecessary `str` object. That is,
634
+ `orjson.loads(b"{}")` instead of `orjson.loads(b"{}".decode("utf-8"))`. This
635
+ has lower memory usage and lower latency.
636
+
637
+ The input must be valid UTF-8.
638
+
639
+ orjson maintains a cache of map keys for the duration of the process. This
640
+ causes a net reduction in memory usage by avoiding duplicate strings. The
641
+ keys must be at most 64 bytes to be cached and 2048 entries are stored.
642
+
643
+ The global interpreter lock (GIL) is held for the duration of the call.
644
+
645
+ It raises `JSONDecodeError` if given an invalid type or invalid
646
+ JSON. This includes if the input contains `NaN`, `Infinity`, or `-Infinity`,
647
+ which the standard library allows, but is not valid JSON.
648
+
649
+ It raises `JSONDecodeError` if a combination of array or object recurses
650
+ 1024 levels deep.
651
+
652
+ `JSONDecodeError` is a subclass of `json.JSONDecodeError` and `ValueError`.
653
+ This is for compatibility with the standard library.
654
+
655
+ ## Types
656
+
657
+ ### dataclass
658
+
659
+ orjson serializes instances of `dataclasses.dataclass` natively. It serializes
660
+ instances 40-50x as fast as other libraries and avoids a severe slowdown seen
661
+ in other libraries compared to serializing `dict`.
662
+
663
+ It is supported to pass all variants of dataclasses, including dataclasses
664
+ using `__slots__`, frozen dataclasses, those with optional or default
665
+ attributes, and subclasses. There is a performance benefit to not
666
+ using `__slots__`.
667
+
668
+ | Library | dict (ms) | dataclass (ms) | vs. orjson |
669
+ |-----------|-------------|------------------|--------------|
670
+ | orjson | 0.43 | 0.95 | 1 |
671
+ | json | 5.81 | 38.32 | 40 |
672
+
673
+ This measures serializing 555KiB of JSON, orjson natively and other libraries
674
+ using `default` to serialize the output of `dataclasses.asdict()`. This can be
675
+ reproduced using the `pydataclass` script.
676
+
677
+ Dataclasses are serialized as maps, with every attribute serialized and in
678
+ the order given on class definition:
679
+
680
+ ```python
681
+ >>> import dataclasses, orjson, typing
682
+
683
+ @dataclasses.dataclass
684
+ class Member:
685
+ id: int
686
+ active: bool = dataclasses.field(default=False)
687
+
688
+ @dataclasses.dataclass
689
+ class Object:
690
+ id: int
691
+ name: str
692
+ members: typing.List[Member]
693
+
694
+ >>> orjson.dumps(Object(1, "a", [Member(1, True), Member(2)]))
695
+ b'{"id":1,"name":"a","members":[{"id":1,"active":true},{"id":2,"active":false}]}'
696
+ ```
697
+
698
+ ### datetime
699
+
700
+ orjson serializes `datetime.datetime` objects to
701
+ [RFC 3339](https://tools.ietf.org/html/rfc3339) format,
702
+ e.g., "1970-01-01T00:00:00+00:00". This is a subset of ISO 8601 and is
703
+ compatible with `isoformat()` in the standard library.
704
+
705
+ ```python
706
+ >>> import orjson, datetime, zoneinfo
707
+ >>> orjson.dumps(
708
+ datetime.datetime(2018, 12, 1, 2, 3, 4, 9, tzinfo=zoneinfo.ZoneInfo("Australia/Adelaide"))
709
+ )
710
+ b'"2018-12-01T02:03:04.000009+10:30"'
711
+ >>> orjson.dumps(
712
+ datetime.datetime(2100, 9, 1, 21, 55, 2).replace(tzinfo=zoneinfo.ZoneInfo("UTC"))
713
+ )
714
+ b'"2100-09-01T21:55:02+00:00"'
715
+ >>> orjson.dumps(
716
+ datetime.datetime(2100, 9, 1, 21, 55, 2)
717
+ )
718
+ b'"2100-09-01T21:55:02"'
719
+ ```
720
+
721
+ `datetime.datetime` supports instances with a `tzinfo` that is `None`,
722
+ `datetime.timezone.utc`, a timezone instance from the python3.9+ `zoneinfo`
723
+ module, or a timezone instance from the third-party `pendulum`, `pytz`, or
724
+ `dateutil`/`arrow` libraries.
725
+
726
+ It is fastest to use the standard library's `zoneinfo.ZoneInfo` for timezones.
727
+
728
+ `datetime.time` objects must not have a `tzinfo`.
729
+
730
+ ```python
731
+ >>> import orjson, datetime
732
+ >>> orjson.dumps(datetime.time(12, 0, 15, 290))
733
+ b'"12:00:15.000290"'
734
+ ```
735
+
736
+ `datetime.date` objects will always serialize.
737
+
738
+ ```python
739
+ >>> import orjson, datetime
740
+ >>> orjson.dumps(datetime.date(1900, 1, 2))
741
+ b'"1900-01-02"'
742
+ ```
743
+
744
+ Errors with `tzinfo` result in `JSONEncodeError` being raised.
745
+
746
+ To disable serialization of `datetime` objects specify the option
747
+ `orjson.OPT_PASSTHROUGH_DATETIME`.
748
+
749
+ To use "Z" suffix instead of "+00:00" to indicate UTC ("Zulu") time, use the option
750
+ `orjson.OPT_UTC_Z`.
751
+
752
+ To assume datetimes without timezone are UTC, use the option `orjson.OPT_NAIVE_UTC`.
753
+
754
+ ### enum
755
+
756
+ orjson serializes enums natively. Options apply to their values.
757
+
758
+ ```python
759
+ >>> import enum, datetime, orjson
760
+ >>>
761
+ class DatetimeEnum(enum.Enum):
762
+ EPOCH = datetime.datetime(1970, 1, 1, 0, 0, 0)
763
+ >>> orjson.dumps(DatetimeEnum.EPOCH)
764
+ b'"1970-01-01T00:00:00"'
765
+ >>> orjson.dumps(DatetimeEnum.EPOCH, option=orjson.OPT_NAIVE_UTC)
766
+ b'"1970-01-01T00:00:00+00:00"'
767
+ ```
768
+
769
+ Enums with members that are not supported types can be serialized using
770
+ `default`:
771
+
772
+ ```python
773
+ >>> import enum, orjson
774
+ >>>
775
+ class Custom:
776
+ def __init__(self, val):
777
+ self.val = val
778
+
779
+ def default(obj):
780
+ if isinstance(obj, Custom):
781
+ return obj.val
782
+ raise TypeError
783
+
784
+ class CustomEnum(enum.Enum):
785
+ ONE = Custom(1)
786
+
787
+ >>> orjson.dumps(CustomEnum.ONE, default=default)
788
+ b'1'
789
+ ```
790
+
791
+ ### float
792
+
793
+ orjson serializes and deserializes double precision floats with no loss of
794
+ precision and consistent rounding.
795
+
796
+ `orjson.dumps()` serializes Nan, Infinity, and -Infinity, which are not
797
+ compliant JSON, as `null`:
798
+
799
+ ```python
800
+ >>> import orjson, json
801
+ >>> orjson.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
802
+ b'[null,null,null]'
803
+ >>> json.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
804
+ '[NaN, Infinity, -Infinity]'
805
+ ```
806
+
807
+ ### int
808
+
809
+ orjson serializes and deserializes 64-bit integers by default. The range
810
+ supported is a signed 64-bit integer's minimum (-9223372036854775807) to
811
+ an unsigned 64-bit integer's maximum (18446744073709551615). This
812
+ is widely compatible, but there are implementations
813
+ that only support 53-bits for integers, e.g.,
814
+ web browsers. For those implementations, `dumps()` can be configured to
815
+ raise a `JSONEncodeError` on values exceeding the 53-bit range.
816
+
817
+ ```python
818
+ >>> import orjson
819
+ >>> orjson.dumps(9007199254740992)
820
+ b'9007199254740992'
821
+ >>> orjson.dumps(9007199254740992, option=orjson.OPT_STRICT_INTEGER)
822
+ JSONEncodeError: Integer exceeds 53-bit range
823
+ >>> orjson.dumps(-9007199254740992, option=orjson.OPT_STRICT_INTEGER)
824
+ JSONEncodeError: Integer exceeds 53-bit range
825
+ ```
826
+
827
+ ### numpy
828
+
829
+ orjson natively serializes `numpy.ndarray` and individual
830
+ `numpy.float64`, `numpy.float32`, `numpy.float16` (`numpy.half`),
831
+ `numpy.int64`, `numpy.int32`, `numpy.int16`, `numpy.int8`,
832
+ `numpy.uint64`, `numpy.uint32`, `numpy.uint16`, `numpy.uint8`,
833
+ `numpy.uintp`, `numpy.intp`, `numpy.datetime64`, and `numpy.bool`
834
+ instances.
835
+
836
+ orjson is compatible with both numpy v1 and v2.
837
+
838
+ orjson is faster than all compared libraries at serializing
839
+ numpy instances. Serializing numpy data requires specifying
840
+ `option=orjson.OPT_SERIALIZE_NUMPY`.
841
+
842
+ ```python
843
+ >>> import orjson, numpy
844
+ >>> orjson.dumps(
845
+ numpy.array([[1, 2, 3], [4, 5, 6]]),
846
+ option=orjson.OPT_SERIALIZE_NUMPY,
847
+ )
848
+ b'[[1,2,3],[4,5,6]]'
849
+ ```
850
+
851
+ The array must be a contiguous C array (`C_CONTIGUOUS`) and one of the
852
+ supported datatypes.
853
+
854
+ Note a difference between serializing `numpy.float32` using `ndarray.tolist()`
855
+ or `orjson.dumps(..., option=orjson.OPT_SERIALIZE_NUMPY)`: `tolist()` converts
856
+ to a `double` before serializing and orjson's native path does not. This
857
+ can result in different rounding.
858
+
859
+ `numpy.datetime64` instances are serialized as RFC 3339 strings and
860
+ datetime options affect them.
861
+
862
+ ```python
863
+ >>> import orjson, numpy
864
+ >>> orjson.dumps(
865
+ numpy.datetime64("2021-01-01T00:00:00.172"),
866
+ option=orjson.OPT_SERIALIZE_NUMPY,
867
+ )
868
+ b'"2021-01-01T00:00:00.172000"'
869
+ >>> orjson.dumps(
870
+ numpy.datetime64("2021-01-01T00:00:00.172"),
871
+ option=(
872
+ orjson.OPT_SERIALIZE_NUMPY |
873
+ orjson.OPT_NAIVE_UTC |
874
+ orjson.OPT_OMIT_MICROSECONDS
875
+ ),
876
+ )
877
+ b'"2021-01-01T00:00:00+00:00"'
878
+ ```
879
+
880
+ If an array is not a contiguous C array, contains an unsupported datatype,
881
+ or contains a `numpy.datetime64` using an unsupported representation
882
+ (e.g., picoseconds), orjson falls through to `default`. In `default`,
883
+ `obj.tolist()` can be specified.
884
+
885
+ If an array is not in the native endianness, e.g., an array of big-endian values
886
+ on a little-endian system, `orjson.JSONEncodeError` is raised.
887
+
888
+ If an array is malformed, `orjson.JSONEncodeError` is raised.
889
+
890
+ This measures serializing 92MiB of JSON from an `numpy.ndarray` with
891
+ dimensions of `(50000, 100)` and `numpy.float64` values:
892
+
893
+ | Library | Latency (ms) | RSS diff (MiB) | vs. orjson |
894
+ |-----------|----------------|------------------|--------------|
895
+ | orjson | 105 | 105 | 1 |
896
+ | json | 1,481 | 295 | 14.2 |
897
+
898
+ This measures serializing 100MiB of JSON from an `numpy.ndarray` with
899
+ dimensions of `(100000, 100)` and `numpy.int32` values:
900
+
901
+ | Library | Latency (ms) | RSS diff (MiB) | vs. orjson |
902
+ |-----------|----------------|------------------|--------------|
903
+ | orjson | 68 | 119 | 1 |
904
+ | json | 684 | 501 | 10.1 |
905
+
906
+ This measures serializing 105MiB of JSON from an `numpy.ndarray` with
907
+ dimensions of `(100000, 200)` and `numpy.bool` values:
908
+
909
+ | Library | Latency (ms) | RSS diff (MiB) | vs. orjson |
910
+ |-----------|----------------|------------------|--------------|
911
+ | orjson | 50 | 125 | 1 |
912
+ | json | 573 | 398 | 11.5 |
913
+
914
+ In these benchmarks, orjson serializes natively and `json` serializes
915
+ `ndarray.tolist()` via `default`. The RSS column measures peak memory
916
+ usage during serialization. This can be reproduced using the `pynumpy` script.
917
+
918
+ orjson does not have an installation or compilation dependency on numpy. The
919
+ implementation is independent, reading `numpy.ndarray` using
920
+ `PyArrayInterface`.
921
+
922
+ ### str
923
+
924
+ orjson is strict about UTF-8 conformance. This is stricter than the standard
925
+ library's json module, which will serialize and deserialize UTF-16 surrogates,
926
+ e.g., "\ud800", that are invalid UTF-8.
927
+
928
+ If `orjson.dumps()` is given a `str` that does not contain valid UTF-8,
929
+ `orjson.JSONEncodeError` is raised. If `loads()` receives invalid UTF-8,
930
+ `orjson.JSONDecodeError` is raised.
931
+
932
+ ```python
933
+ >>> import orjson, json
934
+ >>> orjson.dumps('\ud800')
935
+ JSONEncodeError: str is not valid UTF-8: surrogates not allowed
936
+ >>> json.dumps('\ud800')
937
+ '"\\ud800"'
938
+ >>> orjson.loads('"\\ud800"')
939
+ JSONDecodeError: unexpected end of hex escape at line 1 column 8: line 1 column 1 (char 0)
940
+ >>> json.loads('"\\ud800"')
941
+ '\ud800'
942
+ ```
943
+
944
+ To make a best effort at deserializing bad input, first decode `bytes` using
945
+ the `replace` or `lossy` argument for `errors`:
946
+
947
+ ```python
948
+ >>> import orjson
949
+ >>> orjson.loads(b'"\xed\xa0\x80"')
950
+ JSONDecodeError: str is not valid UTF-8: surrogates not allowed
951
+ >>> orjson.loads(b'"\xed\xa0\x80"'.decode("utf-8", "replace"))
952
+ '���'
953
+ ```
954
+
955
+ ### uuid
956
+
957
+ orjson serializes `uuid.UUID` instances to
958
+ [RFC 4122](https://tools.ietf.org/html/rfc4122) format, e.g.,
959
+ "f81d4fae-7dec-11d0-a765-00a0c91e6bf6".
960
+
961
+ ``` python
962
+ >>> import orjson, uuid
963
+ >>> orjson.dumps(uuid.uuid5(uuid.NAMESPACE_DNS, "python.org"))
964
+ b'"886313e1-3b8a-5372-9b90-0c9aee199e5d"'
965
+ ```
966
+
967
+ ## Testing
968
+
969
+ The library has comprehensive tests. There are tests against fixtures in the
970
+ [JSONTestSuite](https://github.com/nst/JSONTestSuite) and
971
+ [nativejson-benchmark](https://github.com/miloyip/nativejson-benchmark)
972
+ repositories. It is tested to not crash against the
973
+ [Big List of Naughty Strings](https://github.com/minimaxir/big-list-of-naughty-strings).
974
+ It is tested to not leak memory. It is tested to not crash
975
+ against and not accept invalid UTF-8. There are integration tests
976
+ exercising the library's use in web servers (gunicorn using multiprocess/forked
977
+ workers) and when
978
+ multithreaded. It also uses some tests from the ultrajson library.
979
+
980
+ orjson is the most correct of the compared libraries. This graph shows how each
981
+ library handles a combined 342 JSON fixtures from the
982
+ [JSONTestSuite](https://github.com/nst/JSONTestSuite) and
983
+ [nativejson-benchmark](https://github.com/miloyip/nativejson-benchmark) tests:
984
+
985
+ | Library | Invalid JSON documents not rejected | Valid JSON documents not deserialized |
986
+ |------------|---------------------------------------|-----------------------------------------|
987
+ | orjson | 0 | 0 |
988
+ | json | 17 | 0 |
989
+
990
+ This shows that all libraries deserialize valid JSON but only orjson
991
+ correctly rejects the given invalid JSON fixtures. Errors are largely due to
992
+ accepting invalid strings and numbers.
993
+
994
+ The graph above can be reproduced using the `pycorrectness` script.
995
+
996
+ ## Performance
997
+
998
+ Serialization and deserialization performance of orjson is consistently better
999
+ than the standard library's `json`. The graphs below illustrate a few commonly
1000
+ used documents.
1001
+
1002
+ ### Latency
1003
+
1004
+ ![Serialization](doc/serialization.png)
1005
+
1006
+ ![Deserialization](doc/deserialization.png)
1007
+
1008
+ #### twitter.json serialization
1009
+
1010
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1011
+ |-----------|---------------------------------|-------------------------|----------------------|
1012
+ | orjson | 0.1 | 8453 | 1 |
1013
+ | json | 1.3 | 765 | 11.1 |
1014
+
1015
+ #### twitter.json deserialization
1016
+
1017
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1018
+ |-----------|---------------------------------|-------------------------|----------------------|
1019
+ | orjson | 0.5 | 1889 | 1 |
1020
+ | json | 2.2 | 453 | 4.2 |
1021
+
1022
+ #### github.json serialization
1023
+
1024
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1025
+ |-----------|---------------------------------|-------------------------|----------------------|
1026
+ | orjson | 0.01 | 103693 | 1 |
1027
+ | json | 0.13 | 7648 | 13.6 |
1028
+
1029
+ #### github.json deserialization
1030
+
1031
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1032
+ |-----------|---------------------------------|-------------------------|----------------------|
1033
+ | orjson | 0.04 | 23264 | 1 |
1034
+ | json | 0.1 | 10430 | 2.2 |
1035
+
1036
+ #### citm_catalog.json serialization
1037
+
1038
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1039
+ |-----------|---------------------------------|-------------------------|----------------------|
1040
+ | orjson | 0.3 | 3975 | 1 |
1041
+ | json | 3 | 338 | 11.8 |
1042
+
1043
+ #### citm_catalog.json deserialization
1044
+
1045
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1046
+ |-----------|---------------------------------|-------------------------|----------------------|
1047
+ | orjson | 1.3 | 781 | 1 |
1048
+ | json | 4 | 250 | 3.1 |
1049
+
1050
+ #### canada.json serialization
1051
+
1052
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1053
+ |-----------|---------------------------------|-------------------------|----------------------|
1054
+ | orjson | 2.5 | 399 | 1 |
1055
+ | json | 29.8 | 33 | 11.9 |
1056
+
1057
+ #### canada.json deserialization
1058
+
1059
+ | Library | Median latency (milliseconds) | Operations per second | Relative (latency) |
1060
+ |-----------|---------------------------------|-------------------------|----------------------|
1061
+ | orjson | 3 | 333 | 1 |
1062
+ | json | 18 | 55 | 6 |
1063
+
1064
+ ### Reproducing
1065
+
1066
+ The above was measured using Python 3.11.10 in a Fedora 42 container on an
1067
+ x86-64-v4 machine using the
1068
+ `orjson-3.10.11-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl`
1069
+ artifact on PyPI. The latency results can be reproduced using the `pybench` script.
1070
+
1071
+ ## Questions
1072
+
1073
+ ### Why can't I install it from PyPI?
1074
+
1075
+ Probably `pip` needs to be upgraded to version 20.3 or later to support
1076
+ the latest manylinux_x_y or universal2 wheel formats.
1077
+
1078
+ ### "Cargo, the Rust package manager, is not installed or is not on PATH."
1079
+
1080
+ This happens when there are no binary wheels (like manylinux) for your
1081
+ platform on PyPI. You can install [Rust](https://www.rust-lang.org/) through
1082
+ `rustup` or a package manager and then it will compile.
1083
+
1084
+ ### Will it deserialize to dataclasses, UUIDs, decimals, etc or support object_hook?
1085
+
1086
+ No. This requires a schema specifying what types are expected and how to
1087
+ handle errors etc. This is addressed by data validation libraries a
1088
+ level above this.
1089
+
1090
+ ### Will it serialize to `str`?
1091
+
1092
+ No. `bytes` is the correct type for a serialized blob.
1093
+
1094
+ ### Will it support NDJSON or JSONL?
1095
+
1096
+ No. [orjsonl](https://github.com/umarbutler/orjsonl) may be appropriate.
1097
+
1098
+ ### Will it support JSON5 or RJSON?
1099
+
1100
+ No, it supports RFC 8259.
1101
+
1102
+ ## Packaging
1103
+
1104
+ To package orjson requires at least [Rust](https://www.rust-lang.org/) 1.82
1105
+ and the [maturin](https://github.com/PyO3/maturin) build tool. The recommended
1106
+ build command is:
1107
+
1108
+ ```sh
1109
+ maturin build --release --strip
1110
+ ```
1111
+
1112
+ It benefits from also having a C build environment to compile a faster
1113
+ deserialization backend. See this project's `manylinux_2_28` builds for an
1114
+ example using clang and LTO.
1115
+
1116
+ The project's own CI tests against `nightly-2025-01-07` and stable 1.72. It
1117
+ is prudent to pin the nightly version because that channel can introduce
1118
+ breaking changes. There is a significant performance benefit to using
1119
+ nightly.
1120
+
1121
+ orjson is tested for amd64, aarch64, and i686 on Linux and cross-compiles for
1122
+ arm7, ppc64le, and s390x. It is tested for either aarch64 or amd64 on macOS and
1123
+ cross-compiles for the other, depending on version. For Windows it is
1124
+ tested on amd64 and i686.
1125
+
1126
+ There are no runtime dependencies other than libc.
1127
+
1128
+ The source distribution on PyPI contains all dependencies' source and can be
1129
+ built without network access. The file can be downloaded from
1130
+ `https://files.pythonhosted.org/packages/source/o/orjson/orjson-${version}.tar.gz`.
1131
+
1132
+ orjson's tests are included in the source distribution on PyPI. The
1133
+ requirements to run the tests are specified in `test/requirements.txt`. The
1134
+ tests should be run as part of the build. It can be run with
1135
+ `pytest -q test`.
1136
+
1137
+ ## License
1138
+
1139
+ orjson was written by ijl <<ijl@mailbox.org>>, copyright 2018 - 2025, available
1140
+ to you under either the Apache 2 license or MIT license at your choice.
1141
+