xiaoanyu123 commited on
Commit
cc0b6ff
·
verified ·
1 Parent(s): b458de1

Add files using upload-large-folder tool

Browse files
Files changed (20) hide show
  1. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/__init__.cpython-310.pyc +0 -0
  2. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/matchhelpers.cpython-310.pyc +0 -0
  3. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/temporalisomorphvf2.cpython-310.pyc +0 -0
  4. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/tree_isomorphism.cpython-310.pyc +0 -0
  5. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/vf2pp.cpython-310.pyc +0 -0
  6. pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/vf2userfunc.cpython-310.pyc +0 -0
  7. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__init__.py +2 -0
  8. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/__init__.cpython-310.pyc +0 -0
  9. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/hits_alg.cpython-310.pyc +0 -0
  10. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/pagerank_alg.cpython-310.pyc +0 -0
  11. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/hits_alg.py +337 -0
  12. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/pagerank_alg.py +499 -0
  13. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__init__.py +0 -0
  14. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/__init__.cpython-310.pyc +0 -0
  15. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/test_hits.cpython-310.pyc +0 -0
  16. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/test_pagerank.cpython-310.pyc +0 -0
  17. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/test_hits.py +78 -0
  18. pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/test_pagerank.py +217 -0
  19. pythonProject/.venv/Lib/site-packages/networkx/algorithms/minors/__init__.py +27 -0
  20. pythonProject/.venv/Lib/site-packages/networkx/algorithms/minors/contraction.py +633 -0
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (578 Bytes). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/matchhelpers.cpython-310.pyc ADDED
Binary file (10.9 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/temporalisomorphvf2.cpython-310.pyc ADDED
Binary file (10.8 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/tree_isomorphism.cpython-310.pyc ADDED
Binary file (7.44 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/vf2pp.cpython-310.pyc ADDED
Binary file (28.5 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/isomorphism/__pycache__/vf2userfunc.cpython-310.pyc ADDED
Binary file (6.58 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__init__.py ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ from networkx.algorithms.link_analysis.hits_alg import *
2
+ from networkx.algorithms.link_analysis.pagerank_alg import *
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (307 Bytes). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/hits_alg.cpython-310.pyc ADDED
Binary file (9.7 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/__pycache__/pagerank_alg.cpython-310.pyc ADDED
Binary file (16.3 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/hits_alg.py ADDED
@@ -0,0 +1,337 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Hubs and authorities analysis of graph structure.
2
+ """
3
+ import networkx as nx
4
+
5
+ __all__ = ["hits"]
6
+
7
+
8
+ @nx._dispatchable(preserve_edge_attrs={"G": {"weight": 1}})
9
+ def hits(G, max_iter=100, tol=1.0e-8, nstart=None, normalized=True):
10
+ """Returns HITS hubs and authorities values for nodes.
11
+
12
+ The HITS algorithm computes two numbers for a node.
13
+ Authorities estimates the node value based on the incoming links.
14
+ Hubs estimates the node value based on outgoing links.
15
+
16
+ Parameters
17
+ ----------
18
+ G : graph
19
+ A NetworkX graph
20
+
21
+ max_iter : integer, optional
22
+ Maximum number of iterations in power method.
23
+
24
+ tol : float, optional
25
+ Error tolerance used to check convergence in power method iteration.
26
+
27
+ nstart : dictionary, optional
28
+ Starting value of each node for power method iteration.
29
+
30
+ normalized : bool (default=True)
31
+ Normalize results by the sum of all of the values.
32
+
33
+ Returns
34
+ -------
35
+ (hubs,authorities) : two-tuple of dictionaries
36
+ Two dictionaries keyed by node containing the hub and authority
37
+ values.
38
+
39
+ Raises
40
+ ------
41
+ PowerIterationFailedConvergence
42
+ If the algorithm fails to converge to the specified tolerance
43
+ within the specified number of iterations of the power iteration
44
+ method.
45
+
46
+ Examples
47
+ --------
48
+ >>> G = nx.path_graph(4)
49
+ >>> h, a = nx.hits(G)
50
+
51
+ Notes
52
+ -----
53
+ The eigenvector calculation is done by the power iteration method
54
+ and has no guarantee of convergence. The iteration will stop
55
+ after max_iter iterations or an error tolerance of
56
+ number_of_nodes(G)*tol has been reached.
57
+
58
+ The HITS algorithm was designed for directed graphs but this
59
+ algorithm does not check if the input graph is directed and will
60
+ execute on undirected graphs.
61
+
62
+ References
63
+ ----------
64
+ .. [1] A. Langville and C. Meyer,
65
+ "A survey of eigenvector methods of web information retrieval."
66
+ http://citeseer.ist.psu.edu/713792.html
67
+ .. [2] Jon Kleinberg,
68
+ Authoritative sources in a hyperlinked environment
69
+ Journal of the ACM 46 (5): 604-32, 1999.
70
+ doi:10.1145/324133.324140.
71
+ http://www.cs.cornell.edu/home/kleinber/auth.pdf.
72
+ """
73
+ import numpy as np
74
+ import scipy as sp
75
+
76
+ if len(G) == 0:
77
+ return {}, {}
78
+ A = nx.adjacency_matrix(G, nodelist=list(G), dtype=float)
79
+
80
+ if nstart is not None:
81
+ nstart = np.array(list(nstart.values()))
82
+ if max_iter <= 0:
83
+ raise nx.PowerIterationFailedConvergence(max_iter)
84
+ try:
85
+ _, _, vt = sp.sparse.linalg.svds(A, k=1, v0=nstart, maxiter=max_iter, tol=tol)
86
+ except sp.sparse.linalg.ArpackNoConvergence as exc:
87
+ raise nx.PowerIterationFailedConvergence(max_iter) from exc
88
+
89
+ a = vt.flatten().real
90
+ h = A @ a
91
+ if normalized:
92
+ h /= h.sum()
93
+ a /= a.sum()
94
+ hubs = dict(zip(G, map(float, h)))
95
+ authorities = dict(zip(G, map(float, a)))
96
+ return hubs, authorities
97
+
98
+
99
+ def _hits_python(G, max_iter=100, tol=1.0e-8, nstart=None, normalized=True):
100
+ if isinstance(G, nx.MultiGraph | nx.MultiDiGraph):
101
+ raise Exception("hits() not defined for graphs with multiedges.")
102
+ if len(G) == 0:
103
+ return {}, {}
104
+ # choose fixed starting vector if not given
105
+ if nstart is None:
106
+ h = dict.fromkeys(G, 1.0 / G.number_of_nodes())
107
+ else:
108
+ h = nstart
109
+ # normalize starting vector
110
+ s = 1.0 / sum(h.values())
111
+ for k in h:
112
+ h[k] *= s
113
+ for _ in range(max_iter): # power iteration: make up to max_iter iterations
114
+ hlast = h
115
+ h = dict.fromkeys(hlast.keys(), 0)
116
+ a = dict.fromkeys(hlast.keys(), 0)
117
+ # this "matrix multiply" looks odd because it is
118
+ # doing a left multiply a^T=hlast^T*G
119
+ for n in h:
120
+ for nbr in G[n]:
121
+ a[nbr] += hlast[n] * G[n][nbr].get("weight", 1)
122
+ # now multiply h=Ga
123
+ for n in h:
124
+ for nbr in G[n]:
125
+ h[n] += a[nbr] * G[n][nbr].get("weight", 1)
126
+ # normalize vector
127
+ s = 1.0 / max(h.values())
128
+ for n in h:
129
+ h[n] *= s
130
+ # normalize vector
131
+ s = 1.0 / max(a.values())
132
+ for n in a:
133
+ a[n] *= s
134
+ # check convergence, l1 norm
135
+ err = sum(abs(h[n] - hlast[n]) for n in h)
136
+ if err < tol:
137
+ break
138
+ else:
139
+ raise nx.PowerIterationFailedConvergence(max_iter)
140
+ if normalized:
141
+ s = 1.0 / sum(a.values())
142
+ for n in a:
143
+ a[n] *= s
144
+ s = 1.0 / sum(h.values())
145
+ for n in h:
146
+ h[n] *= s
147
+ return h, a
148
+
149
+
150
+ def _hits_numpy(G, normalized=True):
151
+ """Returns HITS hubs and authorities values for nodes.
152
+
153
+ The HITS algorithm computes two numbers for a node.
154
+ Authorities estimates the node value based on the incoming links.
155
+ Hubs estimates the node value based on outgoing links.
156
+
157
+ Parameters
158
+ ----------
159
+ G : graph
160
+ A NetworkX graph
161
+
162
+ normalized : bool (default=True)
163
+ Normalize results by the sum of all of the values.
164
+
165
+ Returns
166
+ -------
167
+ (hubs,authorities) : two-tuple of dictionaries
168
+ Two dictionaries keyed by node containing the hub and authority
169
+ values.
170
+
171
+ Examples
172
+ --------
173
+ >>> G = nx.path_graph(4)
174
+
175
+ The `hubs` and `authorities` are given by the eigenvectors corresponding to the
176
+ maximum eigenvalues of the hubs_matrix and the authority_matrix, respectively.
177
+
178
+ The ``hubs`` and ``authority`` matrices are computed from the adjacency
179
+ matrix:
180
+
181
+ >>> adj_ary = nx.to_numpy_array(G)
182
+ >>> hubs_matrix = adj_ary @ adj_ary.T
183
+ >>> authority_matrix = adj_ary.T @ adj_ary
184
+
185
+ `_hits_numpy` maps the eigenvector corresponding to the maximum eigenvalue
186
+ of the respective matrices to the nodes in `G`:
187
+
188
+ >>> from networkx.algorithms.link_analysis.hits_alg import _hits_numpy
189
+ >>> hubs, authority = _hits_numpy(G)
190
+
191
+ Notes
192
+ -----
193
+ The eigenvector calculation uses NumPy's interface to LAPACK.
194
+
195
+ The HITS algorithm was designed for directed graphs but this
196
+ algorithm does not check if the input graph is directed and will
197
+ execute on undirected graphs.
198
+
199
+ References
200
+ ----------
201
+ .. [1] A. Langville and C. Meyer,
202
+ "A survey of eigenvector methods of web information retrieval."
203
+ http://citeseer.ist.psu.edu/713792.html
204
+ .. [2] Jon Kleinberg,
205
+ Authoritative sources in a hyperlinked environment
206
+ Journal of the ACM 46 (5): 604-32, 1999.
207
+ doi:10.1145/324133.324140.
208
+ http://www.cs.cornell.edu/home/kleinber/auth.pdf.
209
+ """
210
+ import numpy as np
211
+
212
+ if len(G) == 0:
213
+ return {}, {}
214
+ adj_ary = nx.to_numpy_array(G)
215
+ # Hub matrix
216
+ H = adj_ary @ adj_ary.T
217
+ e, ev = np.linalg.eig(H)
218
+ h = ev[:, np.argmax(e)] # eigenvector corresponding to the maximum eigenvalue
219
+ # Authority matrix
220
+ A = adj_ary.T @ adj_ary
221
+ e, ev = np.linalg.eig(A)
222
+ a = ev[:, np.argmax(e)] # eigenvector corresponding to the maximum eigenvalue
223
+ if normalized:
224
+ h /= h.sum()
225
+ a /= a.sum()
226
+ else:
227
+ h /= h.max()
228
+ a /= a.max()
229
+ hubs = dict(zip(G, map(float, h)))
230
+ authorities = dict(zip(G, map(float, a)))
231
+ return hubs, authorities
232
+
233
+
234
+ def _hits_scipy(G, max_iter=100, tol=1.0e-6, nstart=None, normalized=True):
235
+ """Returns HITS hubs and authorities values for nodes.
236
+
237
+
238
+ The HITS algorithm computes two numbers for a node.
239
+ Authorities estimates the node value based on the incoming links.
240
+ Hubs estimates the node value based on outgoing links.
241
+
242
+ Parameters
243
+ ----------
244
+ G : graph
245
+ A NetworkX graph
246
+
247
+ max_iter : integer, optional
248
+ Maximum number of iterations in power method.
249
+
250
+ tol : float, optional
251
+ Error tolerance used to check convergence in power method iteration.
252
+
253
+ nstart : dictionary, optional
254
+ Starting value of each node for power method iteration.
255
+
256
+ normalized : bool (default=True)
257
+ Normalize results by the sum of all of the values.
258
+
259
+ Returns
260
+ -------
261
+ (hubs,authorities) : two-tuple of dictionaries
262
+ Two dictionaries keyed by node containing the hub and authority
263
+ values.
264
+
265
+ Examples
266
+ --------
267
+ >>> from networkx.algorithms.link_analysis.hits_alg import _hits_scipy
268
+ >>> G = nx.path_graph(4)
269
+ >>> h, a = _hits_scipy(G)
270
+
271
+ Notes
272
+ -----
273
+ This implementation uses SciPy sparse matrices.
274
+
275
+ The eigenvector calculation is done by the power iteration method
276
+ and has no guarantee of convergence. The iteration will stop
277
+ after max_iter iterations or an error tolerance of
278
+ number_of_nodes(G)*tol has been reached.
279
+
280
+ The HITS algorithm was designed for directed graphs but this
281
+ algorithm does not check if the input graph is directed and will
282
+ execute on undirected graphs.
283
+
284
+ Raises
285
+ ------
286
+ PowerIterationFailedConvergence
287
+ If the algorithm fails to converge to the specified tolerance
288
+ within the specified number of iterations of the power iteration
289
+ method.
290
+
291
+ References
292
+ ----------
293
+ .. [1] A. Langville and C. Meyer,
294
+ "A survey of eigenvector methods of web information retrieval."
295
+ http://citeseer.ist.psu.edu/713792.html
296
+ .. [2] Jon Kleinberg,
297
+ Authoritative sources in a hyperlinked environment
298
+ Journal of the ACM 46 (5): 604-632, 1999.
299
+ doi:10.1145/324133.324140.
300
+ http://www.cs.cornell.edu/home/kleinber/auth.pdf.
301
+ """
302
+ import numpy as np
303
+
304
+ if len(G) == 0:
305
+ return {}, {}
306
+ A = nx.to_scipy_sparse_array(G, nodelist=list(G))
307
+ (n, _) = A.shape # should be square
308
+ ATA = A.T @ A # authority matrix
309
+ # choose fixed starting vector if not given
310
+ if nstart is None:
311
+ x = np.ones((n, 1)) / n
312
+ else:
313
+ x = np.array([nstart.get(n, 0) for n in list(G)], dtype=float)
314
+ x /= x.sum()
315
+
316
+ # power iteration on authority matrix
317
+ i = 0
318
+ while True:
319
+ xlast = x
320
+ x = ATA @ x
321
+ x /= x.max()
322
+ # check convergence, l1 norm
323
+ err = np.absolute(x - xlast).sum()
324
+ if err < tol:
325
+ break
326
+ if i > max_iter:
327
+ raise nx.PowerIterationFailedConvergence(max_iter)
328
+ i += 1
329
+
330
+ a = x.flatten()
331
+ h = A @ a
332
+ if normalized:
333
+ h /= h.sum()
334
+ a /= a.sum()
335
+ hubs = dict(zip(G, map(float, h)))
336
+ authorities = dict(zip(G, map(float, a)))
337
+ return hubs, authorities
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/pagerank_alg.py ADDED
@@ -0,0 +1,499 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """PageRank analysis of graph structure. """
2
+ from warnings import warn
3
+
4
+ import networkx as nx
5
+
6
+ __all__ = ["pagerank", "google_matrix"]
7
+
8
+
9
+ @nx._dispatchable(edge_attrs="weight")
10
+ def pagerank(
11
+ G,
12
+ alpha=0.85,
13
+ personalization=None,
14
+ max_iter=100,
15
+ tol=1.0e-6,
16
+ nstart=None,
17
+ weight="weight",
18
+ dangling=None,
19
+ ):
20
+ """Returns the PageRank of the nodes in the graph.
21
+
22
+ PageRank computes a ranking of the nodes in the graph G based on
23
+ the structure of the incoming links. It was originally designed as
24
+ an algorithm to rank web pages.
25
+
26
+ Parameters
27
+ ----------
28
+ G : graph
29
+ A NetworkX graph. Undirected graphs will be converted to a directed
30
+ graph with two directed edges for each undirected edge.
31
+
32
+ alpha : float, optional
33
+ Damping parameter for PageRank, default=0.85.
34
+
35
+ personalization: dict, optional
36
+ The "personalization vector" consisting of a dictionary with a
37
+ key some subset of graph nodes and personalization value each of those.
38
+ At least one personalization value must be non-zero.
39
+ If not specified, a nodes personalization value will be zero.
40
+ By default, a uniform distribution is used.
41
+
42
+ max_iter : integer, optional
43
+ Maximum number of iterations in power method eigenvalue solver.
44
+
45
+ tol : float, optional
46
+ Error tolerance used to check convergence in power method solver.
47
+ The iteration will stop after a tolerance of ``len(G) * tol`` is reached.
48
+
49
+ nstart : dictionary, optional
50
+ Starting value of PageRank iteration for each node.
51
+
52
+ weight : key, optional
53
+ Edge data key to use as weight. If None weights are set to 1.
54
+
55
+ dangling: dict, optional
56
+ The outedges to be assigned to any "dangling" nodes, i.e., nodes without
57
+ any outedges. The dict key is the node the outedge points to and the dict
58
+ value is the weight of that outedge. By default, dangling nodes are given
59
+ outedges according to the personalization vector (uniform if not
60
+ specified). This must be selected to result in an irreducible transition
61
+ matrix (see notes under google_matrix). It may be common to have the
62
+ dangling dict to be the same as the personalization dict.
63
+
64
+
65
+ Returns
66
+ -------
67
+ pagerank : dictionary
68
+ Dictionary of nodes with PageRank as value
69
+
70
+ Examples
71
+ --------
72
+ >>> G = nx.DiGraph(nx.path_graph(4))
73
+ >>> pr = nx.pagerank(G, alpha=0.9)
74
+
75
+ Notes
76
+ -----
77
+ The eigenvector calculation is done by the power iteration method
78
+ and has no guarantee of convergence. The iteration will stop after
79
+ an error tolerance of ``len(G) * tol`` has been reached. If the
80
+ number of iterations exceed `max_iter`, a
81
+ :exc:`networkx.exception.PowerIterationFailedConvergence` exception
82
+ is raised.
83
+
84
+ The PageRank algorithm was designed for directed graphs but this
85
+ algorithm does not check if the input graph is directed and will
86
+ execute on undirected graphs by converting each edge in the
87
+ directed graph to two edges.
88
+
89
+ See Also
90
+ --------
91
+ google_matrix
92
+
93
+ Raises
94
+ ------
95
+ PowerIterationFailedConvergence
96
+ If the algorithm fails to converge to the specified tolerance
97
+ within the specified number of iterations of the power iteration
98
+ method.
99
+
100
+ References
101
+ ----------
102
+ .. [1] A. Langville and C. Meyer,
103
+ "A survey of eigenvector methods of web information retrieval."
104
+ http://citeseer.ist.psu.edu/713792.html
105
+ .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry,
106
+ The PageRank citation ranking: Bringing order to the Web. 1999
107
+ http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
108
+
109
+ """
110
+ return _pagerank_scipy(
111
+ G, alpha, personalization, max_iter, tol, nstart, weight, dangling
112
+ )
113
+
114
+
115
+ def _pagerank_python(
116
+ G,
117
+ alpha=0.85,
118
+ personalization=None,
119
+ max_iter=100,
120
+ tol=1.0e-6,
121
+ nstart=None,
122
+ weight="weight",
123
+ dangling=None,
124
+ ):
125
+ if len(G) == 0:
126
+ return {}
127
+
128
+ D = G.to_directed()
129
+
130
+ # Create a copy in (right) stochastic form
131
+ W = nx.stochastic_graph(D, weight=weight)
132
+ N = W.number_of_nodes()
133
+
134
+ # Choose fixed starting vector if not given
135
+ if nstart is None:
136
+ x = dict.fromkeys(W, 1.0 / N)
137
+ else:
138
+ # Normalized nstart vector
139
+ s = sum(nstart.values())
140
+ x = {k: v / s for k, v in nstart.items()}
141
+
142
+ if personalization is None:
143
+ # Assign uniform personalization vector if not given
144
+ p = dict.fromkeys(W, 1.0 / N)
145
+ else:
146
+ s = sum(personalization.values())
147
+ p = {k: v / s for k, v in personalization.items()}
148
+
149
+ if dangling is None:
150
+ # Use personalization vector if dangling vector not specified
151
+ dangling_weights = p
152
+ else:
153
+ s = sum(dangling.values())
154
+ dangling_weights = {k: v / s for k, v in dangling.items()}
155
+ dangling_nodes = [n for n in W if W.out_degree(n, weight=weight) == 0.0]
156
+
157
+ # power iteration: make up to max_iter iterations
158
+ for _ in range(max_iter):
159
+ xlast = x
160
+ x = dict.fromkeys(xlast.keys(), 0)
161
+ danglesum = alpha * sum(xlast[n] for n in dangling_nodes)
162
+ for n in x:
163
+ # this matrix multiply looks odd because it is
164
+ # doing a left multiply x^T=xlast^T*W
165
+ for _, nbr, wt in W.edges(n, data=weight):
166
+ x[nbr] += alpha * xlast[n] * wt
167
+ x[n] += danglesum * dangling_weights.get(n, 0) + (1.0 - alpha) * p.get(n, 0)
168
+ # check convergence, l1 norm
169
+ err = sum(abs(x[n] - xlast[n]) for n in x)
170
+ if err < N * tol:
171
+ return x
172
+ raise nx.PowerIterationFailedConvergence(max_iter)
173
+
174
+
175
+ @nx._dispatchable(edge_attrs="weight")
176
+ def google_matrix(
177
+ G, alpha=0.85, personalization=None, nodelist=None, weight="weight", dangling=None
178
+ ):
179
+ """Returns the Google matrix of the graph.
180
+
181
+ Parameters
182
+ ----------
183
+ G : graph
184
+ A NetworkX graph. Undirected graphs will be converted to a directed
185
+ graph with two directed edges for each undirected edge.
186
+
187
+ alpha : float
188
+ The damping factor.
189
+
190
+ personalization: dict, optional
191
+ The "personalization vector" consisting of a dictionary with a
192
+ key some subset of graph nodes and personalization value each of those.
193
+ At least one personalization value must be non-zero.
194
+ If not specified, a nodes personalization value will be zero.
195
+ By default, a uniform distribution is used.
196
+
197
+ nodelist : list, optional
198
+ The rows and columns are ordered according to the nodes in nodelist.
199
+ If nodelist is None, then the ordering is produced by G.nodes().
200
+
201
+ weight : key, optional
202
+ Edge data key to use as weight. If None weights are set to 1.
203
+
204
+ dangling: dict, optional
205
+ The outedges to be assigned to any "dangling" nodes, i.e., nodes without
206
+ any outedges. The dict key is the node the outedge points to and the dict
207
+ value is the weight of that outedge. By default, dangling nodes are given
208
+ outedges according to the personalization vector (uniform if not
209
+ specified) This must be selected to result in an irreducible transition
210
+ matrix (see notes below). It may be common to have the dangling dict to
211
+ be the same as the personalization dict.
212
+
213
+ Returns
214
+ -------
215
+ A : 2D NumPy ndarray
216
+ Google matrix of the graph
217
+
218
+ Notes
219
+ -----
220
+ The array returned represents the transition matrix that describes the
221
+ Markov chain used in PageRank. For PageRank to converge to a unique
222
+ solution (i.e., a unique stationary distribution in a Markov chain), the
223
+ transition matrix must be irreducible. In other words, it must be that
224
+ there exists a path between every pair of nodes in the graph, or else there
225
+ is the potential of "rank sinks."
226
+
227
+ This implementation works with Multi(Di)Graphs. For multigraphs the
228
+ weight between two nodes is set to be the sum of all edge weights
229
+ between those nodes.
230
+
231
+ See Also
232
+ --------
233
+ pagerank
234
+ """
235
+ import numpy as np
236
+
237
+ if nodelist is None:
238
+ nodelist = list(G)
239
+
240
+ A = nx.to_numpy_array(G, nodelist=nodelist, weight=weight)
241
+ N = len(G)
242
+ if N == 0:
243
+ return A
244
+
245
+ # Personalization vector
246
+ if personalization is None:
247
+ p = np.repeat(1.0 / N, N)
248
+ else:
249
+ p = np.array([personalization.get(n, 0) for n in nodelist], dtype=float)
250
+ if p.sum() == 0:
251
+ raise ZeroDivisionError
252
+ p /= p.sum()
253
+
254
+ # Dangling nodes
255
+ if dangling is None:
256
+ dangling_weights = p
257
+ else:
258
+ # Convert the dangling dictionary into an array in nodelist order
259
+ dangling_weights = np.array([dangling.get(n, 0) for n in nodelist], dtype=float)
260
+ dangling_weights /= dangling_weights.sum()
261
+ dangling_nodes = np.where(A.sum(axis=1) == 0)[0]
262
+
263
+ # Assign dangling_weights to any dangling nodes (nodes with no out links)
264
+ A[dangling_nodes] = dangling_weights
265
+
266
+ A /= A.sum(axis=1)[:, np.newaxis] # Normalize rows to sum to 1
267
+
268
+ return alpha * A + (1 - alpha) * p
269
+
270
+
271
+ def _pagerank_numpy(
272
+ G, alpha=0.85, personalization=None, weight="weight", dangling=None
273
+ ):
274
+ """Returns the PageRank of the nodes in the graph.
275
+
276
+ PageRank computes a ranking of the nodes in the graph G based on
277
+ the structure of the incoming links. It was originally designed as
278
+ an algorithm to rank web pages.
279
+
280
+ Parameters
281
+ ----------
282
+ G : graph
283
+ A NetworkX graph. Undirected graphs will be converted to a directed
284
+ graph with two directed edges for each undirected edge.
285
+
286
+ alpha : float, optional
287
+ Damping parameter for PageRank, default=0.85.
288
+
289
+ personalization: dict, optional
290
+ The "personalization vector" consisting of a dictionary with a
291
+ key some subset of graph nodes and personalization value each of those.
292
+ At least one personalization value must be non-zero.
293
+ If not specified, a nodes personalization value will be zero.
294
+ By default, a uniform distribution is used.
295
+
296
+ weight : key, optional
297
+ Edge data key to use as weight. If None weights are set to 1.
298
+
299
+ dangling: dict, optional
300
+ The outedges to be assigned to any "dangling" nodes, i.e., nodes without
301
+ any outedges. The dict key is the node the outedge points to and the dict
302
+ value is the weight of that outedge. By default, dangling nodes are given
303
+ outedges according to the personalization vector (uniform if not
304
+ specified) This must be selected to result in an irreducible transition
305
+ matrix (see notes under google_matrix). It may be common to have the
306
+ dangling dict to be the same as the personalization dict.
307
+
308
+ Returns
309
+ -------
310
+ pagerank : dictionary
311
+ Dictionary of nodes with PageRank as value.
312
+
313
+ Examples
314
+ --------
315
+ >>> from networkx.algorithms.link_analysis.pagerank_alg import _pagerank_numpy
316
+ >>> G = nx.DiGraph(nx.path_graph(4))
317
+ >>> pr = _pagerank_numpy(G, alpha=0.9)
318
+
319
+ Notes
320
+ -----
321
+ The eigenvector calculation uses NumPy's interface to the LAPACK
322
+ eigenvalue solvers. This will be the fastest and most accurate
323
+ for small graphs.
324
+
325
+ This implementation works with Multi(Di)Graphs. For multigraphs the
326
+ weight between two nodes is set to be the sum of all edge weights
327
+ between those nodes.
328
+
329
+ See Also
330
+ --------
331
+ pagerank, google_matrix
332
+
333
+ References
334
+ ----------
335
+ .. [1] A. Langville and C. Meyer,
336
+ "A survey of eigenvector methods of web information retrieval."
337
+ http://citeseer.ist.psu.edu/713792.html
338
+ .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry,
339
+ The PageRank citation ranking: Bringing order to the Web. 1999
340
+ http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
341
+ """
342
+ import numpy as np
343
+
344
+ if len(G) == 0:
345
+ return {}
346
+ M = google_matrix(
347
+ G, alpha, personalization=personalization, weight=weight, dangling=dangling
348
+ )
349
+ # use numpy LAPACK solver
350
+ eigenvalues, eigenvectors = np.linalg.eig(M.T)
351
+ ind = np.argmax(eigenvalues)
352
+ # eigenvector of largest eigenvalue is at ind, normalized
353
+ largest = np.array(eigenvectors[:, ind]).flatten().real
354
+ norm = largest.sum()
355
+ return dict(zip(G, map(float, largest / norm)))
356
+
357
+
358
+ def _pagerank_scipy(
359
+ G,
360
+ alpha=0.85,
361
+ personalization=None,
362
+ max_iter=100,
363
+ tol=1.0e-6,
364
+ nstart=None,
365
+ weight="weight",
366
+ dangling=None,
367
+ ):
368
+ """Returns the PageRank of the nodes in the graph.
369
+
370
+ PageRank computes a ranking of the nodes in the graph G based on
371
+ the structure of the incoming links. It was originally designed as
372
+ an algorithm to rank web pages.
373
+
374
+ Parameters
375
+ ----------
376
+ G : graph
377
+ A NetworkX graph. Undirected graphs will be converted to a directed
378
+ graph with two directed edges for each undirected edge.
379
+
380
+ alpha : float, optional
381
+ Damping parameter for PageRank, default=0.85.
382
+
383
+ personalization: dict, optional
384
+ The "personalization vector" consisting of a dictionary with a
385
+ key some subset of graph nodes and personalization value each of those.
386
+ At least one personalization value must be non-zero.
387
+ If not specified, a nodes personalization value will be zero.
388
+ By default, a uniform distribution is used.
389
+
390
+ max_iter : integer, optional
391
+ Maximum number of iterations in power method eigenvalue solver.
392
+
393
+ tol : float, optional
394
+ Error tolerance used to check convergence in power method solver.
395
+ The iteration will stop after a tolerance of ``len(G) * tol`` is reached.
396
+
397
+ nstart : dictionary, optional
398
+ Starting value of PageRank iteration for each node.
399
+
400
+ weight : key, optional
401
+ Edge data key to use as weight. If None weights are set to 1.
402
+
403
+ dangling: dict, optional
404
+ The outedges to be assigned to any "dangling" nodes, i.e., nodes without
405
+ any outedges. The dict key is the node the outedge points to and the dict
406
+ value is the weight of that outedge. By default, dangling nodes are given
407
+ outedges according to the personalization vector (uniform if not
408
+ specified) This must be selected to result in an irreducible transition
409
+ matrix (see notes under google_matrix). It may be common to have the
410
+ dangling dict to be the same as the personalization dict.
411
+
412
+ Returns
413
+ -------
414
+ pagerank : dictionary
415
+ Dictionary of nodes with PageRank as value
416
+
417
+ Examples
418
+ --------
419
+ >>> from networkx.algorithms.link_analysis.pagerank_alg import _pagerank_scipy
420
+ >>> G = nx.DiGraph(nx.path_graph(4))
421
+ >>> pr = _pagerank_scipy(G, alpha=0.9)
422
+
423
+ Notes
424
+ -----
425
+ The eigenvector calculation uses power iteration with a SciPy
426
+ sparse matrix representation.
427
+
428
+ This implementation works with Multi(Di)Graphs. For multigraphs the
429
+ weight between two nodes is set to be the sum of all edge weights
430
+ between those nodes.
431
+
432
+ See Also
433
+ --------
434
+ pagerank
435
+
436
+ Raises
437
+ ------
438
+ PowerIterationFailedConvergence
439
+ If the algorithm fails to converge to the specified tolerance
440
+ within the specified number of iterations of the power iteration
441
+ method.
442
+
443
+ References
444
+ ----------
445
+ .. [1] A. Langville and C. Meyer,
446
+ "A survey of eigenvector methods of web information retrieval."
447
+ http://citeseer.ist.psu.edu/713792.html
448
+ .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry,
449
+ The PageRank citation ranking: Bringing order to the Web. 1999
450
+ http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
451
+ """
452
+ import numpy as np
453
+ import scipy as sp
454
+
455
+ N = len(G)
456
+ if N == 0:
457
+ return {}
458
+
459
+ nodelist = list(G)
460
+ A = nx.to_scipy_sparse_array(G, nodelist=nodelist, weight=weight, dtype=float)
461
+ S = A.sum(axis=1)
462
+ S[S != 0] = 1.0 / S[S != 0]
463
+ # TODO: csr_array
464
+ Q = sp.sparse.csr_array(sp.sparse.spdiags(S.T, 0, *A.shape))
465
+ A = Q @ A
466
+
467
+ # initial vector
468
+ if nstart is None:
469
+ x = np.repeat(1.0 / N, N)
470
+ else:
471
+ x = np.array([nstart.get(n, 0) for n in nodelist], dtype=float)
472
+ x /= x.sum()
473
+
474
+ # Personalization vector
475
+ if personalization is None:
476
+ p = np.repeat(1.0 / N, N)
477
+ else:
478
+ p = np.array([personalization.get(n, 0) for n in nodelist], dtype=float)
479
+ if p.sum() == 0:
480
+ raise ZeroDivisionError
481
+ p /= p.sum()
482
+ # Dangling nodes
483
+ if dangling is None:
484
+ dangling_weights = p
485
+ else:
486
+ # Convert the dangling dictionary into an array in nodelist order
487
+ dangling_weights = np.array([dangling.get(n, 0) for n in nodelist], dtype=float)
488
+ dangling_weights /= dangling_weights.sum()
489
+ is_dangling = np.where(S == 0)[0]
490
+
491
+ # power iteration: make up to max_iter iterations
492
+ for _ in range(max_iter):
493
+ xlast = x
494
+ x = alpha * (x @ A + sum(x[is_dangling]) * dangling_weights) + (1 - alpha) * p
495
+ # check convergence, l1 norm
496
+ err = np.absolute(x - xlast).sum()
497
+ if err < N * tol:
498
+ return dict(zip(nodelist, map(float, x)))
499
+ raise nx.PowerIterationFailedConvergence(max_iter)
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__init__.py ADDED
File without changes
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (196 Bytes). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/test_hits.cpython-310.pyc ADDED
Binary file (2.96 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/__pycache__/test_pagerank.cpython-310.pyc ADDED
Binary file (7.76 kB). View file
 
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/test_hits.py ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import pytest
2
+
3
+ import networkx as nx
4
+
5
+ np = pytest.importorskip("numpy")
6
+ sp = pytest.importorskip("scipy")
7
+
8
+ from networkx.algorithms.link_analysis.hits_alg import (
9
+ _hits_numpy,
10
+ _hits_python,
11
+ _hits_scipy,
12
+ )
13
+
14
+ # Example from
15
+ # A. Langville and C. Meyer, "A survey of eigenvector methods of web
16
+ # information retrieval." http://citeseer.ist.psu.edu/713792.html
17
+
18
+
19
+ class TestHITS:
20
+ @classmethod
21
+ def setup_class(cls):
22
+ G = nx.DiGraph()
23
+
24
+ edges = [(1, 3), (1, 5), (2, 1), (3, 5), (5, 4), (5, 3), (6, 5)]
25
+
26
+ G.add_edges_from(edges, weight=1)
27
+ cls.G = G
28
+ cls.G.a = dict(
29
+ zip(sorted(G), [0.000000, 0.000000, 0.366025, 0.133975, 0.500000, 0.000000])
30
+ )
31
+ cls.G.h = dict(
32
+ zip(sorted(G), [0.366025, 0.000000, 0.211325, 0.000000, 0.211325, 0.211325])
33
+ )
34
+
35
+ def test_hits_numpy(self):
36
+ G = self.G
37
+ h, a = _hits_numpy(G)
38
+ for n in G:
39
+ assert h[n] == pytest.approx(G.h[n], abs=1e-4)
40
+ for n in G:
41
+ assert a[n] == pytest.approx(G.a[n], abs=1e-4)
42
+
43
+ @pytest.mark.parametrize("hits_alg", (nx.hits, _hits_python, _hits_scipy))
44
+ def test_hits(self, hits_alg):
45
+ G = self.G
46
+ h, a = hits_alg(G, tol=1.0e-08)
47
+ for n in G:
48
+ assert h[n] == pytest.approx(G.h[n], abs=1e-4)
49
+ for n in G:
50
+ assert a[n] == pytest.approx(G.a[n], abs=1e-4)
51
+ nstart = {i: 1.0 / 2 for i in G}
52
+ h, a = hits_alg(G, nstart=nstart)
53
+ for n in G:
54
+ assert h[n] == pytest.approx(G.h[n], abs=1e-4)
55
+ for n in G:
56
+ assert a[n] == pytest.approx(G.a[n], abs=1e-4)
57
+
58
+ def test_empty(self):
59
+ G = nx.Graph()
60
+ assert nx.hits(G) == ({}, {})
61
+ assert _hits_numpy(G) == ({}, {})
62
+ assert _hits_python(G) == ({}, {})
63
+ assert _hits_scipy(G) == ({}, {})
64
+
65
+ def test_hits_not_convergent(self):
66
+ G = nx.path_graph(50)
67
+ with pytest.raises(nx.PowerIterationFailedConvergence):
68
+ _hits_scipy(G, max_iter=1)
69
+ with pytest.raises(nx.PowerIterationFailedConvergence):
70
+ _hits_python(G, max_iter=1)
71
+ with pytest.raises(nx.PowerIterationFailedConvergence):
72
+ _hits_scipy(G, max_iter=0)
73
+ with pytest.raises(nx.PowerIterationFailedConvergence):
74
+ _hits_python(G, max_iter=0)
75
+ with pytest.raises(nx.PowerIterationFailedConvergence):
76
+ nx.hits(G, max_iter=0)
77
+ with pytest.raises(nx.PowerIterationFailedConvergence):
78
+ nx.hits(G, max_iter=1)
pythonProject/.venv/Lib/site-packages/networkx/algorithms/link_analysis/tests/test_pagerank.py ADDED
@@ -0,0 +1,217 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import random
2
+
3
+ import pytest
4
+
5
+ import networkx as nx
6
+ from networkx.classes.tests import dispatch_interface
7
+
8
+ np = pytest.importorskip("numpy")
9
+ pytest.importorskip("scipy")
10
+
11
+ from networkx.algorithms.link_analysis.pagerank_alg import (
12
+ _pagerank_numpy,
13
+ _pagerank_python,
14
+ _pagerank_scipy,
15
+ )
16
+
17
+ # Example from
18
+ # A. Langville and C. Meyer, "A survey of eigenvector methods of web
19
+ # information retrieval." http://citeseer.ist.psu.edu/713792.html
20
+
21
+
22
+ class TestPageRank:
23
+ @classmethod
24
+ def setup_class(cls):
25
+ G = nx.DiGraph()
26
+ edges = [
27
+ (1, 2),
28
+ (1, 3),
29
+ # 2 is a dangling node
30
+ (3, 1),
31
+ (3, 2),
32
+ (3, 5),
33
+ (4, 5),
34
+ (4, 6),
35
+ (5, 4),
36
+ (5, 6),
37
+ (6, 4),
38
+ ]
39
+ G.add_edges_from(edges)
40
+ cls.G = G
41
+ cls.G.pagerank = dict(
42
+ zip(
43
+ sorted(G),
44
+ [
45
+ 0.03721197,
46
+ 0.05395735,
47
+ 0.04150565,
48
+ 0.37508082,
49
+ 0.20599833,
50
+ 0.28624589,
51
+ ],
52
+ )
53
+ )
54
+ cls.dangling_node_index = 1
55
+ cls.dangling_edges = {1: 2, 2: 3, 3: 0, 4: 0, 5: 0, 6: 0}
56
+ cls.G.dangling_pagerank = dict(
57
+ zip(
58
+ sorted(G),
59
+ [0.10844518, 0.18618601, 0.0710892, 0.2683668, 0.15919783, 0.20671497],
60
+ )
61
+ )
62
+
63
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
64
+ def test_pagerank(self, alg):
65
+ G = self.G
66
+ p = alg(G, alpha=0.9, tol=1.0e-08)
67
+ for n in G:
68
+ assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
69
+
70
+ nstart = {n: random.random() for n in G}
71
+ p = alg(G, alpha=0.9, tol=1.0e-08, nstart=nstart)
72
+ for n in G:
73
+ assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
74
+
75
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
76
+ def test_pagerank_max_iter(self, alg):
77
+ with pytest.raises(nx.PowerIterationFailedConvergence):
78
+ alg(self.G, max_iter=0)
79
+
80
+ def test_numpy_pagerank(self):
81
+ G = self.G
82
+ p = _pagerank_numpy(G, alpha=0.9)
83
+ for n in G:
84
+ assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
85
+
86
+ # This additionally tests the @nx._dispatchable mechanism, treating
87
+ # nx.google_matrix as if it were a re-implementation from another package
88
+ @pytest.mark.parametrize("wrapper", [lambda x: x, dispatch_interface.convert])
89
+ def test_google_matrix(self, wrapper):
90
+ G = wrapper(self.G)
91
+ M = nx.google_matrix(G, alpha=0.9, nodelist=sorted(G))
92
+ _, ev = np.linalg.eig(M.T)
93
+ p = ev[:, 0] / ev[:, 0].sum()
94
+ for a, b in zip(p, self.G.pagerank.values()):
95
+ assert a == pytest.approx(b, abs=1e-7)
96
+
97
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, _pagerank_numpy))
98
+ def test_personalization(self, alg):
99
+ G = nx.complete_graph(4)
100
+ personalize = {0: 1, 1: 1, 2: 4, 3: 4}
101
+ answer = {
102
+ 0: 0.23246732615667579,
103
+ 1: 0.23246732615667579,
104
+ 2: 0.267532673843324,
105
+ 3: 0.2675326738433241,
106
+ }
107
+ p = alg(G, alpha=0.85, personalization=personalize)
108
+ for n in G:
109
+ assert p[n] == pytest.approx(answer[n], abs=1e-4)
110
+
111
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, nx.google_matrix))
112
+ def test_zero_personalization_vector(self, alg):
113
+ G = nx.complete_graph(4)
114
+ personalize = {0: 0, 1: 0, 2: 0, 3: 0}
115
+ pytest.raises(ZeroDivisionError, alg, G, personalization=personalize)
116
+
117
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
118
+ def test_one_nonzero_personalization_value(self, alg):
119
+ G = nx.complete_graph(4)
120
+ personalize = {0: 0, 1: 0, 2: 0, 3: 1}
121
+ answer = {
122
+ 0: 0.22077931820379187,
123
+ 1: 0.22077931820379187,
124
+ 2: 0.22077931820379187,
125
+ 3: 0.3376620453886241,
126
+ }
127
+ p = alg(G, alpha=0.85, personalization=personalize)
128
+ for n in G:
129
+ assert p[n] == pytest.approx(answer[n], abs=1e-4)
130
+
131
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
132
+ def test_incomplete_personalization(self, alg):
133
+ G = nx.complete_graph(4)
134
+ personalize = {3: 1}
135
+ answer = {
136
+ 0: 0.22077931820379187,
137
+ 1: 0.22077931820379187,
138
+ 2: 0.22077931820379187,
139
+ 3: 0.3376620453886241,
140
+ }
141
+ p = alg(G, alpha=0.85, personalization=personalize)
142
+ for n in G:
143
+ assert p[n] == pytest.approx(answer[n], abs=1e-4)
144
+
145
+ def test_dangling_matrix(self):
146
+ """
147
+ Tests that the google_matrix doesn't change except for the dangling
148
+ nodes.
149
+ """
150
+ G = self.G
151
+ dangling = self.dangling_edges
152
+ dangling_sum = sum(dangling.values())
153
+ M1 = nx.google_matrix(G, personalization=dangling)
154
+ M2 = nx.google_matrix(G, personalization=dangling, dangling=dangling)
155
+ for i in range(len(G)):
156
+ for j in range(len(G)):
157
+ if i == self.dangling_node_index and (j + 1) in dangling:
158
+ assert M2[i, j] == pytest.approx(
159
+ dangling[j + 1] / dangling_sum, abs=1e-4
160
+ )
161
+ else:
162
+ assert M2[i, j] == pytest.approx(M1[i, j], abs=1e-4)
163
+
164
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, _pagerank_numpy))
165
+ def test_dangling_pagerank(self, alg):
166
+ pr = alg(self.G, dangling=self.dangling_edges)
167
+ for n in self.G:
168
+ assert pr[n] == pytest.approx(self.G.dangling_pagerank[n], abs=1e-4)
169
+
170
+ def test_empty(self):
171
+ G = nx.Graph()
172
+ assert nx.pagerank(G) == {}
173
+ assert _pagerank_python(G) == {}
174
+ assert _pagerank_numpy(G) == {}
175
+ assert nx.google_matrix(G).shape == (0, 0)
176
+
177
+ @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
178
+ def test_multigraph(self, alg):
179
+ G = nx.MultiGraph()
180
+ G.add_edges_from([(1, 2), (1, 2), (1, 2), (2, 3), (2, 3), ("3", 3), ("3", 3)])
181
+ answer = {
182
+ 1: 0.21066048614468322,
183
+ 2: 0.3395308825985378,
184
+ 3: 0.28933951385531687,
185
+ "3": 0.16046911740146227,
186
+ }
187
+ p = alg(G)
188
+ for n in G:
189
+ assert p[n] == pytest.approx(answer[n], abs=1e-4)
190
+
191
+
192
+ class TestPageRankScipy(TestPageRank):
193
+ def test_scipy_pagerank(self):
194
+ G = self.G
195
+ p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08)
196
+ for n in G:
197
+ assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
198
+ personalize = {n: random.random() for n in G}
199
+ p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08, personalization=personalize)
200
+
201
+ nstart = {n: random.random() for n in G}
202
+ p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08, nstart=nstart)
203
+ for n in G:
204
+ assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
205
+
206
+ def test_scipy_pagerank_max_iter(self):
207
+ with pytest.raises(nx.PowerIterationFailedConvergence):
208
+ _pagerank_scipy(self.G, max_iter=0)
209
+
210
+ def test_dangling_scipy_pagerank(self):
211
+ pr = _pagerank_scipy(self.G, dangling=self.dangling_edges)
212
+ for n in self.G:
213
+ assert pr[n] == pytest.approx(self.G.dangling_pagerank[n], abs=1e-4)
214
+
215
+ def test_empty_scipy(self):
216
+ G = nx.Graph()
217
+ assert _pagerank_scipy(G) == {}
pythonProject/.venv/Lib/site-packages/networkx/algorithms/minors/__init__.py ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Subpackages related to graph-minor problems.
3
+
4
+ In graph theory, an undirected graph H is called a minor of the graph G if H
5
+ can be formed from G by deleting edges and vertices and by contracting edges
6
+ [1]_.
7
+
8
+ References
9
+ ----------
10
+ .. [1] https://en.wikipedia.org/wiki/Graph_minor
11
+ """
12
+
13
+ from networkx.algorithms.minors.contraction import (
14
+ contracted_edge,
15
+ contracted_nodes,
16
+ equivalence_classes,
17
+ identified_nodes,
18
+ quotient_graph,
19
+ )
20
+
21
+ __all__ = [
22
+ "contracted_edge",
23
+ "contracted_nodes",
24
+ "equivalence_classes",
25
+ "identified_nodes",
26
+ "quotient_graph",
27
+ ]
pythonProject/.venv/Lib/site-packages/networkx/algorithms/minors/contraction.py ADDED
@@ -0,0 +1,633 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Provides functions for computing minors of a graph."""
2
+ from itertools import chain, combinations, permutations, product
3
+
4
+ import networkx as nx
5
+ from networkx import density
6
+ from networkx.exception import NetworkXException
7
+ from networkx.utils import arbitrary_element
8
+
9
+ __all__ = [
10
+ "contracted_edge",
11
+ "contracted_nodes",
12
+ "equivalence_classes",
13
+ "identified_nodes",
14
+ "quotient_graph",
15
+ ]
16
+
17
+ chaini = chain.from_iterable
18
+
19
+
20
+ def equivalence_classes(iterable, relation):
21
+ """Returns equivalence classes of `relation` when applied to `iterable`.
22
+
23
+ The equivalence classes, or blocks, consist of objects from `iterable`
24
+ which are all equivalent. They are defined to be equivalent if the
25
+ `relation` function returns `True` when passed any two objects from that
26
+ class, and `False` otherwise. To define an equivalence relation the
27
+ function must be reflexive, symmetric and transitive.
28
+
29
+ Parameters
30
+ ----------
31
+ iterable : list, tuple, or set
32
+ An iterable of elements/nodes.
33
+
34
+ relation : function
35
+ A Boolean-valued function that implements an equivalence relation
36
+ (reflexive, symmetric, transitive binary relation) on the elements
37
+ of `iterable` - it must take two elements and return `True` if
38
+ they are related, or `False` if not.
39
+
40
+ Returns
41
+ -------
42
+ set of frozensets
43
+ A set of frozensets representing the partition induced by the equivalence
44
+ relation function `relation` on the elements of `iterable`. Each
45
+ member set in the return set represents an equivalence class, or
46
+ block, of the partition.
47
+
48
+ Duplicate elements will be ignored so it makes the most sense for
49
+ `iterable` to be a :class:`set`.
50
+
51
+ Notes
52
+ -----
53
+ This function does not check that `relation` represents an equivalence
54
+ relation. You can check that your equivalence classes provide a partition
55
+ using `is_partition`.
56
+
57
+ Examples
58
+ --------
59
+ Let `X` be the set of integers from `0` to `9`, and consider an equivalence
60
+ relation `R` on `X` of congruence modulo `3`: this means that two integers
61
+ `x` and `y` in `X` are equivalent under `R` if they leave the same
62
+ remainder when divided by `3`, i.e. `(x - y) mod 3 = 0`.
63
+
64
+ The equivalence classes of this relation are `{0, 3, 6, 9}`, `{1, 4, 7}`,
65
+ `{2, 5, 8}`: `0`, `3`, `6`, `9` are all divisible by `3` and leave zero
66
+ remainder; `1`, `4`, `7` leave remainder `1`; while `2`, `5` and `8` leave
67
+ remainder `2`. We can see this by calling `equivalence_classes` with
68
+ `X` and a function implementation of `R`.
69
+
70
+ >>> X = set(range(10))
71
+ >>> def mod3(x, y):
72
+ ... return (x - y) % 3 == 0
73
+ >>> equivalence_classes(X, mod3) # doctest: +SKIP
74
+ {frozenset({1, 4, 7}), frozenset({8, 2, 5}), frozenset({0, 9, 3, 6})}
75
+ """
76
+ # For simplicity of implementation, we initialize the return value as a
77
+ # list of lists, then convert it to a set of sets at the end of the
78
+ # function.
79
+ blocks = []
80
+ # Determine the equivalence class for each element of the iterable.
81
+ for y in iterable:
82
+ # Each element y must be in *exactly one* equivalence class.
83
+ #
84
+ # Each block is guaranteed to be non-empty
85
+ for block in blocks:
86
+ x = arbitrary_element(block)
87
+ if relation(x, y):
88
+ block.append(y)
89
+ break
90
+ else:
91
+ # If the element y is not part of any known equivalence class, it
92
+ # must be in its own, so we create a new singleton equivalence
93
+ # class for it.
94
+ blocks.append([y])
95
+ return {frozenset(block) for block in blocks}
96
+
97
+
98
+ @nx._dispatchable(edge_attrs="weight", returns_graph=True)
99
+ def quotient_graph(
100
+ G,
101
+ partition,
102
+ edge_relation=None,
103
+ node_data=None,
104
+ edge_data=None,
105
+ weight="weight",
106
+ relabel=False,
107
+ create_using=None,
108
+ ):
109
+ """Returns the quotient graph of `G` under the specified equivalence
110
+ relation on nodes.
111
+
112
+ Parameters
113
+ ----------
114
+ G : NetworkX graph
115
+ The graph for which to return the quotient graph with the
116
+ specified node relation.
117
+
118
+ partition : function, or dict or list of lists, tuples or sets
119
+ If a function, this function must represent an equivalence
120
+ relation on the nodes of `G`. It must take two arguments *u*
121
+ and *v* and return True exactly when *u* and *v* are in the
122
+ same equivalence class. The equivalence classes form the nodes
123
+ in the returned graph.
124
+
125
+ If a dict of lists/tuples/sets, the keys can be any meaningful
126
+ block labels, but the values must be the block lists/tuples/sets
127
+ (one list/tuple/set per block), and the blocks must form a valid
128
+ partition of the nodes of the graph. That is, each node must be
129
+ in exactly one block of the partition.
130
+
131
+ If a list of sets, the list must form a valid partition of
132
+ the nodes of the graph. That is, each node must be in exactly
133
+ one block of the partition.
134
+
135
+ edge_relation : Boolean function with two arguments
136
+ This function must represent an edge relation on the *blocks* of
137
+ the `partition` of `G`. It must take two arguments, *B* and *C*,
138
+ each one a set of nodes, and return True exactly when there should be
139
+ an edge joining block *B* to block *C* in the returned graph.
140
+
141
+ If `edge_relation` is not specified, it is assumed to be the
142
+ following relation. Block *B* is related to block *C* if and
143
+ only if some node in *B* is adjacent to some node in *C*,
144
+ according to the edge set of `G`.
145
+
146
+ node_data : function
147
+ This function takes one argument, *B*, a set of nodes in `G`,
148
+ and must return a dictionary representing the node data
149
+ attributes to set on the node representing *B* in the quotient graph.
150
+ If None, the following node attributes will be set:
151
+
152
+ * 'graph', the subgraph of the graph `G` that this block
153
+ represents,
154
+ * 'nnodes', the number of nodes in this block,
155
+ * 'nedges', the number of edges within this block,
156
+ * 'density', the density of the subgraph of `G` that this
157
+ block represents.
158
+
159
+ edge_data : function
160
+ This function takes two arguments, *B* and *C*, each one a set
161
+ of nodes, and must return a dictionary representing the edge
162
+ data attributes to set on the edge joining *B* and *C*, should
163
+ there be an edge joining *B* and *C* in the quotient graph (if
164
+ no such edge occurs in the quotient graph as determined by
165
+ `edge_relation`, then the output of this function is ignored).
166
+
167
+ If the quotient graph would be a multigraph, this function is
168
+ not applied, since the edge data from each edge in the graph
169
+ `G` appears in the edges of the quotient graph.
170
+
171
+ weight : string or None, optional (default="weight")
172
+ The name of an edge attribute that holds the numerical value
173
+ used as a weight. If None then each edge has weight 1.
174
+
175
+ relabel : bool
176
+ If True, relabel the nodes of the quotient graph to be
177
+ nonnegative integers. Otherwise, the nodes are identified with
178
+ :class:`frozenset` instances representing the blocks given in
179
+ `partition`.
180
+
181
+ create_using : NetworkX graph constructor, optional (default=nx.Graph)
182
+ Graph type to create. If graph instance, then cleared before populated.
183
+
184
+ Returns
185
+ -------
186
+ NetworkX graph
187
+ The quotient graph of `G` under the equivalence relation
188
+ specified by `partition`. If the partition were given as a
189
+ list of :class:`set` instances and `relabel` is False,
190
+ each node will be a :class:`frozenset` corresponding to the same
191
+ :class:`set`.
192
+
193
+ Raises
194
+ ------
195
+ NetworkXException
196
+ If the given partition is not a valid partition of the nodes of
197
+ `G`.
198
+
199
+ Examples
200
+ --------
201
+ The quotient graph of the complete bipartite graph under the "same
202
+ neighbors" equivalence relation is `K_2`. Under this relation, two nodes
203
+ are equivalent if they are not adjacent but have the same neighbor set.
204
+
205
+ >>> G = nx.complete_bipartite_graph(2, 3)
206
+ >>> same_neighbors = lambda u, v: (u not in G[v] and v not in G[u] and G[u] == G[v])
207
+ >>> Q = nx.quotient_graph(G, same_neighbors)
208
+ >>> K2 = nx.complete_graph(2)
209
+ >>> nx.is_isomorphic(Q, K2)
210
+ True
211
+
212
+ The quotient graph of a directed graph under the "same strongly connected
213
+ component" equivalence relation is the condensation of the graph (see
214
+ :func:`condensation`). This example comes from the Wikipedia article
215
+ *`Strongly connected component`_*.
216
+
217
+ >>> G = nx.DiGraph()
218
+ >>> edges = [
219
+ ... "ab",
220
+ ... "be",
221
+ ... "bf",
222
+ ... "bc",
223
+ ... "cg",
224
+ ... "cd",
225
+ ... "dc",
226
+ ... "dh",
227
+ ... "ea",
228
+ ... "ef",
229
+ ... "fg",
230
+ ... "gf",
231
+ ... "hd",
232
+ ... "hf",
233
+ ... ]
234
+ >>> G.add_edges_from(tuple(x) for x in edges)
235
+ >>> components = list(nx.strongly_connected_components(G))
236
+ >>> sorted(sorted(component) for component in components)
237
+ [['a', 'b', 'e'], ['c', 'd', 'h'], ['f', 'g']]
238
+ >>>
239
+ >>> C = nx.condensation(G, components)
240
+ >>> component_of = C.graph["mapping"]
241
+ >>> same_component = lambda u, v: component_of[u] == component_of[v]
242
+ >>> Q = nx.quotient_graph(G, same_component)
243
+ >>> nx.is_isomorphic(C, Q)
244
+ True
245
+
246
+ Node identification can be represented as the quotient of a graph under the
247
+ equivalence relation that places the two nodes in one block and each other
248
+ node in its own singleton block.
249
+
250
+ >>> K24 = nx.complete_bipartite_graph(2, 4)
251
+ >>> K34 = nx.complete_bipartite_graph(3, 4)
252
+ >>> C = nx.contracted_nodes(K34, 1, 2)
253
+ >>> nodes = {1, 2}
254
+ >>> is_contracted = lambda u, v: u in nodes and v in nodes
255
+ >>> Q = nx.quotient_graph(K34, is_contracted)
256
+ >>> nx.is_isomorphic(Q, C)
257
+ True
258
+ >>> nx.is_isomorphic(Q, K24)
259
+ True
260
+
261
+ The blockmodeling technique described in [1]_ can be implemented as a
262
+ quotient graph.
263
+
264
+ >>> G = nx.path_graph(6)
265
+ >>> partition = [{0, 1}, {2, 3}, {4, 5}]
266
+ >>> M = nx.quotient_graph(G, partition, relabel=True)
267
+ >>> list(M.edges())
268
+ [(0, 1), (1, 2)]
269
+
270
+ Here is the sample example but using partition as a dict of block sets.
271
+
272
+ >>> G = nx.path_graph(6)
273
+ >>> partition = {0: {0, 1}, 2: {2, 3}, 4: {4, 5}}
274
+ >>> M = nx.quotient_graph(G, partition, relabel=True)
275
+ >>> list(M.edges())
276
+ [(0, 1), (1, 2)]
277
+
278
+ Partitions can be represented in various ways:
279
+
280
+ 0. a list/tuple/set of block lists/tuples/sets
281
+ 1. a dict with block labels as keys and blocks lists/tuples/sets as values
282
+ 2. a dict with block lists/tuples/sets as keys and block labels as values
283
+ 3. a function from nodes in the original iterable to block labels
284
+ 4. an equivalence relation function on the target iterable
285
+
286
+ As `quotient_graph` is designed to accept partitions represented as (0), (1) or
287
+ (4) only, the `equivalence_classes` function can be used to get the partitions
288
+ in the right form, in order to call `quotient_graph`.
289
+
290
+ .. _Strongly connected component: https://en.wikipedia.org/wiki/Strongly_connected_component
291
+
292
+ References
293
+ ----------
294
+ .. [1] Patrick Doreian, Vladimir Batagelj, and Anuska Ferligoj.
295
+ *Generalized Blockmodeling*.
296
+ Cambridge University Press, 2004.
297
+
298
+ """
299
+ # If the user provided an equivalence relation as a function to compute
300
+ # the blocks of the partition on the nodes of G induced by the
301
+ # equivalence relation.
302
+ if callable(partition):
303
+ # equivalence_classes always return partition of whole G.
304
+ partition = equivalence_classes(G, partition)
305
+ if not nx.community.is_partition(G, partition):
306
+ raise nx.NetworkXException(
307
+ "Input `partition` is not an equivalence relation for nodes of G"
308
+ )
309
+ return _quotient_graph(
310
+ G,
311
+ partition,
312
+ edge_relation,
313
+ node_data,
314
+ edge_data,
315
+ weight,
316
+ relabel,
317
+ create_using,
318
+ )
319
+
320
+ # If the partition is a dict, it is assumed to be one where the keys are
321
+ # user-defined block labels, and values are block lists, tuples or sets.
322
+ if isinstance(partition, dict):
323
+ partition = list(partition.values())
324
+
325
+ # If the user provided partition as a collection of sets. Then we
326
+ # need to check if partition covers all of G nodes. If the answer
327
+ # is 'No' then we need to prepare suitable subgraph view.
328
+ partition_nodes = set().union(*partition)
329
+ if len(partition_nodes) != len(G):
330
+ G = G.subgraph(partition_nodes)
331
+ # Each node in the graph/subgraph must be in exactly one block.
332
+ if not nx.community.is_partition(G, partition):
333
+ raise NetworkXException("each node must be in exactly one part of `partition`")
334
+ return _quotient_graph(
335
+ G,
336
+ partition,
337
+ edge_relation,
338
+ node_data,
339
+ edge_data,
340
+ weight,
341
+ relabel,
342
+ create_using,
343
+ )
344
+
345
+
346
+ def _quotient_graph(
347
+ G, partition, edge_relation, node_data, edge_data, weight, relabel, create_using
348
+ ):
349
+ """Construct the quotient graph assuming input has been checked"""
350
+ if create_using is None:
351
+ H = G.__class__()
352
+ else:
353
+ H = nx.empty_graph(0, create_using)
354
+ # By default set some basic information about the subgraph that each block
355
+ # represents on the nodes in the quotient graph.
356
+ if node_data is None:
357
+
358
+ def node_data(b):
359
+ S = G.subgraph(b)
360
+ return {
361
+ "graph": S,
362
+ "nnodes": len(S),
363
+ "nedges": S.number_of_edges(),
364
+ "density": density(S),
365
+ }
366
+
367
+ # Each block of the partition becomes a node in the quotient graph.
368
+ partition = [frozenset(b) for b in partition]
369
+ H.add_nodes_from((b, node_data(b)) for b in partition)
370
+ # By default, the edge relation is the relation defined as follows. B is
371
+ # adjacent to C if a node in B is adjacent to a node in C, according to the
372
+ # edge set of G.
373
+ #
374
+ # This is not a particularly efficient implementation of this relation:
375
+ # there are O(n^2) pairs to check and each check may require O(log n) time
376
+ # (to check set membership). This can certainly be parallelized.
377
+ if edge_relation is None:
378
+
379
+ def edge_relation(b, c):
380
+ return any(v in G[u] for u, v in product(b, c))
381
+
382
+ # By default, sum the weights of the edges joining pairs of nodes across
383
+ # blocks to get the weight of the edge joining those two blocks.
384
+ if edge_data is None:
385
+
386
+ def edge_data(b, c):
387
+ edgedata = (
388
+ d
389
+ for u, v, d in G.edges(b | c, data=True)
390
+ if (u in b and v in c) or (u in c and v in b)
391
+ )
392
+ return {"weight": sum(d.get(weight, 1) for d in edgedata)}
393
+
394
+ block_pairs = permutations(H, 2) if H.is_directed() else combinations(H, 2)
395
+ # In a multigraph, add one edge in the quotient graph for each edge
396
+ # in the original graph.
397
+ if H.is_multigraph():
398
+ edges = chaini(
399
+ (
400
+ (b, c, G.get_edge_data(u, v, default={}))
401
+ for u, v in product(b, c)
402
+ if v in G[u]
403
+ )
404
+ for b, c in block_pairs
405
+ if edge_relation(b, c)
406
+ )
407
+ # In a simple graph, apply the edge data function to each pair of
408
+ # blocks to determine the edge data attributes to apply to each edge
409
+ # in the quotient graph.
410
+ else:
411
+ edges = (
412
+ (b, c, edge_data(b, c)) for (b, c) in block_pairs if edge_relation(b, c)
413
+ )
414
+ H.add_edges_from(edges)
415
+ # If requested by the user, relabel the nodes to be integers,
416
+ # numbered in increasing order from zero in the same order as the
417
+ # iteration order of `partition`.
418
+ if relabel:
419
+ # Can't use nx.convert_node_labels_to_integers() here since we
420
+ # want the order of iteration to be the same for backward
421
+ # compatibility with the nx.blockmodel() function.
422
+ labels = {b: i for i, b in enumerate(partition)}
423
+ H = nx.relabel_nodes(H, labels)
424
+ return H
425
+
426
+
427
+ @nx._dispatchable(
428
+ preserve_all_attrs=True, mutates_input={"not copy": 4}, returns_graph=True
429
+ )
430
+ def contracted_nodes(G, u, v, self_loops=True, copy=True):
431
+ """Returns the graph that results from contracting `u` and `v`.
432
+
433
+ Node contraction identifies the two nodes as a single node incident to any
434
+ edge that was incident to the original two nodes.
435
+
436
+ Parameters
437
+ ----------
438
+ G : NetworkX graph
439
+ The graph whose nodes will be contracted.
440
+
441
+ u, v : nodes
442
+ Must be nodes in `G`.
443
+
444
+ self_loops : Boolean
445
+ If this is True, any edges joining `u` and `v` in `G` become
446
+ self-loops on the new node in the returned graph.
447
+
448
+ copy : Boolean
449
+ If this is True (default True), make a copy of
450
+ `G` and return that instead of directly changing `G`.
451
+
452
+
453
+ Returns
454
+ -------
455
+ Networkx graph
456
+ If Copy is True,
457
+ A new graph object of the same type as `G` (leaving `G` unmodified)
458
+ with `u` and `v` identified in a single node. The right node `v`
459
+ will be merged into the node `u`, so only `u` will appear in the
460
+ returned graph.
461
+ If copy is False,
462
+ Modifies `G` with `u` and `v` identified in a single node.
463
+ The right node `v` will be merged into the node `u`, so
464
+ only `u` will appear in the returned graph.
465
+
466
+ Notes
467
+ -----
468
+ For multigraphs, the edge keys for the realigned edges may
469
+ not be the same as the edge keys for the old edges. This is
470
+ natural because edge keys are unique only within each pair of nodes.
471
+
472
+ For non-multigraphs where `u` and `v` are adjacent to a third node
473
+ `w`, the edge (`v`, `w`) will be contracted into the edge (`u`,
474
+ `w`) with its attributes stored into a "contraction" attribute.
475
+
476
+ This function is also available as `identified_nodes`.
477
+
478
+ Examples
479
+ --------
480
+ Contracting two nonadjacent nodes of the cycle graph on four nodes `C_4`
481
+ yields the path graph (ignoring parallel edges):
482
+
483
+ >>> G = nx.cycle_graph(4)
484
+ >>> M = nx.contracted_nodes(G, 1, 3)
485
+ >>> P3 = nx.path_graph(3)
486
+ >>> nx.is_isomorphic(M, P3)
487
+ True
488
+
489
+ >>> G = nx.MultiGraph(P3)
490
+ >>> M = nx.contracted_nodes(G, 0, 2)
491
+ >>> M.edges
492
+ MultiEdgeView([(0, 1, 0), (0, 1, 1)])
493
+
494
+ >>> G = nx.Graph([(1, 2), (2, 2)])
495
+ >>> H = nx.contracted_nodes(G, 1, 2, self_loops=False)
496
+ >>> list(H.nodes())
497
+ [1]
498
+ >>> list(H.edges())
499
+ [(1, 1)]
500
+
501
+ In a ``MultiDiGraph`` with a self loop, the in and out edges will
502
+ be treated separately as edges, so while contracting a node which
503
+ has a self loop the contraction will add multiple edges:
504
+
505
+ >>> G = nx.MultiDiGraph([(1, 2), (2, 2)])
506
+ >>> H = nx.contracted_nodes(G, 1, 2)
507
+ >>> list(H.edges()) # edge 1->2, 2->2, 2<-2 from the original Graph G
508
+ [(1, 1), (1, 1), (1, 1)]
509
+ >>> H = nx.contracted_nodes(G, 1, 2, self_loops=False)
510
+ >>> list(H.edges()) # edge 2->2, 2<-2 from the original Graph G
511
+ [(1, 1), (1, 1)]
512
+
513
+ See Also
514
+ --------
515
+ contracted_edge
516
+ quotient_graph
517
+
518
+ """
519
+ # Copying has significant overhead and can be disabled if needed
520
+ if copy:
521
+ H = G.copy()
522
+ else:
523
+ H = G
524
+
525
+ # edge code uses G.edges(v) instead of G.adj[v] to handle multiedges
526
+ if H.is_directed():
527
+ edges_to_remap = chain(G.in_edges(v, data=True), G.out_edges(v, data=True))
528
+ else:
529
+ edges_to_remap = G.edges(v, data=True)
530
+
531
+ # If the H=G, the generators change as H changes
532
+ # This makes the edges_to_remap independent of H
533
+ if not copy:
534
+ edges_to_remap = list(edges_to_remap)
535
+
536
+ v_data = H.nodes[v]
537
+ H.remove_node(v)
538
+
539
+ for prev_w, prev_x, d in edges_to_remap:
540
+ w = prev_w if prev_w != v else u
541
+ x = prev_x if prev_x != v else u
542
+
543
+ if ({prev_w, prev_x} == {u, v}) and not self_loops:
544
+ continue
545
+
546
+ if not H.has_edge(w, x) or G.is_multigraph():
547
+ H.add_edge(w, x, **d)
548
+ else:
549
+ if "contraction" in H.edges[(w, x)]:
550
+ H.edges[(w, x)]["contraction"][(prev_w, prev_x)] = d
551
+ else:
552
+ H.edges[(w, x)]["contraction"] = {(prev_w, prev_x): d}
553
+
554
+ if "contraction" in H.nodes[u]:
555
+ H.nodes[u]["contraction"][v] = v_data
556
+ else:
557
+ H.nodes[u]["contraction"] = {v: v_data}
558
+ return H
559
+
560
+
561
+ identified_nodes = contracted_nodes
562
+
563
+
564
+ @nx._dispatchable(
565
+ preserve_edge_attrs=True, mutates_input={"not copy": 3}, returns_graph=True
566
+ )
567
+ def contracted_edge(G, edge, self_loops=True, copy=True):
568
+ """Returns the graph that results from contracting the specified edge.
569
+
570
+ Edge contraction identifies the two endpoints of the edge as a single node
571
+ incident to any edge that was incident to the original two nodes. A graph
572
+ that results from edge contraction is called a *minor* of the original
573
+ graph.
574
+
575
+ Parameters
576
+ ----------
577
+ G : NetworkX graph
578
+ The graph whose edge will be contracted.
579
+
580
+ edge : tuple
581
+ Must be a pair of nodes in `G`.
582
+
583
+ self_loops : Boolean
584
+ If this is True, any edges (including `edge`) joining the
585
+ endpoints of `edge` in `G` become self-loops on the new node in the
586
+ returned graph.
587
+
588
+ copy : Boolean (default True)
589
+ If this is True, a the contraction will be performed on a copy of `G`,
590
+ otherwise the contraction will happen in place.
591
+
592
+ Returns
593
+ -------
594
+ Networkx graph
595
+ A new graph object of the same type as `G` (leaving `G` unmodified)
596
+ with endpoints of `edge` identified in a single node. The right node
597
+ of `edge` will be merged into the left one, so only the left one will
598
+ appear in the returned graph.
599
+
600
+ Raises
601
+ ------
602
+ ValueError
603
+ If `edge` is not an edge in `G`.
604
+
605
+ Examples
606
+ --------
607
+ Attempting to contract two nonadjacent nodes yields an error:
608
+
609
+ >>> G = nx.cycle_graph(4)
610
+ >>> nx.contracted_edge(G, (1, 3))
611
+ Traceback (most recent call last):
612
+ ...
613
+ ValueError: Edge (1, 3) does not exist in graph G; cannot contract it
614
+
615
+ Contracting two adjacent nodes in the cycle graph on *n* nodes yields the
616
+ cycle graph on *n - 1* nodes:
617
+
618
+ >>> C5 = nx.cycle_graph(5)
619
+ >>> C4 = nx.cycle_graph(4)
620
+ >>> M = nx.contracted_edge(C5, (0, 1), self_loops=False)
621
+ >>> nx.is_isomorphic(M, C4)
622
+ True
623
+
624
+ See also
625
+ --------
626
+ contracted_nodes
627
+ quotient_graph
628
+
629
+ """
630
+ u, v = edge[:2]
631
+ if not G.has_edge(u, v):
632
+ raise ValueError(f"Edge {edge} does not exist in graph G; cannot contract it")
633
+ return contracted_nodes(G, u, v, self_loops=self_loops, copy=copy)