ZTWHHH commited on
Commit
c7173d3
·
verified ·
1 Parent(s): 3a72245

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/INSTALLER +1 -0
  2. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/LICENSE +279 -0
  3. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/METADATA +126 -0
  4. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/RECORD +18 -0
  5. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/REQUESTED +0 -0
  6. valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/WHEEL +4 -0
  7. valley/lib/python3.10/site-packages/cpuinfo/__init__.py +5 -0
  8. valley/lib/python3.10/site-packages/cpuinfo/__main__.py +5 -0
  9. valley/lib/python3.10/site-packages/cpuinfo/__pycache__/__init__.cpython-310.pyc +0 -0
  10. valley/lib/python3.10/site-packages/cpuinfo/__pycache__/__main__.cpython-310.pyc +0 -0
  11. valley/lib/python3.10/site-packages/cpuinfo/__pycache__/cpuinfo.cpython-310.pyc +0 -0
  12. valley/lib/python3.10/site-packages/cpuinfo/cpuinfo.py +2827 -0
  13. valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/INSTALLER +1 -0
  14. valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/RECORD +9 -0
  15. valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/REQUESTED +0 -0
  16. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/AUTHORS +98 -0
  17. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/INSTALLER +1 -0
  18. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/LICENSE +20 -0
  19. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/METADATA +37 -0
  20. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/RECORD +30 -0
  21. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/REQUESTED +0 -0
  22. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/WHEEL +6 -0
  23. valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/top_level.txt +1 -0
  24. valley/lib/python3.10/site-packages/referencing/__init__.py +7 -0
  25. valley/lib/python3.10/site-packages/referencing/__pycache__/jsonschema.cpython-310.pyc +0 -0
  26. valley/lib/python3.10/site-packages/referencing/__pycache__/typing.cpython-310.pyc +0 -0
  27. valley/lib/python3.10/site-packages/referencing/_attrs.py +31 -0
  28. valley/lib/python3.10/site-packages/referencing/_attrs.pyi +20 -0
  29. valley/lib/python3.10/site-packages/referencing/_core.py +729 -0
  30. valley/lib/python3.10/site-packages/referencing/exceptions.py +165 -0
  31. valley/lib/python3.10/site-packages/referencing/jsonschema.py +642 -0
  32. valley/lib/python3.10/site-packages/referencing/py.typed +0 -0
  33. valley/lib/python3.10/site-packages/referencing/retrieval.py +87 -0
  34. valley/lib/python3.10/site-packages/referencing/tests/__init__.py +0 -0
  35. valley/lib/python3.10/site-packages/referencing/tests/__pycache__/test_core.cpython-310.pyc +0 -0
  36. valley/lib/python3.10/site-packages/referencing/tests/__pycache__/test_jsonschema.cpython-310.pyc +0 -0
  37. valley/lib/python3.10/site-packages/referencing/tests/test_core.py +1057 -0
  38. valley/lib/python3.10/site-packages/referencing/tests/test_exceptions.py +34 -0
  39. valley/lib/python3.10/site-packages/referencing/tests/test_jsonschema.py +382 -0
  40. valley/lib/python3.10/site-packages/referencing/tests/test_referencing_suite.py +66 -0
  41. valley/lib/python3.10/site-packages/referencing/tests/test_retrieval.py +106 -0
  42. valley/lib/python3.10/site-packages/referencing/typing.py +63 -0
  43. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/INSTALLER +1 -0
  44. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE +3 -0
  45. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE.APACHE2 +202 -0
  46. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE.MIT +20 -0
  47. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/METADATA +104 -0
  48. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/RECORD +20 -0
  49. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/REQUESTED +0 -0
  50. valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/WHEEL +5 -0
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/LICENSE ADDED
@@ -0,0 +1,279 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ A. HISTORY OF THE SOFTWARE
2
+ ==========================
3
+
4
+ Python was created in the early 1990s by Guido van Rossum at Stichting
5
+ Mathematisch Centrum (CWI, see https://www.cwi.nl) in the Netherlands
6
+ as a successor of a language called ABC. Guido remains Python's
7
+ principal author, although it includes many contributions from others.
8
+
9
+ In 1995, Guido continued his work on Python at the Corporation for
10
+ National Research Initiatives (CNRI, see https://www.cnri.reston.va.us)
11
+ in Reston, Virginia where he released several versions of the
12
+ software.
13
+
14
+ In May 2000, Guido and the Python core development team moved to
15
+ BeOpen.com to form the BeOpen PythonLabs team. In October of the same
16
+ year, the PythonLabs team moved to Digital Creations, which became
17
+ Zope Corporation. In 2001, the Python Software Foundation (PSF, see
18
+ https://www.python.org/psf/) was formed, a non-profit organization
19
+ created specifically to own Python-related Intellectual Property.
20
+ Zope Corporation was a sponsoring member of the PSF.
21
+
22
+ All Python releases are Open Source (see https://opensource.org for
23
+ the Open Source Definition). Historically, most, but not all, Python
24
+ releases have also been GPL-compatible; the table below summarizes
25
+ the various releases.
26
+
27
+ Release Derived Year Owner GPL-
28
+ from compatible? (1)
29
+
30
+ 0.9.0 thru 1.2 1991-1995 CWI yes
31
+ 1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
32
+ 1.6 1.5.2 2000 CNRI no
33
+ 2.0 1.6 2000 BeOpen.com no
34
+ 1.6.1 1.6 2001 CNRI yes (2)
35
+ 2.1 2.0+1.6.1 2001 PSF no
36
+ 2.0.1 2.0+1.6.1 2001 PSF yes
37
+ 2.1.1 2.1+2.0.1 2001 PSF yes
38
+ 2.1.2 2.1.1 2002 PSF yes
39
+ 2.1.3 2.1.2 2002 PSF yes
40
+ 2.2 and above 2.1.1 2001-now PSF yes
41
+
42
+ Footnotes:
43
+
44
+ (1) GPL-compatible doesn't mean that we're distributing Python under
45
+ the GPL. All Python licenses, unlike the GPL, let you distribute
46
+ a modified version without making your changes open source. The
47
+ GPL-compatible licenses make it possible to combine Python with
48
+ other software that is released under the GPL; the others don't.
49
+
50
+ (2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
51
+ because its license has a choice of law clause. According to
52
+ CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
53
+ is "not incompatible" with the GPL.
54
+
55
+ Thanks to the many outside volunteers who have worked under Guido's
56
+ direction to make these releases possible.
57
+
58
+
59
+ B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
60
+ ===============================================================
61
+
62
+ Python software and documentation are licensed under the
63
+ Python Software Foundation License Version 2.
64
+
65
+ Starting with Python 3.8.6, examples, recipes, and other code in
66
+ the documentation are dual licensed under the PSF License Version 2
67
+ and the Zero-Clause BSD license.
68
+
69
+ Some software incorporated into Python is under different licenses.
70
+ The licenses are listed with code falling under that license.
71
+
72
+
73
+ PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
74
+ --------------------------------------------
75
+
76
+ 1. This LICENSE AGREEMENT is between the Python Software Foundation
77
+ ("PSF"), and the Individual or Organization ("Licensee") accessing and
78
+ otherwise using this software ("Python") in source or binary form and
79
+ its associated documentation.
80
+
81
+ 2. Subject to the terms and conditions of this License Agreement, PSF hereby
82
+ grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
83
+ analyze, test, perform and/or display publicly, prepare derivative works,
84
+ distribute, and otherwise use Python alone or in any derivative version,
85
+ provided, however, that PSF's License Agreement and PSF's notice of copyright,
86
+ i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
87
+ 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023 Python Software Foundation;
88
+ All Rights Reserved" are retained in Python alone or in any derivative version
89
+ prepared by Licensee.
90
+
91
+ 3. In the event Licensee prepares a derivative work that is based on
92
+ or incorporates Python or any part thereof, and wants to make
93
+ the derivative work available to others as provided herein, then
94
+ Licensee hereby agrees to include in any such work a brief summary of
95
+ the changes made to Python.
96
+
97
+ 4. PSF is making Python available to Licensee on an "AS IS"
98
+ basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
99
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
100
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
101
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
102
+ INFRINGE ANY THIRD PARTY RIGHTS.
103
+
104
+ 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
105
+ FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
106
+ A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
107
+ OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
108
+
109
+ 6. This License Agreement will automatically terminate upon a material
110
+ breach of its terms and conditions.
111
+
112
+ 7. Nothing in this License Agreement shall be deemed to create any
113
+ relationship of agency, partnership, or joint venture between PSF and
114
+ Licensee. This License Agreement does not grant permission to use PSF
115
+ trademarks or trade name in a trademark sense to endorse or promote
116
+ products or services of Licensee, or any third party.
117
+
118
+ 8. By copying, installing or otherwise using Python, Licensee
119
+ agrees to be bound by the terms and conditions of this License
120
+ Agreement.
121
+
122
+
123
+ BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
124
+ -------------------------------------------
125
+
126
+ BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
127
+
128
+ 1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
129
+ office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
130
+ Individual or Organization ("Licensee") accessing and otherwise using
131
+ this software in source or binary form and its associated
132
+ documentation ("the Software").
133
+
134
+ 2. Subject to the terms and conditions of this BeOpen Python License
135
+ Agreement, BeOpen hereby grants Licensee a non-exclusive,
136
+ royalty-free, world-wide license to reproduce, analyze, test, perform
137
+ and/or display publicly, prepare derivative works, distribute, and
138
+ otherwise use the Software alone or in any derivative version,
139
+ provided, however, that the BeOpen Python License is retained in the
140
+ Software, alone or in any derivative version prepared by Licensee.
141
+
142
+ 3. BeOpen is making the Software available to Licensee on an "AS IS"
143
+ basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
144
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
145
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
146
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
147
+ INFRINGE ANY THIRD PARTY RIGHTS.
148
+
149
+ 4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
150
+ SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
151
+ AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
152
+ DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
153
+
154
+ 5. This License Agreement will automatically terminate upon a material
155
+ breach of its terms and conditions.
156
+
157
+ 6. This License Agreement shall be governed by and interpreted in all
158
+ respects by the law of the State of California, excluding conflict of
159
+ law provisions. Nothing in this License Agreement shall be deemed to
160
+ create any relationship of agency, partnership, or joint venture
161
+ between BeOpen and Licensee. This License Agreement does not grant
162
+ permission to use BeOpen trademarks or trade names in a trademark
163
+ sense to endorse or promote products or services of Licensee, or any
164
+ third party. As an exception, the "BeOpen Python" logos available at
165
+ http://www.pythonlabs.com/logos.html may be used according to the
166
+ permissions granted on that web page.
167
+
168
+ 7. By copying, installing or otherwise using the software, Licensee
169
+ agrees to be bound by the terms and conditions of this License
170
+ Agreement.
171
+
172
+
173
+ CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
174
+ ---------------------------------------
175
+
176
+ 1. This LICENSE AGREEMENT is between the Corporation for National
177
+ Research Initiatives, having an office at 1895 Preston White Drive,
178
+ Reston, VA 20191 ("CNRI"), and the Individual or Organization
179
+ ("Licensee") accessing and otherwise using Python 1.6.1 software in
180
+ source or binary form and its associated documentation.
181
+
182
+ 2. Subject to the terms and conditions of this License Agreement, CNRI
183
+ hereby grants Licensee a nonexclusive, royalty-free, world-wide
184
+ license to reproduce, analyze, test, perform and/or display publicly,
185
+ prepare derivative works, distribute, and otherwise use Python 1.6.1
186
+ alone or in any derivative version, provided, however, that CNRI's
187
+ License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
188
+ 1995-2001 Corporation for National Research Initiatives; All Rights
189
+ Reserved" are retained in Python 1.6.1 alone or in any derivative
190
+ version prepared by Licensee. Alternately, in lieu of CNRI's License
191
+ Agreement, Licensee may substitute the following text (omitting the
192
+ quotes): "Python 1.6.1 is made available subject to the terms and
193
+ conditions in CNRI's License Agreement. This Agreement together with
194
+ Python 1.6.1 may be located on the internet using the following
195
+ unique, persistent identifier (known as a handle): 1895.22/1013. This
196
+ Agreement may also be obtained from a proxy server on the internet
197
+ using the following URL: http://hdl.handle.net/1895.22/1013".
198
+
199
+ 3. In the event Licensee prepares a derivative work that is based on
200
+ or incorporates Python 1.6.1 or any part thereof, and wants to make
201
+ the derivative work available to others as provided herein, then
202
+ Licensee hereby agrees to include in any such work a brief summary of
203
+ the changes made to Python 1.6.1.
204
+
205
+ 4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
206
+ basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
207
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
208
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
209
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
210
+ INFRINGE ANY THIRD PARTY RIGHTS.
211
+
212
+ 5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
213
+ 1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
214
+ A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
215
+ OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
216
+
217
+ 6. This License Agreement will automatically terminate upon a material
218
+ breach of its terms and conditions.
219
+
220
+ 7. This License Agreement shall be governed by the federal
221
+ intellectual property law of the United States, including without
222
+ limitation the federal copyright law, and, to the extent such
223
+ U.S. federal law does not apply, by the law of the Commonwealth of
224
+ Virginia, excluding Virginia's conflict of law provisions.
225
+ Notwithstanding the foregoing, with regard to derivative works based
226
+ on Python 1.6.1 that incorporate non-separable material that was
227
+ previously distributed under the GNU General Public License (GPL), the
228
+ law of the Commonwealth of Virginia shall govern this License
229
+ Agreement only as to issues arising under or with respect to
230
+ Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
231
+ License Agreement shall be deemed to create any relationship of
232
+ agency, partnership, or joint venture between CNRI and Licensee. This
233
+ License Agreement does not grant permission to use CNRI trademarks or
234
+ trade name in a trademark sense to endorse or promote products or
235
+ services of Licensee, or any third party.
236
+
237
+ 8. By clicking on the "ACCEPT" button where indicated, or by copying,
238
+ installing or otherwise using Python 1.6.1, Licensee agrees to be
239
+ bound by the terms and conditions of this License Agreement.
240
+
241
+ ACCEPT
242
+
243
+
244
+ CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
245
+ --------------------------------------------------
246
+
247
+ Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
248
+ The Netherlands. All rights reserved.
249
+
250
+ Permission to use, copy, modify, and distribute this software and its
251
+ documentation for any purpose and without fee is hereby granted,
252
+ provided that the above copyright notice appear in all copies and that
253
+ both that copyright notice and this permission notice appear in
254
+ supporting documentation, and that the name of Stichting Mathematisch
255
+ Centrum or CWI not be used in advertising or publicity pertaining to
256
+ distribution of the software without specific, written prior
257
+ permission.
258
+
259
+ STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
260
+ THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
261
+ FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
262
+ FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
263
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
264
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
265
+ OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
266
+
267
+ ZERO-CLAUSE BSD LICENSE FOR CODE IN THE PYTHON DOCUMENTATION
268
+ ----------------------------------------------------------------------
269
+
270
+ Permission to use, copy, modify, and/or distribute this software for any
271
+ purpose with or without fee is hereby granted.
272
+
273
+ THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
274
+ REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
275
+ AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
276
+ INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
277
+ LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
278
+ OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
279
+ PERFORMANCE OF THIS SOFTWARE.
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/METADATA ADDED
@@ -0,0 +1,126 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: aiohappyeyeballs
3
+ Version: 2.4.2
4
+ Summary: Happy Eyeballs for asyncio
5
+ Home-page: https://github.com/aio-libs/aiohappyeyeballs
6
+ License: Python-2.0.1
7
+ Author: J. Nick Koston
8
+ Author-email: nick@koston.org
9
+ Requires-Python: >=3.8
10
+ Classifier: Development Status :: 5 - Production/Stable
11
+ Classifier: Intended Audience :: Developers
12
+ Classifier: License :: OSI Approved :: Python Software Foundation License
13
+ Classifier: License :: Other/Proprietary License
14
+ Classifier: Natural Language :: English
15
+ Classifier: Operating System :: OS Independent
16
+ Classifier: Programming Language :: Python :: 3
17
+ Classifier: Programming Language :: Python :: 3.8
18
+ Classifier: Programming Language :: Python :: 3.9
19
+ Classifier: Programming Language :: Python :: 3.10
20
+ Classifier: Programming Language :: Python :: 3.11
21
+ Classifier: Programming Language :: Python :: 3.12
22
+ Classifier: Programming Language :: Python :: 3.13
23
+ Classifier: Topic :: Software Development :: Libraries
24
+ Project-URL: Bug Tracker, https://github.com/aio-libs/aiohappyeyeballs/issues
25
+ Project-URL: Changelog, https://github.com/aio-libs/aiohappyeyeballs/blob/main/CHANGELOG.md
26
+ Project-URL: Documentation, https://aiohappyeyeballs.readthedocs.io
27
+ Project-URL: Repository, https://github.com/aio-libs/aiohappyeyeballs
28
+ Description-Content-Type: text/markdown
29
+
30
+ # aiohappyeyeballs
31
+
32
+ <p align="center">
33
+ <a href="https://github.com/aio-libs/aiohappyeyeballs/actions/workflows/ci.yml?query=branch%3Amain">
34
+ <img src="https://img.shields.io/github/actions/workflow/status/aio-libs/aiohappyeyeballs/ci-cd.yml?branch=main&label=CI&logo=github&style=flat-square" alt="CI Status" >
35
+ </a>
36
+ <a href="https://aiohappyeyeballs.readthedocs.io">
37
+ <img src="https://img.shields.io/readthedocs/aiohappyeyeballs.svg?logo=read-the-docs&logoColor=fff&style=flat-square" alt="Documentation Status">
38
+ </a>
39
+ <a href="https://codecov.io/gh/aio-libs/aiohappyeyeballs">
40
+ <img src="https://img.shields.io/codecov/c/github/aio-libs/aiohappyeyeballs.svg?logo=codecov&logoColor=fff&style=flat-square" alt="Test coverage percentage">
41
+ </a>
42
+ </p>
43
+ <p align="center">
44
+ <a href="https://python-poetry.org/">
45
+ <img src="https://img.shields.io/badge/packaging-poetry-299bd7?style=flat-square&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAASCAYAAABrXO8xAAAACXBIWXMAAAsTAAALEwEAmpwYAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAJJSURBVHgBfZLPa1NBEMe/s7tNXoxW1KJQKaUHkXhQvHgW6UHQQ09CBS/6V3hKc/AP8CqCrUcpmop3Cx48eDB4yEECjVQrlZb80CRN8t6OM/teagVxYZi38+Yz853dJbzoMV3MM8cJUcLMSUKIE8AzQ2PieZzFxEJOHMOgMQQ+dUgSAckNXhapU/NMhDSWLs1B24A8sO1xrN4NECkcAC9ASkiIJc6k5TRiUDPhnyMMdhKc+Zx19l6SgyeW76BEONY9exVQMzKExGKwwPsCzza7KGSSWRWEQhyEaDXp6ZHEr416ygbiKYOd7TEWvvcQIeusHYMJGhTwF9y7sGnSwaWyFAiyoxzqW0PM/RjghPxF2pWReAowTEXnDh0xgcLs8l2YQmOrj3N7ByiqEoH0cARs4u78WgAVkoEDIDoOi3AkcLOHU60RIg5wC4ZuTC7FaHKQm8Hq1fQuSOBvX/sodmNJSB5geaF5CPIkUeecdMxieoRO5jz9bheL6/tXjrwCyX/UYBUcjCaWHljx1xiX6z9xEjkYAzbGVnB8pvLmyXm9ep+W8CmsSHQQY77Zx1zboxAV0w7ybMhQmfqdmmw3nEp1I0Z+FGO6M8LZdoyZnuzzBdjISicKRnpxzI9fPb+0oYXsNdyi+d3h9bm9MWYHFtPeIZfLwzmFDKy1ai3p+PDls1Llz4yyFpferxjnyjJDSEy9CaCx5m2cJPerq6Xm34eTrZt3PqxYO1XOwDYZrFlH1fWnpU38Y9HRze3lj0vOujZcXKuuXm3jP+s3KbZVra7y2EAAAAAASUVORK5CYII=" alt="Poetry">
46
+ </a>
47
+ <a href="https://github.com/ambv/black">
48
+ <img src="https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square" alt="black">
49
+ </a>
50
+ <a href="https://github.com/pre-commit/pre-commit">
51
+ <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" alt="pre-commit">
52
+ </a>
53
+ </p>
54
+ <p align="center">
55
+ <a href="https://pypi.org/project/aiohappyeyeballs/">
56
+ <img src="https://img.shields.io/pypi/v/aiohappyeyeballs.svg?logo=python&logoColor=fff&style=flat-square" alt="PyPI Version">
57
+ </a>
58
+ <img src="https://img.shields.io/pypi/pyversions/aiohappyeyeballs.svg?style=flat-square&logo=python&amp;logoColor=fff" alt="Supported Python versions">
59
+ <img src="https://img.shields.io/pypi/l/aiohappyeyeballs.svg?style=flat-square" alt="License">
60
+ </p>
61
+
62
+ ---
63
+
64
+ **Documentation**: <a href="https://aiohappyeyeballs.readthedocs.io" target="_blank">https://aiohappyeyeballs.readthedocs.io </a>
65
+
66
+ **Source Code**: <a href="https://github.com/aio-libs/aiohappyeyeballs" target="_blank">https://github.com/aio-libs/aiohappyeyeballs </a>
67
+
68
+ ---
69
+
70
+ [Happy Eyeballs](https://en.wikipedia.org/wiki/Happy_Eyeballs)
71
+ ([RFC 8305](https://www.rfc-editor.org/rfc/rfc8305.html))
72
+
73
+ ## Use case
74
+
75
+ This library exists to allow connecting with
76
+ [Happy Eyeballs](https://en.wikipedia.org/wiki/Happy_Eyeballs)
77
+ ([RFC 8305](https://www.rfc-editor.org/rfc/rfc8305.html))
78
+ when you
79
+ already have a list of addrinfo and not a DNS name.
80
+
81
+ The stdlib version of `loop.create_connection()`
82
+ will only work when you pass in an unresolved name which
83
+ is not a good fit when using DNS caching or resolving
84
+ names via another method such as `zeroconf`.
85
+
86
+ ## Installation
87
+
88
+ Install this via pip (or your favourite package manager):
89
+
90
+ `pip install aiohappyeyeballs`
91
+
92
+ ## License
93
+
94
+ [aiohappyeyeballs is licensed under the same terms as cpython itself.](https://github.com/python/cpython/blob/main/LICENSE)
95
+
96
+ ## Example usage
97
+
98
+ ```python
99
+
100
+ addr_infos = await loop.getaddrinfo("example.org", 80)
101
+
102
+ socket = await start_connection(addr_infos)
103
+ socket = await start_connection(addr_infos, local_addr_infos=local_addr_infos, happy_eyeballs_delay=0.2)
104
+
105
+ transport, protocol = await loop.create_connection(
106
+ MyProtocol, sock=socket, ...)
107
+
108
+ # Remove the first address for each family from addr_info
109
+ pop_addr_infos_interleave(addr_info, 1)
110
+
111
+ # Remove all matching address from addr_info
112
+ remove_addr_infos(addr_info, "dead::beef::")
113
+
114
+ # Convert a local_addr to local_addr_infos
115
+ local_addr_infos = addr_to_addr_infos(("127.0.0.1",0))
116
+ ```
117
+
118
+ ## Credits
119
+
120
+ This package contains code from cpython and is licensed under the same terms as cpython itself.
121
+
122
+ This package was created with
123
+ [Copier](https://copier.readthedocs.io/) and the
124
+ [browniebroke/pypackage-template](https://github.com/browniebroke/pypackage-template)
125
+ project template.
126
+
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/RECORD ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohappyeyeballs-2.4.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
2
+ aiohappyeyeballs-2.4.2.dist-info/LICENSE,sha256=Oy-B_iHRgcSZxZolbI4ZaEVdZonSaaqFNzv7avQdo78,13936
3
+ aiohappyeyeballs-2.4.2.dist-info/METADATA,sha256=_ziodLe_RgvRdvRSJPZXCJ_yX5J6Phw8-mCHS4Mf__k,6038
4
+ aiohappyeyeballs-2.4.2.dist-info/RECORD,,
5
+ aiohappyeyeballs-2.4.2.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
+ aiohappyeyeballs-2.4.2.dist-info/WHEEL,sha256=sP946D7jFCHeNz5Iq4fL4Lu-PrWrFsgfLXbbkciIZwg,88
7
+ aiohappyeyeballs/__init__.py,sha256=Mh38VpQBOrri17V9BbQZlMA4wiAWECVXWRThzRStbaM,317
8
+ aiohappyeyeballs/__pycache__/__init__.cpython-310.pyc,,
9
+ aiohappyeyeballs/__pycache__/impl.cpython-310.pyc,,
10
+ aiohappyeyeballs/__pycache__/staggered.cpython-310.pyc,,
11
+ aiohappyeyeballs/__pycache__/types.cpython-310.pyc,,
12
+ aiohappyeyeballs/__pycache__/utils.cpython-310.pyc,,
13
+ aiohappyeyeballs/_staggered.py,sha256=IJaJ7byBVtgVwVzbcRluzXy4co3v3KdRDBIDXj6YVHE,3691
14
+ aiohappyeyeballs/impl.py,sha256=f0gsqRwZI4EaBifuNnYuggQkubFOK7PO_jnvmZAdJyU,7252
15
+ aiohappyeyeballs/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
16
+ aiohappyeyeballs/staggered.py,sha256=jS92EwfrnDQKSK1gU5K_QCnA-m7jz1bf5e2PaI-OKVk,249
17
+ aiohappyeyeballs/types.py,sha256=iYPiBTl5J7YEjnIqEOVUTRPzz2DwqSHBRhvbAlM0zv0,234
18
+ aiohappyeyeballs/utils.py,sha256=W_Oaf1iP8wYRHo6B95eYx-ZxbjpxyWwYgTdkhWqGF5c,3026
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/REQUESTED ADDED
File without changes
valley/lib/python3.10/site-packages/aiohappyeyeballs-2.4.2.dist-info/WHEEL ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: poetry-core 1.9.0
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
valley/lib/python3.10/site-packages/cpuinfo/__init__.py ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+
2
+ import sys
3
+ from cpuinfo.cpuinfo import *
4
+
5
+
valley/lib/python3.10/site-packages/cpuinfo/__main__.py ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+
2
+ import cpuinfo
3
+
4
+ cpuinfo.main()
5
+
valley/lib/python3.10/site-packages/cpuinfo/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (206 Bytes). View file
 
valley/lib/python3.10/site-packages/cpuinfo/__pycache__/__main__.cpython-310.pyc ADDED
Binary file (194 Bytes). View file
 
valley/lib/python3.10/site-packages/cpuinfo/__pycache__/cpuinfo.cpython-310.pyc ADDED
Binary file (58.7 kB). View file
 
valley/lib/python3.10/site-packages/cpuinfo/cpuinfo.py ADDED
@@ -0,0 +1,2827 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python
2
+ # -*- coding: UTF-8 -*-
3
+
4
+ # Copyright (c) 2014-2022 Matthew Brennan Jones <matthew.brennan.jones@gmail.com>
5
+ # Py-cpuinfo gets CPU info with pure Python
6
+ # It uses the MIT License
7
+ # It is hosted at: https://github.com/workhorsy/py-cpuinfo
8
+ #
9
+ # Permission is hereby granted, free of charge, to any person obtaining
10
+ # a copy of this software and associated documentation files (the
11
+ # "Software"), to deal in the Software without restriction, including
12
+ # without limitation the rights to use, copy, modify, merge, publish,
13
+ # distribute, sublicense, and/or sell copies of the Software, and to
14
+ # permit persons to whom the Software is furnished to do so, subject to
15
+ # the following conditions:
16
+ #
17
+ # The above copyright notice and this permission notice shall be included
18
+ # in all copies or substantial portions of the Software.
19
+ #
20
+ # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
21
+ # EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
22
+ # MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
23
+ # IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
24
+ # CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
25
+ # TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
26
+ # SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
27
+
28
+ CPUINFO_VERSION = (9, 0, 0)
29
+ CPUINFO_VERSION_STRING = '.'.join([str(n) for n in CPUINFO_VERSION])
30
+
31
+ import os, sys
32
+ import platform
33
+ import multiprocessing
34
+ import ctypes
35
+
36
+
37
+ CAN_CALL_CPUID_IN_SUBPROCESS = True
38
+
39
+ g_trace = None
40
+
41
+
42
+ class Trace(object):
43
+ def __init__(self, is_active, is_stored_in_string):
44
+ self._is_active = is_active
45
+ if not self._is_active:
46
+ return
47
+
48
+ from datetime import datetime
49
+ from io import StringIO
50
+
51
+ if is_stored_in_string:
52
+ self._output = StringIO()
53
+ else:
54
+ date = datetime.now().strftime("%Y-%m-%d_%H-%M-%S-%f")
55
+ self._output = open('cpuinfo_trace_{0}.trace'.format(date), 'w')
56
+
57
+ self._stdout = StringIO()
58
+ self._stderr = StringIO()
59
+ self._err = None
60
+
61
+ def header(self, msg):
62
+ if not self._is_active: return
63
+
64
+ from inspect import stack
65
+ frame = stack()[1]
66
+ file = frame[1]
67
+ line = frame[2]
68
+ self._output.write("{0} ({1} {2})\n".format(msg, file, line))
69
+ self._output.flush()
70
+
71
+ def success(self):
72
+ if not self._is_active: return
73
+
74
+ from inspect import stack
75
+ frame = stack()[1]
76
+ file = frame[1]
77
+ line = frame[2]
78
+
79
+ self._output.write("Success ... ({0} {1})\n\n".format(file, line))
80
+ self._output.flush()
81
+
82
+ def fail(self, msg):
83
+ if not self._is_active: return
84
+
85
+ from inspect import stack
86
+ frame = stack()[1]
87
+ file = frame[1]
88
+ line = frame[2]
89
+
90
+ if isinstance(msg, str):
91
+ msg = ''.join(['\t' + line for line in msg.split('\n')]) + '\n'
92
+
93
+ self._output.write(msg)
94
+ self._output.write("Failed ... ({0} {1})\n\n".format(file, line))
95
+ self._output.flush()
96
+ elif isinstance(msg, Exception):
97
+ from traceback import format_exc
98
+ err_string = format_exc()
99
+ self._output.write("\tFailed ... ({0} {1})\n".format(file, line))
100
+ self._output.write(''.join(['\t\t{0}\n'.format(n) for n in err_string.split('\n')]) + '\n')
101
+ self._output.flush()
102
+
103
+ def command_header(self, msg):
104
+ if not self._is_active: return
105
+
106
+ from inspect import stack
107
+ frame = stack()[3]
108
+ file = frame[1]
109
+ line = frame[2]
110
+ self._output.write("\t{0} ({1} {2})\n".format(msg, file, line))
111
+ self._output.flush()
112
+
113
+ def command_output(self, msg, output):
114
+ if not self._is_active: return
115
+
116
+ self._output.write("\t\t{0}\n".format(msg))
117
+ self._output.write(''.join(['\t\t\t{0}\n'.format(n) for n in output.split('\n')]) + '\n')
118
+ self._output.flush()
119
+
120
+ def keys(self, keys, info, new_info):
121
+ if not self._is_active: return
122
+
123
+ from inspect import stack
124
+ frame = stack()[2]
125
+ file = frame[1]
126
+ line = frame[2]
127
+
128
+ # List updated keys
129
+ self._output.write("\tChanged keys ({0} {1})\n".format(file, line))
130
+ changed_keys = [key for key in keys if key in info and key in new_info and info[key] != new_info[key]]
131
+ if changed_keys:
132
+ for key in changed_keys:
133
+ self._output.write('\t\t{0}: {1} to {2}\n'.format(key, info[key], new_info[key]))
134
+ else:
135
+ self._output.write('\t\tNone\n')
136
+
137
+ # List new keys
138
+ self._output.write("\tNew keys ({0} {1})\n".format(file, line))
139
+ new_keys = [key for key in keys if key in new_info and key not in info]
140
+ if new_keys:
141
+ for key in new_keys:
142
+ self._output.write('\t\t{0}: {1}\n'.format(key, new_info[key]))
143
+ else:
144
+ self._output.write('\t\tNone\n')
145
+
146
+ self._output.write('\n')
147
+ self._output.flush()
148
+
149
+ def write(self, msg):
150
+ if not self._is_active: return
151
+
152
+ self._output.write(msg + '\n')
153
+ self._output.flush()
154
+
155
+ def to_dict(self, info, is_fail):
156
+ return {
157
+ 'output' : self._output.getvalue(),
158
+ 'stdout' : self._stdout.getvalue(),
159
+ 'stderr' : self._stderr.getvalue(),
160
+ 'info' : info,
161
+ 'err' : self._err,
162
+ 'is_fail' : is_fail
163
+ }
164
+
165
+ class DataSource(object):
166
+ bits = platform.architecture()[0]
167
+ cpu_count = multiprocessing.cpu_count()
168
+ is_windows = platform.system().lower() == 'windows'
169
+ arch_string_raw = platform.machine()
170
+ uname_string_raw = platform.uname()[5]
171
+ can_cpuid = True
172
+
173
+ @staticmethod
174
+ def has_proc_cpuinfo():
175
+ return os.path.exists('/proc/cpuinfo')
176
+
177
+ @staticmethod
178
+ def has_dmesg():
179
+ return len(_program_paths('dmesg')) > 0
180
+
181
+ @staticmethod
182
+ def has_var_run_dmesg_boot():
183
+ uname = platform.system().strip().strip('"').strip("'").strip().lower()
184
+ return 'linux' in uname and os.path.exists('/var/run/dmesg.boot')
185
+
186
+ @staticmethod
187
+ def has_cpufreq_info():
188
+ return len(_program_paths('cpufreq-info')) > 0
189
+
190
+ @staticmethod
191
+ def has_sestatus():
192
+ return len(_program_paths('sestatus')) > 0
193
+
194
+ @staticmethod
195
+ def has_sysctl():
196
+ return len(_program_paths('sysctl')) > 0
197
+
198
+ @staticmethod
199
+ def has_isainfo():
200
+ return len(_program_paths('isainfo')) > 0
201
+
202
+ @staticmethod
203
+ def has_kstat():
204
+ return len(_program_paths('kstat')) > 0
205
+
206
+ @staticmethod
207
+ def has_sysinfo():
208
+ uname = platform.system().strip().strip('"').strip("'").strip().lower()
209
+ is_beos = 'beos' in uname or 'haiku' in uname
210
+ return is_beos and len(_program_paths('sysinfo')) > 0
211
+
212
+ @staticmethod
213
+ def has_lscpu():
214
+ return len(_program_paths('lscpu')) > 0
215
+
216
+ @staticmethod
217
+ def has_ibm_pa_features():
218
+ return len(_program_paths('lsprop')) > 0
219
+
220
+ @staticmethod
221
+ def has_wmic():
222
+ returncode, output = _run_and_get_stdout(['wmic', 'os', 'get', 'Version'])
223
+ return returncode == 0 and len(output) > 0
224
+
225
+ @staticmethod
226
+ def cat_proc_cpuinfo():
227
+ return _run_and_get_stdout(['cat', '/proc/cpuinfo'])
228
+
229
+ @staticmethod
230
+ def cpufreq_info():
231
+ return _run_and_get_stdout(['cpufreq-info'])
232
+
233
+ @staticmethod
234
+ def sestatus_b():
235
+ return _run_and_get_stdout(['sestatus', '-b'])
236
+
237
+ @staticmethod
238
+ def dmesg_a():
239
+ return _run_and_get_stdout(['dmesg', '-a'])
240
+
241
+ @staticmethod
242
+ def cat_var_run_dmesg_boot():
243
+ return _run_and_get_stdout(['cat', '/var/run/dmesg.boot'])
244
+
245
+ @staticmethod
246
+ def sysctl_machdep_cpu_hw_cpufrequency():
247
+ return _run_and_get_stdout(['sysctl', 'machdep.cpu', 'hw.cpufrequency'])
248
+
249
+ @staticmethod
250
+ def isainfo_vb():
251
+ return _run_and_get_stdout(['isainfo', '-vb'])
252
+
253
+ @staticmethod
254
+ def kstat_m_cpu_info():
255
+ return _run_and_get_stdout(['kstat', '-m', 'cpu_info'])
256
+
257
+ @staticmethod
258
+ def sysinfo_cpu():
259
+ return _run_and_get_stdout(['sysinfo', '-cpu'])
260
+
261
+ @staticmethod
262
+ def lscpu():
263
+ return _run_and_get_stdout(['lscpu'])
264
+
265
+ @staticmethod
266
+ def ibm_pa_features():
267
+ import glob
268
+
269
+ ibm_features = glob.glob('/proc/device-tree/cpus/*/ibm,pa-features')
270
+ if ibm_features:
271
+ return _run_and_get_stdout(['lsprop', ibm_features[0]])
272
+
273
+ @staticmethod
274
+ def wmic_cpu():
275
+ return _run_and_get_stdout(['wmic', 'cpu', 'get', 'Name,CurrentClockSpeed,L2CacheSize,L3CacheSize,Description,Caption,Manufacturer', '/format:list'])
276
+
277
+ @staticmethod
278
+ def winreg_processor_brand():
279
+ processor_brand = _read_windows_registry_key(r"Hardware\Description\System\CentralProcessor\0", "ProcessorNameString")
280
+ return processor_brand.strip()
281
+
282
+ @staticmethod
283
+ def winreg_vendor_id_raw():
284
+ vendor_id_raw = _read_windows_registry_key(r"Hardware\Description\System\CentralProcessor\0", "VendorIdentifier")
285
+ return vendor_id_raw
286
+
287
+ @staticmethod
288
+ def winreg_arch_string_raw():
289
+ arch_string_raw = _read_windows_registry_key(r"SYSTEM\CurrentControlSet\Control\Session Manager\Environment", "PROCESSOR_ARCHITECTURE")
290
+ return arch_string_raw
291
+
292
+ @staticmethod
293
+ def winreg_hz_actual():
294
+ hz_actual = _read_windows_registry_key(r"Hardware\Description\System\CentralProcessor\0", "~Mhz")
295
+ hz_actual = _to_decimal_string(hz_actual)
296
+ return hz_actual
297
+
298
+ @staticmethod
299
+ def winreg_feature_bits():
300
+ feature_bits = _read_windows_registry_key(r"Hardware\Description\System\CentralProcessor\0", "FeatureSet")
301
+ return feature_bits
302
+
303
+
304
+ def _program_paths(program_name):
305
+ paths = []
306
+ exts = filter(None, os.environ.get('PATHEXT', '').split(os.pathsep))
307
+ for p in os.environ['PATH'].split(os.pathsep):
308
+ p = os.path.join(p, program_name)
309
+ if os.access(p, os.X_OK):
310
+ paths.append(p)
311
+ for e in exts:
312
+ pext = p + e
313
+ if os.access(pext, os.X_OK):
314
+ paths.append(pext)
315
+ return paths
316
+
317
+ def _run_and_get_stdout(command, pipe_command=None):
318
+ from subprocess import Popen, PIPE
319
+
320
+ g_trace.command_header('Running command "' + ' '.join(command) + '" ...')
321
+
322
+ # Run the command normally
323
+ if not pipe_command:
324
+ p1 = Popen(command, stdout=PIPE, stderr=PIPE, stdin=PIPE)
325
+ # Run the command and pipe it into another command
326
+ else:
327
+ p2 = Popen(command, stdout=PIPE, stderr=PIPE, stdin=PIPE)
328
+ p1 = Popen(pipe_command, stdin=p2.stdout, stdout=PIPE, stderr=PIPE)
329
+ p2.stdout.close()
330
+
331
+ # Get the stdout and stderr
332
+ stdout_output, stderr_output = p1.communicate()
333
+ stdout_output = stdout_output.decode(encoding='UTF-8')
334
+ stderr_output = stderr_output.decode(encoding='UTF-8')
335
+
336
+ # Send the result to the logger
337
+ g_trace.command_output('return code:', str(p1.returncode))
338
+ g_trace.command_output('stdout:', stdout_output)
339
+
340
+ # Return the return code and stdout
341
+ return p1.returncode, stdout_output
342
+
343
+ def _read_windows_registry_key(key_name, field_name):
344
+ g_trace.command_header('Reading Registry key "{0}" field "{1}" ...'.format(key_name, field_name))
345
+
346
+ try:
347
+ import _winreg as winreg
348
+ except ImportError as err:
349
+ try:
350
+ import winreg
351
+ except ImportError as err:
352
+ pass
353
+
354
+ key = winreg.OpenKey(winreg.HKEY_LOCAL_MACHINE, key_name)
355
+ value = winreg.QueryValueEx(key, field_name)[0]
356
+ winreg.CloseKey(key)
357
+ g_trace.command_output('value:', str(value))
358
+ return value
359
+
360
+ # Make sure we are running on a supported system
361
+ def _check_arch():
362
+ arch, bits = _parse_arch(DataSource.arch_string_raw)
363
+ if not arch in ['X86_32', 'X86_64', 'ARM_7', 'ARM_8',
364
+ 'PPC_64', 'S390X', 'MIPS_32', 'MIPS_64',
365
+ "RISCV_32", "RISCV_64"]:
366
+ raise Exception("py-cpuinfo currently only works on X86 "
367
+ "and some ARM/PPC/S390X/MIPS/RISCV CPUs.")
368
+
369
+ def _obj_to_b64(thing):
370
+ import pickle
371
+ import base64
372
+
373
+ a = thing
374
+ b = pickle.dumps(a)
375
+ c = base64.b64encode(b)
376
+ d = c.decode('utf8')
377
+ return d
378
+
379
+ def _b64_to_obj(thing):
380
+ import pickle
381
+ import base64
382
+
383
+ try:
384
+ a = base64.b64decode(thing)
385
+ b = pickle.loads(a)
386
+ return b
387
+ except Exception:
388
+ return {}
389
+
390
+ def _utf_to_str(input):
391
+ if isinstance(input, list):
392
+ return [_utf_to_str(element) for element in input]
393
+ elif isinstance(input, dict):
394
+ return {_utf_to_str(key): _utf_to_str(value)
395
+ for key, value in input.items()}
396
+ else:
397
+ return input
398
+
399
+ def _copy_new_fields(info, new_info):
400
+ keys = [
401
+ 'vendor_id_raw', 'hardware_raw', 'brand_raw', 'hz_advertised_friendly', 'hz_actual_friendly',
402
+ 'hz_advertised', 'hz_actual', 'arch', 'bits', 'count',
403
+ 'arch_string_raw', 'uname_string_raw',
404
+ 'l2_cache_size', 'l2_cache_line_size', 'l2_cache_associativity',
405
+ 'stepping', 'model', 'family',
406
+ 'processor_type', 'flags',
407
+ 'l3_cache_size', 'l1_data_cache_size', 'l1_instruction_cache_size'
408
+ ]
409
+
410
+ g_trace.keys(keys, info, new_info)
411
+
412
+ # Update the keys with new values
413
+ for key in keys:
414
+ if new_info.get(key, None) and not info.get(key, None):
415
+ info[key] = new_info[key]
416
+ elif key == 'flags' and new_info.get('flags'):
417
+ for f in new_info['flags']:
418
+ if f not in info['flags']: info['flags'].append(f)
419
+ info['flags'].sort()
420
+
421
+ def _get_field_actual(cant_be_number, raw_string, field_names):
422
+ for line in raw_string.splitlines():
423
+ for field_name in field_names:
424
+ field_name = field_name.lower()
425
+ if ':' in line:
426
+ left, right = line.split(':', 1)
427
+ left = left.strip().lower()
428
+ right = right.strip()
429
+ if left == field_name and len(right) > 0:
430
+ if cant_be_number:
431
+ if not right.isdigit():
432
+ return right
433
+ else:
434
+ return right
435
+
436
+ return None
437
+
438
+ def _get_field(cant_be_number, raw_string, convert_to, default_value, *field_names):
439
+ retval = _get_field_actual(cant_be_number, raw_string, field_names)
440
+
441
+ # Convert the return value
442
+ if retval and convert_to:
443
+ try:
444
+ retval = convert_to(retval)
445
+ except Exception:
446
+ retval = default_value
447
+
448
+ # Return the default if there is no return value
449
+ if retval is None:
450
+ retval = default_value
451
+
452
+ return retval
453
+
454
+ def _to_decimal_string(ticks):
455
+ try:
456
+ # Convert to string
457
+ ticks = '{0}'.format(ticks)
458
+ # Sometimes ',' is used as a decimal separator
459
+ ticks = ticks.replace(',', '.')
460
+
461
+ # Strip off non numbers and decimal places
462
+ ticks = "".join(n for n in ticks if n.isdigit() or n=='.').strip()
463
+ if ticks == '':
464
+ ticks = '0'
465
+
466
+ # Add decimal if missing
467
+ if '.' not in ticks:
468
+ ticks = '{0}.0'.format(ticks)
469
+
470
+ # Remove trailing zeros
471
+ ticks = ticks.rstrip('0')
472
+
473
+ # Add one trailing zero for empty right side
474
+ if ticks.endswith('.'):
475
+ ticks = '{0}0'.format(ticks)
476
+
477
+ # Make sure the number can be converted to a float
478
+ ticks = float(ticks)
479
+ ticks = '{0}'.format(ticks)
480
+ return ticks
481
+ except Exception:
482
+ return '0.0'
483
+
484
+ def _hz_short_to_full(ticks, scale):
485
+ try:
486
+ # Make sure the number can be converted to a float
487
+ ticks = float(ticks)
488
+ ticks = '{0}'.format(ticks)
489
+
490
+ # Scale the numbers
491
+ hz = ticks.lstrip('0')
492
+ old_index = hz.index('.')
493
+ hz = hz.replace('.', '')
494
+ hz = hz.ljust(scale + old_index+1, '0')
495
+ new_index = old_index + scale
496
+ hz = '{0}.{1}'.format(hz[:new_index], hz[new_index:])
497
+ left, right = hz.split('.')
498
+ left, right = int(left), int(right)
499
+ return (left, right)
500
+ except Exception:
501
+ return (0, 0)
502
+
503
+ def _hz_friendly_to_full(hz_string):
504
+ try:
505
+ hz_string = hz_string.strip().lower()
506
+ hz, scale = (None, None)
507
+
508
+ if hz_string.endswith('ghz'):
509
+ scale = 9
510
+ elif hz_string.endswith('mhz'):
511
+ scale = 6
512
+ elif hz_string.endswith('hz'):
513
+ scale = 0
514
+
515
+ hz = "".join(n for n in hz_string if n.isdigit() or n=='.').strip()
516
+ if not '.' in hz:
517
+ hz += '.0'
518
+
519
+ hz, scale = _hz_short_to_full(hz, scale)
520
+
521
+ return (hz, scale)
522
+ except Exception:
523
+ return (0, 0)
524
+
525
+ def _hz_short_to_friendly(ticks, scale):
526
+ try:
527
+ # Get the raw Hz as a string
528
+ left, right = _hz_short_to_full(ticks, scale)
529
+ result = '{0}.{1}'.format(left, right)
530
+
531
+ # Get the location of the dot, and remove said dot
532
+ dot_index = result.index('.')
533
+ result = result.replace('.', '')
534
+
535
+ # Get the Hz symbol and scale
536
+ symbol = "Hz"
537
+ scale = 0
538
+ if dot_index > 9:
539
+ symbol = "GHz"
540
+ scale = 9
541
+ elif dot_index > 6:
542
+ symbol = "MHz"
543
+ scale = 6
544
+ elif dot_index > 3:
545
+ symbol = "KHz"
546
+ scale = 3
547
+
548
+ # Get the Hz with the dot at the new scaled point
549
+ result = '{0}.{1}'.format(result[:-scale-1], result[-scale-1:])
550
+
551
+ # Format the ticks to have 4 numbers after the decimal
552
+ # and remove any superfluous zeroes.
553
+ result = '{0:.4f} {1}'.format(float(result), symbol)
554
+ result = result.rstrip('0')
555
+ return result
556
+ except Exception:
557
+ return '0.0000 Hz'
558
+
559
+ def _to_friendly_bytes(input):
560
+ import re
561
+
562
+ if not input:
563
+ return input
564
+ input = "{0}".format(input)
565
+
566
+ formats = {
567
+ r"^[0-9]+B$" : 'B',
568
+ r"^[0-9]+K$" : 'KB',
569
+ r"^[0-9]+M$" : 'MB',
570
+ r"^[0-9]+G$" : 'GB'
571
+ }
572
+
573
+ for pattern, friendly_size in formats.items():
574
+ if re.match(pattern, input):
575
+ return "{0} {1}".format(input[ : -1].strip(), friendly_size)
576
+
577
+ return input
578
+
579
+ def _friendly_bytes_to_int(friendly_bytes):
580
+ input = friendly_bytes.lower()
581
+
582
+ formats = [
583
+ {'gib' : 1024 * 1024 * 1024},
584
+ {'mib' : 1024 * 1024},
585
+ {'kib' : 1024},
586
+
587
+ {'gb' : 1024 * 1024 * 1024},
588
+ {'mb' : 1024 * 1024},
589
+ {'kb' : 1024},
590
+
591
+ {'g' : 1024 * 1024 * 1024},
592
+ {'m' : 1024 * 1024},
593
+ {'k' : 1024},
594
+ {'b' : 1},
595
+ ]
596
+
597
+ try:
598
+ for entry in formats:
599
+ pattern = list(entry.keys())[0]
600
+ multiplier = list(entry.values())[0]
601
+ if input.endswith(pattern):
602
+ return int(input.split(pattern)[0].strip()) * multiplier
603
+
604
+ except Exception as err:
605
+ pass
606
+
607
+ return friendly_bytes
608
+
609
+ def _parse_cpu_brand_string(cpu_string):
610
+ # Just return 0 if the processor brand does not have the Hz
611
+ if not 'hz' in cpu_string.lower():
612
+ return ('0.0', 0)
613
+
614
+ hz = cpu_string.lower()
615
+ scale = 0
616
+
617
+ if hz.endswith('mhz'):
618
+ scale = 6
619
+ elif hz.endswith('ghz'):
620
+ scale = 9
621
+ if '@' in hz:
622
+ hz = hz.split('@')[1]
623
+ else:
624
+ hz = hz.rsplit(None, 1)[1]
625
+
626
+ hz = hz.rstrip('mhz').rstrip('ghz').strip()
627
+ hz = _to_decimal_string(hz)
628
+
629
+ return (hz, scale)
630
+
631
+ def _parse_cpu_brand_string_dx(cpu_string):
632
+ import re
633
+
634
+ # Find all the strings inside brackets ()
635
+ starts = [m.start() for m in re.finditer(r"\(", cpu_string)]
636
+ ends = [m.start() for m in re.finditer(r"\)", cpu_string)]
637
+ insides = {k: v for k, v in zip(starts, ends)}
638
+ insides = [cpu_string[start+1 : end] for start, end in insides.items()]
639
+
640
+ # Find all the fields
641
+ vendor_id, stepping, model, family = (None, None, None, None)
642
+ for inside in insides:
643
+ for pair in inside.split(','):
644
+ pair = [n.strip() for n in pair.split(':')]
645
+ if len(pair) > 1:
646
+ name, value = pair[0], pair[1]
647
+ if name == 'origin':
648
+ vendor_id = value.strip('"')
649
+ elif name == 'stepping':
650
+ stepping = int(value.lstrip('0x'), 16)
651
+ elif name == 'model':
652
+ model = int(value.lstrip('0x'), 16)
653
+ elif name in ['fam', 'family']:
654
+ family = int(value.lstrip('0x'), 16)
655
+
656
+ # Find the Processor Brand
657
+ # Strip off extra strings in brackets at end
658
+ brand = cpu_string.strip()
659
+ is_working = True
660
+ while is_working:
661
+ is_working = False
662
+ for inside in insides:
663
+ full = "({0})".format(inside)
664
+ if brand.endswith(full):
665
+ brand = brand[ :-len(full)].strip()
666
+ is_working = True
667
+
668
+ # Find the Hz in the brand string
669
+ hz_brand, scale = _parse_cpu_brand_string(brand)
670
+
671
+ # Find Hz inside brackets () after the brand string
672
+ if hz_brand == '0.0':
673
+ for inside in insides:
674
+ hz = inside
675
+ for entry in ['GHz', 'MHz', 'Hz']:
676
+ if entry in hz:
677
+ hz = "CPU @ " + hz[ : hz.find(entry) + len(entry)]
678
+ hz_brand, scale = _parse_cpu_brand_string(hz)
679
+ break
680
+
681
+ return (hz_brand, scale, brand, vendor_id, stepping, model, family)
682
+
683
+ def _parse_dmesg_output(output):
684
+ try:
685
+ # Get all the dmesg lines that might contain a CPU string
686
+ lines = output.split(' CPU0:')[1:] + \
687
+ output.split(' CPU1:')[1:] + \
688
+ output.split(' CPU:')[1:] + \
689
+ output.split('\nCPU0:')[1:] + \
690
+ output.split('\nCPU1:')[1:] + \
691
+ output.split('\nCPU:')[1:]
692
+ lines = [l.split('\n')[0].strip() for l in lines]
693
+
694
+ # Convert the lines to CPU strings
695
+ cpu_strings = [_parse_cpu_brand_string_dx(l) for l in lines]
696
+
697
+ # Find the CPU string that has the most fields
698
+ best_string = None
699
+ highest_count = 0
700
+ for cpu_string in cpu_strings:
701
+ count = sum([n is not None for n in cpu_string])
702
+ if count > highest_count:
703
+ highest_count = count
704
+ best_string = cpu_string
705
+
706
+ # If no CPU string was found, return {}
707
+ if not best_string:
708
+ return {}
709
+
710
+ hz_actual, scale, processor_brand, vendor_id, stepping, model, family = best_string
711
+
712
+ # Origin
713
+ if ' Origin=' in output:
714
+ fields = output[output.find(' Origin=') : ].split('\n')[0]
715
+ fields = fields.strip().split()
716
+ fields = [n.strip().split('=') for n in fields]
717
+ fields = [{n[0].strip().lower() : n[1].strip()} for n in fields]
718
+
719
+ for field in fields:
720
+ name = list(field.keys())[0]
721
+ value = list(field.values())[0]
722
+
723
+ if name == 'origin':
724
+ vendor_id = value.strip('"')
725
+ elif name == 'stepping':
726
+ stepping = int(value.lstrip('0x'), 16)
727
+ elif name == 'model':
728
+ model = int(value.lstrip('0x'), 16)
729
+ elif name in ['fam', 'family']:
730
+ family = int(value.lstrip('0x'), 16)
731
+
732
+ # Features
733
+ flag_lines = []
734
+ for category in [' Features=', ' Features2=', ' AMD Features=', ' AMD Features2=']:
735
+ if category in output:
736
+ flag_lines.append(output.split(category)[1].split('\n')[0])
737
+
738
+ flags = []
739
+ for line in flag_lines:
740
+ line = line.split('<')[1].split('>')[0].lower()
741
+ for flag in line.split(','):
742
+ flags.append(flag)
743
+ flags.sort()
744
+
745
+ # Convert from GHz/MHz string to Hz
746
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
747
+
748
+ # If advertised hz not found, use the actual hz
749
+ if hz_advertised == '0.0':
750
+ scale = 6
751
+ hz_advertised = _to_decimal_string(hz_actual)
752
+
753
+ info = {
754
+ 'vendor_id_raw' : vendor_id,
755
+ 'brand_raw' : processor_brand,
756
+
757
+ 'stepping' : stepping,
758
+ 'model' : model,
759
+ 'family' : family,
760
+ 'flags' : flags
761
+ }
762
+
763
+ if hz_advertised and hz_advertised != '0.0':
764
+ info['hz_advertised_friendly'] = _hz_short_to_friendly(hz_advertised, scale)
765
+ info['hz_actual_friendly'] = _hz_short_to_friendly(hz_actual, scale)
766
+
767
+ if hz_advertised and hz_advertised != '0.0':
768
+ info['hz_advertised'] = _hz_short_to_full(hz_advertised, scale)
769
+ info['hz_actual'] = _hz_short_to_full(hz_actual, scale)
770
+
771
+ return {k: v for k, v in info.items() if v}
772
+ except Exception as err:
773
+ g_trace.fail(err)
774
+ #raise
775
+
776
+ return {}
777
+
778
+ def _parse_arch(arch_string_raw):
779
+ import re
780
+
781
+ arch, bits = None, None
782
+ arch_string_raw = arch_string_raw.lower()
783
+
784
+ # X86
785
+ if re.match(r'^i\d86$|^x86$|^x86_32$|^i86pc$|^ia32$|^ia-32$|^bepc$', arch_string_raw):
786
+ arch = 'X86_32'
787
+ bits = 32
788
+ elif re.match(r'^x64$|^x86_64$|^x86_64t$|^i686-64$|^amd64$|^ia64$|^ia-64$', arch_string_raw):
789
+ arch = 'X86_64'
790
+ bits = 64
791
+ # ARM
792
+ elif re.match(r'^armv8-a|aarch64|arm64$', arch_string_raw):
793
+ arch = 'ARM_8'
794
+ bits = 64
795
+ elif re.match(r'^armv7$|^armv7[a-z]$|^armv7-[a-z]$|^armv6[a-z]$', arch_string_raw):
796
+ arch = 'ARM_7'
797
+ bits = 32
798
+ elif re.match(r'^armv8$|^armv8[a-z]$|^armv8-[a-z]$', arch_string_raw):
799
+ arch = 'ARM_8'
800
+ bits = 32
801
+ # PPC
802
+ elif re.match(r'^ppc32$|^prep$|^pmac$|^powermac$', arch_string_raw):
803
+ arch = 'PPC_32'
804
+ bits = 32
805
+ elif re.match(r'^powerpc$|^ppc64$|^ppc64le$', arch_string_raw):
806
+ arch = 'PPC_64'
807
+ bits = 64
808
+ # SPARC
809
+ elif re.match(r'^sparc32$|^sparc$', arch_string_raw):
810
+ arch = 'SPARC_32'
811
+ bits = 32
812
+ elif re.match(r'^sparc64$|^sun4u$|^sun4v$', arch_string_raw):
813
+ arch = 'SPARC_64'
814
+ bits = 64
815
+ # S390X
816
+ elif re.match(r'^s390x$', arch_string_raw):
817
+ arch = 'S390X'
818
+ bits = 64
819
+ elif arch_string_raw == 'mips':
820
+ arch = 'MIPS_32'
821
+ bits = 32
822
+ elif arch_string_raw == 'mips64':
823
+ arch = 'MIPS_64'
824
+ bits = 64
825
+ # RISCV
826
+ elif re.match(r'^riscv$|^riscv32$|^riscv32be$', arch_string_raw):
827
+ arch = 'RISCV_32'
828
+ bits = 32
829
+ elif re.match(r'^riscv64$|^riscv64be$', arch_string_raw):
830
+ arch = 'RISCV_64'
831
+ bits = 64
832
+
833
+ return (arch, bits)
834
+
835
+ def _is_bit_set(reg, bit):
836
+ mask = 1 << bit
837
+ is_set = reg & mask > 0
838
+ return is_set
839
+
840
+
841
+ def _is_selinux_enforcing(trace):
842
+ # Just return if the SE Linux Status Tool is not installed
843
+ if not DataSource.has_sestatus():
844
+ trace.fail('Failed to find sestatus.')
845
+ return False
846
+
847
+ # Run the sestatus, and just return if it failed to run
848
+ returncode, output = DataSource.sestatus_b()
849
+ if returncode != 0:
850
+ trace.fail('Failed to run sestatus. Skipping ...')
851
+ return False
852
+
853
+ # Figure out if explicitly in enforcing mode
854
+ for line in output.splitlines():
855
+ line = line.strip().lower()
856
+ if line.startswith("current mode:"):
857
+ if line.endswith("enforcing"):
858
+ return True
859
+ else:
860
+ return False
861
+
862
+ # Figure out if we can execute heap and execute memory
863
+ can_selinux_exec_heap = False
864
+ can_selinux_exec_memory = False
865
+ for line in output.splitlines():
866
+ line = line.strip().lower()
867
+ if line.startswith("allow_execheap") and line.endswith("on"):
868
+ can_selinux_exec_heap = True
869
+ elif line.startswith("allow_execmem") and line.endswith("on"):
870
+ can_selinux_exec_memory = True
871
+
872
+ trace.command_output('can_selinux_exec_heap:', can_selinux_exec_heap)
873
+ trace.command_output('can_selinux_exec_memory:', can_selinux_exec_memory)
874
+
875
+ return (not can_selinux_exec_heap or not can_selinux_exec_memory)
876
+
877
+ def _filter_dict_keys_with_empty_values(info, acceptable_values = {}):
878
+ filtered_info = {}
879
+ for key in info:
880
+ value = info[key]
881
+
882
+ # Keep if value is acceptable
883
+ if key in acceptable_values:
884
+ if acceptable_values[key] == value:
885
+ filtered_info[key] = value
886
+ continue
887
+
888
+ # Filter out None, 0, "", (), {}, []
889
+ if not value:
890
+ continue
891
+
892
+ # Filter out (0, 0)
893
+ if value == (0, 0):
894
+ continue
895
+
896
+ # Filter out -1
897
+ if value == -1:
898
+ continue
899
+
900
+ # Filter out strings that start with "0.0"
901
+ if type(value) == str and value.startswith('0.0'):
902
+ continue
903
+
904
+ filtered_info[key] = value
905
+
906
+ return filtered_info
907
+
908
+ class ASM(object):
909
+ def __init__(self, restype=None, argtypes=(), machine_code=[]):
910
+ self.restype = restype
911
+ self.argtypes = argtypes
912
+ self.machine_code = machine_code
913
+ self.prochandle = None
914
+ self.mm = None
915
+ self.func = None
916
+ self.address = None
917
+ self.size = 0
918
+
919
+ def compile(self):
920
+ machine_code = bytes.join(b'', self.machine_code)
921
+ self.size = ctypes.c_size_t(len(machine_code))
922
+
923
+ if DataSource.is_windows:
924
+ # Allocate a memory segment the size of the machine code, and make it executable
925
+ size = len(machine_code)
926
+ # Alloc at least 1 page to ensure we own all pages that we want to change protection on
927
+ if size < 0x1000: size = 0x1000
928
+ MEM_COMMIT = ctypes.c_ulong(0x1000)
929
+ PAGE_READWRITE = ctypes.c_ulong(0x4)
930
+ pfnVirtualAlloc = ctypes.windll.kernel32.VirtualAlloc
931
+ pfnVirtualAlloc.restype = ctypes.c_void_p
932
+ self.address = pfnVirtualAlloc(None, ctypes.c_size_t(size), MEM_COMMIT, PAGE_READWRITE)
933
+ if not self.address:
934
+ raise Exception("Failed to VirtualAlloc")
935
+
936
+ # Copy the machine code into the memory segment
937
+ memmove = ctypes.CFUNCTYPE(ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_size_t)(ctypes._memmove_addr)
938
+ if memmove(self.address, machine_code, size) < 0:
939
+ raise Exception("Failed to memmove")
940
+
941
+ # Enable execute permissions
942
+ PAGE_EXECUTE = ctypes.c_ulong(0x10)
943
+ old_protect = ctypes.c_ulong(0)
944
+ pfnVirtualProtect = ctypes.windll.kernel32.VirtualProtect
945
+ res = pfnVirtualProtect(ctypes.c_void_p(self.address), ctypes.c_size_t(size), PAGE_EXECUTE, ctypes.byref(old_protect))
946
+ if not res:
947
+ raise Exception("Failed VirtualProtect")
948
+
949
+ # Flush Instruction Cache
950
+ # First, get process Handle
951
+ if not self.prochandle:
952
+ pfnGetCurrentProcess = ctypes.windll.kernel32.GetCurrentProcess
953
+ pfnGetCurrentProcess.restype = ctypes.c_void_p
954
+ self.prochandle = ctypes.c_void_p(pfnGetCurrentProcess())
955
+ # Actually flush cache
956
+ res = ctypes.windll.kernel32.FlushInstructionCache(self.prochandle, ctypes.c_void_p(self.address), ctypes.c_size_t(size))
957
+ if not res:
958
+ raise Exception("Failed FlushInstructionCache")
959
+ else:
960
+ from mmap import mmap, MAP_PRIVATE, MAP_ANONYMOUS, PROT_WRITE, PROT_READ, PROT_EXEC
961
+
962
+ # Allocate a private and executable memory segment the size of the machine code
963
+ machine_code = bytes.join(b'', self.machine_code)
964
+ self.size = len(machine_code)
965
+ self.mm = mmap(-1, self.size, flags=MAP_PRIVATE | MAP_ANONYMOUS, prot=PROT_WRITE | PROT_READ | PROT_EXEC)
966
+
967
+ # Copy the machine code into the memory segment
968
+ self.mm.write(machine_code)
969
+ self.address = ctypes.addressof(ctypes.c_int.from_buffer(self.mm))
970
+
971
+ # Cast the memory segment into a function
972
+ functype = ctypes.CFUNCTYPE(self.restype, *self.argtypes)
973
+ self.func = functype(self.address)
974
+
975
+ def run(self):
976
+ # Call the machine code like a function
977
+ retval = self.func()
978
+
979
+ return retval
980
+
981
+ def free(self):
982
+ # Free the function memory segment
983
+ if DataSource.is_windows:
984
+ MEM_RELEASE = ctypes.c_ulong(0x8000)
985
+ ctypes.windll.kernel32.VirtualFree(ctypes.c_void_p(self.address), ctypes.c_size_t(0), MEM_RELEASE)
986
+ else:
987
+ self.mm.close()
988
+
989
+ self.prochandle = None
990
+ self.mm = None
991
+ self.func = None
992
+ self.address = None
993
+ self.size = 0
994
+
995
+
996
+ class CPUID(object):
997
+ def __init__(self, trace=None):
998
+ if trace is None:
999
+ trace = Trace(False, False)
1000
+
1001
+ # Figure out if SE Linux is on and in enforcing mode
1002
+ self.is_selinux_enforcing = _is_selinux_enforcing(trace)
1003
+
1004
+ def _asm_func(self, restype=None, argtypes=(), machine_code=[]):
1005
+ asm = ASM(restype, argtypes, machine_code)
1006
+ asm.compile()
1007
+ return asm
1008
+
1009
+ def _run_asm(self, *machine_code):
1010
+ asm = ASM(ctypes.c_uint32, (), machine_code)
1011
+ asm.compile()
1012
+ retval = asm.run()
1013
+ asm.free()
1014
+ return retval
1015
+
1016
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D0:_Get_vendor_ID
1017
+ def get_vendor_id(self):
1018
+ # EBX
1019
+ ebx = self._run_asm(
1020
+ b"\x31\xC0", # xor eax,eax
1021
+ b"\x0F\xA2" # cpuid
1022
+ b"\x89\xD8" # mov ax,bx
1023
+ b"\xC3" # ret
1024
+ )
1025
+
1026
+ # ECX
1027
+ ecx = self._run_asm(
1028
+ b"\x31\xC0", # xor eax,eax
1029
+ b"\x0f\xa2" # cpuid
1030
+ b"\x89\xC8" # mov ax,cx
1031
+ b"\xC3" # ret
1032
+ )
1033
+
1034
+ # EDX
1035
+ edx = self._run_asm(
1036
+ b"\x31\xC0", # xor eax,eax
1037
+ b"\x0f\xa2" # cpuid
1038
+ b"\x89\xD0" # mov ax,dx
1039
+ b"\xC3" # ret
1040
+ )
1041
+
1042
+ # Each 4bits is a ascii letter in the name
1043
+ vendor_id = []
1044
+ for reg in [ebx, edx, ecx]:
1045
+ for n in [0, 8, 16, 24]:
1046
+ vendor_id.append(chr((reg >> n) & 0xFF))
1047
+ vendor_id = ''.join(vendor_id)
1048
+
1049
+ return vendor_id
1050
+
1051
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D1:_Processor_Info_and_Feature_Bits
1052
+ def get_info(self):
1053
+ # EAX
1054
+ eax = self._run_asm(
1055
+ b"\xB8\x01\x00\x00\x00", # mov eax,0x1"
1056
+ b"\x0f\xa2" # cpuid
1057
+ b"\xC3" # ret
1058
+ )
1059
+
1060
+ # Get the CPU info
1061
+ stepping_id = (eax >> 0) & 0xF # 4 bits
1062
+ model = (eax >> 4) & 0xF # 4 bits
1063
+ family_id = (eax >> 8) & 0xF # 4 bits
1064
+ processor_type = (eax >> 12) & 0x3 # 2 bits
1065
+ extended_model_id = (eax >> 16) & 0xF # 4 bits
1066
+ extended_family_id = (eax >> 20) & 0xFF # 8 bits
1067
+ family = 0
1068
+
1069
+ if family_id in [15]:
1070
+ family = extended_family_id + family_id
1071
+ else:
1072
+ family = family_id
1073
+
1074
+ if family_id in [6, 15]:
1075
+ model = (extended_model_id << 4) + model
1076
+
1077
+ return {
1078
+ 'stepping' : stepping_id,
1079
+ 'model' : model,
1080
+ 'family' : family,
1081
+ 'processor_type' : processor_type
1082
+ }
1083
+
1084
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D80000000h:_Get_Highest_Extended_Function_Supported
1085
+ def get_max_extension_support(self):
1086
+ # Check for extension support
1087
+ max_extension_support = self._run_asm(
1088
+ b"\xB8\x00\x00\x00\x80" # mov ax,0x80000000
1089
+ b"\x0f\xa2" # cpuid
1090
+ b"\xC3" # ret
1091
+ )
1092
+
1093
+ return max_extension_support
1094
+
1095
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D1:_Processor_Info_and_Feature_Bits
1096
+ def get_flags(self, max_extension_support):
1097
+ # EDX
1098
+ edx = self._run_asm(
1099
+ b"\xB8\x01\x00\x00\x00", # mov eax,0x1"
1100
+ b"\x0f\xa2" # cpuid
1101
+ b"\x89\xD0" # mov ax,dx
1102
+ b"\xC3" # ret
1103
+ )
1104
+
1105
+ # ECX
1106
+ ecx = self._run_asm(
1107
+ b"\xB8\x01\x00\x00\x00", # mov eax,0x1"
1108
+ b"\x0f\xa2" # cpuid
1109
+ b"\x89\xC8" # mov ax,cx
1110
+ b"\xC3" # ret
1111
+ )
1112
+
1113
+ # Get the CPU flags
1114
+ flags = {
1115
+ 'fpu' : _is_bit_set(edx, 0),
1116
+ 'vme' : _is_bit_set(edx, 1),
1117
+ 'de' : _is_bit_set(edx, 2),
1118
+ 'pse' : _is_bit_set(edx, 3),
1119
+ 'tsc' : _is_bit_set(edx, 4),
1120
+ 'msr' : _is_bit_set(edx, 5),
1121
+ 'pae' : _is_bit_set(edx, 6),
1122
+ 'mce' : _is_bit_set(edx, 7),
1123
+ 'cx8' : _is_bit_set(edx, 8),
1124
+ 'apic' : _is_bit_set(edx, 9),
1125
+ #'reserved1' : _is_bit_set(edx, 10),
1126
+ 'sep' : _is_bit_set(edx, 11),
1127
+ 'mtrr' : _is_bit_set(edx, 12),
1128
+ 'pge' : _is_bit_set(edx, 13),
1129
+ 'mca' : _is_bit_set(edx, 14),
1130
+ 'cmov' : _is_bit_set(edx, 15),
1131
+ 'pat' : _is_bit_set(edx, 16),
1132
+ 'pse36' : _is_bit_set(edx, 17),
1133
+ 'pn' : _is_bit_set(edx, 18),
1134
+ 'clflush' : _is_bit_set(edx, 19),
1135
+ #'reserved2' : _is_bit_set(edx, 20),
1136
+ 'dts' : _is_bit_set(edx, 21),
1137
+ 'acpi' : _is_bit_set(edx, 22),
1138
+ 'mmx' : _is_bit_set(edx, 23),
1139
+ 'fxsr' : _is_bit_set(edx, 24),
1140
+ 'sse' : _is_bit_set(edx, 25),
1141
+ 'sse2' : _is_bit_set(edx, 26),
1142
+ 'ss' : _is_bit_set(edx, 27),
1143
+ 'ht' : _is_bit_set(edx, 28),
1144
+ 'tm' : _is_bit_set(edx, 29),
1145
+ 'ia64' : _is_bit_set(edx, 30),
1146
+ 'pbe' : _is_bit_set(edx, 31),
1147
+
1148
+ 'pni' : _is_bit_set(ecx, 0),
1149
+ 'pclmulqdq' : _is_bit_set(ecx, 1),
1150
+ 'dtes64' : _is_bit_set(ecx, 2),
1151
+ 'monitor' : _is_bit_set(ecx, 3),
1152
+ 'ds_cpl' : _is_bit_set(ecx, 4),
1153
+ 'vmx' : _is_bit_set(ecx, 5),
1154
+ 'smx' : _is_bit_set(ecx, 6),
1155
+ 'est' : _is_bit_set(ecx, 7),
1156
+ 'tm2' : _is_bit_set(ecx, 8),
1157
+ 'ssse3' : _is_bit_set(ecx, 9),
1158
+ 'cid' : _is_bit_set(ecx, 10),
1159
+ #'reserved3' : _is_bit_set(ecx, 11),
1160
+ 'fma' : _is_bit_set(ecx, 12),
1161
+ 'cx16' : _is_bit_set(ecx, 13),
1162
+ 'xtpr' : _is_bit_set(ecx, 14),
1163
+ 'pdcm' : _is_bit_set(ecx, 15),
1164
+ #'reserved4' : _is_bit_set(ecx, 16),
1165
+ 'pcid' : _is_bit_set(ecx, 17),
1166
+ 'dca' : _is_bit_set(ecx, 18),
1167
+ 'sse4_1' : _is_bit_set(ecx, 19),
1168
+ 'sse4_2' : _is_bit_set(ecx, 20),
1169
+ 'x2apic' : _is_bit_set(ecx, 21),
1170
+ 'movbe' : _is_bit_set(ecx, 22),
1171
+ 'popcnt' : _is_bit_set(ecx, 23),
1172
+ 'tscdeadline' : _is_bit_set(ecx, 24),
1173
+ 'aes' : _is_bit_set(ecx, 25),
1174
+ 'xsave' : _is_bit_set(ecx, 26),
1175
+ 'osxsave' : _is_bit_set(ecx, 27),
1176
+ 'avx' : _is_bit_set(ecx, 28),
1177
+ 'f16c' : _is_bit_set(ecx, 29),
1178
+ 'rdrnd' : _is_bit_set(ecx, 30),
1179
+ 'hypervisor' : _is_bit_set(ecx, 31)
1180
+ }
1181
+
1182
+ # Get a list of only the flags that are true
1183
+ flags = [k for k, v in flags.items() if v]
1184
+
1185
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D7.2C_ECX.3D0:_Extended_Features
1186
+ if max_extension_support >= 7:
1187
+ # EBX
1188
+ ebx = self._run_asm(
1189
+ b"\x31\xC9", # xor ecx,ecx
1190
+ b"\xB8\x07\x00\x00\x00" # mov eax,7
1191
+ b"\x0f\xa2" # cpuid
1192
+ b"\x89\xD8" # mov ax,bx
1193
+ b"\xC3" # ret
1194
+ )
1195
+
1196
+ # ECX
1197
+ ecx = self._run_asm(
1198
+ b"\x31\xC9", # xor ecx,ecx
1199
+ b"\xB8\x07\x00\x00\x00" # mov eax,7
1200
+ b"\x0f\xa2" # cpuid
1201
+ b"\x89\xC8" # mov ax,cx
1202
+ b"\xC3" # ret
1203
+ )
1204
+
1205
+ # Get the extended CPU flags
1206
+ extended_flags = {
1207
+ #'fsgsbase' : _is_bit_set(ebx, 0),
1208
+ #'IA32_TSC_ADJUST' : _is_bit_set(ebx, 1),
1209
+ 'sgx' : _is_bit_set(ebx, 2),
1210
+ 'bmi1' : _is_bit_set(ebx, 3),
1211
+ 'hle' : _is_bit_set(ebx, 4),
1212
+ 'avx2' : _is_bit_set(ebx, 5),
1213
+ #'reserved' : _is_bit_set(ebx, 6),
1214
+ 'smep' : _is_bit_set(ebx, 7),
1215
+ 'bmi2' : _is_bit_set(ebx, 8),
1216
+ 'erms' : _is_bit_set(ebx, 9),
1217
+ 'invpcid' : _is_bit_set(ebx, 10),
1218
+ 'rtm' : _is_bit_set(ebx, 11),
1219
+ 'pqm' : _is_bit_set(ebx, 12),
1220
+ #'FPU CS and FPU DS deprecated' : _is_bit_set(ebx, 13),
1221
+ 'mpx' : _is_bit_set(ebx, 14),
1222
+ 'pqe' : _is_bit_set(ebx, 15),
1223
+ 'avx512f' : _is_bit_set(ebx, 16),
1224
+ 'avx512dq' : _is_bit_set(ebx, 17),
1225
+ 'rdseed' : _is_bit_set(ebx, 18),
1226
+ 'adx' : _is_bit_set(ebx, 19),
1227
+ 'smap' : _is_bit_set(ebx, 20),
1228
+ 'avx512ifma' : _is_bit_set(ebx, 21),
1229
+ 'pcommit' : _is_bit_set(ebx, 22),
1230
+ 'clflushopt' : _is_bit_set(ebx, 23),
1231
+ 'clwb' : _is_bit_set(ebx, 24),
1232
+ 'intel_pt' : _is_bit_set(ebx, 25),
1233
+ 'avx512pf' : _is_bit_set(ebx, 26),
1234
+ 'avx512er' : _is_bit_set(ebx, 27),
1235
+ 'avx512cd' : _is_bit_set(ebx, 28),
1236
+ 'sha' : _is_bit_set(ebx, 29),
1237
+ 'avx512bw' : _is_bit_set(ebx, 30),
1238
+ 'avx512vl' : _is_bit_set(ebx, 31),
1239
+
1240
+ 'prefetchwt1' : _is_bit_set(ecx, 0),
1241
+ 'avx512vbmi' : _is_bit_set(ecx, 1),
1242
+ 'umip' : _is_bit_set(ecx, 2),
1243
+ 'pku' : _is_bit_set(ecx, 3),
1244
+ 'ospke' : _is_bit_set(ecx, 4),
1245
+ #'reserved' : _is_bit_set(ecx, 5),
1246
+ 'avx512vbmi2' : _is_bit_set(ecx, 6),
1247
+ #'reserved' : _is_bit_set(ecx, 7),
1248
+ 'gfni' : _is_bit_set(ecx, 8),
1249
+ 'vaes' : _is_bit_set(ecx, 9),
1250
+ 'vpclmulqdq' : _is_bit_set(ecx, 10),
1251
+ 'avx512vnni' : _is_bit_set(ecx, 11),
1252
+ 'avx512bitalg' : _is_bit_set(ecx, 12),
1253
+ #'reserved' : _is_bit_set(ecx, 13),
1254
+ 'avx512vpopcntdq' : _is_bit_set(ecx, 14),
1255
+ #'reserved' : _is_bit_set(ecx, 15),
1256
+ #'reserved' : _is_bit_set(ecx, 16),
1257
+ #'mpx0' : _is_bit_set(ecx, 17),
1258
+ #'mpx1' : _is_bit_set(ecx, 18),
1259
+ #'mpx2' : _is_bit_set(ecx, 19),
1260
+ #'mpx3' : _is_bit_set(ecx, 20),
1261
+ #'mpx4' : _is_bit_set(ecx, 21),
1262
+ 'rdpid' : _is_bit_set(ecx, 22),
1263
+ #'reserved' : _is_bit_set(ecx, 23),
1264
+ #'reserved' : _is_bit_set(ecx, 24),
1265
+ #'reserved' : _is_bit_set(ecx, 25),
1266
+ #'reserved' : _is_bit_set(ecx, 26),
1267
+ #'reserved' : _is_bit_set(ecx, 27),
1268
+ #'reserved' : _is_bit_set(ecx, 28),
1269
+ #'reserved' : _is_bit_set(ecx, 29),
1270
+ 'sgx_lc' : _is_bit_set(ecx, 30),
1271
+ #'reserved' : _is_bit_set(ecx, 31)
1272
+ }
1273
+
1274
+ # Get a list of only the flags that are true
1275
+ extended_flags = [k for k, v in extended_flags.items() if v]
1276
+ flags += extended_flags
1277
+
1278
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D80000001h:_Extended_Processor_Info_and_Feature_Bits
1279
+ if max_extension_support >= 0x80000001:
1280
+ # EBX
1281
+ ebx = self._run_asm(
1282
+ b"\xB8\x01\x00\x00\x80" # mov ax,0x80000001
1283
+ b"\x0f\xa2" # cpuid
1284
+ b"\x89\xD8" # mov ax,bx
1285
+ b"\xC3" # ret
1286
+ )
1287
+
1288
+ # ECX
1289
+ ecx = self._run_asm(
1290
+ b"\xB8\x01\x00\x00\x80" # mov ax,0x80000001
1291
+ b"\x0f\xa2" # cpuid
1292
+ b"\x89\xC8" # mov ax,cx
1293
+ b"\xC3" # ret
1294
+ )
1295
+
1296
+ # Get the extended CPU flags
1297
+ extended_flags = {
1298
+ 'fpu' : _is_bit_set(ebx, 0),
1299
+ 'vme' : _is_bit_set(ebx, 1),
1300
+ 'de' : _is_bit_set(ebx, 2),
1301
+ 'pse' : _is_bit_set(ebx, 3),
1302
+ 'tsc' : _is_bit_set(ebx, 4),
1303
+ 'msr' : _is_bit_set(ebx, 5),
1304
+ 'pae' : _is_bit_set(ebx, 6),
1305
+ 'mce' : _is_bit_set(ebx, 7),
1306
+ 'cx8' : _is_bit_set(ebx, 8),
1307
+ 'apic' : _is_bit_set(ebx, 9),
1308
+ #'reserved' : _is_bit_set(ebx, 10),
1309
+ 'syscall' : _is_bit_set(ebx, 11),
1310
+ 'mtrr' : _is_bit_set(ebx, 12),
1311
+ 'pge' : _is_bit_set(ebx, 13),
1312
+ 'mca' : _is_bit_set(ebx, 14),
1313
+ 'cmov' : _is_bit_set(ebx, 15),
1314
+ 'pat' : _is_bit_set(ebx, 16),
1315
+ 'pse36' : _is_bit_set(ebx, 17),
1316
+ #'reserved' : _is_bit_set(ebx, 18),
1317
+ 'mp' : _is_bit_set(ebx, 19),
1318
+ 'nx' : _is_bit_set(ebx, 20),
1319
+ #'reserved' : _is_bit_set(ebx, 21),
1320
+ 'mmxext' : _is_bit_set(ebx, 22),
1321
+ 'mmx' : _is_bit_set(ebx, 23),
1322
+ 'fxsr' : _is_bit_set(ebx, 24),
1323
+ 'fxsr_opt' : _is_bit_set(ebx, 25),
1324
+ 'pdpe1gp' : _is_bit_set(ebx, 26),
1325
+ 'rdtscp' : _is_bit_set(ebx, 27),
1326
+ #'reserved' : _is_bit_set(ebx, 28),
1327
+ 'lm' : _is_bit_set(ebx, 29),
1328
+ '3dnowext' : _is_bit_set(ebx, 30),
1329
+ '3dnow' : _is_bit_set(ebx, 31),
1330
+
1331
+ 'lahf_lm' : _is_bit_set(ecx, 0),
1332
+ 'cmp_legacy' : _is_bit_set(ecx, 1),
1333
+ 'svm' : _is_bit_set(ecx, 2),
1334
+ 'extapic' : _is_bit_set(ecx, 3),
1335
+ 'cr8_legacy' : _is_bit_set(ecx, 4),
1336
+ 'abm' : _is_bit_set(ecx, 5),
1337
+ 'sse4a' : _is_bit_set(ecx, 6),
1338
+ 'misalignsse' : _is_bit_set(ecx, 7),
1339
+ '3dnowprefetch' : _is_bit_set(ecx, 8),
1340
+ 'osvw' : _is_bit_set(ecx, 9),
1341
+ 'ibs' : _is_bit_set(ecx, 10),
1342
+ 'xop' : _is_bit_set(ecx, 11),
1343
+ 'skinit' : _is_bit_set(ecx, 12),
1344
+ 'wdt' : _is_bit_set(ecx, 13),
1345
+ #'reserved' : _is_bit_set(ecx, 14),
1346
+ 'lwp' : _is_bit_set(ecx, 15),
1347
+ 'fma4' : _is_bit_set(ecx, 16),
1348
+ 'tce' : _is_bit_set(ecx, 17),
1349
+ #'reserved' : _is_bit_set(ecx, 18),
1350
+ 'nodeid_msr' : _is_bit_set(ecx, 19),
1351
+ #'reserved' : _is_bit_set(ecx, 20),
1352
+ 'tbm' : _is_bit_set(ecx, 21),
1353
+ 'topoext' : _is_bit_set(ecx, 22),
1354
+ 'perfctr_core' : _is_bit_set(ecx, 23),
1355
+ 'perfctr_nb' : _is_bit_set(ecx, 24),
1356
+ #'reserved' : _is_bit_set(ecx, 25),
1357
+ 'dbx' : _is_bit_set(ecx, 26),
1358
+ 'perftsc' : _is_bit_set(ecx, 27),
1359
+ 'pci_l2i' : _is_bit_set(ecx, 28),
1360
+ #'reserved' : _is_bit_set(ecx, 29),
1361
+ #'reserved' : _is_bit_set(ecx, 30),
1362
+ #'reserved' : _is_bit_set(ecx, 31)
1363
+ }
1364
+
1365
+ # Get a list of only the flags that are true
1366
+ extended_flags = [k for k, v in extended_flags.items() if v]
1367
+ flags += extended_flags
1368
+
1369
+ flags.sort()
1370
+ return flags
1371
+
1372
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D80000002h.2C80000003h.2C80000004h:_Processor_Brand_String
1373
+ def get_processor_brand(self, max_extension_support):
1374
+ processor_brand = ""
1375
+
1376
+ # Processor brand string
1377
+ if max_extension_support >= 0x80000004:
1378
+ instructions = [
1379
+ b"\xB8\x02\x00\x00\x80", # mov ax,0x80000002
1380
+ b"\xB8\x03\x00\x00\x80", # mov ax,0x80000003
1381
+ b"\xB8\x04\x00\x00\x80" # mov ax,0x80000004
1382
+ ]
1383
+ for instruction in instructions:
1384
+ # EAX
1385
+ eax = self._run_asm(
1386
+ instruction, # mov ax,0x8000000?
1387
+ b"\x0f\xa2" # cpuid
1388
+ b"\x89\xC0" # mov ax,ax
1389
+ b"\xC3" # ret
1390
+ )
1391
+
1392
+ # EBX
1393
+ ebx = self._run_asm(
1394
+ instruction, # mov ax,0x8000000?
1395
+ b"\x0f\xa2" # cpuid
1396
+ b"\x89\xD8" # mov ax,bx
1397
+ b"\xC3" # ret
1398
+ )
1399
+
1400
+ # ECX
1401
+ ecx = self._run_asm(
1402
+ instruction, # mov ax,0x8000000?
1403
+ b"\x0f\xa2" # cpuid
1404
+ b"\x89\xC8" # mov ax,cx
1405
+ b"\xC3" # ret
1406
+ )
1407
+
1408
+ # EDX
1409
+ edx = self._run_asm(
1410
+ instruction, # mov ax,0x8000000?
1411
+ b"\x0f\xa2" # cpuid
1412
+ b"\x89\xD0" # mov ax,dx
1413
+ b"\xC3" # ret
1414
+ )
1415
+
1416
+ # Combine each of the 4 bytes in each register into the string
1417
+ for reg in [eax, ebx, ecx, edx]:
1418
+ for n in [0, 8, 16, 24]:
1419
+ processor_brand += chr((reg >> n) & 0xFF)
1420
+
1421
+ # Strip off any trailing NULL terminators and white space
1422
+ processor_brand = processor_brand.strip("\0").strip()
1423
+
1424
+ return processor_brand
1425
+
1426
+ # http://en.wikipedia.org/wiki/CPUID#EAX.3D80000006h:_Extended_L2_Cache_Features
1427
+ def get_cache(self, max_extension_support):
1428
+ cache_info = {}
1429
+
1430
+ # Just return if the cache feature is not supported
1431
+ if max_extension_support < 0x80000006:
1432
+ return cache_info
1433
+
1434
+ # ECX
1435
+ ecx = self._run_asm(
1436
+ b"\xB8\x06\x00\x00\x80" # mov ax,0x80000006
1437
+ b"\x0f\xa2" # cpuid
1438
+ b"\x89\xC8" # mov ax,cx
1439
+ b"\xC3" # ret
1440
+ )
1441
+
1442
+ cache_info = {
1443
+ 'size_b' : (ecx & 0xFF) * 1024,
1444
+ 'associativity' : (ecx >> 12) & 0xF,
1445
+ 'line_size_b' : (ecx >> 16) & 0xFFFF
1446
+ }
1447
+
1448
+ return cache_info
1449
+
1450
+ def get_ticks_func(self):
1451
+ retval = None
1452
+
1453
+ if DataSource.bits == '32bit':
1454
+ # Works on x86_32
1455
+ restype = None
1456
+ argtypes = (ctypes.POINTER(ctypes.c_uint), ctypes.POINTER(ctypes.c_uint))
1457
+ get_ticks_x86_32 = self._asm_func(restype, argtypes,
1458
+ [
1459
+ b"\x55", # push bp
1460
+ b"\x89\xE5", # mov bp,sp
1461
+ b"\x31\xC0", # xor ax,ax
1462
+ b"\x0F\xA2", # cpuid
1463
+ b"\x0F\x31", # rdtsc
1464
+ b"\x8B\x5D\x08", # mov bx,[di+0x8]
1465
+ b"\x8B\x4D\x0C", # mov cx,[di+0xc]
1466
+ b"\x89\x13", # mov [bp+di],dx
1467
+ b"\x89\x01", # mov [bx+di],ax
1468
+ b"\x5D", # pop bp
1469
+ b"\xC3" # ret
1470
+ ]
1471
+ )
1472
+
1473
+ # Monkey patch func to combine high and low args into one return
1474
+ old_func = get_ticks_x86_32.func
1475
+ def new_func():
1476
+ # Pass two uint32s into function
1477
+ high = ctypes.c_uint32(0)
1478
+ low = ctypes.c_uint32(0)
1479
+ old_func(ctypes.byref(high), ctypes.byref(low))
1480
+
1481
+ # Shift the two uint32s into one uint64
1482
+ retval = ((high.value << 32) & 0xFFFFFFFF00000000) | low.value
1483
+ return retval
1484
+ get_ticks_x86_32.func = new_func
1485
+
1486
+ retval = get_ticks_x86_32
1487
+ elif DataSource.bits == '64bit':
1488
+ # Works on x86_64
1489
+ restype = ctypes.c_uint64
1490
+ argtypes = ()
1491
+ get_ticks_x86_64 = self._asm_func(restype, argtypes,
1492
+ [
1493
+ b"\x48", # dec ax
1494
+ b"\x31\xC0", # xor ax,ax
1495
+ b"\x0F\xA2", # cpuid
1496
+ b"\x0F\x31", # rdtsc
1497
+ b"\x48", # dec ax
1498
+ b"\xC1\xE2\x20", # shl dx,byte 0x20
1499
+ b"\x48", # dec ax
1500
+ b"\x09\xD0", # or ax,dx
1501
+ b"\xC3", # ret
1502
+ ]
1503
+ )
1504
+
1505
+ retval = get_ticks_x86_64
1506
+ return retval
1507
+
1508
+ def get_raw_hz(self):
1509
+ from time import sleep
1510
+
1511
+ ticks_fn = self.get_ticks_func()
1512
+
1513
+ start = ticks_fn.func()
1514
+ sleep(1)
1515
+ end = ticks_fn.func()
1516
+
1517
+ ticks = (end - start)
1518
+ ticks_fn.free()
1519
+
1520
+ return ticks
1521
+
1522
+ def _get_cpu_info_from_cpuid_actual():
1523
+ '''
1524
+ Warning! This function has the potential to crash the Python runtime.
1525
+ Do not call it directly. Use the _get_cpu_info_from_cpuid function instead.
1526
+ It will safely call this function in another process.
1527
+ '''
1528
+
1529
+ from io import StringIO
1530
+
1531
+ trace = Trace(True, True)
1532
+ info = {}
1533
+
1534
+ # Pipe stdout and stderr to strings
1535
+ sys.stdout = trace._stdout
1536
+ sys.stderr = trace._stderr
1537
+
1538
+ try:
1539
+ # Get the CPU arch and bits
1540
+ arch, bits = _parse_arch(DataSource.arch_string_raw)
1541
+
1542
+ # Return none if this is not an X86 CPU
1543
+ if not arch in ['X86_32', 'X86_64']:
1544
+ trace.fail('Not running on X86_32 or X86_64. Skipping ...')
1545
+ return trace.to_dict(info, True)
1546
+
1547
+ # Return none if SE Linux is in enforcing mode
1548
+ cpuid = CPUID(trace)
1549
+ if cpuid.is_selinux_enforcing:
1550
+ trace.fail('SELinux is enforcing. Skipping ...')
1551
+ return trace.to_dict(info, True)
1552
+
1553
+ # Get the cpu info from the CPUID register
1554
+ max_extension_support = cpuid.get_max_extension_support()
1555
+ cache_info = cpuid.get_cache(max_extension_support)
1556
+ info = cpuid.get_info()
1557
+
1558
+ processor_brand = cpuid.get_processor_brand(max_extension_support)
1559
+
1560
+ # Get the Hz and scale
1561
+ hz_actual = cpuid.get_raw_hz()
1562
+ hz_actual = _to_decimal_string(hz_actual)
1563
+
1564
+ # Get the Hz and scale
1565
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
1566
+ info = {
1567
+ 'vendor_id_raw' : cpuid.get_vendor_id(),
1568
+ 'hardware_raw' : '',
1569
+ 'brand_raw' : processor_brand,
1570
+
1571
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
1572
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, 0),
1573
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
1574
+ 'hz_actual' : _hz_short_to_full(hz_actual, 0),
1575
+
1576
+ 'l2_cache_size' : cache_info['size_b'],
1577
+ 'l2_cache_line_size' : cache_info['line_size_b'],
1578
+ 'l2_cache_associativity' : cache_info['associativity'],
1579
+
1580
+ 'stepping' : info['stepping'],
1581
+ 'model' : info['model'],
1582
+ 'family' : info['family'],
1583
+ 'processor_type' : info['processor_type'],
1584
+ 'flags' : cpuid.get_flags(max_extension_support)
1585
+ }
1586
+
1587
+ info = _filter_dict_keys_with_empty_values(info)
1588
+ trace.success()
1589
+ except Exception as err:
1590
+ from traceback import format_exc
1591
+ err_string = format_exc()
1592
+ trace._err = ''.join(['\t\t{0}\n'.format(n) for n in err_string.split('\n')]) + '\n'
1593
+ return trace.to_dict(info, True)
1594
+
1595
+ return trace.to_dict(info, False)
1596
+
1597
+ def _get_cpu_info_from_cpuid_subprocess_wrapper(queue):
1598
+ orig_stdout = sys.stdout
1599
+ orig_stderr = sys.stderr
1600
+
1601
+ output = _get_cpu_info_from_cpuid_actual()
1602
+
1603
+ sys.stdout = orig_stdout
1604
+ sys.stderr = orig_stderr
1605
+
1606
+ queue.put(_obj_to_b64(output))
1607
+
1608
+ def _get_cpu_info_from_cpuid():
1609
+ '''
1610
+ Returns the CPU info gathered by querying the X86 cpuid register in a new process.
1611
+ Returns {} on non X86 cpus.
1612
+ Returns {} if SELinux is in enforcing mode.
1613
+ '''
1614
+
1615
+ g_trace.header('Tying to get info from CPUID ...')
1616
+
1617
+ from multiprocessing import Process, Queue
1618
+
1619
+ # Return {} if can't cpuid
1620
+ if not DataSource.can_cpuid:
1621
+ g_trace.fail('Can\'t CPUID. Skipping ...')
1622
+ return {}
1623
+
1624
+ # Get the CPU arch and bits
1625
+ arch, bits = _parse_arch(DataSource.arch_string_raw)
1626
+
1627
+ # Return {} if this is not an X86 CPU
1628
+ if not arch in ['X86_32', 'X86_64']:
1629
+ g_trace.fail('Not running on X86_32 or X86_64. Skipping ...')
1630
+ return {}
1631
+
1632
+ try:
1633
+ if CAN_CALL_CPUID_IN_SUBPROCESS:
1634
+ # Start running the function in a subprocess
1635
+ queue = Queue()
1636
+ p = Process(target=_get_cpu_info_from_cpuid_subprocess_wrapper, args=(queue,))
1637
+ p.start()
1638
+
1639
+ # Wait for the process to end, while it is still alive
1640
+ while p.is_alive():
1641
+ p.join(0)
1642
+
1643
+ # Return {} if it failed
1644
+ if p.exitcode != 0:
1645
+ g_trace.fail('Failed to run CPUID in process. Skipping ...')
1646
+ return {}
1647
+
1648
+ # Return {} if no results
1649
+ if queue.empty():
1650
+ g_trace.fail('Failed to get anything from CPUID process. Skipping ...')
1651
+ return {}
1652
+ # Return the result, only if there is something to read
1653
+ else:
1654
+ output = _b64_to_obj(queue.get())
1655
+ import pprint
1656
+ pp = pprint.PrettyPrinter(indent=4)
1657
+ #pp.pprint(output)
1658
+
1659
+ if 'output' in output and output['output']:
1660
+ g_trace.write(output['output'])
1661
+
1662
+ if 'stdout' in output and output['stdout']:
1663
+ sys.stdout.write('{0}\n'.format(output['stdout']))
1664
+ sys.stdout.flush()
1665
+
1666
+ if 'stderr' in output and output['stderr']:
1667
+ sys.stderr.write('{0}\n'.format(output['stderr']))
1668
+ sys.stderr.flush()
1669
+
1670
+ if 'is_fail' not in output:
1671
+ g_trace.fail('Failed to get is_fail from CPUID process. Skipping ...')
1672
+ return {}
1673
+
1674
+ # Fail if there was an exception
1675
+ if 'err' in output and output['err']:
1676
+ g_trace.fail('Failed to run CPUID in process. Skipping ...')
1677
+ g_trace.write(output['err'])
1678
+ g_trace.write('Failed ...')
1679
+ return {}
1680
+
1681
+ if 'is_fail' in output and output['is_fail']:
1682
+ g_trace.write('Failed ...')
1683
+ return {}
1684
+
1685
+ if 'info' not in output or not output['info']:
1686
+ g_trace.fail('Failed to get return info from CPUID process. Skipping ...')
1687
+ return {}
1688
+
1689
+ return output['info']
1690
+ else:
1691
+ # FIXME: This should write the values like in the above call to actual
1692
+ orig_stdout = sys.stdout
1693
+ orig_stderr = sys.stderr
1694
+
1695
+ output = _get_cpu_info_from_cpuid_actual()
1696
+
1697
+ sys.stdout = orig_stdout
1698
+ sys.stderr = orig_stderr
1699
+
1700
+ g_trace.success()
1701
+ return output['info']
1702
+ except Exception as err:
1703
+ g_trace.fail(err)
1704
+
1705
+ # Return {} if everything failed
1706
+ return {}
1707
+
1708
+ def _get_cpu_info_from_proc_cpuinfo():
1709
+ '''
1710
+ Returns the CPU info gathered from /proc/cpuinfo.
1711
+ Returns {} if /proc/cpuinfo is not found.
1712
+ '''
1713
+
1714
+ g_trace.header('Tying to get info from /proc/cpuinfo ...')
1715
+
1716
+ try:
1717
+ # Just return {} if there is no cpuinfo
1718
+ if not DataSource.has_proc_cpuinfo():
1719
+ g_trace.fail('Failed to find /proc/cpuinfo. Skipping ...')
1720
+ return {}
1721
+
1722
+ returncode, output = DataSource.cat_proc_cpuinfo()
1723
+ if returncode != 0:
1724
+ g_trace.fail('Failed to run cat /proc/cpuinfo. Skipping ...')
1725
+ return {}
1726
+
1727
+ # Various fields
1728
+ vendor_id = _get_field(False, output, None, '', 'vendor_id', 'vendor id', 'vendor')
1729
+ processor_brand = _get_field(True, output, None, None, 'model name', 'cpu', 'processor', 'uarch')
1730
+ cache_size = _get_field(False, output, None, '', 'cache size')
1731
+ stepping = _get_field(False, output, int, -1, 'stepping')
1732
+ model = _get_field(False, output, int, -1, 'model')
1733
+ family = _get_field(False, output, int, -1, 'cpu family')
1734
+ hardware = _get_field(False, output, None, '', 'Hardware')
1735
+
1736
+ # Flags
1737
+ flags = _get_field(False, output, None, None, 'flags', 'Features', 'ASEs implemented')
1738
+ if flags:
1739
+ flags = flags.split()
1740
+ flags.sort()
1741
+
1742
+ # Check for other cache format
1743
+ if not cache_size:
1744
+ try:
1745
+ for i in range(0, 10):
1746
+ name = "cache{0}".format(i)
1747
+ value = _get_field(False, output, None, None, name)
1748
+ if value:
1749
+ value = [entry.split('=') for entry in value.split(' ')]
1750
+ value = dict(value)
1751
+ if 'level' in value and value['level'] == '3' and 'size' in value:
1752
+ cache_size = value['size']
1753
+ break
1754
+ except Exception:
1755
+ pass
1756
+
1757
+ # Convert from MHz string to Hz
1758
+ hz_actual = _get_field(False, output, None, '', 'cpu MHz', 'cpu speed', 'clock', 'cpu MHz dynamic', 'cpu MHz static')
1759
+ hz_actual = hz_actual.lower().rstrip('mhz').strip()
1760
+ hz_actual = _to_decimal_string(hz_actual)
1761
+
1762
+ # Convert from GHz/MHz string to Hz
1763
+ hz_advertised, scale = (None, 0)
1764
+ try:
1765
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
1766
+ except Exception:
1767
+ pass
1768
+
1769
+ info = {
1770
+ 'hardware_raw' : hardware,
1771
+ 'brand_raw' : processor_brand,
1772
+
1773
+ 'l3_cache_size' : _friendly_bytes_to_int(cache_size),
1774
+ 'flags' : flags,
1775
+ 'vendor_id_raw' : vendor_id,
1776
+ 'stepping' : stepping,
1777
+ 'model' : model,
1778
+ 'family' : family,
1779
+ }
1780
+
1781
+ # Make the Hz the same for actual and advertised if missing any
1782
+ if not hz_advertised or hz_advertised == '0.0':
1783
+ hz_advertised = hz_actual
1784
+ scale = 6
1785
+ elif not hz_actual or hz_actual == '0.0':
1786
+ hz_actual = hz_advertised
1787
+
1788
+ # Add the Hz if there is one
1789
+ if _hz_short_to_full(hz_advertised, scale) > (0, 0):
1790
+ info['hz_advertised_friendly'] = _hz_short_to_friendly(hz_advertised, scale)
1791
+ info['hz_advertised'] = _hz_short_to_full(hz_advertised, scale)
1792
+ if _hz_short_to_full(hz_actual, scale) > (0, 0):
1793
+ info['hz_actual_friendly'] = _hz_short_to_friendly(hz_actual, 6)
1794
+ info['hz_actual'] = _hz_short_to_full(hz_actual, 6)
1795
+
1796
+ info = _filter_dict_keys_with_empty_values(info, {'stepping':0, 'model':0, 'family':0})
1797
+ g_trace.success()
1798
+ return info
1799
+ except Exception as err:
1800
+ g_trace.fail(err)
1801
+ #raise # NOTE: To have this throw on error, uncomment this line
1802
+ return {}
1803
+
1804
+ def _get_cpu_info_from_cpufreq_info():
1805
+ '''
1806
+ Returns the CPU info gathered from cpufreq-info.
1807
+ Returns {} if cpufreq-info is not found.
1808
+ '''
1809
+
1810
+ g_trace.header('Tying to get info from cpufreq-info ...')
1811
+
1812
+ try:
1813
+ hz_brand, scale = '0.0', 0
1814
+
1815
+ if not DataSource.has_cpufreq_info():
1816
+ g_trace.fail('Failed to find cpufreq-info. Skipping ...')
1817
+ return {}
1818
+
1819
+ returncode, output = DataSource.cpufreq_info()
1820
+ if returncode != 0:
1821
+ g_trace.fail('Failed to run cpufreq-info. Skipping ...')
1822
+ return {}
1823
+
1824
+ hz_brand = output.split('current CPU frequency is')[1].split('\n')[0]
1825
+ i = hz_brand.find('Hz')
1826
+ assert(i != -1)
1827
+ hz_brand = hz_brand[0 : i+2].strip().lower()
1828
+
1829
+ if hz_brand.endswith('mhz'):
1830
+ scale = 6
1831
+ elif hz_brand.endswith('ghz'):
1832
+ scale = 9
1833
+ hz_brand = hz_brand.rstrip('mhz').rstrip('ghz').strip()
1834
+ hz_brand = _to_decimal_string(hz_brand)
1835
+
1836
+ info = {
1837
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_brand, scale),
1838
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_brand, scale),
1839
+ 'hz_advertised' : _hz_short_to_full(hz_brand, scale),
1840
+ 'hz_actual' : _hz_short_to_full(hz_brand, scale),
1841
+ }
1842
+
1843
+ info = _filter_dict_keys_with_empty_values(info)
1844
+ g_trace.success()
1845
+ return info
1846
+ except Exception as err:
1847
+ g_trace.fail(err)
1848
+ #raise # NOTE: To have this throw on error, uncomment this line
1849
+ return {}
1850
+
1851
+ def _get_cpu_info_from_lscpu():
1852
+ '''
1853
+ Returns the CPU info gathered from lscpu.
1854
+ Returns {} if lscpu is not found.
1855
+ '''
1856
+
1857
+ g_trace.header('Tying to get info from lscpu ...')
1858
+
1859
+ try:
1860
+ if not DataSource.has_lscpu():
1861
+ g_trace.fail('Failed to find lscpu. Skipping ...')
1862
+ return {}
1863
+
1864
+ returncode, output = DataSource.lscpu()
1865
+ if returncode != 0:
1866
+ g_trace.fail('Failed to run lscpu. Skipping ...')
1867
+ return {}
1868
+
1869
+ info = {}
1870
+
1871
+ new_hz = _get_field(False, output, None, None, 'CPU max MHz', 'CPU MHz')
1872
+ if new_hz:
1873
+ new_hz = _to_decimal_string(new_hz)
1874
+ scale = 6
1875
+ info['hz_advertised_friendly'] = _hz_short_to_friendly(new_hz, scale)
1876
+ info['hz_actual_friendly'] = _hz_short_to_friendly(new_hz, scale)
1877
+ info['hz_advertised'] = _hz_short_to_full(new_hz, scale)
1878
+ info['hz_actual'] = _hz_short_to_full(new_hz, scale)
1879
+
1880
+ new_hz = _get_field(False, output, None, None, 'CPU dynamic MHz', 'CPU static MHz')
1881
+ if new_hz:
1882
+ new_hz = _to_decimal_string(new_hz)
1883
+ scale = 6
1884
+ info['hz_advertised_friendly'] = _hz_short_to_friendly(new_hz, scale)
1885
+ info['hz_actual_friendly'] = _hz_short_to_friendly(new_hz, scale)
1886
+ info['hz_advertised'] = _hz_short_to_full(new_hz, scale)
1887
+ info['hz_actual'] = _hz_short_to_full(new_hz, scale)
1888
+
1889
+ vendor_id = _get_field(False, output, None, None, 'Vendor ID')
1890
+ if vendor_id:
1891
+ info['vendor_id_raw'] = vendor_id
1892
+
1893
+ brand = _get_field(False, output, None, None, 'Model name')
1894
+ if brand:
1895
+ info['brand_raw'] = brand
1896
+ else:
1897
+ brand = _get_field(False, output, None, None, 'Model')
1898
+ if brand and not brand.isdigit():
1899
+ info['brand_raw'] = brand
1900
+
1901
+ family = _get_field(False, output, None, None, 'CPU family')
1902
+ if family and family.isdigit():
1903
+ info['family'] = int(family)
1904
+
1905
+ stepping = _get_field(False, output, None, None, 'Stepping')
1906
+ if stepping and stepping.isdigit():
1907
+ info['stepping'] = int(stepping)
1908
+
1909
+ model = _get_field(False, output, None, None, 'Model')
1910
+ if model and model.isdigit():
1911
+ info['model'] = int(model)
1912
+
1913
+ l1_data_cache_size = _get_field(False, output, None, None, 'L1d cache')
1914
+ if l1_data_cache_size:
1915
+ l1_data_cache_size = l1_data_cache_size.split('(')[0].strip()
1916
+ info['l1_data_cache_size'] = _friendly_bytes_to_int(l1_data_cache_size)
1917
+
1918
+ l1_instruction_cache_size = _get_field(False, output, None, None, 'L1i cache')
1919
+ if l1_instruction_cache_size:
1920
+ l1_instruction_cache_size = l1_instruction_cache_size.split('(')[0].strip()
1921
+ info['l1_instruction_cache_size'] = _friendly_bytes_to_int(l1_instruction_cache_size)
1922
+
1923
+ l2_cache_size = _get_field(False, output, None, None, 'L2 cache', 'L2d cache')
1924
+ if l2_cache_size:
1925
+ l2_cache_size = l2_cache_size.split('(')[0].strip()
1926
+ info['l2_cache_size'] = _friendly_bytes_to_int(l2_cache_size)
1927
+
1928
+ l3_cache_size = _get_field(False, output, None, None, 'L3 cache')
1929
+ if l3_cache_size:
1930
+ l3_cache_size = l3_cache_size.split('(')[0].strip()
1931
+ info['l3_cache_size'] = _friendly_bytes_to_int(l3_cache_size)
1932
+
1933
+ # Flags
1934
+ flags = _get_field(False, output, None, None, 'flags', 'Features', 'ASEs implemented')
1935
+ if flags:
1936
+ flags = flags.split()
1937
+ flags.sort()
1938
+ info['flags'] = flags
1939
+
1940
+ info = _filter_dict_keys_with_empty_values(info, {'stepping':0, 'model':0, 'family':0})
1941
+ g_trace.success()
1942
+ return info
1943
+ except Exception as err:
1944
+ g_trace.fail(err)
1945
+ #raise # NOTE: To have this throw on error, uncomment this line
1946
+ return {}
1947
+
1948
+ def _get_cpu_info_from_dmesg():
1949
+ '''
1950
+ Returns the CPU info gathered from dmesg.
1951
+ Returns {} if dmesg is not found or does not have the desired info.
1952
+ '''
1953
+
1954
+ g_trace.header('Tying to get info from the dmesg ...')
1955
+
1956
+ # Just return {} if this arch has an unreliable dmesg log
1957
+ arch, bits = _parse_arch(DataSource.arch_string_raw)
1958
+ if arch in ['S390X']:
1959
+ g_trace.fail('Running on S390X. Skipping ...')
1960
+ return {}
1961
+
1962
+ # Just return {} if there is no dmesg
1963
+ if not DataSource.has_dmesg():
1964
+ g_trace.fail('Failed to find dmesg. Skipping ...')
1965
+ return {}
1966
+
1967
+ # If dmesg fails return {}
1968
+ returncode, output = DataSource.dmesg_a()
1969
+ if output is None or returncode != 0:
1970
+ g_trace.fail('Failed to run \"dmesg -a\". Skipping ...')
1971
+ return {}
1972
+
1973
+ info = _parse_dmesg_output(output)
1974
+ g_trace.success()
1975
+ return info
1976
+
1977
+
1978
+ # https://openpowerfoundation.org/wp-content/uploads/2016/05/LoPAPR_DRAFT_v11_24March2016_cmt1.pdf
1979
+ # page 767
1980
+ def _get_cpu_info_from_ibm_pa_features():
1981
+ '''
1982
+ Returns the CPU info gathered from lsprop /proc/device-tree/cpus/*/ibm,pa-features
1983
+ Returns {} if lsprop is not found or ibm,pa-features does not have the desired info.
1984
+ '''
1985
+
1986
+ g_trace.header('Tying to get info from lsprop ...')
1987
+
1988
+ try:
1989
+ # Just return {} if there is no lsprop
1990
+ if not DataSource.has_ibm_pa_features():
1991
+ g_trace.fail('Failed to find lsprop. Skipping ...')
1992
+ return {}
1993
+
1994
+ # If ibm,pa-features fails return {}
1995
+ returncode, output = DataSource.ibm_pa_features()
1996
+ if output is None or returncode != 0:
1997
+ g_trace.fail('Failed to glob /proc/device-tree/cpus/*/ibm,pa-features. Skipping ...')
1998
+ return {}
1999
+
2000
+ # Filter out invalid characters from output
2001
+ value = output.split("ibm,pa-features")[1].lower()
2002
+ value = [s for s in value if s in list('0123456789abcfed')]
2003
+ value = ''.join(value)
2004
+
2005
+ # Get data converted to Uint32 chunks
2006
+ left = int(value[0 : 8], 16)
2007
+ right = int(value[8 : 16], 16)
2008
+
2009
+ # Get the CPU flags
2010
+ flags = {
2011
+ # Byte 0
2012
+ 'mmu' : _is_bit_set(left, 0),
2013
+ 'fpu' : _is_bit_set(left, 1),
2014
+ 'slb' : _is_bit_set(left, 2),
2015
+ 'run' : _is_bit_set(left, 3),
2016
+ #'reserved' : _is_bit_set(left, 4),
2017
+ 'dabr' : _is_bit_set(left, 5),
2018
+ 'ne' : _is_bit_set(left, 6),
2019
+ 'wtr' : _is_bit_set(left, 7),
2020
+
2021
+ # Byte 1
2022
+ 'mcr' : _is_bit_set(left, 8),
2023
+ 'dsisr' : _is_bit_set(left, 9),
2024
+ 'lp' : _is_bit_set(left, 10),
2025
+ 'ri' : _is_bit_set(left, 11),
2026
+ 'dabrx' : _is_bit_set(left, 12),
2027
+ 'sprg3' : _is_bit_set(left, 13),
2028
+ 'rislb' : _is_bit_set(left, 14),
2029
+ 'pp' : _is_bit_set(left, 15),
2030
+
2031
+ # Byte 2
2032
+ 'vpm' : _is_bit_set(left, 16),
2033
+ 'dss_2.05' : _is_bit_set(left, 17),
2034
+ #'reserved' : _is_bit_set(left, 18),
2035
+ 'dar' : _is_bit_set(left, 19),
2036
+ #'reserved' : _is_bit_set(left, 20),
2037
+ 'ppr' : _is_bit_set(left, 21),
2038
+ 'dss_2.02' : _is_bit_set(left, 22),
2039
+ 'dss_2.06' : _is_bit_set(left, 23),
2040
+
2041
+ # Byte 3
2042
+ 'lsd_in_dscr' : _is_bit_set(left, 24),
2043
+ 'ugr_in_dscr' : _is_bit_set(left, 25),
2044
+ #'reserved' : _is_bit_set(left, 26),
2045
+ #'reserved' : _is_bit_set(left, 27),
2046
+ #'reserved' : _is_bit_set(left, 28),
2047
+ #'reserved' : _is_bit_set(left, 29),
2048
+ #'reserved' : _is_bit_set(left, 30),
2049
+ #'reserved' : _is_bit_set(left, 31),
2050
+
2051
+ # Byte 4
2052
+ 'sso_2.06' : _is_bit_set(right, 0),
2053
+ #'reserved' : _is_bit_set(right, 1),
2054
+ #'reserved' : _is_bit_set(right, 2),
2055
+ #'reserved' : _is_bit_set(right, 3),
2056
+ #'reserved' : _is_bit_set(right, 4),
2057
+ #'reserved' : _is_bit_set(right, 5),
2058
+ #'reserved' : _is_bit_set(right, 6),
2059
+ #'reserved' : _is_bit_set(right, 7),
2060
+
2061
+ # Byte 5
2062
+ 'le' : _is_bit_set(right, 8),
2063
+ 'cfar' : _is_bit_set(right, 9),
2064
+ 'eb' : _is_bit_set(right, 10),
2065
+ 'lsq_2.07' : _is_bit_set(right, 11),
2066
+ #'reserved' : _is_bit_set(right, 12),
2067
+ #'reserved' : _is_bit_set(right, 13),
2068
+ #'reserved' : _is_bit_set(right, 14),
2069
+ #'reserved' : _is_bit_set(right, 15),
2070
+
2071
+ # Byte 6
2072
+ 'dss_2.07' : _is_bit_set(right, 16),
2073
+ #'reserved' : _is_bit_set(right, 17),
2074
+ #'reserved' : _is_bit_set(right, 18),
2075
+ #'reserved' : _is_bit_set(right, 19),
2076
+ #'reserved' : _is_bit_set(right, 20),
2077
+ #'reserved' : _is_bit_set(right, 21),
2078
+ #'reserved' : _is_bit_set(right, 22),
2079
+ #'reserved' : _is_bit_set(right, 23),
2080
+
2081
+ # Byte 7
2082
+ #'reserved' : _is_bit_set(right, 24),
2083
+ #'reserved' : _is_bit_set(right, 25),
2084
+ #'reserved' : _is_bit_set(right, 26),
2085
+ #'reserved' : _is_bit_set(right, 27),
2086
+ #'reserved' : _is_bit_set(right, 28),
2087
+ #'reserved' : _is_bit_set(right, 29),
2088
+ #'reserved' : _is_bit_set(right, 30),
2089
+ #'reserved' : _is_bit_set(right, 31),
2090
+ }
2091
+
2092
+ # Get a list of only the flags that are true
2093
+ flags = [k for k, v in flags.items() if v]
2094
+ flags.sort()
2095
+
2096
+ info = {
2097
+ 'flags' : flags
2098
+ }
2099
+ info = _filter_dict_keys_with_empty_values(info)
2100
+ g_trace.success()
2101
+ return info
2102
+ except Exception as err:
2103
+ g_trace.fail(err)
2104
+ return {}
2105
+
2106
+
2107
+ def _get_cpu_info_from_cat_var_run_dmesg_boot():
2108
+ '''
2109
+ Returns the CPU info gathered from /var/run/dmesg.boot.
2110
+ Returns {} if dmesg is not found or does not have the desired info.
2111
+ '''
2112
+
2113
+ g_trace.header('Tying to get info from the /var/run/dmesg.boot log ...')
2114
+
2115
+ # Just return {} if there is no /var/run/dmesg.boot
2116
+ if not DataSource.has_var_run_dmesg_boot():
2117
+ g_trace.fail('Failed to find /var/run/dmesg.boot file. Skipping ...')
2118
+ return {}
2119
+
2120
+ # If dmesg.boot fails return {}
2121
+ returncode, output = DataSource.cat_var_run_dmesg_boot()
2122
+ if output is None or returncode != 0:
2123
+ g_trace.fail('Failed to run \"cat /var/run/dmesg.boot\". Skipping ...')
2124
+ return {}
2125
+
2126
+ info = _parse_dmesg_output(output)
2127
+ g_trace.success()
2128
+ return info
2129
+
2130
+
2131
+ def _get_cpu_info_from_sysctl():
2132
+ '''
2133
+ Returns the CPU info gathered from sysctl.
2134
+ Returns {} if sysctl is not found.
2135
+ '''
2136
+
2137
+ g_trace.header('Tying to get info from sysctl ...')
2138
+
2139
+ try:
2140
+ # Just return {} if there is no sysctl
2141
+ if not DataSource.has_sysctl():
2142
+ g_trace.fail('Failed to find sysctl. Skipping ...')
2143
+ return {}
2144
+
2145
+ # If sysctl fails return {}
2146
+ returncode, output = DataSource.sysctl_machdep_cpu_hw_cpufrequency()
2147
+ if output is None or returncode != 0:
2148
+ g_trace.fail('Failed to run \"sysctl machdep.cpu hw.cpufrequency\". Skipping ...')
2149
+ return {}
2150
+
2151
+ # Various fields
2152
+ vendor_id = _get_field(False, output, None, None, 'machdep.cpu.vendor')
2153
+ processor_brand = _get_field(True, output, None, None, 'machdep.cpu.brand_string')
2154
+ cache_size = _get_field(False, output, int, 0, 'machdep.cpu.cache.size')
2155
+ stepping = _get_field(False, output, int, 0, 'machdep.cpu.stepping')
2156
+ model = _get_field(False, output, int, 0, 'machdep.cpu.model')
2157
+ family = _get_field(False, output, int, 0, 'machdep.cpu.family')
2158
+
2159
+ # Flags
2160
+ flags = _get_field(False, output, None, '', 'machdep.cpu.features').lower().split()
2161
+ flags.extend(_get_field(False, output, None, '', 'machdep.cpu.leaf7_features').lower().split())
2162
+ flags.extend(_get_field(False, output, None, '', 'machdep.cpu.extfeatures').lower().split())
2163
+ flags.sort()
2164
+
2165
+ # Convert from GHz/MHz string to Hz
2166
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
2167
+ hz_actual = _get_field(False, output, None, None, 'hw.cpufrequency')
2168
+ hz_actual = _to_decimal_string(hz_actual)
2169
+
2170
+ info = {
2171
+ 'vendor_id_raw' : vendor_id,
2172
+ 'brand_raw' : processor_brand,
2173
+
2174
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
2175
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, 0),
2176
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
2177
+ 'hz_actual' : _hz_short_to_full(hz_actual, 0),
2178
+
2179
+ 'l2_cache_size' : int(cache_size) * 1024,
2180
+
2181
+ 'stepping' : stepping,
2182
+ 'model' : model,
2183
+ 'family' : family,
2184
+ 'flags' : flags
2185
+ }
2186
+
2187
+ info = _filter_dict_keys_with_empty_values(info)
2188
+ g_trace.success()
2189
+ return info
2190
+ except Exception as err:
2191
+ g_trace.fail(err)
2192
+ return {}
2193
+
2194
+
2195
+ def _get_cpu_info_from_sysinfo():
2196
+ '''
2197
+ Returns the CPU info gathered from sysinfo.
2198
+ Returns {} if sysinfo is not found.
2199
+ '''
2200
+
2201
+ info = _get_cpu_info_from_sysinfo_v1()
2202
+ info.update(_get_cpu_info_from_sysinfo_v2())
2203
+ return info
2204
+
2205
+ def _get_cpu_info_from_sysinfo_v1():
2206
+ '''
2207
+ Returns the CPU info gathered from sysinfo.
2208
+ Returns {} if sysinfo is not found.
2209
+ '''
2210
+
2211
+ g_trace.header('Tying to get info from sysinfo version 1 ...')
2212
+
2213
+ try:
2214
+ # Just return {} if there is no sysinfo
2215
+ if not DataSource.has_sysinfo():
2216
+ g_trace.fail('Failed to find sysinfo. Skipping ...')
2217
+ return {}
2218
+
2219
+ # If sysinfo fails return {}
2220
+ returncode, output = DataSource.sysinfo_cpu()
2221
+ if output is None or returncode != 0:
2222
+ g_trace.fail('Failed to run \"sysinfo -cpu\". Skipping ...')
2223
+ return {}
2224
+
2225
+ # Various fields
2226
+ vendor_id = '' #_get_field(False, output, None, None, 'CPU #0: ')
2227
+ processor_brand = output.split('CPU #0: "')[1].split('"\n')[0].strip()
2228
+ cache_size = '' #_get_field(False, output, None, None, 'machdep.cpu.cache.size')
2229
+ stepping = int(output.split(', stepping ')[1].split(',')[0].strip())
2230
+ model = int(output.split(', model ')[1].split(',')[0].strip())
2231
+ family = int(output.split(', family ')[1].split(',')[0].strip())
2232
+
2233
+ # Flags
2234
+ flags = []
2235
+ for line in output.split('\n'):
2236
+ if line.startswith('\t\t'):
2237
+ for flag in line.strip().lower().split():
2238
+ flags.append(flag)
2239
+ flags.sort()
2240
+
2241
+ # Convert from GHz/MHz string to Hz
2242
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
2243
+ hz_actual = hz_advertised
2244
+
2245
+ info = {
2246
+ 'vendor_id_raw' : vendor_id,
2247
+ 'brand_raw' : processor_brand,
2248
+
2249
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
2250
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, scale),
2251
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
2252
+ 'hz_actual' : _hz_short_to_full(hz_actual, scale),
2253
+
2254
+ 'l2_cache_size' : _to_friendly_bytes(cache_size),
2255
+
2256
+ 'stepping' : stepping,
2257
+ 'model' : model,
2258
+ 'family' : family,
2259
+ 'flags' : flags
2260
+ }
2261
+
2262
+ info = _filter_dict_keys_with_empty_values(info)
2263
+ g_trace.success()
2264
+ return info
2265
+ except Exception as err:
2266
+ g_trace.fail(err)
2267
+ #raise # NOTE: To have this throw on error, uncomment this line
2268
+ return {}
2269
+
2270
+ def _get_cpu_info_from_sysinfo_v2():
2271
+ '''
2272
+ Returns the CPU info gathered from sysinfo.
2273
+ Returns {} if sysinfo is not found.
2274
+ '''
2275
+
2276
+ g_trace.header('Tying to get info from sysinfo version 2 ...')
2277
+
2278
+ try:
2279
+ # Just return {} if there is no sysinfo
2280
+ if not DataSource.has_sysinfo():
2281
+ g_trace.fail('Failed to find sysinfo. Skipping ...')
2282
+ return {}
2283
+
2284
+ # If sysinfo fails return {}
2285
+ returncode, output = DataSource.sysinfo_cpu()
2286
+ if output is None or returncode != 0:
2287
+ g_trace.fail('Failed to run \"sysinfo -cpu\". Skipping ...')
2288
+ return {}
2289
+
2290
+ # Various fields
2291
+ vendor_id = '' #_get_field(False, output, None, None, 'CPU #0: ')
2292
+ processor_brand = output.split('CPU #0: "')[1].split('"\n')[0].strip()
2293
+ cache_size = '' #_get_field(False, output, None, None, 'machdep.cpu.cache.size')
2294
+ signature = output.split('Signature:')[1].split('\n')[0].strip()
2295
+ #
2296
+ stepping = int(signature.split('stepping ')[1].split(',')[0].strip())
2297
+ model = int(signature.split('model ')[1].split(',')[0].strip())
2298
+ family = int(signature.split('family ')[1].split(',')[0].strip())
2299
+
2300
+ # Flags
2301
+ def get_subsection_flags(output):
2302
+ retval = []
2303
+ for line in output.split('\n')[1:]:
2304
+ if not line.startswith(' ') and not line.startswith(' '): break
2305
+ for entry in line.strip().lower().split(' '):
2306
+ retval.append(entry)
2307
+ return retval
2308
+
2309
+ flags = get_subsection_flags(output.split('Features: ')[1]) + \
2310
+ get_subsection_flags(output.split('Extended Features (0x00000001): ')[1]) + \
2311
+ get_subsection_flags(output.split('Extended Features (0x80000001): ')[1])
2312
+ flags.sort()
2313
+
2314
+ # Convert from GHz/MHz string to Hz
2315
+ lines = [n for n in output.split('\n') if n]
2316
+ raw_hz = lines[0].split('running at ')[1].strip().lower()
2317
+ hz_advertised = raw_hz.rstrip('mhz').rstrip('ghz').strip()
2318
+ hz_advertised = _to_decimal_string(hz_advertised)
2319
+ hz_actual = hz_advertised
2320
+
2321
+ scale = 0
2322
+ if raw_hz.endswith('mhz'):
2323
+ scale = 6
2324
+ elif raw_hz.endswith('ghz'):
2325
+ scale = 9
2326
+
2327
+ info = {
2328
+ 'vendor_id_raw' : vendor_id,
2329
+ 'brand_raw' : processor_brand,
2330
+
2331
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
2332
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, scale),
2333
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
2334
+ 'hz_actual' : _hz_short_to_full(hz_actual, scale),
2335
+
2336
+ 'l2_cache_size' : _to_friendly_bytes(cache_size),
2337
+
2338
+ 'stepping' : stepping,
2339
+ 'model' : model,
2340
+ 'family' : family,
2341
+ 'flags' : flags
2342
+ }
2343
+
2344
+ info = _filter_dict_keys_with_empty_values(info)
2345
+ g_trace.success()
2346
+ return info
2347
+ except Exception as err:
2348
+ g_trace.fail(err)
2349
+ #raise # NOTE: To have this throw on error, uncomment this line
2350
+ return {}
2351
+
2352
+ def _get_cpu_info_from_wmic():
2353
+ '''
2354
+ Returns the CPU info gathered from WMI.
2355
+ Returns {} if not on Windows, or wmic is not installed.
2356
+ '''
2357
+ g_trace.header('Tying to get info from wmic ...')
2358
+
2359
+ try:
2360
+ # Just return {} if not Windows or there is no wmic
2361
+ if not DataSource.is_windows or not DataSource.has_wmic():
2362
+ g_trace.fail('Failed to find WMIC, or not on Windows. Skipping ...')
2363
+ return {}
2364
+
2365
+ returncode, output = DataSource.wmic_cpu()
2366
+ if output is None or returncode != 0:
2367
+ g_trace.fail('Failed to run wmic. Skipping ...')
2368
+ return {}
2369
+
2370
+ # Break the list into key values pairs
2371
+ value = output.split("\n")
2372
+ value = [s.rstrip().split('=') for s in value if '=' in s]
2373
+ value = {k: v for k, v in value if v}
2374
+
2375
+ # Get the advertised MHz
2376
+ processor_brand = value.get('Name')
2377
+ hz_advertised, scale_advertised = _parse_cpu_brand_string(processor_brand)
2378
+
2379
+ # Get the actual MHz
2380
+ hz_actual = value.get('CurrentClockSpeed')
2381
+ scale_actual = 6
2382
+ if hz_actual:
2383
+ hz_actual = _to_decimal_string(hz_actual)
2384
+
2385
+ # Get cache sizes
2386
+ l2_cache_size = value.get('L2CacheSize') # NOTE: L2CacheSize is in kilobytes
2387
+ if l2_cache_size:
2388
+ l2_cache_size = int(l2_cache_size) * 1024
2389
+
2390
+ l3_cache_size = value.get('L3CacheSize') # NOTE: L3CacheSize is in kilobytes
2391
+ if l3_cache_size:
2392
+ l3_cache_size = int(l3_cache_size) * 1024
2393
+
2394
+ # Get family, model, and stepping
2395
+ family, model, stepping = '', '', ''
2396
+ description = value.get('Description') or value.get('Caption')
2397
+ entries = description.split(' ')
2398
+
2399
+ if 'Family' in entries and entries.index('Family') < len(entries)-1:
2400
+ i = entries.index('Family')
2401
+ family = int(entries[i + 1])
2402
+
2403
+ if 'Model' in entries and entries.index('Model') < len(entries)-1:
2404
+ i = entries.index('Model')
2405
+ model = int(entries[i + 1])
2406
+
2407
+ if 'Stepping' in entries and entries.index('Stepping') < len(entries)-1:
2408
+ i = entries.index('Stepping')
2409
+ stepping = int(entries[i + 1])
2410
+
2411
+ info = {
2412
+ 'vendor_id_raw' : value.get('Manufacturer'),
2413
+ 'brand_raw' : processor_brand,
2414
+
2415
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale_advertised),
2416
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, scale_actual),
2417
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale_advertised),
2418
+ 'hz_actual' : _hz_short_to_full(hz_actual, scale_actual),
2419
+
2420
+ 'l2_cache_size' : l2_cache_size,
2421
+ 'l3_cache_size' : l3_cache_size,
2422
+
2423
+ 'stepping' : stepping,
2424
+ 'model' : model,
2425
+ 'family' : family,
2426
+ }
2427
+
2428
+ info = _filter_dict_keys_with_empty_values(info)
2429
+ g_trace.success()
2430
+ return info
2431
+ except Exception as err:
2432
+ g_trace.fail(err)
2433
+ #raise # NOTE: To have this throw on error, uncomment this line
2434
+ return {}
2435
+
2436
+ def _get_cpu_info_from_registry():
2437
+ '''
2438
+ Returns the CPU info gathered from the Windows Registry.
2439
+ Returns {} if not on Windows.
2440
+ '''
2441
+
2442
+ g_trace.header('Tying to get info from Windows registry ...')
2443
+
2444
+ try:
2445
+ # Just return {} if not on Windows
2446
+ if not DataSource.is_windows:
2447
+ g_trace.fail('Not running on Windows. Skipping ...')
2448
+ return {}
2449
+
2450
+ # Get the CPU name
2451
+ processor_brand = DataSource.winreg_processor_brand().strip()
2452
+
2453
+ # Get the CPU vendor id
2454
+ vendor_id = DataSource.winreg_vendor_id_raw()
2455
+
2456
+ # Get the CPU arch and bits
2457
+ arch_string_raw = DataSource.winreg_arch_string_raw()
2458
+ arch, bits = _parse_arch(arch_string_raw)
2459
+
2460
+ # Get the actual CPU Hz
2461
+ hz_actual = DataSource.winreg_hz_actual()
2462
+ hz_actual = _to_decimal_string(hz_actual)
2463
+
2464
+ # Get the advertised CPU Hz
2465
+ hz_advertised, scale = _parse_cpu_brand_string(processor_brand)
2466
+
2467
+ # If advertised hz not found, use the actual hz
2468
+ if hz_advertised == '0.0':
2469
+ scale = 6
2470
+ hz_advertised = _to_decimal_string(hz_actual)
2471
+
2472
+ # Get the CPU features
2473
+ feature_bits = DataSource.winreg_feature_bits()
2474
+
2475
+ def is_set(bit):
2476
+ mask = 0x80000000 >> bit
2477
+ retval = mask & feature_bits > 0
2478
+ return retval
2479
+
2480
+ # http://en.wikipedia.org/wiki/CPUID
2481
+ # http://unix.stackexchange.com/questions/43539/what-do-the-flags-in-proc-cpuinfo-mean
2482
+ # http://www.lohninger.com/helpcsuite/public_constants_cpuid.htm
2483
+ flags = {
2484
+ 'fpu' : is_set(0), # Floating Point Unit
2485
+ 'vme' : is_set(1), # V86 Mode Extensions
2486
+ 'de' : is_set(2), # Debug Extensions - I/O breakpoints supported
2487
+ 'pse' : is_set(3), # Page Size Extensions (4 MB pages supported)
2488
+ 'tsc' : is_set(4), # Time Stamp Counter and RDTSC instruction are available
2489
+ 'msr' : is_set(5), # Model Specific Registers
2490
+ 'pae' : is_set(6), # Physical Address Extensions (36 bit address, 2MB pages)
2491
+ 'mce' : is_set(7), # Machine Check Exception supported
2492
+ 'cx8' : is_set(8), # Compare Exchange Eight Byte instruction available
2493
+ 'apic' : is_set(9), # Local APIC present (multiprocessor operation support)
2494
+ 'sepamd' : is_set(10), # Fast system calls (AMD only)
2495
+ 'sep' : is_set(11), # Fast system calls
2496
+ 'mtrr' : is_set(12), # Memory Type Range Registers
2497
+ 'pge' : is_set(13), # Page Global Enable
2498
+ 'mca' : is_set(14), # Machine Check Architecture
2499
+ 'cmov' : is_set(15), # Conditional MOVe instructions
2500
+ 'pat' : is_set(16), # Page Attribute Table
2501
+ 'pse36' : is_set(17), # 36 bit Page Size Extensions
2502
+ 'serial' : is_set(18), # Processor Serial Number
2503
+ 'clflush' : is_set(19), # Cache Flush
2504
+ #'reserved1' : is_set(20), # reserved
2505
+ 'dts' : is_set(21), # Debug Trace Store
2506
+ 'acpi' : is_set(22), # ACPI support
2507
+ 'mmx' : is_set(23), # MultiMedia Extensions
2508
+ 'fxsr' : is_set(24), # FXSAVE and FXRSTOR instructions
2509
+ 'sse' : is_set(25), # SSE instructions
2510
+ 'sse2' : is_set(26), # SSE2 (WNI) instructions
2511
+ 'ss' : is_set(27), # self snoop
2512
+ #'reserved2' : is_set(28), # reserved
2513
+ 'tm' : is_set(29), # Automatic clock control
2514
+ 'ia64' : is_set(30), # IA64 instructions
2515
+ '3dnow' : is_set(31) # 3DNow! instructions available
2516
+ }
2517
+
2518
+ # Get a list of only the flags that are true
2519
+ flags = [k for k, v in flags.items() if v]
2520
+ flags.sort()
2521
+
2522
+ info = {
2523
+ 'vendor_id_raw' : vendor_id,
2524
+ 'brand_raw' : processor_brand,
2525
+
2526
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
2527
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, 6),
2528
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
2529
+ 'hz_actual' : _hz_short_to_full(hz_actual, 6),
2530
+
2531
+ 'flags' : flags
2532
+ }
2533
+
2534
+ info = _filter_dict_keys_with_empty_values(info)
2535
+ g_trace.success()
2536
+ return info
2537
+ except Exception as err:
2538
+ g_trace.fail(err)
2539
+ return {}
2540
+
2541
+ def _get_cpu_info_from_kstat():
2542
+ '''
2543
+ Returns the CPU info gathered from isainfo and kstat.
2544
+ Returns {} if isainfo or kstat are not found.
2545
+ '''
2546
+
2547
+ g_trace.header('Tying to get info from kstat ...')
2548
+
2549
+ try:
2550
+ # Just return {} if there is no isainfo or kstat
2551
+ if not DataSource.has_isainfo() or not DataSource.has_kstat():
2552
+ g_trace.fail('Failed to find isinfo or kstat. Skipping ...')
2553
+ return {}
2554
+
2555
+ # If isainfo fails return {}
2556
+ returncode, flag_output = DataSource.isainfo_vb()
2557
+ if flag_output is None or returncode != 0:
2558
+ g_trace.fail('Failed to run \"isainfo -vb\". Skipping ...')
2559
+ return {}
2560
+
2561
+ # If kstat fails return {}
2562
+ returncode, kstat = DataSource.kstat_m_cpu_info()
2563
+ if kstat is None or returncode != 0:
2564
+ g_trace.fail('Failed to run \"kstat -m cpu_info\". Skipping ...')
2565
+ return {}
2566
+
2567
+ # Various fields
2568
+ vendor_id = kstat.split('\tvendor_id ')[1].split('\n')[0].strip()
2569
+ processor_brand = kstat.split('\tbrand ')[1].split('\n')[0].strip()
2570
+ stepping = int(kstat.split('\tstepping ')[1].split('\n')[0].strip())
2571
+ model = int(kstat.split('\tmodel ')[1].split('\n')[0].strip())
2572
+ family = int(kstat.split('\tfamily ')[1].split('\n')[0].strip())
2573
+
2574
+ # Flags
2575
+ flags = flag_output.strip().split('\n')[-1].strip().lower().split()
2576
+ flags.sort()
2577
+
2578
+ # Convert from GHz/MHz string to Hz
2579
+ scale = 6
2580
+ hz_advertised = kstat.split('\tclock_MHz ')[1].split('\n')[0].strip()
2581
+ hz_advertised = _to_decimal_string(hz_advertised)
2582
+
2583
+ # Convert from GHz/MHz string to Hz
2584
+ hz_actual = kstat.split('\tcurrent_clock_Hz ')[1].split('\n')[0].strip()
2585
+ hz_actual = _to_decimal_string(hz_actual)
2586
+
2587
+ info = {
2588
+ 'vendor_id_raw' : vendor_id,
2589
+ 'brand_raw' : processor_brand,
2590
+
2591
+ 'hz_advertised_friendly' : _hz_short_to_friendly(hz_advertised, scale),
2592
+ 'hz_actual_friendly' : _hz_short_to_friendly(hz_actual, 0),
2593
+ 'hz_advertised' : _hz_short_to_full(hz_advertised, scale),
2594
+ 'hz_actual' : _hz_short_to_full(hz_actual, 0),
2595
+
2596
+ 'stepping' : stepping,
2597
+ 'model' : model,
2598
+ 'family' : family,
2599
+ 'flags' : flags
2600
+ }
2601
+
2602
+ info = _filter_dict_keys_with_empty_values(info)
2603
+ g_trace.success()
2604
+ return info
2605
+ except Exception as err:
2606
+ g_trace.fail(err)
2607
+ return {}
2608
+
2609
+ def _get_cpu_info_from_platform_uname():
2610
+
2611
+ g_trace.header('Tying to get info from platform.uname ...')
2612
+
2613
+ try:
2614
+ uname = DataSource.uname_string_raw.split(',')[0]
2615
+
2616
+ family, model, stepping = (None, None, None)
2617
+ entries = uname.split(' ')
2618
+
2619
+ if 'Family' in entries and entries.index('Family') < len(entries)-1:
2620
+ i = entries.index('Family')
2621
+ family = int(entries[i + 1])
2622
+
2623
+ if 'Model' in entries and entries.index('Model') < len(entries)-1:
2624
+ i = entries.index('Model')
2625
+ model = int(entries[i + 1])
2626
+
2627
+ if 'Stepping' in entries and entries.index('Stepping') < len(entries)-1:
2628
+ i = entries.index('Stepping')
2629
+ stepping = int(entries[i + 1])
2630
+
2631
+ info = {
2632
+ 'family' : family,
2633
+ 'model' : model,
2634
+ 'stepping' : stepping
2635
+ }
2636
+ info = _filter_dict_keys_with_empty_values(info)
2637
+ g_trace.success()
2638
+ return info
2639
+ except Exception as err:
2640
+ g_trace.fail(err)
2641
+ return {}
2642
+
2643
+ def _get_cpu_info_internal():
2644
+ '''
2645
+ Returns the CPU info by using the best sources of information for your OS.
2646
+ Returns {} if nothing is found.
2647
+ '''
2648
+
2649
+ g_trace.write('!' * 80)
2650
+
2651
+ # Get the CPU arch and bits
2652
+ arch, bits = _parse_arch(DataSource.arch_string_raw)
2653
+
2654
+ friendly_maxsize = { 2**31-1: '32 bit', 2**63-1: '64 bit' }.get(sys.maxsize) or 'unknown bits'
2655
+ friendly_version = "{0}.{1}.{2}.{3}.{4}".format(*sys.version_info)
2656
+ PYTHON_VERSION = "{0} ({1})".format(friendly_version, friendly_maxsize)
2657
+
2658
+ info = {
2659
+ 'python_version' : PYTHON_VERSION,
2660
+ 'cpuinfo_version' : CPUINFO_VERSION,
2661
+ 'cpuinfo_version_string' : CPUINFO_VERSION_STRING,
2662
+ 'arch' : arch,
2663
+ 'bits' : bits,
2664
+ 'count' : DataSource.cpu_count,
2665
+ 'arch_string_raw' : DataSource.arch_string_raw,
2666
+ }
2667
+
2668
+ g_trace.write("python_version: {0}".format(info['python_version']))
2669
+ g_trace.write("cpuinfo_version: {0}".format(info['cpuinfo_version']))
2670
+ g_trace.write("arch: {0}".format(info['arch']))
2671
+ g_trace.write("bits: {0}".format(info['bits']))
2672
+ g_trace.write("count: {0}".format(info['count']))
2673
+ g_trace.write("arch_string_raw: {0}".format(info['arch_string_raw']))
2674
+
2675
+ # Try the Windows wmic
2676
+ _copy_new_fields(info, _get_cpu_info_from_wmic())
2677
+
2678
+ # Try the Windows registry
2679
+ _copy_new_fields(info, _get_cpu_info_from_registry())
2680
+
2681
+ # Try /proc/cpuinfo
2682
+ _copy_new_fields(info, _get_cpu_info_from_proc_cpuinfo())
2683
+
2684
+ # Try cpufreq-info
2685
+ _copy_new_fields(info, _get_cpu_info_from_cpufreq_info())
2686
+
2687
+ # Try LSCPU
2688
+ _copy_new_fields(info, _get_cpu_info_from_lscpu())
2689
+
2690
+ # Try sysctl
2691
+ _copy_new_fields(info, _get_cpu_info_from_sysctl())
2692
+
2693
+ # Try kstat
2694
+ _copy_new_fields(info, _get_cpu_info_from_kstat())
2695
+
2696
+ # Try dmesg
2697
+ _copy_new_fields(info, _get_cpu_info_from_dmesg())
2698
+
2699
+ # Try /var/run/dmesg.boot
2700
+ _copy_new_fields(info, _get_cpu_info_from_cat_var_run_dmesg_boot())
2701
+
2702
+ # Try lsprop ibm,pa-features
2703
+ _copy_new_fields(info, _get_cpu_info_from_ibm_pa_features())
2704
+
2705
+ # Try sysinfo
2706
+ _copy_new_fields(info, _get_cpu_info_from_sysinfo())
2707
+
2708
+ # Try querying the CPU cpuid register
2709
+ # FIXME: This should print stdout and stderr to trace log
2710
+ _copy_new_fields(info, _get_cpu_info_from_cpuid())
2711
+
2712
+ # Try platform.uname
2713
+ _copy_new_fields(info, _get_cpu_info_from_platform_uname())
2714
+
2715
+ g_trace.write('!' * 80)
2716
+
2717
+ return info
2718
+
2719
+ def get_cpu_info_json():
2720
+ '''
2721
+ Returns the CPU info by using the best sources of information for your OS.
2722
+ Returns the result in a json string
2723
+ '''
2724
+
2725
+ import json
2726
+
2727
+ output = None
2728
+
2729
+ # If running under pyinstaller, run normally
2730
+ if getattr(sys, 'frozen', False):
2731
+ info = _get_cpu_info_internal()
2732
+ output = json.dumps(info)
2733
+ output = "{0}".format(output)
2734
+ # if not running under pyinstaller, run in another process.
2735
+ # This is done because multiprocesing has a design flaw that
2736
+ # causes non main programs to run multiple times on Windows.
2737
+ else:
2738
+ from subprocess import Popen, PIPE
2739
+
2740
+ command = [sys.executable, __file__, '--json']
2741
+ p1 = Popen(command, stdout=PIPE, stderr=PIPE, stdin=PIPE)
2742
+ output = p1.communicate()[0]
2743
+
2744
+ if p1.returncode != 0:
2745
+ return "{}"
2746
+
2747
+ output = output.decode(encoding='UTF-8')
2748
+
2749
+ return output
2750
+
2751
+ def get_cpu_info():
2752
+ '''
2753
+ Returns the CPU info by using the best sources of information for your OS.
2754
+ Returns the result in a dict
2755
+ '''
2756
+
2757
+ import json
2758
+
2759
+ output = get_cpu_info_json()
2760
+
2761
+ # Convert JSON to Python with non unicode strings
2762
+ output = json.loads(output, object_hook = _utf_to_str)
2763
+
2764
+ return output
2765
+
2766
+ def main():
2767
+ from argparse import ArgumentParser
2768
+ import json
2769
+
2770
+ # Parse args
2771
+ parser = ArgumentParser(description='Gets CPU info with pure Python')
2772
+ parser.add_argument('--json', action='store_true', help='Return the info in JSON format')
2773
+ parser.add_argument('--version', action='store_true', help='Return the version of py-cpuinfo')
2774
+ parser.add_argument('--trace', action='store_true', help='Traces code paths used to find CPU info to file')
2775
+ args = parser.parse_args()
2776
+
2777
+ global g_trace
2778
+ g_trace = Trace(args.trace, False)
2779
+
2780
+ try:
2781
+ _check_arch()
2782
+ except Exception as err:
2783
+ sys.stderr.write(str(err) + "\n")
2784
+ sys.exit(1)
2785
+
2786
+ info = _get_cpu_info_internal()
2787
+
2788
+ if not info:
2789
+ sys.stderr.write("Failed to find cpu info\n")
2790
+ sys.exit(1)
2791
+
2792
+ if args.json:
2793
+ print(json.dumps(info))
2794
+ elif args.version:
2795
+ print(CPUINFO_VERSION_STRING)
2796
+ else:
2797
+ print('Python Version: {0}'.format(info.get('python_version', '')))
2798
+ print('Cpuinfo Version: {0}'.format(info.get('cpuinfo_version_string', '')))
2799
+ print('Vendor ID Raw: {0}'.format(info.get('vendor_id_raw', '')))
2800
+ print('Hardware Raw: {0}'.format(info.get('hardware_raw', '')))
2801
+ print('Brand Raw: {0}'.format(info.get('brand_raw', '')))
2802
+ print('Hz Advertised Friendly: {0}'.format(info.get('hz_advertised_friendly', '')))
2803
+ print('Hz Actual Friendly: {0}'.format(info.get('hz_actual_friendly', '')))
2804
+ print('Hz Advertised: {0}'.format(info.get('hz_advertised', '')))
2805
+ print('Hz Actual: {0}'.format(info.get('hz_actual', '')))
2806
+ print('Arch: {0}'.format(info.get('arch', '')))
2807
+ print('Bits: {0}'.format(info.get('bits', '')))
2808
+ print('Count: {0}'.format(info.get('count', '')))
2809
+ print('Arch String Raw: {0}'.format(info.get('arch_string_raw', '')))
2810
+ print('L1 Data Cache Size: {0}'.format(info.get('l1_data_cache_size', '')))
2811
+ print('L1 Instruction Cache Size: {0}'.format(info.get('l1_instruction_cache_size', '')))
2812
+ print('L2 Cache Size: {0}'.format(info.get('l2_cache_size', '')))
2813
+ print('L2 Cache Line Size: {0}'.format(info.get('l2_cache_line_size', '')))
2814
+ print('L2 Cache Associativity: {0}'.format(info.get('l2_cache_associativity', '')))
2815
+ print('L3 Cache Size: {0}'.format(info.get('l3_cache_size', '')))
2816
+ print('Stepping: {0}'.format(info.get('stepping', '')))
2817
+ print('Model: {0}'.format(info.get('model', '')))
2818
+ print('Family: {0}'.format(info.get('family', '')))
2819
+ print('Processor Type: {0}'.format(info.get('processor_type', '')))
2820
+ print('Flags: {0}'.format(', '.join(info.get('flags', ''))))
2821
+
2822
+
2823
+ if __name__ == '__main__':
2824
+ main()
2825
+ else:
2826
+ g_trace = Trace(False, False)
2827
+ _check_arch()
valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/RECORD ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ __pycache__/ffmpy.cpython-310.pyc,,
2
+ ffmpy-0.4.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
3
+ ffmpy-0.4.0.dist-info/LICENSE,sha256=Ge5d77thSMVMLWBa3du0njex83-_GOW0Xk4frpS-UzE,1059
4
+ ffmpy-0.4.0.dist-info/METADATA,sha256=CIiGN8oUQb-wE_xqitX7ozL1YdE7fnxp0Cw8Pi9bzSE,2923
5
+ ffmpy-0.4.0.dist-info/RECORD,,
6
+ ffmpy-0.4.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
7
+ ffmpy-0.4.0.dist-info/WHEEL,sha256=sP946D7jFCHeNz5Iq4fL4Lu-PrWrFsgfLXbbkciIZwg,88
8
+ ffmpy.py,sha256=W_mYavEP_zT8_-jSmAWEJYJJMNpRz-7iXPqBsK-LhPs,9591
9
+ py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
valley/lib/python3.10/site-packages/ffmpy-0.4.0.dist-info/REQUESTED ADDED
File without changes
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/AUTHORS ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ James Robert
2
+ github: jiaaro
3
+ twitter: @jiaaro
4
+ web: jiaaro.com
5
+ email: pydub@jiaaro.com
6
+
7
+ Marc Webbie
8
+ github: marcwebbie
9
+
10
+ Jean-philippe Serafin
11
+ github: jeanphix
12
+
13
+ Anurag Ramdasan
14
+ github: AnuragRamdasan
15
+
16
+ Choongmin Lee
17
+ github: clee704
18
+
19
+ Patrick Pittman
20
+ github: ptpittman
21
+
22
+ Hunter Lang
23
+ github: hunterlang
24
+
25
+ Alexey
26
+ github: nihisil
27
+
28
+ Jaymz Campbell
29
+ github: jaymzcd
30
+
31
+ Ross McFarland
32
+ github: ross
33
+
34
+ John McMellen
35
+ github: jmcmellen
36
+
37
+ Johan Lövgren
38
+ github: dashj
39
+
40
+ Joachim Krüger
41
+ github: jkrgr
42
+
43
+ Shichao An
44
+ github: shichao-an
45
+
46
+ Michael Bortnyck
47
+ github: mbortnyck
48
+
49
+ André Cloete
50
+ github: aj-cloete
51
+
52
+ David Acacio
53
+ github: dacacioa
54
+
55
+ Thiago Abdnur
56
+ github: bolaum
57
+
58
+ Aurélien Ooms
59
+ github: aureooms
60
+
61
+ Mike Mattozzi
62
+ github: mmattozzi
63
+
64
+ Marcio Mazza
65
+ github: marciomazza
66
+
67
+ Sungsu Lim
68
+ github: proflim
69
+
70
+ Evandro Myller
71
+ github: emyller
72
+
73
+ Sérgio Agostinho
74
+ github: SergioRAgostinho
75
+
76
+ Antonio Larrosa
77
+ github: antlarr
78
+
79
+ Aaron Craig
80
+ github: craigthelinguist
81
+
82
+ Carlos del Castillo
83
+ github: greyalien502
84
+
85
+ Yudong Sun
86
+ github: sunjerry019
87
+
88
+ Jorge Perianez
89
+ github: JPery
90
+
91
+ Chendi Luo
92
+ github: Creonalia
93
+
94
+ Daniel Lefevre
95
+ gitHub: dplefevre
96
+
97
+ Grzegorz Kotfis
98
+ github: gkotfis
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/LICENSE ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Copyright (c) 2011 James Robert, http://jiaaro.com
2
+
3
+ Permission is hereby granted, free of charge, to any person obtaining
4
+ a copy of this software and associated documentation files (the
5
+ "Software"), to deal in the Software without restriction, including
6
+ without limitation the rights to use, copy, modify, merge, publish,
7
+ distribute, sublicense, and/or sell copies of the Software, and to
8
+ permit persons to whom the Software is furnished to do so, subject to
9
+ the following conditions:
10
+
11
+ The above copyright notice and this permission notice shall be
12
+ included in all copies or substantial portions of the Software.
13
+
14
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
15
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
16
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
17
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
18
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
19
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
20
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/METADATA ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: pydub
3
+ Version: 0.25.1
4
+ Summary: Manipulate audio with an simple and easy high level interface
5
+ Home-page: http://pydub.com
6
+ Author: James Robert
7
+ Author-email: jiaaro@gmail.com
8
+ License: MIT
9
+ Keywords: audio sound high-level
10
+ Platform: UNKNOWN
11
+ Classifier: Development Status :: 5 - Production/Stable
12
+ Classifier: License :: OSI Approved :: MIT License
13
+ Classifier: Programming Language :: Python
14
+ Classifier: Programming Language :: Python :: 2
15
+ Classifier: Programming Language :: Python :: 2.7
16
+ Classifier: Programming Language :: Python :: 3
17
+ Classifier: Programming Language :: Python :: 3.4
18
+ Classifier: Programming Language :: Python :: 3.5
19
+ Classifier: Programming Language :: Python :: 3.6
20
+ Classifier: Programming Language :: Python :: 3.7
21
+ Classifier: Programming Language :: Python :: 3.8
22
+ Classifier: Intended Audience :: Developers
23
+ Classifier: Operating System :: OS Independent
24
+ Classifier: Topic :: Multimedia :: Sound/Audio
25
+ Classifier: Topic :: Multimedia :: Sound/Audio :: Analysis
26
+ Classifier: Topic :: Multimedia :: Sound/Audio :: Conversion
27
+ Classifier: Topic :: Multimedia :: Sound/Audio :: Editors
28
+ Classifier: Topic :: Multimedia :: Sound/Audio :: Mixers
29
+ Classifier: Topic :: Software Development :: Libraries
30
+ Classifier: Topic :: Utilities
31
+
32
+
33
+ Manipulate audio with an simple and easy high level interface.
34
+
35
+ See the README file for details, usage info, and a list of gotchas.
36
+
37
+
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/RECORD ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ pydub-0.25.1.dist-info/AUTHORS,sha256=AyY2PS9I2enOyBnUnxcpeAX-NnMNWLQT4yDtg8IIy78,1250
2
+ pydub-0.25.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
3
+ pydub-0.25.1.dist-info/LICENSE,sha256=roVlNiJMx6OJ6Wh3H8XyWYFL3Q2mNTnPcigq2672iXo,1074
4
+ pydub-0.25.1.dist-info/METADATA,sha256=f0M8_ZVtbiYoUI9ejXIeJ03Jo9A5Nbi-0V1bVqs5iYk,1406
5
+ pydub-0.25.1.dist-info/RECORD,,
6
+ pydub-0.25.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
7
+ pydub-0.25.1.dist-info/WHEEL,sha256=Z-nyYpwrcSqxfdux5Mbn_DQ525iP7J2DG3JgGvOYyTQ,110
8
+ pydub-0.25.1.dist-info/top_level.txt,sha256=PHhiDCQVZdycZxfKL2lQozruBT6ZhvyZAwqjRrw3t0w,6
9
+ pydub/__init__.py,sha256=w1Xv1awbaR3fMhTNE1-grnfswgARTNQrKpBzfZ--VBA,39
10
+ pydub/__pycache__/__init__.cpython-310.pyc,,
11
+ pydub/__pycache__/audio_segment.cpython-310.pyc,,
12
+ pydub/__pycache__/effects.cpython-310.pyc,,
13
+ pydub/__pycache__/exceptions.cpython-310.pyc,,
14
+ pydub/__pycache__/generators.cpython-310.pyc,,
15
+ pydub/__pycache__/logging_utils.cpython-310.pyc,,
16
+ pydub/__pycache__/playback.cpython-310.pyc,,
17
+ pydub/__pycache__/pyaudioop.cpython-310.pyc,,
18
+ pydub/__pycache__/scipy_effects.cpython-310.pyc,,
19
+ pydub/__pycache__/silence.cpython-310.pyc,,
20
+ pydub/__pycache__/utils.cpython-310.pyc,,
21
+ pydub/audio_segment.py,sha256=Nf5VkHGY1v9Jqb7NtEYfwRpLrfqusfBdPGOZsi7R5Cg,49185
22
+ pydub/effects.py,sha256=1HUMzhefrwG_E1rTnzvbl-P0-KNuwHklCnu8QCGS7jA,11507
23
+ pydub/exceptions.py,sha256=osgXoUujwpH8K6hr80iYpW30CMBDFwqyaRD-5d7ZpKs,455
24
+ pydub/generators.py,sha256=u6q7J8JLOY-uEZqMPUTzakxyua3XNQcPiDsuiK2-lLA,4045
25
+ pydub/logging_utils.py,sha256=WuSqfzn4zyT7PxXHGV-PXMDynufeM6sC6eSmVlGX2RU,374
26
+ pydub/playback.py,sha256=zFngVclUL_7oDipjzKC8b7jToPNV11DV28rGyH8pio0,1987
27
+ pydub/pyaudioop.py,sha256=Dp_cQgAyYjD4OV2ZHuxtKI2KABuPi9YYNRUF8giR80Q,13094
28
+ pydub/scipy_effects.py,sha256=U2p8AQuVreTp5MrtUAzRbWgOHUc6Dwq0TAG_RtEg-7g,6637
29
+ pydub/silence.py,sha256=F6MV0VlaO6mtuisjLGks_UR-GVmzO1v87_NKvzwRc30,6457
30
+ pydub/utils.py,sha256=W71pgJFbbNP3adH63yn0Eo0CLLVgzXG7WHYSXpWvdyc,12368
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/REQUESTED ADDED
File without changes
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/WHEEL ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: bdist_wheel (0.36.2)
3
+ Root-Is-Purelib: true
4
+ Tag: py2-none-any
5
+ Tag: py3-none-any
6
+
valley/lib/python3.10/site-packages/pydub-0.25.1.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ pydub
valley/lib/python3.10/site-packages/referencing/__init__.py ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ """
2
+ Cross-specification, implementation-agnostic JSON referencing.
3
+ """
4
+
5
+ from referencing._core import Anchor, Registry, Resource, Specification
6
+
7
+ __all__ = ["Anchor", "Registry", "Resource", "Specification"]
valley/lib/python3.10/site-packages/referencing/__pycache__/jsonschema.cpython-310.pyc ADDED
Binary file (11.5 kB). View file
 
valley/lib/python3.10/site-packages/referencing/__pycache__/typing.cpython-310.pyc ADDED
Binary file (2.04 kB). View file
 
valley/lib/python3.10/site-packages/referencing/_attrs.py ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from typing import NoReturn, TypeVar
4
+
5
+ from attrs import define as _define, frozen as _frozen
6
+
7
+ _T = TypeVar("_T")
8
+
9
+
10
+ def define(cls: type[_T]) -> type[_T]: # pragma: no cover
11
+ cls.__init_subclass__ = _do_not_subclass
12
+ return _define(cls)
13
+
14
+
15
+ def frozen(cls: type[_T]) -> type[_T]:
16
+ cls.__init_subclass__ = _do_not_subclass
17
+ return _frozen(cls)
18
+
19
+
20
+ class UnsupportedSubclassing(Exception):
21
+ def __str__(self):
22
+ return (
23
+ "Subclassing is not part of referencing's public API. "
24
+ "If no other suitable API exists for what you're trying to do, "
25
+ "feel free to file an issue asking for one."
26
+ )
27
+
28
+
29
+ @staticmethod
30
+ def _do_not_subclass() -> NoReturn: # pragma: no cover
31
+ raise UnsupportedSubclassing()
valley/lib/python3.10/site-packages/referencing/_attrs.pyi ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from typing import Any, Callable, TypeVar, Union
2
+
3
+ from attr import attrib, field
4
+
5
+ class UnsupportedSubclassing(Exception): ...
6
+
7
+ _T = TypeVar("_T")
8
+
9
+ def __dataclass_transform__(
10
+ *,
11
+ frozen_default: bool = False,
12
+ field_descriptors: tuple[Union[type, Callable[..., Any]], ...] = ...,
13
+ ) -> Callable[[_T], _T]: ...
14
+ @__dataclass_transform__(field_descriptors=(attrib, field))
15
+ def define(cls: type[_T]) -> type[_T]: ...
16
+ @__dataclass_transform__(
17
+ frozen_default=True,
18
+ field_descriptors=(attrib, field),
19
+ )
20
+ def frozen(cls: type[_T]) -> type[_T]: ...
valley/lib/python3.10/site-packages/referencing/_core.py ADDED
@@ -0,0 +1,729 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from collections.abc import Iterable, Iterator, Sequence
4
+ from enum import Enum
5
+ from typing import Any, Callable, ClassVar, Generic, Protocol, TypeVar
6
+ from urllib.parse import unquote, urldefrag, urljoin
7
+
8
+ from attrs import evolve, field
9
+ from rpds import HashTrieMap, HashTrieSet, List
10
+
11
+ from referencing import exceptions
12
+ from referencing._attrs import frozen
13
+ from referencing.typing import URI, Anchor as AnchorType, D, Mapping, Retrieve
14
+
15
+ EMPTY_UNCRAWLED: HashTrieSet[URI] = HashTrieSet()
16
+ EMPTY_PREVIOUS_RESOLVERS: List[URI] = List()
17
+
18
+
19
+ class _Unset(Enum):
20
+ """
21
+ What sillyness...
22
+ """
23
+
24
+ SENTINEL = 1
25
+
26
+
27
+ _UNSET = _Unset.SENTINEL
28
+
29
+
30
+ class _MaybeInSubresource(Protocol[D]):
31
+ def __call__(
32
+ self,
33
+ segments: Sequence[int | str],
34
+ resolver: Resolver[D],
35
+ subresource: Resource[D],
36
+ ) -> Resolver[D]: ...
37
+
38
+
39
+ def _detect_or_error(contents: D) -> Specification[D]:
40
+ if not isinstance(contents, Mapping):
41
+ raise exceptions.CannotDetermineSpecification(contents)
42
+
43
+ jsonschema_dialect_id = contents.get("$schema") # type: ignore[reportUnknownMemberType]
44
+ if not isinstance(jsonschema_dialect_id, str):
45
+ raise exceptions.CannotDetermineSpecification(contents)
46
+
47
+ from referencing.jsonschema import specification_with
48
+
49
+ return specification_with(jsonschema_dialect_id)
50
+
51
+
52
+ def _detect_or_default(
53
+ default: Specification[D],
54
+ ) -> Callable[[D], Specification[D]]:
55
+ def _detect(contents: D) -> Specification[D]:
56
+ if not isinstance(contents, Mapping):
57
+ return default
58
+
59
+ jsonschema_dialect_id = contents.get("$schema") # type: ignore[reportUnknownMemberType]
60
+ if jsonschema_dialect_id is None:
61
+ return default
62
+
63
+ from referencing.jsonschema import specification_with
64
+
65
+ return specification_with(
66
+ jsonschema_dialect_id, # type: ignore[reportUnknownArgumentType]
67
+ default=default,
68
+ )
69
+
70
+ return _detect
71
+
72
+
73
+ class _SpecificationDetector:
74
+ def __get__(
75
+ self,
76
+ instance: Specification[D] | None,
77
+ cls: type[Specification[D]],
78
+ ) -> Callable[[D], Specification[D]]:
79
+ if instance is None:
80
+ return _detect_or_error
81
+ else:
82
+ return _detect_or_default(instance)
83
+
84
+
85
+ @frozen
86
+ class Specification(Generic[D]):
87
+ """
88
+ A specification which defines referencing behavior.
89
+
90
+ The various methods of a `Specification` allow for varying referencing
91
+ behavior across JSON Schema specification versions, etc.
92
+ """
93
+
94
+ #: A short human-readable name for the specification, used for debugging.
95
+ name: str
96
+
97
+ #: Find the ID of a given document.
98
+ id_of: Callable[[D], URI | None]
99
+
100
+ #: Retrieve the subresources of the given document (without traversing into
101
+ #: the subresources themselves).
102
+ subresources_of: Callable[[D], Iterable[D]]
103
+
104
+ #: While resolving a JSON pointer, conditionally enter a subresource
105
+ #: (if e.g. we have just entered a keyword whose value is a subresource)
106
+ maybe_in_subresource: _MaybeInSubresource[D]
107
+
108
+ #: Retrieve the anchors contained in the given document.
109
+ _anchors_in: Callable[
110
+ [Specification[D], D],
111
+ Iterable[AnchorType[D]],
112
+ ] = field(alias="anchors_in")
113
+
114
+ #: An opaque specification where resources have no subresources
115
+ #: nor internal identifiers.
116
+ OPAQUE: ClassVar[Specification[Any]]
117
+
118
+ #: Attempt to discern which specification applies to the given contents.
119
+ #:
120
+ #: May be called either as an instance method or as a class method, with
121
+ #: slightly different behavior in the following case:
122
+ #:
123
+ #: Recall that not all contents contains enough internal information about
124
+ #: which specification it is written for -- the JSON Schema ``{}``,
125
+ #: for instance, is valid under many different dialects and may be
126
+ #: interpreted as any one of them.
127
+ #:
128
+ #: When this method is used as an instance method (i.e. called on a
129
+ #: specific specification), that specification is used as the default
130
+ #: if the given contents are unidentifiable.
131
+ #:
132
+ #: On the other hand when called as a class method, an error is raised.
133
+ #:
134
+ #: To reiterate, ``DRAFT202012.detect({})`` will return ``DRAFT202012``
135
+ #: whereas the class method ``Specification.detect({})`` will raise an
136
+ #: error.
137
+ #:
138
+ #: (Note that of course ``DRAFT202012.detect(...)`` may return some other
139
+ #: specification when given a schema which *does* identify as being for
140
+ #: another version).
141
+ #:
142
+ #: Raises:
143
+ #:
144
+ #: `CannotDetermineSpecification`
145
+ #:
146
+ #: if the given contents don't have any discernible
147
+ #: information which could be used to guess which
148
+ #: specification they identify as
149
+ detect = _SpecificationDetector()
150
+
151
+ def __repr__(self) -> str:
152
+ return f"<Specification name={self.name!r}>"
153
+
154
+ def anchors_in(self, contents: D):
155
+ """
156
+ Retrieve the anchors contained in the given document.
157
+ """
158
+ return self._anchors_in(self, contents)
159
+
160
+ def create_resource(self, contents: D) -> Resource[D]:
161
+ """
162
+ Create a resource which is interpreted using this specification.
163
+ """
164
+ return Resource(contents=contents, specification=self)
165
+
166
+
167
+ Specification.OPAQUE = Specification(
168
+ name="opaque",
169
+ id_of=lambda contents: None,
170
+ subresources_of=lambda contents: [],
171
+ anchors_in=lambda specification, contents: [],
172
+ maybe_in_subresource=lambda segments, resolver, subresource: resolver,
173
+ )
174
+
175
+
176
+ @frozen
177
+ class Resource(Generic[D]):
178
+ r"""
179
+ A document (deserialized JSON) with a concrete interpretation under a spec.
180
+
181
+ In other words, a Python object, along with an instance of `Specification`
182
+ which describes how the document interacts with referencing -- both
183
+ internally (how it refers to other `Resource`\ s) and externally (how it
184
+ should be identified such that it is referenceable by other documents).
185
+ """
186
+
187
+ contents: D
188
+ _specification: Specification[D] = field(alias="specification")
189
+
190
+ @classmethod
191
+ def from_contents(
192
+ cls,
193
+ contents: D,
194
+ default_specification: (
195
+ type[Specification[D]] | Specification[D]
196
+ ) = Specification,
197
+ ) -> Resource[D]:
198
+ """
199
+ Create a resource guessing which specification applies to the contents.
200
+
201
+ Raises:
202
+
203
+ `CannotDetermineSpecification`
204
+
205
+ if the given contents don't have any discernible
206
+ information which could be used to guess which
207
+ specification they identify as
208
+
209
+ """
210
+ specification = default_specification.detect(contents)
211
+ return specification.create_resource(contents=contents)
212
+
213
+ @classmethod
214
+ def opaque(cls, contents: D) -> Resource[D]:
215
+ """
216
+ Create an opaque `Resource` -- i.e. one with opaque specification.
217
+
218
+ See `Specification.OPAQUE` for details.
219
+ """
220
+ return Specification.OPAQUE.create_resource(contents=contents)
221
+
222
+ def id(self) -> URI | None:
223
+ """
224
+ Retrieve this resource's (specification-specific) identifier.
225
+ """
226
+ id = self._specification.id_of(self.contents)
227
+ if id is None:
228
+ return
229
+ return id.rstrip("#")
230
+
231
+ def subresources(self) -> Iterable[Resource[D]]:
232
+ """
233
+ Retrieve this resource's subresources.
234
+ """
235
+ return (
236
+ Resource.from_contents(
237
+ each,
238
+ default_specification=self._specification,
239
+ )
240
+ for each in self._specification.subresources_of(self.contents)
241
+ )
242
+
243
+ def anchors(self) -> Iterable[AnchorType[D]]:
244
+ """
245
+ Retrieve this resource's (specification-specific) identifier.
246
+ """
247
+ return self._specification.anchors_in(self.contents)
248
+
249
+ def pointer(self, pointer: str, resolver: Resolver[D]) -> Resolved[D]:
250
+ """
251
+ Resolve the given JSON pointer.
252
+
253
+ Raises:
254
+
255
+ `exceptions.PointerToNowhere`
256
+
257
+ if the pointer points to a location not present in the document
258
+
259
+ """
260
+ if not pointer:
261
+ return Resolved(contents=self.contents, resolver=resolver)
262
+
263
+ contents = self.contents
264
+ segments: list[int | str] = []
265
+ for segment in unquote(pointer[1:]).split("/"):
266
+ if isinstance(contents, Sequence):
267
+ segment = int(segment)
268
+ else:
269
+ segment = segment.replace("~1", "/").replace("~0", "~")
270
+ try:
271
+ contents = contents[segment] # type: ignore[reportUnknownArgumentType]
272
+ except LookupError as lookup_error:
273
+ error = exceptions.PointerToNowhere(ref=pointer, resource=self)
274
+ raise error from lookup_error
275
+
276
+ segments.append(segment)
277
+ last = resolver
278
+ resolver = self._specification.maybe_in_subresource(
279
+ segments=segments,
280
+ resolver=resolver,
281
+ subresource=self._specification.create_resource(contents),
282
+ )
283
+ if resolver is not last:
284
+ segments = []
285
+ return Resolved(contents=contents, resolver=resolver) # type: ignore[reportUnknownArgumentType]
286
+
287
+
288
+ def _fail_to_retrieve(uri: URI):
289
+ raise exceptions.NoSuchResource(ref=uri)
290
+
291
+
292
+ @frozen
293
+ class Registry(Mapping[URI, Resource[D]]):
294
+ r"""
295
+ A registry of `Resource`\ s, each identified by their canonical URIs.
296
+
297
+ Registries store a collection of in-memory resources, and optionally
298
+ enable additional resources which may be stored elsewhere (e.g. in a
299
+ database, a separate set of files, over the network, etc.).
300
+
301
+ They also lazily walk their known resources, looking for subresources
302
+ within them. In other words, subresources contained within any added
303
+ resources will be retrievable via their own IDs (though this discovery of
304
+ subresources will be delayed until necessary).
305
+
306
+ Registries are immutable, and their methods return new instances of the
307
+ registry with the additional resources added to them.
308
+
309
+ The ``retrieve`` argument can be used to configure retrieval of resources
310
+ dynamically, either over the network, from a database, or the like.
311
+ Pass it a callable which will be called if any URI not present in the
312
+ registry is accessed. It must either return a `Resource` or else raise a
313
+ `NoSuchResource` exception indicating that the resource does not exist
314
+ even according to the retrieval logic.
315
+ """
316
+
317
+ _resources: HashTrieMap[URI, Resource[D]] = field(
318
+ default=HashTrieMap(),
319
+ converter=HashTrieMap.convert, # type: ignore[reportGeneralTypeIssues]
320
+ alias="resources",
321
+ )
322
+ _anchors: HashTrieMap[tuple[URI, str], AnchorType[D]] = HashTrieMap()
323
+ _uncrawled: HashTrieSet[URI] = EMPTY_UNCRAWLED
324
+ _retrieve: Retrieve[D] = field(default=_fail_to_retrieve, alias="retrieve")
325
+
326
+ def __getitem__(self, uri: URI) -> Resource[D]:
327
+ """
328
+ Return the (already crawled) `Resource` identified by the given URI.
329
+ """
330
+ try:
331
+ return self._resources[uri.rstrip("#")]
332
+ except KeyError:
333
+ raise exceptions.NoSuchResource(ref=uri) from None
334
+
335
+ def __iter__(self) -> Iterator[URI]:
336
+ """
337
+ Iterate over all crawled URIs in the registry.
338
+ """
339
+ return iter(self._resources)
340
+
341
+ def __len__(self) -> int:
342
+ """
343
+ Count the total number of fully crawled resources in this registry.
344
+ """
345
+ return len(self._resources)
346
+
347
+ def __rmatmul__(
348
+ self,
349
+ new: Resource[D] | Iterable[Resource[D]],
350
+ ) -> Registry[D]:
351
+ """
352
+ Create a new registry with resource(s) added using their internal IDs.
353
+
354
+ Resources must have a internal IDs (e.g. the :kw:`$id` keyword in
355
+ modern JSON Schema versions), otherwise an error will be raised.
356
+
357
+ Both a single resource as well as an iterable of resources works, i.e.:
358
+
359
+ * ``resource @ registry`` or
360
+
361
+ * ``[iterable, of, multiple, resources] @ registry``
362
+
363
+ which -- again, assuming the resources have internal IDs -- is
364
+ equivalent to calling `Registry.with_resources` as such:
365
+
366
+ .. code:: python
367
+
368
+ registry.with_resources(
369
+ (resource.id(), resource) for resource in new_resources
370
+ )
371
+
372
+ Raises:
373
+
374
+ `NoInternalID`
375
+
376
+ if the resource(s) in fact do not have IDs
377
+
378
+ """
379
+ if isinstance(new, Resource):
380
+ new = (new,)
381
+
382
+ resources = self._resources
383
+ uncrawled = self._uncrawled
384
+ for resource in new:
385
+ id = resource.id()
386
+ if id is None:
387
+ raise exceptions.NoInternalID(resource=resource)
388
+ uncrawled = uncrawled.insert(id)
389
+ resources = resources.insert(id, resource)
390
+ return evolve(self, resources=resources, uncrawled=uncrawled)
391
+
392
+ def __repr__(self) -> str:
393
+ size = len(self)
394
+ pluralized = "resource" if size == 1 else "resources"
395
+ if self._uncrawled:
396
+ uncrawled = len(self._uncrawled)
397
+ if uncrawled == size:
398
+ summary = f"uncrawled {pluralized}"
399
+ else:
400
+ summary = f"{pluralized}, {uncrawled} uncrawled"
401
+ else:
402
+ summary = f"{pluralized}"
403
+ return f"<Registry ({size} {summary})>"
404
+
405
+ def get_or_retrieve(self, uri: URI) -> Retrieved[D, Resource[D]]:
406
+ """
407
+ Get a resource from the registry, crawling or retrieving if necessary.
408
+
409
+ May involve crawling to find the given URI if it is not already known,
410
+ so the returned object is a `Retrieved` object which contains both the
411
+ resource value as well as the registry which ultimately contained it.
412
+ """
413
+ resource = self._resources.get(uri)
414
+ if resource is not None:
415
+ return Retrieved(registry=self, value=resource)
416
+
417
+ registry = self.crawl()
418
+ resource = registry._resources.get(uri)
419
+ if resource is not None:
420
+ return Retrieved(registry=registry, value=resource)
421
+
422
+ try:
423
+ resource = registry._retrieve(uri)
424
+ except (
425
+ exceptions.CannotDetermineSpecification,
426
+ exceptions.NoSuchResource,
427
+ ):
428
+ raise
429
+ except Exception as error:
430
+ raise exceptions.Unretrievable(ref=uri) from error
431
+ else:
432
+ registry = registry.with_resource(uri, resource)
433
+ return Retrieved(registry=registry, value=resource)
434
+
435
+ def remove(self, uri: URI):
436
+ """
437
+ Return a registry with the resource identified by a given URI removed.
438
+ """
439
+ if uri not in self._resources:
440
+ raise exceptions.NoSuchResource(ref=uri)
441
+
442
+ return evolve(
443
+ self,
444
+ resources=self._resources.remove(uri),
445
+ uncrawled=self._uncrawled.discard(uri),
446
+ anchors=HashTrieMap(
447
+ (k, v) for k, v in self._anchors.items() if k[0] != uri
448
+ ),
449
+ )
450
+
451
+ def anchor(self, uri: URI, name: str):
452
+ """
453
+ Retrieve a given anchor from a resource which must already be crawled.
454
+ """
455
+ value = self._anchors.get((uri, name))
456
+ if value is not None:
457
+ return Retrieved(value=value, registry=self)
458
+
459
+ registry = self.crawl()
460
+ value = registry._anchors.get((uri, name))
461
+ if value is not None:
462
+ return Retrieved(value=value, registry=registry)
463
+
464
+ resource = self[uri]
465
+ canonical_uri = resource.id()
466
+ if canonical_uri is not None:
467
+ value = registry._anchors.get((canonical_uri, name))
468
+ if value is not None:
469
+ return Retrieved(value=value, registry=registry)
470
+
471
+ if "/" in name:
472
+ raise exceptions.InvalidAnchor(
473
+ ref=uri,
474
+ resource=resource,
475
+ anchor=name,
476
+ )
477
+ raise exceptions.NoSuchAnchor(ref=uri, resource=resource, anchor=name)
478
+
479
+ def contents(self, uri: URI) -> D:
480
+ """
481
+ Retrieve the (already crawled) contents identified by the given URI.
482
+ """
483
+ return self[uri].contents
484
+
485
+ def crawl(self) -> Registry[D]:
486
+ """
487
+ Crawl all added resources, discovering subresources.
488
+ """
489
+ resources = self._resources
490
+ anchors = self._anchors
491
+ uncrawled = [(uri, resources[uri]) for uri in self._uncrawled]
492
+ while uncrawled:
493
+ uri, resource = uncrawled.pop()
494
+
495
+ id = resource.id()
496
+ if id is not None:
497
+ uri = urljoin(uri, id)
498
+ resources = resources.insert(uri, resource)
499
+ for each in resource.anchors():
500
+ anchors = anchors.insert((uri, each.name), each)
501
+ uncrawled.extend((uri, each) for each in resource.subresources())
502
+ return evolve(
503
+ self,
504
+ resources=resources,
505
+ anchors=anchors,
506
+ uncrawled=EMPTY_UNCRAWLED,
507
+ )
508
+
509
+ def with_resource(self, uri: URI, resource: Resource[D]):
510
+ """
511
+ Add the given `Resource` to the registry, without crawling it.
512
+ """
513
+ return self.with_resources([(uri, resource)])
514
+
515
+ def with_resources(
516
+ self,
517
+ pairs: Iterable[tuple[URI, Resource[D]]],
518
+ ) -> Registry[D]:
519
+ r"""
520
+ Add the given `Resource`\ s to the registry, without crawling them.
521
+ """
522
+ resources = self._resources
523
+ uncrawled = self._uncrawled
524
+ for uri, resource in pairs:
525
+ # Empty fragment URIs are equivalent to URIs without the fragment.
526
+ # TODO: Is this true for non JSON Schema resources? Probably not.
527
+ uri = uri.rstrip("#")
528
+ uncrawled = uncrawled.insert(uri)
529
+ resources = resources.insert(uri, resource)
530
+ return evolve(self, resources=resources, uncrawled=uncrawled)
531
+
532
+ def with_contents(
533
+ self,
534
+ pairs: Iterable[tuple[URI, D]],
535
+ **kwargs: Any,
536
+ ) -> Registry[D]:
537
+ r"""
538
+ Add the given contents to the registry, autodetecting when necessary.
539
+ """
540
+ return self.with_resources(
541
+ (uri, Resource.from_contents(each, **kwargs))
542
+ for uri, each in pairs
543
+ )
544
+
545
+ def combine(self, *registries: Registry[D]) -> Registry[D]:
546
+ """
547
+ Combine together one or more other registries, producing a unified one.
548
+ """
549
+ if registries == (self,):
550
+ return self
551
+ resources = self._resources
552
+ anchors = self._anchors
553
+ uncrawled = self._uncrawled
554
+ retrieve = self._retrieve
555
+ for registry in registries:
556
+ resources = resources.update(registry._resources)
557
+ anchors = anchors.update(registry._anchors)
558
+ uncrawled = uncrawled.update(registry._uncrawled)
559
+
560
+ if registry._retrieve is not _fail_to_retrieve:
561
+ if registry._retrieve is not retrieve is not _fail_to_retrieve:
562
+ raise ValueError( # noqa: TRY003
563
+ "Cannot combine registries with conflicting retrieval "
564
+ "functions.",
565
+ )
566
+ retrieve = registry._retrieve
567
+ return evolve(
568
+ self,
569
+ anchors=anchors,
570
+ resources=resources,
571
+ uncrawled=uncrawled,
572
+ retrieve=retrieve,
573
+ )
574
+
575
+ def resolver(self, base_uri: URI = "") -> Resolver[D]:
576
+ """
577
+ Return a `Resolver` which resolves references against this registry.
578
+ """
579
+ return Resolver(base_uri=base_uri, registry=self)
580
+
581
+ def resolver_with_root(self, resource: Resource[D]) -> Resolver[D]:
582
+ """
583
+ Return a `Resolver` with a specific root resource.
584
+ """
585
+ uri = resource.id() or ""
586
+ return Resolver(
587
+ base_uri=uri,
588
+ registry=self.with_resource(uri, resource),
589
+ )
590
+
591
+
592
+ #: An anchor or resource.
593
+ AnchorOrResource = TypeVar("AnchorOrResource", AnchorType[Any], Resource[Any])
594
+
595
+
596
+ @frozen
597
+ class Retrieved(Generic[D, AnchorOrResource]):
598
+ """
599
+ A value retrieved from a `Registry`.
600
+ """
601
+
602
+ value: AnchorOrResource
603
+ registry: Registry[D]
604
+
605
+
606
+ @frozen
607
+ class Resolved(Generic[D]):
608
+ """
609
+ A reference resolved to its contents by a `Resolver`.
610
+ """
611
+
612
+ contents: D
613
+ resolver: Resolver[D]
614
+
615
+
616
+ @frozen
617
+ class Resolver(Generic[D]):
618
+ """
619
+ A reference resolver.
620
+
621
+ Resolvers help resolve references (including relative ones) by
622
+ pairing a fixed base URI with a `Registry`.
623
+
624
+ This object, under normal circumstances, is expected to be used by
625
+ *implementers of libraries* built on top of `referencing` (e.g. JSON Schema
626
+ implementations or other libraries resolving JSON references),
627
+ not directly by end-users populating registries or while writing
628
+ schemas or other resources.
629
+
630
+ References are resolved against the base URI, and the combined URI
631
+ is then looked up within the registry.
632
+
633
+ The process of resolving a reference may itself involve calculating
634
+ a *new* base URI for future reference resolution (e.g. if an
635
+ intermediate resource sets a new base URI), or may involve encountering
636
+ additional subresources and adding them to a new registry.
637
+ """
638
+
639
+ _base_uri: URI = field(alias="base_uri")
640
+ _registry: Registry[D] = field(alias="registry")
641
+ _previous: List[URI] = field(default=List(), repr=False, alias="previous")
642
+
643
+ def lookup(self, ref: URI) -> Resolved[D]:
644
+ """
645
+ Resolve the given reference to the resource it points to.
646
+
647
+ Raises:
648
+
649
+ `exceptions.Unresolvable`
650
+
651
+ or a subclass thereof (see below) if the reference isn't
652
+ resolvable
653
+
654
+ `exceptions.NoSuchAnchor`
655
+
656
+ if the reference is to a URI where a resource exists but
657
+ contains a plain name fragment which does not exist within
658
+ the resource
659
+
660
+ `exceptions.PointerToNowhere`
661
+
662
+ if the reference is to a URI where a resource exists but
663
+ contains a JSON pointer to a location within the resource
664
+ that does not exist
665
+
666
+ """
667
+ if ref.startswith("#"):
668
+ uri, fragment = self._base_uri, ref[1:]
669
+ else:
670
+ uri, fragment = urldefrag(urljoin(self._base_uri, ref))
671
+ try:
672
+ retrieved = self._registry.get_or_retrieve(uri)
673
+ except exceptions.NoSuchResource:
674
+ raise exceptions.Unresolvable(ref=ref) from None
675
+ except exceptions.Unretrievable as error:
676
+ raise exceptions.Unresolvable(ref=ref) from error
677
+
678
+ if fragment.startswith("/"):
679
+ resolver = self._evolve(registry=retrieved.registry, base_uri=uri)
680
+ return retrieved.value.pointer(pointer=fragment, resolver=resolver)
681
+
682
+ if fragment:
683
+ retrieved = retrieved.registry.anchor(uri, fragment)
684
+ resolver = self._evolve(registry=retrieved.registry, base_uri=uri)
685
+ return retrieved.value.resolve(resolver=resolver)
686
+
687
+ resolver = self._evolve(registry=retrieved.registry, base_uri=uri)
688
+ return Resolved(contents=retrieved.value.contents, resolver=resolver)
689
+
690
+ def in_subresource(self, subresource: Resource[D]) -> Resolver[D]:
691
+ """
692
+ Create a resolver for a subresource (which may have a new base URI).
693
+ """
694
+ id = subresource.id()
695
+ if id is None:
696
+ return self
697
+ return evolve(self, base_uri=urljoin(self._base_uri, id))
698
+
699
+ def dynamic_scope(self) -> Iterable[tuple[URI, Registry[D]]]:
700
+ """
701
+ In specs with such a notion, return the URIs in the dynamic scope.
702
+ """
703
+ for uri in self._previous:
704
+ yield uri, self._registry
705
+
706
+ def _evolve(self, base_uri: URI, **kwargs: Any):
707
+ """
708
+ Evolve, appending to the dynamic scope.
709
+ """
710
+ previous = self._previous
711
+ if self._base_uri and (not previous or base_uri != self._base_uri):
712
+ previous = previous.push_front(self._base_uri)
713
+ return evolve(self, base_uri=base_uri, previous=previous, **kwargs)
714
+
715
+
716
+ @frozen
717
+ class Anchor(Generic[D]):
718
+ """
719
+ A simple anchor in a `Resource`.
720
+ """
721
+
722
+ name: str
723
+ resource: Resource[D]
724
+
725
+ def resolve(self, resolver: Resolver[D]):
726
+ """
727
+ Return the resource for this anchor.
728
+ """
729
+ return Resolved(contents=self.resource.contents, resolver=resolver)
valley/lib/python3.10/site-packages/referencing/exceptions.py ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Errors, oh no!
3
+ """
4
+
5
+ from __future__ import annotations
6
+
7
+ from typing import TYPE_CHECKING, Any
8
+
9
+ import attrs
10
+
11
+ from referencing._attrs import frozen
12
+
13
+ if TYPE_CHECKING:
14
+ from referencing import Resource
15
+ from referencing.typing import URI
16
+
17
+
18
+ @frozen
19
+ class NoSuchResource(KeyError):
20
+ """
21
+ The given URI is not present in a registry.
22
+
23
+ Unlike most exceptions, this class *is* intended to be publicly
24
+ instantiable and *is* part of the public API of the package.
25
+ """
26
+
27
+ ref: URI
28
+
29
+ def __eq__(self, other: object) -> bool:
30
+ if self.__class__ is not other.__class__:
31
+ return NotImplemented
32
+ return attrs.astuple(self) == attrs.astuple(other)
33
+
34
+ def __hash__(self) -> int:
35
+ return hash(attrs.astuple(self))
36
+
37
+
38
+ @frozen
39
+ class NoInternalID(Exception):
40
+ """
41
+ A resource has no internal ID, but one is needed.
42
+
43
+ E.g. in modern JSON Schema drafts, this is the :kw:`$id` keyword.
44
+
45
+ One might be needed if a resource was to-be added to a registry but no
46
+ other URI is available, and the resource doesn't declare its canonical URI.
47
+ """
48
+
49
+ resource: Resource[Any]
50
+
51
+ def __eq__(self, other: object) -> bool:
52
+ if self.__class__ is not other.__class__:
53
+ return NotImplemented
54
+ return attrs.astuple(self) == attrs.astuple(other)
55
+
56
+ def __hash__(self) -> int:
57
+ return hash(attrs.astuple(self))
58
+
59
+
60
+ @frozen
61
+ class Unretrievable(KeyError):
62
+ """
63
+ The given URI is not present in a registry, and retrieving it failed.
64
+ """
65
+
66
+ ref: URI
67
+
68
+ def __eq__(self, other: object) -> bool:
69
+ if self.__class__ is not other.__class__:
70
+ return NotImplemented
71
+ return attrs.astuple(self) == attrs.astuple(other)
72
+
73
+ def __hash__(self) -> int:
74
+ return hash(attrs.astuple(self))
75
+
76
+
77
+ @frozen
78
+ class CannotDetermineSpecification(Exception):
79
+ """
80
+ Attempting to detect the appropriate `Specification` failed.
81
+
82
+ This happens if no discernible information is found in the contents of the
83
+ new resource which would help identify it.
84
+ """
85
+
86
+ contents: Any
87
+
88
+ def __eq__(self, other: object) -> bool:
89
+ if self.__class__ is not other.__class__:
90
+ return NotImplemented
91
+ return attrs.astuple(self) == attrs.astuple(other)
92
+
93
+ def __hash__(self) -> int:
94
+ return hash(attrs.astuple(self))
95
+
96
+
97
+ @attrs.frozen # Because here we allow subclassing below.
98
+ class Unresolvable(Exception):
99
+ """
100
+ A reference was unresolvable.
101
+ """
102
+
103
+ ref: URI
104
+
105
+ def __eq__(self, other: object) -> bool:
106
+ if self.__class__ is not other.__class__:
107
+ return NotImplemented
108
+ return attrs.astuple(self) == attrs.astuple(other)
109
+
110
+ def __hash__(self) -> int:
111
+ return hash(attrs.astuple(self))
112
+
113
+
114
+ @frozen
115
+ class PointerToNowhere(Unresolvable):
116
+ """
117
+ A JSON Pointer leads to a part of a document that does not exist.
118
+ """
119
+
120
+ resource: Resource[Any]
121
+
122
+ def __str__(self) -> str:
123
+ msg = f"{self.ref!r} does not exist within {self.resource.contents!r}"
124
+ if self.ref == "/":
125
+ msg += (
126
+ ". The pointer '/' is a valid JSON Pointer but it points to "
127
+ "an empty string property ''. If you intended to point "
128
+ "to the entire resource, you should use '#'."
129
+ )
130
+ return msg
131
+
132
+
133
+ @frozen
134
+ class NoSuchAnchor(Unresolvable):
135
+ """
136
+ An anchor does not exist within a particular resource.
137
+ """
138
+
139
+ resource: Resource[Any]
140
+ anchor: str
141
+
142
+ def __str__(self) -> str:
143
+ return (
144
+ f"{self.anchor!r} does not exist within {self.resource.contents!r}"
145
+ )
146
+
147
+
148
+ @frozen
149
+ class InvalidAnchor(Unresolvable):
150
+ """
151
+ An anchor which could never exist in a resource was dereferenced.
152
+
153
+ It is somehow syntactically invalid.
154
+ """
155
+
156
+ resource: Resource[Any]
157
+ anchor: str
158
+
159
+ def __str__(self) -> str:
160
+ return (
161
+ f"'#{self.anchor}' is not a valid anchor, neither as a "
162
+ "plain name anchor nor as a JSON Pointer. You may have intended "
163
+ f"to use '#/{self.anchor}', as the slash is required *before each "
164
+ "segment* of a JSON pointer."
165
+ )
valley/lib/python3.10/site-packages/referencing/jsonschema.py ADDED
@@ -0,0 +1,642 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Referencing implementations for JSON Schema specs (historic & current).
3
+ """
4
+
5
+ from __future__ import annotations
6
+
7
+ from collections.abc import Sequence, Set
8
+ from typing import Any, Iterable, Union
9
+
10
+ from referencing import Anchor, Registry, Resource, Specification, exceptions
11
+ from referencing._attrs import frozen
12
+ from referencing._core import (
13
+ _UNSET, # type: ignore[reportPrivateUsage]
14
+ Resolved as _Resolved,
15
+ Resolver as _Resolver,
16
+ _Unset, # type: ignore[reportPrivateUsage]
17
+ )
18
+ from referencing.typing import URI, Anchor as AnchorType, Mapping
19
+
20
+ #: A JSON Schema which is a JSON object
21
+ ObjectSchema = Mapping[str, Any]
22
+
23
+ #: A JSON Schema of any kind
24
+ Schema = Union[bool, ObjectSchema]
25
+
26
+ #: A Resource whose contents are JSON Schemas
27
+ SchemaResource = Resource[Schema]
28
+
29
+ #: A JSON Schema Registry
30
+ SchemaRegistry = Registry[Schema]
31
+
32
+ #: The empty JSON Schema Registry
33
+ EMPTY_REGISTRY: SchemaRegistry = Registry()
34
+
35
+
36
+ @frozen
37
+ class UnknownDialect(Exception):
38
+ """
39
+ A dialect identifier was found for a dialect unknown by this library.
40
+
41
+ If it's a custom ("unofficial") dialect, be sure you've registered it.
42
+ """
43
+
44
+ uri: URI
45
+
46
+
47
+ def _dollar_id(contents: Schema) -> URI | None:
48
+ if isinstance(contents, bool):
49
+ return
50
+ return contents.get("$id")
51
+
52
+
53
+ def _legacy_dollar_id(contents: Schema) -> URI | None:
54
+ if isinstance(contents, bool) or "$ref" in contents:
55
+ return
56
+ id = contents.get("$id")
57
+ if id is not None and not id.startswith("#"):
58
+ return id
59
+
60
+
61
+ def _legacy_id(contents: ObjectSchema) -> URI | None:
62
+ if "$ref" in contents:
63
+ return
64
+ id = contents.get("id")
65
+ if id is not None and not id.startswith("#"):
66
+ return id
67
+
68
+
69
+ def _anchor(
70
+ specification: Specification[Schema],
71
+ contents: Schema,
72
+ ) -> Iterable[AnchorType[Schema]]:
73
+ if isinstance(contents, bool):
74
+ return
75
+ anchor = contents.get("$anchor")
76
+ if anchor is not None:
77
+ yield Anchor(
78
+ name=anchor,
79
+ resource=specification.create_resource(contents),
80
+ )
81
+
82
+ dynamic_anchor = contents.get("$dynamicAnchor")
83
+ if dynamic_anchor is not None:
84
+ yield DynamicAnchor(
85
+ name=dynamic_anchor,
86
+ resource=specification.create_resource(contents),
87
+ )
88
+
89
+
90
+ def _anchor_2019(
91
+ specification: Specification[Schema],
92
+ contents: Schema,
93
+ ) -> Iterable[Anchor[Schema]]:
94
+ if isinstance(contents, bool):
95
+ return []
96
+ anchor = contents.get("$anchor")
97
+ if anchor is None:
98
+ return []
99
+ return [
100
+ Anchor(
101
+ name=anchor,
102
+ resource=specification.create_resource(contents),
103
+ ),
104
+ ]
105
+
106
+
107
+ def _legacy_anchor_in_dollar_id(
108
+ specification: Specification[Schema],
109
+ contents: Schema,
110
+ ) -> Iterable[Anchor[Schema]]:
111
+ if isinstance(contents, bool):
112
+ return []
113
+ id = contents.get("$id", "")
114
+ if not id.startswith("#"):
115
+ return []
116
+ return [
117
+ Anchor(
118
+ name=id[1:],
119
+ resource=specification.create_resource(contents),
120
+ ),
121
+ ]
122
+
123
+
124
+ def _legacy_anchor_in_id(
125
+ specification: Specification[ObjectSchema],
126
+ contents: ObjectSchema,
127
+ ) -> Iterable[Anchor[ObjectSchema]]:
128
+ id = contents.get("id", "")
129
+ if not id.startswith("#"):
130
+ return []
131
+ return [
132
+ Anchor(
133
+ name=id[1:],
134
+ resource=specification.create_resource(contents),
135
+ ),
136
+ ]
137
+
138
+
139
+ def _subresources_of(
140
+ in_value: Set[str] = frozenset(),
141
+ in_subvalues: Set[str] = frozenset(),
142
+ in_subarray: Set[str] = frozenset(),
143
+ ):
144
+ """
145
+ Create a callable returning JSON Schema specification-style subschemas.
146
+
147
+ Relies on specifying the set of keywords containing subschemas in their
148
+ values, in a subobject's values, or in a subarray.
149
+ """
150
+
151
+ def subresources_of(contents: Schema) -> Iterable[ObjectSchema]:
152
+ if isinstance(contents, bool):
153
+ return
154
+ for each in in_value:
155
+ if each in contents:
156
+ yield contents[each]
157
+ for each in in_subarray:
158
+ if each in contents:
159
+ yield from contents[each]
160
+ for each in in_subvalues:
161
+ if each in contents:
162
+ yield from contents[each].values()
163
+
164
+ return subresources_of
165
+
166
+
167
+ def _subresources_of_with_crazy_items(
168
+ in_value: Set[str] = frozenset(),
169
+ in_subvalues: Set[str] = frozenset(),
170
+ in_subarray: Set[str] = frozenset(),
171
+ ):
172
+ """
173
+ Specifically handle older drafts where there are some funky keywords.
174
+ """
175
+
176
+ def subresources_of(contents: Schema) -> Iterable[ObjectSchema]:
177
+ if isinstance(contents, bool):
178
+ return
179
+ for each in in_value:
180
+ if each in contents:
181
+ yield contents[each]
182
+ for each in in_subarray:
183
+ if each in contents:
184
+ yield from contents[each]
185
+ for each in in_subvalues:
186
+ if each in contents:
187
+ yield from contents[each].values()
188
+
189
+ items = contents.get("items")
190
+ if items is not None:
191
+ if isinstance(items, Sequence):
192
+ yield from items
193
+ else:
194
+ yield items
195
+
196
+ return subresources_of
197
+
198
+
199
+ def _subresources_of_with_crazy_items_dependencies(
200
+ in_value: Set[str] = frozenset(),
201
+ in_subvalues: Set[str] = frozenset(),
202
+ in_subarray: Set[str] = frozenset(),
203
+ ):
204
+ """
205
+ Specifically handle older drafts where there are some funky keywords.
206
+ """
207
+
208
+ def subresources_of(contents: Schema) -> Iterable[ObjectSchema]:
209
+ if isinstance(contents, bool):
210
+ return
211
+ for each in in_value:
212
+ if each in contents:
213
+ yield contents[each]
214
+ for each in in_subarray:
215
+ if each in contents:
216
+ yield from contents[each]
217
+ for each in in_subvalues:
218
+ if each in contents:
219
+ yield from contents[each].values()
220
+
221
+ items = contents.get("items")
222
+ if items is not None:
223
+ if isinstance(items, Sequence):
224
+ yield from items
225
+ else:
226
+ yield items
227
+ dependencies = contents.get("dependencies")
228
+ if dependencies is not None:
229
+ values = iter(dependencies.values())
230
+ value = next(values, None)
231
+ if isinstance(value, Mapping):
232
+ yield value
233
+ yield from values
234
+
235
+ return subresources_of
236
+
237
+
238
+ def _subresources_of_with_crazy_aP_items_dependencies(
239
+ in_value: Set[str] = frozenset(),
240
+ in_subvalues: Set[str] = frozenset(),
241
+ in_subarray: Set[str] = frozenset(),
242
+ ):
243
+ """
244
+ Specifically handle even older drafts where there are some funky keywords.
245
+ """
246
+
247
+ def subresources_of(contents: ObjectSchema) -> Iterable[ObjectSchema]:
248
+ for each in in_value:
249
+ if each in contents:
250
+ yield contents[each]
251
+ for each in in_subarray:
252
+ if each in contents:
253
+ yield from contents[each]
254
+ for each in in_subvalues:
255
+ if each in contents:
256
+ yield from contents[each].values()
257
+
258
+ items = contents.get("items")
259
+ if items is not None:
260
+ if isinstance(items, Sequence):
261
+ yield from items
262
+ else:
263
+ yield items
264
+ dependencies = contents.get("dependencies")
265
+ if dependencies is not None:
266
+ values = iter(dependencies.values())
267
+ value = next(values, None)
268
+ if isinstance(value, Mapping):
269
+ yield value
270
+ yield from values
271
+
272
+ for each in "additionalItems", "additionalProperties":
273
+ value = contents.get(each)
274
+ if isinstance(value, Mapping):
275
+ yield value
276
+
277
+ return subresources_of
278
+
279
+
280
+ def _maybe_in_subresource(
281
+ in_value: Set[str] = frozenset(),
282
+ in_subvalues: Set[str] = frozenset(),
283
+ in_subarray: Set[str] = frozenset(),
284
+ ):
285
+ in_child = in_subvalues | in_subarray
286
+
287
+ def maybe_in_subresource(
288
+ segments: Sequence[int | str],
289
+ resolver: _Resolver[Any],
290
+ subresource: Resource[Any],
291
+ ) -> _Resolver[Any]:
292
+ _segments = iter(segments)
293
+ for segment in _segments:
294
+ if segment not in in_value and (
295
+ segment not in in_child or next(_segments, None) is None
296
+ ):
297
+ return resolver
298
+ return resolver.in_subresource(subresource)
299
+
300
+ return maybe_in_subresource
301
+
302
+
303
+ def _maybe_in_subresource_crazy_items(
304
+ in_value: Set[str] = frozenset(),
305
+ in_subvalues: Set[str] = frozenset(),
306
+ in_subarray: Set[str] = frozenset(),
307
+ ):
308
+ in_child = in_subvalues | in_subarray
309
+
310
+ def maybe_in_subresource(
311
+ segments: Sequence[int | str],
312
+ resolver: _Resolver[Any],
313
+ subresource: Resource[Any],
314
+ ) -> _Resolver[Any]:
315
+ _segments = iter(segments)
316
+ for segment in _segments:
317
+ if segment == "items" and isinstance(
318
+ subresource.contents,
319
+ Mapping,
320
+ ):
321
+ return resolver.in_subresource(subresource)
322
+ if segment not in in_value and (
323
+ segment not in in_child or next(_segments, None) is None
324
+ ):
325
+ return resolver
326
+ return resolver.in_subresource(subresource)
327
+
328
+ return maybe_in_subresource
329
+
330
+
331
+ def _maybe_in_subresource_crazy_items_dependencies(
332
+ in_value: Set[str] = frozenset(),
333
+ in_subvalues: Set[str] = frozenset(),
334
+ in_subarray: Set[str] = frozenset(),
335
+ ):
336
+ in_child = in_subvalues | in_subarray
337
+
338
+ def maybe_in_subresource(
339
+ segments: Sequence[int | str],
340
+ resolver: _Resolver[Any],
341
+ subresource: Resource[Any],
342
+ ) -> _Resolver[Any]:
343
+ _segments = iter(segments)
344
+ for segment in _segments:
345
+ if segment in {"items", "dependencies"} and isinstance(
346
+ subresource.contents,
347
+ Mapping,
348
+ ):
349
+ return resolver.in_subresource(subresource)
350
+ if segment not in in_value and (
351
+ segment not in in_child or next(_segments, None) is None
352
+ ):
353
+ return resolver
354
+ return resolver.in_subresource(subresource)
355
+
356
+ return maybe_in_subresource
357
+
358
+
359
+ #: JSON Schema draft 2020-12
360
+ DRAFT202012 = Specification(
361
+ name="draft2020-12",
362
+ id_of=_dollar_id,
363
+ subresources_of=_subresources_of(
364
+ in_value={
365
+ "additionalProperties",
366
+ "contains",
367
+ "contentSchema",
368
+ "else",
369
+ "if",
370
+ "items",
371
+ "not",
372
+ "propertyNames",
373
+ "then",
374
+ "unevaluatedItems",
375
+ "unevaluatedProperties",
376
+ },
377
+ in_subarray={"allOf", "anyOf", "oneOf", "prefixItems"},
378
+ in_subvalues={
379
+ "$defs",
380
+ "definitions",
381
+ "dependentSchemas",
382
+ "patternProperties",
383
+ "properties",
384
+ },
385
+ ),
386
+ anchors_in=_anchor,
387
+ maybe_in_subresource=_maybe_in_subresource(
388
+ in_value={
389
+ "additionalProperties",
390
+ "contains",
391
+ "contentSchema",
392
+ "else",
393
+ "if",
394
+ "items",
395
+ "not",
396
+ "propertyNames",
397
+ "then",
398
+ "unevaluatedItems",
399
+ "unevaluatedProperties",
400
+ },
401
+ in_subarray={"allOf", "anyOf", "oneOf", "prefixItems"},
402
+ in_subvalues={
403
+ "$defs",
404
+ "definitions",
405
+ "dependentSchemas",
406
+ "patternProperties",
407
+ "properties",
408
+ },
409
+ ),
410
+ )
411
+ #: JSON Schema draft 2019-09
412
+ DRAFT201909 = Specification(
413
+ name="draft2019-09",
414
+ id_of=_dollar_id,
415
+ subresources_of=_subresources_of_with_crazy_items(
416
+ in_value={
417
+ "additionalItems",
418
+ "additionalProperties",
419
+ "contains",
420
+ "contentSchema",
421
+ "else",
422
+ "if",
423
+ "not",
424
+ "propertyNames",
425
+ "then",
426
+ "unevaluatedItems",
427
+ "unevaluatedProperties",
428
+ },
429
+ in_subarray={"allOf", "anyOf", "oneOf"},
430
+ in_subvalues={
431
+ "$defs",
432
+ "definitions",
433
+ "dependentSchemas",
434
+ "patternProperties",
435
+ "properties",
436
+ },
437
+ ),
438
+ anchors_in=_anchor_2019, # type: ignore[reportGeneralTypeIssues] # TODO: check whether this is real
439
+ maybe_in_subresource=_maybe_in_subresource_crazy_items(
440
+ in_value={
441
+ "additionalItems",
442
+ "additionalProperties",
443
+ "contains",
444
+ "contentSchema",
445
+ "else",
446
+ "if",
447
+ "not",
448
+ "propertyNames",
449
+ "then",
450
+ "unevaluatedItems",
451
+ "unevaluatedProperties",
452
+ },
453
+ in_subarray={"allOf", "anyOf", "oneOf"},
454
+ in_subvalues={
455
+ "$defs",
456
+ "definitions",
457
+ "dependentSchemas",
458
+ "patternProperties",
459
+ "properties",
460
+ },
461
+ ),
462
+ )
463
+ #: JSON Schema draft 7
464
+ DRAFT7 = Specification(
465
+ name="draft-07",
466
+ id_of=_legacy_dollar_id,
467
+ subresources_of=_subresources_of_with_crazy_items_dependencies(
468
+ in_value={
469
+ "additionalItems",
470
+ "additionalProperties",
471
+ "contains",
472
+ "else",
473
+ "if",
474
+ "not",
475
+ "propertyNames",
476
+ "then",
477
+ },
478
+ in_subarray={"allOf", "anyOf", "oneOf"},
479
+ in_subvalues={"definitions", "patternProperties", "properties"},
480
+ ),
481
+ anchors_in=_legacy_anchor_in_dollar_id, # type: ignore[reportGeneralTypeIssues] # TODO: check whether this is real
482
+ maybe_in_subresource=_maybe_in_subresource_crazy_items_dependencies(
483
+ in_value={
484
+ "additionalItems",
485
+ "additionalProperties",
486
+ "contains",
487
+ "else",
488
+ "if",
489
+ "not",
490
+ "propertyNames",
491
+ "then",
492
+ },
493
+ in_subarray={"allOf", "anyOf", "oneOf"},
494
+ in_subvalues={"definitions", "patternProperties", "properties"},
495
+ ),
496
+ )
497
+ #: JSON Schema draft 6
498
+ DRAFT6 = Specification(
499
+ name="draft-06",
500
+ id_of=_legacy_dollar_id,
501
+ subresources_of=_subresources_of_with_crazy_items_dependencies(
502
+ in_value={
503
+ "additionalItems",
504
+ "additionalProperties",
505
+ "contains",
506
+ "not",
507
+ "propertyNames",
508
+ },
509
+ in_subarray={"allOf", "anyOf", "oneOf"},
510
+ in_subvalues={"definitions", "patternProperties", "properties"},
511
+ ),
512
+ anchors_in=_legacy_anchor_in_dollar_id, # type: ignore[reportGeneralTypeIssues] # TODO: check whether this is real
513
+ maybe_in_subresource=_maybe_in_subresource_crazy_items_dependencies(
514
+ in_value={
515
+ "additionalItems",
516
+ "additionalProperties",
517
+ "contains",
518
+ "not",
519
+ "propertyNames",
520
+ },
521
+ in_subarray={"allOf", "anyOf", "oneOf"},
522
+ in_subvalues={"definitions", "patternProperties", "properties"},
523
+ ),
524
+ )
525
+ #: JSON Schema draft 4
526
+ DRAFT4 = Specification(
527
+ name="draft-04",
528
+ id_of=_legacy_id,
529
+ subresources_of=_subresources_of_with_crazy_aP_items_dependencies(
530
+ in_value={"not"},
531
+ in_subarray={"allOf", "anyOf", "oneOf"},
532
+ in_subvalues={"definitions", "patternProperties", "properties"},
533
+ ),
534
+ anchors_in=_legacy_anchor_in_id,
535
+ maybe_in_subresource=_maybe_in_subresource_crazy_items_dependencies(
536
+ in_value={"additionalItems", "additionalProperties", "not"},
537
+ in_subarray={"allOf", "anyOf", "oneOf"},
538
+ in_subvalues={"definitions", "patternProperties", "properties"},
539
+ ),
540
+ )
541
+ #: JSON Schema draft 3
542
+ DRAFT3 = Specification(
543
+ name="draft-03",
544
+ id_of=_legacy_id,
545
+ subresources_of=_subresources_of_with_crazy_aP_items_dependencies(
546
+ in_subarray={"extends"},
547
+ in_subvalues={"definitions", "patternProperties", "properties"},
548
+ ),
549
+ anchors_in=_legacy_anchor_in_id,
550
+ maybe_in_subresource=_maybe_in_subresource_crazy_items_dependencies(
551
+ in_value={"additionalItems", "additionalProperties"},
552
+ in_subarray={"extends"},
553
+ in_subvalues={"definitions", "patternProperties", "properties"},
554
+ ),
555
+ )
556
+
557
+
558
+ _SPECIFICATIONS: Registry[Specification[Schema]] = Registry(
559
+ { # type: ignore[reportGeneralTypeIssues] # :/ internal vs external types
560
+ dialect_id: Resource.opaque(specification)
561
+ for dialect_id, specification in [
562
+ ("https://json-schema.org/draft/2020-12/schema", DRAFT202012),
563
+ ("https://json-schema.org/draft/2019-09/schema", DRAFT201909),
564
+ ("http://json-schema.org/draft-07/schema", DRAFT7),
565
+ ("http://json-schema.org/draft-06/schema", DRAFT6),
566
+ ("http://json-schema.org/draft-04/schema", DRAFT4),
567
+ ("http://json-schema.org/draft-03/schema", DRAFT3),
568
+ ]
569
+ },
570
+ )
571
+
572
+
573
+ def specification_with(
574
+ dialect_id: URI,
575
+ default: Specification[Any] | _Unset = _UNSET,
576
+ ) -> Specification[Any]:
577
+ """
578
+ Retrieve the `Specification` with the given dialect identifier.
579
+
580
+ Raises:
581
+
582
+ `UnknownDialect`
583
+
584
+ if the given ``dialect_id`` isn't known
585
+
586
+ """
587
+ resource = _SPECIFICATIONS.get(dialect_id.rstrip("#"))
588
+ if resource is not None:
589
+ return resource.contents
590
+ if default is _UNSET:
591
+ raise UnknownDialect(dialect_id)
592
+ return default
593
+
594
+
595
+ @frozen
596
+ class DynamicAnchor:
597
+ """
598
+ Dynamic anchors, introduced in draft 2020.
599
+ """
600
+
601
+ name: str
602
+ resource: SchemaResource
603
+
604
+ def resolve(self, resolver: _Resolver[Schema]) -> _Resolved[Schema]:
605
+ """
606
+ Resolve this anchor dynamically.
607
+ """
608
+ last = self.resource
609
+ for uri, registry in resolver.dynamic_scope():
610
+ try:
611
+ anchor = registry.anchor(uri, self.name).value
612
+ except exceptions.NoSuchAnchor:
613
+ continue
614
+ if isinstance(anchor, DynamicAnchor):
615
+ last = anchor.resource
616
+ return _Resolved(
617
+ contents=last.contents,
618
+ resolver=resolver.in_subresource(last),
619
+ )
620
+
621
+
622
+ def lookup_recursive_ref(resolver: _Resolver[Schema]) -> _Resolved[Schema]:
623
+ """
624
+ Recursive references (via recursive anchors), present only in draft 2019.
625
+
626
+ As per the 2019 specification (§ 8.2.4.2.1), only the ``#`` recursive
627
+ reference is supported (and is therefore assumed to be the relevant
628
+ reference).
629
+ """
630
+ resolved = resolver.lookup("#")
631
+ if isinstance(resolved.contents, Mapping) and resolved.contents.get(
632
+ "$recursiveAnchor",
633
+ ):
634
+ for uri, _ in resolver.dynamic_scope():
635
+ next_resolved = resolver.lookup(uri)
636
+ if not isinstance(
637
+ next_resolved.contents,
638
+ Mapping,
639
+ ) or not next_resolved.contents.get("$recursiveAnchor"):
640
+ break
641
+ resolved = next_resolved
642
+ return resolved
valley/lib/python3.10/site-packages/referencing/py.typed ADDED
File without changes
valley/lib/python3.10/site-packages/referencing/retrieval.py ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Helpers related to (dynamic) resource retrieval.
3
+ """
4
+
5
+ from __future__ import annotations
6
+
7
+ from functools import lru_cache
8
+ from typing import TYPE_CHECKING, Callable, TypeVar
9
+ import json
10
+
11
+ from referencing import Resource
12
+
13
+ if TYPE_CHECKING:
14
+ from referencing.typing import URI, D, Retrieve
15
+
16
+ #: A serialized document (e.g. a JSON string)
17
+ _T = TypeVar("_T")
18
+
19
+
20
+ def to_cached_resource(
21
+ cache: Callable[[Retrieve[D]], Retrieve[D]] | None = None,
22
+ loads: Callable[[_T], D] = json.loads,
23
+ from_contents: Callable[[D], Resource[D]] = Resource.from_contents,
24
+ ) -> Callable[[Callable[[URI], _T]], Retrieve[D]]:
25
+ """
26
+ Create a retriever which caches its return values from a simpler callable.
27
+
28
+ Takes a function which returns things like serialized JSON (strings) and
29
+ returns something suitable for passing to `Registry` as a retrieve
30
+ function.
31
+
32
+ This decorator both reduces a small bit of boilerplate for a common case
33
+ (deserializing JSON from strings and creating `Resource` objects from the
34
+ result) as well as makes the probable need for caching a bit easier.
35
+ Retrievers which otherwise do expensive operations (like hitting the
36
+ network) might otherwise be called repeatedly.
37
+
38
+ Examples
39
+ --------
40
+
41
+ .. testcode::
42
+
43
+ from referencing import Registry
44
+ from referencing.typing import URI
45
+ import referencing.retrieval
46
+
47
+
48
+ @referencing.retrieval.to_cached_resource()
49
+ def retrieve(uri: URI):
50
+ print(f"Retrieved {uri}")
51
+
52
+ # Normally, go get some expensive JSON from the network, a file ...
53
+ return '''
54
+ {
55
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
56
+ "foo": "bar"
57
+ }
58
+ '''
59
+
60
+ one = Registry(retrieve=retrieve).get_or_retrieve("urn:example:foo")
61
+ print(one.value.contents["foo"])
62
+
63
+ # Retrieving the same URI again reuses the same value (and thus doesn't
64
+ # print another retrieval message here)
65
+ two = Registry(retrieve=retrieve).get_or_retrieve("urn:example:foo")
66
+ print(two.value.contents["foo"])
67
+
68
+ .. testoutput::
69
+
70
+ Retrieved urn:example:foo
71
+ bar
72
+ bar
73
+
74
+ """
75
+ if cache is None:
76
+ cache = lru_cache(maxsize=None)
77
+
78
+ def decorator(retrieve: Callable[[URI], _T]):
79
+ @cache
80
+ def cached_retrieve(uri: URI):
81
+ response = retrieve(uri)
82
+ contents = loads(response)
83
+ return from_contents(contents)
84
+
85
+ return cached_retrieve
86
+
87
+ return decorator
valley/lib/python3.10/site-packages/referencing/tests/__init__.py ADDED
File without changes
valley/lib/python3.10/site-packages/referencing/tests/__pycache__/test_core.cpython-310.pyc ADDED
Binary file (36.7 kB). View file
 
valley/lib/python3.10/site-packages/referencing/tests/__pycache__/test_jsonschema.cpython-310.pyc ADDED
Binary file (6.72 kB). View file
 
valley/lib/python3.10/site-packages/referencing/tests/test_core.py ADDED
@@ -0,0 +1,1057 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from rpds import HashTrieMap
2
+ import pytest
3
+
4
+ from referencing import Anchor, Registry, Resource, Specification, exceptions
5
+ from referencing.jsonschema import DRAFT202012
6
+
7
+ ID_AND_CHILDREN = Specification(
8
+ name="id-and-children",
9
+ id_of=lambda contents: contents.get("ID"),
10
+ subresources_of=lambda contents: contents.get("children", []),
11
+ anchors_in=lambda specification, contents: [
12
+ Anchor(
13
+ name=name,
14
+ resource=specification.create_resource(contents=each),
15
+ )
16
+ for name, each in contents.get("anchors", {}).items()
17
+ ],
18
+ maybe_in_subresource=lambda segments, resolver, subresource: (
19
+ resolver.in_subresource(subresource)
20
+ if not len(segments) % 2
21
+ and all(each == "children" for each in segments[::2])
22
+ else resolver
23
+ ),
24
+ )
25
+
26
+
27
+ def blow_up(uri): # pragma: no cover
28
+ """
29
+ A retriever suitable for use in tests which expect it never to be used.
30
+ """
31
+ raise RuntimeError("This retrieve function expects to never be called!")
32
+
33
+
34
+ class TestRegistry:
35
+ def test_with_resource(self):
36
+ """
37
+ Adding a resource to the registry then allows re-retrieving it.
38
+ """
39
+
40
+ resource = Resource.opaque(contents={"foo": "bar"})
41
+ uri = "urn:example"
42
+ registry = Registry().with_resource(uri=uri, resource=resource)
43
+ assert registry[uri] is resource
44
+
45
+ def test_with_resources(self):
46
+ """
47
+ Adding multiple resources to the registry is like adding each one.
48
+ """
49
+
50
+ one = Resource.opaque(contents={})
51
+ two = Resource(contents={"foo": "bar"}, specification=ID_AND_CHILDREN)
52
+ registry = Registry().with_resources(
53
+ [
54
+ ("http://example.com/1", one),
55
+ ("http://example.com/foo/bar", two),
56
+ ],
57
+ )
58
+ assert registry == Registry().with_resource(
59
+ uri="http://example.com/1",
60
+ resource=one,
61
+ ).with_resource(
62
+ uri="http://example.com/foo/bar",
63
+ resource=two,
64
+ )
65
+
66
+ def test_matmul_resource(self):
67
+ uri = "urn:example:resource"
68
+ resource = ID_AND_CHILDREN.create_resource({"ID": uri, "foo": 12})
69
+ registry = resource @ Registry()
70
+ assert registry == Registry().with_resource(uri, resource)
71
+
72
+ def test_matmul_many_resources(self):
73
+ one_uri = "urn:example:one"
74
+ one = ID_AND_CHILDREN.create_resource({"ID": one_uri, "foo": 12})
75
+
76
+ two_uri = "urn:example:two"
77
+ two = ID_AND_CHILDREN.create_resource({"ID": two_uri, "foo": 12})
78
+
79
+ registry = [one, two] @ Registry()
80
+ assert registry == Registry().with_resources(
81
+ [(one_uri, one), (two_uri, two)],
82
+ )
83
+
84
+ def test_matmul_resource_without_id(self):
85
+ resource = Resource.opaque(contents={"foo": "bar"})
86
+ with pytest.raises(exceptions.NoInternalID) as e:
87
+ resource @ Registry()
88
+ assert e.value == exceptions.NoInternalID(resource=resource)
89
+
90
+ def test_with_contents_from_json_schema(self):
91
+ uri = "urn:example"
92
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
93
+ registry = Registry().with_contents([(uri, schema)])
94
+
95
+ expected = Resource(contents=schema, specification=DRAFT202012)
96
+ assert registry[uri] == expected
97
+
98
+ def test_with_contents_and_default_specification(self):
99
+ uri = "urn:example"
100
+ registry = Registry().with_contents(
101
+ [(uri, {"foo": "bar"})],
102
+ default_specification=Specification.OPAQUE,
103
+ )
104
+ assert registry[uri] == Resource.opaque({"foo": "bar"})
105
+
106
+ def test_len(self):
107
+ total = 5
108
+ registry = Registry().with_contents(
109
+ [(str(i), {"foo": "bar"}) for i in range(total)],
110
+ default_specification=Specification.OPAQUE,
111
+ )
112
+ assert len(registry) == total
113
+
114
+ def test_bool_empty(self):
115
+ assert not Registry()
116
+
117
+ def test_bool_not_empty(self):
118
+ registry = Registry().with_contents(
119
+ [(str(i), {"foo": "bar"}) for i in range(3)],
120
+ default_specification=Specification.OPAQUE,
121
+ )
122
+ assert registry
123
+
124
+ def test_iter(self):
125
+ registry = Registry().with_contents(
126
+ [(str(i), {"foo": "bar"}) for i in range(8)],
127
+ default_specification=Specification.OPAQUE,
128
+ )
129
+ assert set(registry) == {str(i) for i in range(8)}
130
+
131
+ def test_crawl_still_has_top_level_resource(self):
132
+ resource = Resource.opaque({"foo": "bar"})
133
+ uri = "urn:example"
134
+ registry = Registry({uri: resource}).crawl()
135
+ assert registry[uri] is resource
136
+
137
+ def test_crawl_finds_a_subresource(self):
138
+ child_id = "urn:child"
139
+ root = ID_AND_CHILDREN.create_resource(
140
+ {"ID": "urn:root", "children": [{"ID": child_id, "foo": 12}]},
141
+ )
142
+ registry = root @ Registry()
143
+ with pytest.raises(LookupError):
144
+ registry[child_id]
145
+
146
+ expected = ID_AND_CHILDREN.create_resource({"ID": child_id, "foo": 12})
147
+ assert registry.crawl()[child_id] == expected
148
+
149
+ def test_crawl_finds_anchors_with_id(self):
150
+ resource = ID_AND_CHILDREN.create_resource(
151
+ {"ID": "urn:bar", "anchors": {"foo": 12}},
152
+ )
153
+ registry = resource @ Registry()
154
+
155
+ assert registry.crawl().anchor(resource.id(), "foo").value == Anchor(
156
+ name="foo",
157
+ resource=ID_AND_CHILDREN.create_resource(12),
158
+ )
159
+
160
+ def test_crawl_finds_anchors_no_id(self):
161
+ resource = ID_AND_CHILDREN.create_resource({"anchors": {"foo": 12}})
162
+ registry = Registry().with_resource("urn:root", resource)
163
+
164
+ assert registry.crawl().anchor("urn:root", "foo").value == Anchor(
165
+ name="foo",
166
+ resource=ID_AND_CHILDREN.create_resource(12),
167
+ )
168
+
169
+ def test_contents(self):
170
+ resource = Resource.opaque({"foo": "bar"})
171
+ uri = "urn:example"
172
+ registry = Registry().with_resource(uri, resource)
173
+ assert registry.contents(uri) == {"foo": "bar"}
174
+
175
+ def test_getitem_strips_empty_fragments(self):
176
+ uri = "http://example.com/"
177
+ resource = ID_AND_CHILDREN.create_resource({"ID": uri + "#"})
178
+ registry = resource @ Registry()
179
+ assert registry[uri] == registry[uri + "#"] == resource
180
+
181
+ def test_contents_strips_empty_fragments(self):
182
+ uri = "http://example.com/"
183
+ resource = ID_AND_CHILDREN.create_resource({"ID": uri + "#"})
184
+ registry = resource @ Registry()
185
+ assert (
186
+ registry.contents(uri)
187
+ == registry.contents(uri + "#")
188
+ == {"ID": uri + "#"}
189
+ )
190
+
191
+ def test_contents_nonexistent_resource(self):
192
+ registry = Registry()
193
+ with pytest.raises(exceptions.NoSuchResource) as e:
194
+ registry.contents("urn:example")
195
+ assert e.value == exceptions.NoSuchResource(ref="urn:example")
196
+
197
+ def test_crawled_anchor(self):
198
+ resource = ID_AND_CHILDREN.create_resource({"anchors": {"foo": "bar"}})
199
+ registry = Registry().with_resource("urn:example", resource)
200
+ retrieved = registry.anchor("urn:example", "foo")
201
+ assert retrieved.value == Anchor(
202
+ name="foo",
203
+ resource=ID_AND_CHILDREN.create_resource("bar"),
204
+ )
205
+ assert retrieved.registry == registry.crawl()
206
+
207
+ def test_anchor_in_nonexistent_resource(self):
208
+ registry = Registry()
209
+ with pytest.raises(exceptions.NoSuchResource) as e:
210
+ registry.anchor("urn:example", "foo")
211
+ assert e.value == exceptions.NoSuchResource(ref="urn:example")
212
+
213
+ def test_init(self):
214
+ one = Resource.opaque(contents={})
215
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
216
+ registry = Registry(
217
+ {
218
+ "http://example.com/1": one,
219
+ "http://example.com/foo/bar": two,
220
+ },
221
+ )
222
+ assert (
223
+ registry
224
+ == Registry()
225
+ .with_resources(
226
+ [
227
+ ("http://example.com/1", one),
228
+ ("http://example.com/foo/bar", two),
229
+ ],
230
+ )
231
+ .crawl()
232
+ )
233
+
234
+ def test_dict_conversion(self):
235
+ """
236
+ Passing a `dict` to `Registry` gets converted to a `HashTrieMap`.
237
+
238
+ So continuing to use the registry works.
239
+ """
240
+
241
+ one = Resource.opaque(contents={})
242
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
243
+ registry = Registry(
244
+ {"http://example.com/1": one},
245
+ ).with_resource("http://example.com/foo/bar", two)
246
+ assert (
247
+ registry.crawl()
248
+ == Registry()
249
+ .with_resources(
250
+ [
251
+ ("http://example.com/1", one),
252
+ ("http://example.com/foo/bar", two),
253
+ ],
254
+ )
255
+ .crawl()
256
+ )
257
+
258
+ def test_no_such_resource(self):
259
+ registry = Registry()
260
+ with pytest.raises(exceptions.NoSuchResource) as e:
261
+ registry["urn:bigboom"]
262
+ assert e.value == exceptions.NoSuchResource(ref="urn:bigboom")
263
+
264
+ def test_combine(self):
265
+ one = Resource.opaque(contents={})
266
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
267
+ three = ID_AND_CHILDREN.create_resource({"baz": "quux"})
268
+ four = ID_AND_CHILDREN.create_resource({"anchors": {"foo": 12}})
269
+
270
+ first = Registry({"http://example.com/1": one})
271
+ second = Registry().with_resource("http://example.com/foo/bar", two)
272
+ third = Registry(
273
+ {
274
+ "http://example.com/1": one,
275
+ "http://example.com/baz": three,
276
+ },
277
+ )
278
+ fourth = (
279
+ Registry()
280
+ .with_resource(
281
+ "http://example.com/foo/quux",
282
+ four,
283
+ )
284
+ .crawl()
285
+ )
286
+ assert first.combine(second, third, fourth) == Registry(
287
+ [
288
+ ("http://example.com/1", one),
289
+ ("http://example.com/baz", three),
290
+ ("http://example.com/foo/quux", four),
291
+ ],
292
+ anchors=HashTrieMap(
293
+ {
294
+ ("http://example.com/foo/quux", "foo"): Anchor(
295
+ name="foo",
296
+ resource=ID_AND_CHILDREN.create_resource(12),
297
+ ),
298
+ },
299
+ ),
300
+ ).with_resource("http://example.com/foo/bar", two)
301
+
302
+ def test_combine_self(self):
303
+ """
304
+ Combining a registry with itself short-circuits.
305
+
306
+ This is a performance optimization -- otherwise we do lots more work
307
+ (in jsonschema this seems to correspond to making the test suite take
308
+ *3x* longer).
309
+ """
310
+
311
+ registry = Registry({"urn:foo": "bar"})
312
+ assert registry.combine(registry) is registry
313
+
314
+ def test_combine_with_uncrawled_resources(self):
315
+ one = Resource.opaque(contents={})
316
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
317
+ three = ID_AND_CHILDREN.create_resource({"baz": "quux"})
318
+
319
+ first = Registry().with_resource("http://example.com/1", one)
320
+ second = Registry().with_resource("http://example.com/foo/bar", two)
321
+ third = Registry(
322
+ {
323
+ "http://example.com/1": one,
324
+ "http://example.com/baz": three,
325
+ },
326
+ )
327
+ expected = Registry(
328
+ [
329
+ ("http://example.com/1", one),
330
+ ("http://example.com/foo/bar", two),
331
+ ("http://example.com/baz", three),
332
+ ],
333
+ )
334
+ combined = first.combine(second, third)
335
+ assert combined != expected
336
+ assert combined.crawl() == expected
337
+
338
+ def test_combine_with_single_retrieve(self):
339
+ one = Resource.opaque(contents={})
340
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
341
+ three = ID_AND_CHILDREN.create_resource({"baz": "quux"})
342
+
343
+ def retrieve(uri): # pragma: no cover
344
+ pass
345
+
346
+ first = Registry().with_resource("http://example.com/1", one)
347
+ second = Registry(
348
+ retrieve=retrieve,
349
+ ).with_resource("http://example.com/2", two)
350
+ third = Registry().with_resource("http://example.com/3", three)
351
+
352
+ assert first.combine(second, third) == Registry(
353
+ retrieve=retrieve,
354
+ ).with_resources(
355
+ [
356
+ ("http://example.com/1", one),
357
+ ("http://example.com/2", two),
358
+ ("http://example.com/3", three),
359
+ ],
360
+ )
361
+ assert second.combine(first, third) == Registry(
362
+ retrieve=retrieve,
363
+ ).with_resources(
364
+ [
365
+ ("http://example.com/1", one),
366
+ ("http://example.com/2", two),
367
+ ("http://example.com/3", three),
368
+ ],
369
+ )
370
+
371
+ def test_combine_with_common_retrieve(self):
372
+ one = Resource.opaque(contents={})
373
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
374
+ three = ID_AND_CHILDREN.create_resource({"baz": "quux"})
375
+
376
+ def retrieve(uri): # pragma: no cover
377
+ pass
378
+
379
+ first = Registry(retrieve=retrieve).with_resource(
380
+ "http://example.com/1",
381
+ one,
382
+ )
383
+ second = Registry(
384
+ retrieve=retrieve,
385
+ ).with_resource("http://example.com/2", two)
386
+ third = Registry(retrieve=retrieve).with_resource(
387
+ "http://example.com/3",
388
+ three,
389
+ )
390
+
391
+ assert first.combine(second, third) == Registry(
392
+ retrieve=retrieve,
393
+ ).with_resources(
394
+ [
395
+ ("http://example.com/1", one),
396
+ ("http://example.com/2", two),
397
+ ("http://example.com/3", three),
398
+ ],
399
+ )
400
+ assert second.combine(first, third) == Registry(
401
+ retrieve=retrieve,
402
+ ).with_resources(
403
+ [
404
+ ("http://example.com/1", one),
405
+ ("http://example.com/2", two),
406
+ ("http://example.com/3", three),
407
+ ],
408
+ )
409
+
410
+ def test_combine_conflicting_retrieve(self):
411
+ one = Resource.opaque(contents={})
412
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
413
+ three = ID_AND_CHILDREN.create_resource({"baz": "quux"})
414
+
415
+ def foo_retrieve(uri): # pragma: no cover
416
+ pass
417
+
418
+ def bar_retrieve(uri): # pragma: no cover
419
+ pass
420
+
421
+ first = Registry(retrieve=foo_retrieve).with_resource(
422
+ "http://example.com/1",
423
+ one,
424
+ )
425
+ second = Registry().with_resource("http://example.com/2", two)
426
+ third = Registry(retrieve=bar_retrieve).with_resource(
427
+ "http://example.com/3",
428
+ three,
429
+ )
430
+
431
+ with pytest.raises(Exception, match="conflict.*retriev"):
432
+ first.combine(second, third)
433
+
434
+ def test_remove(self):
435
+ one = Resource.opaque(contents={})
436
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
437
+ registry = Registry({"urn:foo": one, "urn:bar": two})
438
+ assert registry.remove("urn:foo") == Registry({"urn:bar": two})
439
+
440
+ def test_remove_uncrawled(self):
441
+ one = Resource.opaque(contents={})
442
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
443
+ registry = Registry().with_resources(
444
+ [("urn:foo", one), ("urn:bar", two)],
445
+ )
446
+ assert registry.remove("urn:foo") == Registry().with_resource(
447
+ "urn:bar",
448
+ two,
449
+ )
450
+
451
+ def test_remove_with_anchors(self):
452
+ one = Resource.opaque(contents={})
453
+ two = ID_AND_CHILDREN.create_resource({"anchors": {"foo": "bar"}})
454
+ registry = (
455
+ Registry()
456
+ .with_resources(
457
+ [("urn:foo", one), ("urn:bar", two)],
458
+ )
459
+ .crawl()
460
+ )
461
+ assert (
462
+ registry.remove("urn:bar")
463
+ == Registry()
464
+ .with_resource(
465
+ "urn:foo",
466
+ one,
467
+ )
468
+ .crawl()
469
+ )
470
+
471
+ def test_remove_nonexistent_uri(self):
472
+ with pytest.raises(exceptions.NoSuchResource) as e:
473
+ Registry().remove("urn:doesNotExist")
474
+ assert e.value == exceptions.NoSuchResource(ref="urn:doesNotExist")
475
+
476
+ def test_retrieve(self):
477
+ foo = Resource.opaque({"foo": "bar"})
478
+ registry = Registry(retrieve=lambda uri: foo)
479
+ assert registry.get_or_retrieve("urn:example").value == foo
480
+
481
+ def test_retrieve_arbitrary_exception(self):
482
+ foo = Resource.opaque({"foo": "bar"})
483
+
484
+ def retrieve(uri):
485
+ if uri == "urn:succeed":
486
+ return foo
487
+ raise Exception("Oh no!")
488
+
489
+ registry = Registry(retrieve=retrieve)
490
+ assert registry.get_or_retrieve("urn:succeed").value == foo
491
+ with pytest.raises(exceptions.Unretrievable):
492
+ registry.get_or_retrieve("urn:uhoh")
493
+
494
+ def test_retrieve_no_such_resource(self):
495
+ foo = Resource.opaque({"foo": "bar"})
496
+
497
+ def retrieve(uri):
498
+ if uri == "urn:succeed":
499
+ return foo
500
+ raise exceptions.NoSuchResource(ref=uri)
501
+
502
+ registry = Registry(retrieve=retrieve)
503
+ assert registry.get_or_retrieve("urn:succeed").value == foo
504
+ with pytest.raises(exceptions.NoSuchResource):
505
+ registry.get_or_retrieve("urn:uhoh")
506
+
507
+ def test_retrieve_cannot_determine_specification(self):
508
+ def retrieve(uri):
509
+ return Resource.from_contents({})
510
+
511
+ registry = Registry(retrieve=retrieve)
512
+ with pytest.raises(exceptions.CannotDetermineSpecification):
513
+ registry.get_or_retrieve("urn:uhoh")
514
+
515
+ def test_retrieve_already_available_resource(self):
516
+ foo = Resource.opaque({"foo": "bar"})
517
+ registry = Registry({"urn:example": foo}, retrieve=blow_up)
518
+ assert registry["urn:example"] == foo
519
+ assert registry.get_or_retrieve("urn:example").value == foo
520
+
521
+ def test_retrieve_first_checks_crawlable_resource(self):
522
+ child = ID_AND_CHILDREN.create_resource({"ID": "urn:child", "foo": 12})
523
+ root = ID_AND_CHILDREN.create_resource({"children": [child.contents]})
524
+ registry = Registry(retrieve=blow_up).with_resource("urn:root", root)
525
+ assert registry.crawl()["urn:child"] == child
526
+
527
+ def test_resolver(self):
528
+ one = Resource.opaque(contents={})
529
+ registry = Registry({"http://example.com": one})
530
+ resolver = registry.resolver(base_uri="http://example.com")
531
+ assert resolver.lookup("#").contents == {}
532
+
533
+ def test_resolver_with_root_identified(self):
534
+ root = ID_AND_CHILDREN.create_resource({"ID": "http://example.com"})
535
+ resolver = Registry().resolver_with_root(root)
536
+ assert resolver.lookup("http://example.com").contents == root.contents
537
+ assert resolver.lookup("#").contents == root.contents
538
+
539
+ def test_resolver_with_root_unidentified(self):
540
+ root = Resource.opaque(contents={})
541
+ resolver = Registry().resolver_with_root(root)
542
+ assert resolver.lookup("#").contents == root.contents
543
+
544
+ def test_repr(self):
545
+ one = Resource.opaque(contents={})
546
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
547
+ registry = Registry().with_resources(
548
+ [
549
+ ("http://example.com/1", one),
550
+ ("http://example.com/foo/bar", two),
551
+ ],
552
+ )
553
+ assert repr(registry) == "<Registry (2 uncrawled resources)>"
554
+ assert repr(registry.crawl()) == "<Registry (2 resources)>"
555
+
556
+ def test_repr_mixed_crawled(self):
557
+ one = Resource.opaque(contents={})
558
+ two = ID_AND_CHILDREN.create_resource({"foo": "bar"})
559
+ registry = (
560
+ Registry(
561
+ {"http://example.com/1": one},
562
+ )
563
+ .crawl()
564
+ .with_resource(uri="http://example.com/foo/bar", resource=two)
565
+ )
566
+ assert repr(registry) == "<Registry (2 resources, 1 uncrawled)>"
567
+
568
+ def test_repr_one_resource(self):
569
+ registry = Registry().with_resource(
570
+ uri="http://example.com/1",
571
+ resource=Resource.opaque(contents={}),
572
+ )
573
+ assert repr(registry) == "<Registry (1 uncrawled resource)>"
574
+
575
+ def test_repr_empty(self):
576
+ assert repr(Registry()) == "<Registry (0 resources)>"
577
+
578
+
579
+ class TestResource:
580
+ def test_from_contents_from_json_schema(self):
581
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
582
+ resource = Resource.from_contents(schema)
583
+ assert resource == Resource(contents=schema, specification=DRAFT202012)
584
+
585
+ def test_from_contents_with_no_discernible_information(self):
586
+ """
587
+ Creating a resource with no discernible way to see what
588
+ specification it belongs to (e.g. no ``$schema`` keyword for JSON
589
+ Schema) raises an error.
590
+ """
591
+
592
+ with pytest.raises(exceptions.CannotDetermineSpecification):
593
+ Resource.from_contents({"foo": "bar"})
594
+
595
+ def test_from_contents_with_no_discernible_information_and_default(self):
596
+ resource = Resource.from_contents(
597
+ {"foo": "bar"},
598
+ default_specification=Specification.OPAQUE,
599
+ )
600
+ assert resource == Resource.opaque(contents={"foo": "bar"})
601
+
602
+ def test_from_contents_unneeded_default(self):
603
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
604
+ resource = Resource.from_contents(
605
+ schema,
606
+ default_specification=Specification.OPAQUE,
607
+ )
608
+ assert resource == Resource(
609
+ contents=schema,
610
+ specification=DRAFT202012,
611
+ )
612
+
613
+ def test_non_mapping_from_contents(self):
614
+ resource = Resource.from_contents(
615
+ True,
616
+ default_specification=ID_AND_CHILDREN,
617
+ )
618
+ assert resource == Resource(
619
+ contents=True,
620
+ specification=ID_AND_CHILDREN,
621
+ )
622
+
623
+ def test_from_contents_with_fallback(self):
624
+ resource = Resource.from_contents(
625
+ {"foo": "bar"},
626
+ default_specification=Specification.OPAQUE,
627
+ )
628
+ assert resource == Resource.opaque(contents={"foo": "bar"})
629
+
630
+ def test_id_delegates_to_specification(self):
631
+ specification = Specification(
632
+ name="",
633
+ id_of=lambda contents: "urn:fixedID",
634
+ subresources_of=lambda contents: [],
635
+ anchors_in=lambda specification, contents: [],
636
+ maybe_in_subresource=(
637
+ lambda segments, resolver, subresource: resolver
638
+ ),
639
+ )
640
+ resource = Resource(
641
+ contents={"foo": "baz"},
642
+ specification=specification,
643
+ )
644
+ assert resource.id() == "urn:fixedID"
645
+
646
+ def test_id_strips_empty_fragment(self):
647
+ uri = "http://example.com/"
648
+ root = ID_AND_CHILDREN.create_resource({"ID": uri + "#"})
649
+ assert root.id() == uri
650
+
651
+ def test_subresources_delegates_to_specification(self):
652
+ resource = ID_AND_CHILDREN.create_resource({"children": [{}, 12]})
653
+ assert list(resource.subresources()) == [
654
+ ID_AND_CHILDREN.create_resource(each) for each in [{}, 12]
655
+ ]
656
+
657
+ def test_subresource_with_different_specification(self):
658
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
659
+ resource = ID_AND_CHILDREN.create_resource({"children": [schema]})
660
+ assert list(resource.subresources()) == [
661
+ DRAFT202012.create_resource(schema),
662
+ ]
663
+
664
+ def test_anchors_delegates_to_specification(self):
665
+ resource = ID_AND_CHILDREN.create_resource(
666
+ {"anchors": {"foo": {}, "bar": 1, "baz": ""}},
667
+ )
668
+ assert list(resource.anchors()) == [
669
+ Anchor(name="foo", resource=ID_AND_CHILDREN.create_resource({})),
670
+ Anchor(name="bar", resource=ID_AND_CHILDREN.create_resource(1)),
671
+ Anchor(name="baz", resource=ID_AND_CHILDREN.create_resource("")),
672
+ ]
673
+
674
+ def test_pointer_to_mapping(self):
675
+ resource = Resource.opaque(contents={"foo": "baz"})
676
+ resolver = Registry().resolver()
677
+ assert resource.pointer("/foo", resolver=resolver).contents == "baz"
678
+
679
+ def test_pointer_to_array(self):
680
+ resource = Resource.opaque(contents={"foo": {"bar": [3]}})
681
+ resolver = Registry().resolver()
682
+ assert resource.pointer("/foo/bar/0", resolver=resolver).contents == 3
683
+
684
+ def test_root_pointer(self):
685
+ contents = {"foo": "baz"}
686
+ resource = Resource.opaque(contents=contents)
687
+ resolver = Registry().resolver()
688
+ assert resource.pointer("", resolver=resolver).contents == contents
689
+
690
+ def test_opaque(self):
691
+ contents = {"foo": "bar"}
692
+ assert Resource.opaque(contents) == Resource(
693
+ contents=contents,
694
+ specification=Specification.OPAQUE,
695
+ )
696
+
697
+
698
+ class TestResolver:
699
+ def test_lookup_exact_uri(self):
700
+ resource = Resource.opaque(contents={"foo": "baz"})
701
+ resolver = Registry({"http://example.com/1": resource}).resolver()
702
+ resolved = resolver.lookup("http://example.com/1")
703
+ assert resolved.contents == resource.contents
704
+
705
+ def test_lookup_subresource(self):
706
+ root = ID_AND_CHILDREN.create_resource(
707
+ {
708
+ "ID": "http://example.com/",
709
+ "children": [
710
+ {"ID": "http://example.com/a", "foo": 12},
711
+ ],
712
+ },
713
+ )
714
+ registry = root @ Registry()
715
+ resolved = registry.resolver().lookup("http://example.com/a")
716
+ assert resolved.contents == {"ID": "http://example.com/a", "foo": 12}
717
+
718
+ def test_lookup_anchor_with_id(self):
719
+ root = ID_AND_CHILDREN.create_resource(
720
+ {
721
+ "ID": "http://example.com/",
722
+ "anchors": {"foo": 12},
723
+ },
724
+ )
725
+ registry = root @ Registry()
726
+ resolved = registry.resolver().lookup("http://example.com/#foo")
727
+ assert resolved.contents == 12
728
+
729
+ def test_lookup_anchor_without_id(self):
730
+ root = ID_AND_CHILDREN.create_resource({"anchors": {"foo": 12}})
731
+ resolver = Registry().with_resource("urn:example", root).resolver()
732
+ resolved = resolver.lookup("urn:example#foo")
733
+ assert resolved.contents == 12
734
+
735
+ def test_lookup_unknown_reference(self):
736
+ resolver = Registry().resolver()
737
+ ref = "http://example.com/does/not/exist"
738
+ with pytest.raises(exceptions.Unresolvable) as e:
739
+ resolver.lookup(ref)
740
+ assert e.value == exceptions.Unresolvable(ref=ref)
741
+
742
+ def test_lookup_non_existent_pointer(self):
743
+ resource = Resource.opaque({"foo": {}})
744
+ resolver = Registry({"http://example.com/1": resource}).resolver()
745
+ ref = "http://example.com/1#/foo/bar"
746
+ with pytest.raises(exceptions.Unresolvable) as e:
747
+ resolver.lookup(ref)
748
+ assert e.value == exceptions.PointerToNowhere(
749
+ ref="/foo/bar",
750
+ resource=resource,
751
+ )
752
+ assert str(e.value) == "'/foo/bar' does not exist within {'foo': {}}"
753
+
754
+ def test_lookup_non_existent_pointer_to_array_index(self):
755
+ resource = Resource.opaque([1, 2, 4, 8])
756
+ resolver = Registry({"http://example.com/1": resource}).resolver()
757
+ ref = "http://example.com/1#/10"
758
+ with pytest.raises(exceptions.Unresolvable) as e:
759
+ resolver.lookup(ref)
760
+ assert e.value == exceptions.PointerToNowhere(
761
+ ref="/10",
762
+ resource=resource,
763
+ )
764
+
765
+ def test_lookup_pointer_to_empty_string(self):
766
+ resolver = Registry().resolver_with_root(Resource.opaque({"": {}}))
767
+ assert resolver.lookup("#/").contents == {}
768
+
769
+ def test_lookup_non_existent_pointer_to_empty_string(self):
770
+ resource = Resource.opaque({"foo": {}})
771
+ resolver = Registry().resolver_with_root(resource)
772
+ with pytest.raises(
773
+ exceptions.Unresolvable,
774
+ match="^'/' does not exist within {'foo': {}}.*'#'",
775
+ ) as e:
776
+ resolver.lookup("#/")
777
+ assert e.value == exceptions.PointerToNowhere(
778
+ ref="/",
779
+ resource=resource,
780
+ )
781
+
782
+ def test_lookup_non_existent_anchor(self):
783
+ root = ID_AND_CHILDREN.create_resource({"anchors": {}})
784
+ resolver = Registry().with_resource("urn:example", root).resolver()
785
+ resolved = resolver.lookup("urn:example")
786
+ assert resolved.contents == root.contents
787
+
788
+ ref = "urn:example#noSuchAnchor"
789
+ with pytest.raises(exceptions.Unresolvable) as e:
790
+ resolver.lookup(ref)
791
+ assert "'noSuchAnchor' does not exist" in str(e.value)
792
+ assert e.value == exceptions.NoSuchAnchor(
793
+ ref="urn:example",
794
+ resource=root,
795
+ anchor="noSuchAnchor",
796
+ )
797
+
798
+ def test_lookup_invalid_JSON_pointerish_anchor(self):
799
+ resolver = Registry().resolver_with_root(
800
+ ID_AND_CHILDREN.create_resource(
801
+ {
802
+ "ID": "http://example.com/",
803
+ "foo": {"bar": 12},
804
+ },
805
+ ),
806
+ )
807
+
808
+ valid = resolver.lookup("#/foo/bar")
809
+ assert valid.contents == 12
810
+
811
+ with pytest.raises(exceptions.InvalidAnchor) as e:
812
+ resolver.lookup("#foo/bar")
813
+ assert " '#/foo/bar'" in str(e.value)
814
+
815
+ def test_lookup_retrieved_resource(self):
816
+ resource = Resource.opaque(contents={"foo": "baz"})
817
+ resolver = Registry(retrieve=lambda uri: resource).resolver()
818
+ resolved = resolver.lookup("http://example.com/")
819
+ assert resolved.contents == resource.contents
820
+
821
+ def test_lookup_failed_retrieved_resource(self):
822
+ """
823
+ Unretrievable exceptions are also wrapped in Unresolvable.
824
+ """
825
+
826
+ uri = "http://example.com/"
827
+
828
+ registry = Registry(retrieve=blow_up)
829
+ with pytest.raises(exceptions.Unretrievable):
830
+ registry.get_or_retrieve(uri)
831
+
832
+ resolver = registry.resolver()
833
+ with pytest.raises(exceptions.Unresolvable):
834
+ resolver.lookup(uri)
835
+
836
+ def test_repeated_lookup_from_retrieved_resource(self):
837
+ """
838
+ A (custom-)retrieved resource is added to the registry returned by
839
+ looking it up.
840
+ """
841
+ resource = Resource.opaque(contents={"foo": "baz"})
842
+ once = [resource]
843
+
844
+ def retrieve(uri):
845
+ return once.pop()
846
+
847
+ resolver = Registry(retrieve=retrieve).resolver()
848
+ resolved = resolver.lookup("http://example.com/")
849
+ assert resolved.contents == resource.contents
850
+
851
+ resolved = resolved.resolver.lookup("http://example.com/")
852
+ assert resolved.contents == resource.contents
853
+
854
+ def test_repeated_anchor_lookup_from_retrieved_resource(self):
855
+ resource = Resource.opaque(contents={"foo": "baz"})
856
+ once = [resource]
857
+
858
+ def retrieve(uri):
859
+ return once.pop()
860
+
861
+ resolver = Registry(retrieve=retrieve).resolver()
862
+ resolved = resolver.lookup("http://example.com/")
863
+ assert resolved.contents == resource.contents
864
+
865
+ resolved = resolved.resolver.lookup("#")
866
+ assert resolved.contents == resource.contents
867
+
868
+ # FIXME: The tests below aren't really representable in the current
869
+ # suite, though we should probably think of ways to do so.
870
+
871
+ def test_in_subresource(self):
872
+ root = ID_AND_CHILDREN.create_resource(
873
+ {
874
+ "ID": "http://example.com/",
875
+ "children": [
876
+ {
877
+ "ID": "child/",
878
+ "children": [{"ID": "grandchild"}],
879
+ },
880
+ ],
881
+ },
882
+ )
883
+ registry = root @ Registry()
884
+
885
+ resolver = registry.resolver()
886
+ first = resolver.lookup("http://example.com/")
887
+ assert first.contents == root.contents
888
+
889
+ with pytest.raises(exceptions.Unresolvable):
890
+ first.resolver.lookup("grandchild")
891
+
892
+ sub = first.resolver.in_subresource(
893
+ ID_AND_CHILDREN.create_resource(first.contents["children"][0]),
894
+ )
895
+ second = sub.lookup("grandchild")
896
+ assert second.contents == {"ID": "grandchild"}
897
+
898
+ def test_in_pointer_subresource(self):
899
+ root = ID_AND_CHILDREN.create_resource(
900
+ {
901
+ "ID": "http://example.com/",
902
+ "children": [
903
+ {
904
+ "ID": "child/",
905
+ "children": [{"ID": "grandchild"}],
906
+ },
907
+ ],
908
+ },
909
+ )
910
+ registry = root @ Registry()
911
+
912
+ resolver = registry.resolver()
913
+ first = resolver.lookup("http://example.com/")
914
+ assert first.contents == root.contents
915
+
916
+ with pytest.raises(exceptions.Unresolvable):
917
+ first.resolver.lookup("grandchild")
918
+
919
+ second = first.resolver.lookup("#/children/0")
920
+ third = second.resolver.lookup("grandchild")
921
+ assert third.contents == {"ID": "grandchild"}
922
+
923
+ def test_dynamic_scope(self):
924
+ one = ID_AND_CHILDREN.create_resource(
925
+ {
926
+ "ID": "http://example.com/",
927
+ "children": [
928
+ {
929
+ "ID": "child/",
930
+ "children": [{"ID": "grandchild"}],
931
+ },
932
+ ],
933
+ },
934
+ )
935
+ two = ID_AND_CHILDREN.create_resource(
936
+ {
937
+ "ID": "http://example.com/two",
938
+ "children": [{"ID": "two-child/"}],
939
+ },
940
+ )
941
+ registry = [one, two] @ Registry()
942
+
943
+ resolver = registry.resolver()
944
+ first = resolver.lookup("http://example.com/")
945
+ second = first.resolver.lookup("#/children/0")
946
+ third = second.resolver.lookup("grandchild")
947
+ fourth = third.resolver.lookup("http://example.com/two")
948
+ assert list(fourth.resolver.dynamic_scope()) == [
949
+ ("http://example.com/child/grandchild", fourth.resolver._registry),
950
+ ("http://example.com/child/", fourth.resolver._registry),
951
+ ("http://example.com/", fourth.resolver._registry),
952
+ ]
953
+ assert list(third.resolver.dynamic_scope()) == [
954
+ ("http://example.com/child/", third.resolver._registry),
955
+ ("http://example.com/", third.resolver._registry),
956
+ ]
957
+ assert list(second.resolver.dynamic_scope()) == [
958
+ ("http://example.com/", second.resolver._registry),
959
+ ]
960
+ assert list(first.resolver.dynamic_scope()) == []
961
+
962
+
963
+ class TestSpecification:
964
+ def test_create_resource(self):
965
+ specification = Specification(
966
+ name="",
967
+ id_of=lambda contents: "urn:fixedID",
968
+ subresources_of=lambda contents: [],
969
+ anchors_in=lambda specification, contents: [],
970
+ maybe_in_subresource=(
971
+ lambda segments, resolver, subresource: resolver
972
+ ),
973
+ )
974
+ resource = specification.create_resource(contents={"foo": "baz"})
975
+ assert resource == Resource(
976
+ contents={"foo": "baz"},
977
+ specification=specification,
978
+ )
979
+ assert resource.id() == "urn:fixedID"
980
+
981
+ def test_detect_from_json_schema(self):
982
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
983
+ specification = Specification.detect(schema)
984
+ assert specification == DRAFT202012
985
+
986
+ def test_detect_with_no_discernible_information(self):
987
+ with pytest.raises(exceptions.CannotDetermineSpecification):
988
+ Specification.detect({"foo": "bar"})
989
+
990
+ def test_detect_with_non_URI_schema(self):
991
+ with pytest.raises(exceptions.CannotDetermineSpecification):
992
+ Specification.detect({"$schema": 37})
993
+
994
+ def test_detect_with_no_discernible_information_and_default(self):
995
+ specification = Specification.OPAQUE.detect({"foo": "bar"})
996
+ assert specification is Specification.OPAQUE
997
+
998
+ def test_detect_unneeded_default(self):
999
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
1000
+ specification = Specification.OPAQUE.detect(schema)
1001
+ assert specification == DRAFT202012
1002
+
1003
+ def test_non_mapping_detect(self):
1004
+ with pytest.raises(exceptions.CannotDetermineSpecification):
1005
+ Specification.detect(True)
1006
+
1007
+ def test_non_mapping_detect_with_default(self):
1008
+ specification = ID_AND_CHILDREN.detect(True)
1009
+ assert specification is ID_AND_CHILDREN
1010
+
1011
+ def test_detect_with_fallback(self):
1012
+ specification = Specification.OPAQUE.detect({"foo": "bar"})
1013
+ assert specification is Specification.OPAQUE
1014
+
1015
+ def test_repr(self):
1016
+ assert (
1017
+ repr(ID_AND_CHILDREN) == "<Specification name='id-and-children'>"
1018
+ )
1019
+
1020
+
1021
+ class TestOpaqueSpecification:
1022
+ THINGS = [{"foo": "bar"}, True, 37, "foo", object()]
1023
+
1024
+ @pytest.mark.parametrize("thing", THINGS)
1025
+ def test_no_id(self, thing):
1026
+ """
1027
+ An arbitrary thing has no ID.
1028
+ """
1029
+
1030
+ assert Specification.OPAQUE.id_of(thing) is None
1031
+
1032
+ @pytest.mark.parametrize("thing", THINGS)
1033
+ def test_no_subresources(self, thing):
1034
+ """
1035
+ An arbitrary thing has no subresources.
1036
+ """
1037
+
1038
+ assert list(Specification.OPAQUE.subresources_of(thing)) == []
1039
+
1040
+ @pytest.mark.parametrize("thing", THINGS)
1041
+ def test_no_anchors(self, thing):
1042
+ """
1043
+ An arbitrary thing has no anchors.
1044
+ """
1045
+
1046
+ assert list(Specification.OPAQUE.anchors_in(thing)) == []
1047
+
1048
+
1049
+ @pytest.mark.parametrize(
1050
+ "cls",
1051
+ [Anchor, Registry, Resource, Specification, exceptions.PointerToNowhere],
1052
+ )
1053
+ def test_nonsubclassable(cls):
1054
+ with pytest.raises(Exception, match="(?i)subclassing"):
1055
+
1056
+ class Boom(cls): # pragma: no cover
1057
+ pass
valley/lib/python3.10/site-packages/referencing/tests/test_exceptions.py ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import itertools
2
+
3
+ import pytest
4
+
5
+ from referencing import Resource, exceptions
6
+
7
+
8
+ def pairs(choices):
9
+ return itertools.combinations(choices, 2)
10
+
11
+
12
+ TRUE = Resource.opaque(True)
13
+
14
+
15
+ thunks = (
16
+ lambda: exceptions.CannotDetermineSpecification(TRUE),
17
+ lambda: exceptions.NoSuchResource("urn:example:foo"),
18
+ lambda: exceptions.NoInternalID(TRUE),
19
+ lambda: exceptions.InvalidAnchor(resource=TRUE, anchor="foo", ref="a#b"),
20
+ lambda: exceptions.NoSuchAnchor(resource=TRUE, anchor="foo", ref="a#b"),
21
+ lambda: exceptions.PointerToNowhere(resource=TRUE, ref="urn:example:foo"),
22
+ lambda: exceptions.Unresolvable("urn:example:foo"),
23
+ lambda: exceptions.Unretrievable("urn:example:foo"),
24
+ )
25
+
26
+
27
+ @pytest.mark.parametrize("one, two", pairs(each() for each in thunks))
28
+ def test_eq_incompatible_types(one, two):
29
+ assert one != two
30
+
31
+
32
+ @pytest.mark.parametrize("thunk", thunks)
33
+ def test_hash(thunk):
34
+ assert thunk() in {thunk()}
valley/lib/python3.10/site-packages/referencing/tests/test_jsonschema.py ADDED
@@ -0,0 +1,382 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import pytest
2
+
3
+ from referencing import Registry, Resource, Specification
4
+ import referencing.jsonschema
5
+
6
+
7
+ @pytest.mark.parametrize(
8
+ "uri, expected",
9
+ [
10
+ (
11
+ "https://json-schema.org/draft/2020-12/schema",
12
+ referencing.jsonschema.DRAFT202012,
13
+ ),
14
+ (
15
+ "https://json-schema.org/draft/2019-09/schema",
16
+ referencing.jsonschema.DRAFT201909,
17
+ ),
18
+ (
19
+ "http://json-schema.org/draft-07/schema#",
20
+ referencing.jsonschema.DRAFT7,
21
+ ),
22
+ (
23
+ "http://json-schema.org/draft-06/schema#",
24
+ referencing.jsonschema.DRAFT6,
25
+ ),
26
+ (
27
+ "http://json-schema.org/draft-04/schema#",
28
+ referencing.jsonschema.DRAFT4,
29
+ ),
30
+ (
31
+ "http://json-schema.org/draft-03/schema#",
32
+ referencing.jsonschema.DRAFT3,
33
+ ),
34
+ ],
35
+ )
36
+ def test_schemas_with_explicit_schema_keywords_are_detected(uri, expected):
37
+ """
38
+ The $schema keyword in JSON Schema is a dialect identifier.
39
+ """
40
+ contents = {"$schema": uri}
41
+ resource = Resource.from_contents(contents)
42
+ assert resource == Resource(contents=contents, specification=expected)
43
+
44
+
45
+ def test_unknown_dialect():
46
+ dialect_id = "http://example.com/unknown-json-schema-dialect-id"
47
+ with pytest.raises(referencing.jsonschema.UnknownDialect) as excinfo:
48
+ Resource.from_contents({"$schema": dialect_id})
49
+ assert excinfo.value.uri == dialect_id
50
+
51
+
52
+ @pytest.mark.parametrize(
53
+ "id, specification",
54
+ [
55
+ ("$id", referencing.jsonschema.DRAFT202012),
56
+ ("$id", referencing.jsonschema.DRAFT201909),
57
+ ("$id", referencing.jsonschema.DRAFT7),
58
+ ("$id", referencing.jsonschema.DRAFT6),
59
+ ("id", referencing.jsonschema.DRAFT4),
60
+ ("id", referencing.jsonschema.DRAFT3),
61
+ ],
62
+ )
63
+ def test_id_of_mapping(id, specification):
64
+ uri = "http://example.com/some-schema"
65
+ assert specification.id_of({id: uri}) == uri
66
+
67
+
68
+ @pytest.mark.parametrize(
69
+ "specification",
70
+ [
71
+ referencing.jsonschema.DRAFT202012,
72
+ referencing.jsonschema.DRAFT201909,
73
+ referencing.jsonschema.DRAFT7,
74
+ referencing.jsonschema.DRAFT6,
75
+ ],
76
+ )
77
+ @pytest.mark.parametrize("value", [True, False])
78
+ def test_id_of_bool(specification, value):
79
+ assert specification.id_of(value) is None
80
+
81
+
82
+ @pytest.mark.parametrize(
83
+ "specification",
84
+ [
85
+ referencing.jsonschema.DRAFT202012,
86
+ referencing.jsonschema.DRAFT201909,
87
+ referencing.jsonschema.DRAFT7,
88
+ referencing.jsonschema.DRAFT6,
89
+ ],
90
+ )
91
+ @pytest.mark.parametrize("value", [True, False])
92
+ def test_anchors_in_bool(specification, value):
93
+ assert list(specification.anchors_in(value)) == []
94
+
95
+
96
+ @pytest.mark.parametrize(
97
+ "specification",
98
+ [
99
+ referencing.jsonschema.DRAFT202012,
100
+ referencing.jsonschema.DRAFT201909,
101
+ referencing.jsonschema.DRAFT7,
102
+ referencing.jsonschema.DRAFT6,
103
+ ],
104
+ )
105
+ @pytest.mark.parametrize("value", [True, False])
106
+ def test_subresources_of_bool(specification, value):
107
+ assert list(specification.subresources_of(value)) == []
108
+
109
+
110
+ @pytest.mark.parametrize(
111
+ "uri, expected",
112
+ [
113
+ (
114
+ "https://json-schema.org/draft/2020-12/schema",
115
+ referencing.jsonschema.DRAFT202012,
116
+ ),
117
+ (
118
+ "https://json-schema.org/draft/2019-09/schema",
119
+ referencing.jsonschema.DRAFT201909,
120
+ ),
121
+ (
122
+ "http://json-schema.org/draft-07/schema#",
123
+ referencing.jsonschema.DRAFT7,
124
+ ),
125
+ (
126
+ "http://json-schema.org/draft-06/schema#",
127
+ referencing.jsonschema.DRAFT6,
128
+ ),
129
+ (
130
+ "http://json-schema.org/draft-04/schema#",
131
+ referencing.jsonschema.DRAFT4,
132
+ ),
133
+ (
134
+ "http://json-schema.org/draft-03/schema#",
135
+ referencing.jsonschema.DRAFT3,
136
+ ),
137
+ ],
138
+ )
139
+ def test_specification_with(uri, expected):
140
+ assert referencing.jsonschema.specification_with(uri) == expected
141
+
142
+
143
+ @pytest.mark.parametrize(
144
+ "uri, expected",
145
+ [
146
+ (
147
+ "http://json-schema.org/draft-07/schema",
148
+ referencing.jsonschema.DRAFT7,
149
+ ),
150
+ (
151
+ "http://json-schema.org/draft-06/schema",
152
+ referencing.jsonschema.DRAFT6,
153
+ ),
154
+ (
155
+ "http://json-schema.org/draft-04/schema",
156
+ referencing.jsonschema.DRAFT4,
157
+ ),
158
+ (
159
+ "http://json-schema.org/draft-03/schema",
160
+ referencing.jsonschema.DRAFT3,
161
+ ),
162
+ ],
163
+ )
164
+ def test_specification_with_no_empty_fragment(uri, expected):
165
+ assert referencing.jsonschema.specification_with(uri) == expected
166
+
167
+
168
+ def test_specification_with_unknown_dialect():
169
+ dialect_id = "http://example.com/unknown-json-schema-dialect-id"
170
+ with pytest.raises(referencing.jsonschema.UnknownDialect) as excinfo:
171
+ referencing.jsonschema.specification_with(dialect_id)
172
+ assert excinfo.value.uri == dialect_id
173
+
174
+
175
+ def test_specification_with_default():
176
+ dialect_id = "http://example.com/unknown-json-schema-dialect-id"
177
+ specification = referencing.jsonschema.specification_with(
178
+ dialect_id,
179
+ default=Specification.OPAQUE,
180
+ )
181
+ assert specification is Specification.OPAQUE
182
+
183
+
184
+ # FIXME: The tests below should move to the referencing suite but I haven't yet
185
+ # figured out how to represent dynamic (& recursive) ref lookups in it.
186
+ def test_lookup_trivial_dynamic_ref():
187
+ one = referencing.jsonschema.DRAFT202012.create_resource(
188
+ {"$dynamicAnchor": "foo"},
189
+ )
190
+ resolver = Registry().with_resource("http://example.com", one).resolver()
191
+ resolved = resolver.lookup("http://example.com#foo")
192
+ assert resolved.contents == one.contents
193
+
194
+
195
+ def test_multiple_lookup_trivial_dynamic_ref():
196
+ TRUE = referencing.jsonschema.DRAFT202012.create_resource(True)
197
+ root = referencing.jsonschema.DRAFT202012.create_resource(
198
+ {
199
+ "$id": "http://example.com",
200
+ "$dynamicAnchor": "fooAnchor",
201
+ "$defs": {
202
+ "foo": {
203
+ "$id": "foo",
204
+ "$dynamicAnchor": "fooAnchor",
205
+ "$defs": {
206
+ "bar": True,
207
+ "baz": {
208
+ "$dynamicAnchor": "fooAnchor",
209
+ },
210
+ },
211
+ },
212
+ },
213
+ },
214
+ )
215
+ resolver = (
216
+ Registry()
217
+ .with_resources(
218
+ [
219
+ ("http://example.com", root),
220
+ ("http://example.com/foo/", TRUE),
221
+ ("http://example.com/foo/bar", root),
222
+ ],
223
+ )
224
+ .resolver()
225
+ )
226
+
227
+ first = resolver.lookup("http://example.com")
228
+ second = first.resolver.lookup("foo/")
229
+ resolver = second.resolver.lookup("bar").resolver
230
+ fourth = resolver.lookup("#fooAnchor")
231
+ assert fourth.contents == root.contents
232
+
233
+
234
+ def test_multiple_lookup_dynamic_ref_to_nondynamic_ref():
235
+ one = referencing.jsonschema.DRAFT202012.create_resource(
236
+ {"$anchor": "fooAnchor"},
237
+ )
238
+ two = referencing.jsonschema.DRAFT202012.create_resource(
239
+ {
240
+ "$id": "http://example.com",
241
+ "$dynamicAnchor": "fooAnchor",
242
+ "$defs": {
243
+ "foo": {
244
+ "$id": "foo",
245
+ "$dynamicAnchor": "fooAnchor",
246
+ "$defs": {
247
+ "bar": True,
248
+ "baz": {
249
+ "$dynamicAnchor": "fooAnchor",
250
+ },
251
+ },
252
+ },
253
+ },
254
+ },
255
+ )
256
+ resolver = (
257
+ Registry()
258
+ .with_resources(
259
+ [
260
+ ("http://example.com", two),
261
+ ("http://example.com/foo/", one),
262
+ ("http://example.com/foo/bar", two),
263
+ ],
264
+ )
265
+ .resolver()
266
+ )
267
+
268
+ first = resolver.lookup("http://example.com")
269
+ second = first.resolver.lookup("foo/")
270
+ resolver = second.resolver.lookup("bar").resolver
271
+ fourth = resolver.lookup("#fooAnchor")
272
+ assert fourth.contents == two.contents
273
+
274
+
275
+ def test_lookup_trivial_recursive_ref():
276
+ one = referencing.jsonschema.DRAFT201909.create_resource(
277
+ {"$recursiveAnchor": True},
278
+ )
279
+ resolver = Registry().with_resource("http://example.com", one).resolver()
280
+ first = resolver.lookup("http://example.com")
281
+ resolved = referencing.jsonschema.lookup_recursive_ref(
282
+ resolver=first.resolver,
283
+ )
284
+ assert resolved.contents == one.contents
285
+
286
+
287
+ def test_lookup_recursive_ref_to_bool():
288
+ TRUE = referencing.jsonschema.DRAFT201909.create_resource(True)
289
+ registry = Registry({"http://example.com": TRUE})
290
+ resolved = referencing.jsonschema.lookup_recursive_ref(
291
+ resolver=registry.resolver(base_uri="http://example.com"),
292
+ )
293
+ assert resolved.contents == TRUE.contents
294
+
295
+
296
+ def test_multiple_lookup_recursive_ref_to_bool():
297
+ TRUE = referencing.jsonschema.DRAFT201909.create_resource(True)
298
+ root = referencing.jsonschema.DRAFT201909.create_resource(
299
+ {
300
+ "$id": "http://example.com",
301
+ "$recursiveAnchor": True,
302
+ "$defs": {
303
+ "foo": {
304
+ "$id": "foo",
305
+ "$recursiveAnchor": True,
306
+ "$defs": {
307
+ "bar": True,
308
+ "baz": {
309
+ "$recursiveAnchor": True,
310
+ "$anchor": "fooAnchor",
311
+ },
312
+ },
313
+ },
314
+ },
315
+ },
316
+ )
317
+ resolver = (
318
+ Registry()
319
+ .with_resources(
320
+ [
321
+ ("http://example.com", root),
322
+ ("http://example.com/foo/", TRUE),
323
+ ("http://example.com/foo/bar", root),
324
+ ],
325
+ )
326
+ .resolver()
327
+ )
328
+
329
+ first = resolver.lookup("http://example.com")
330
+ second = first.resolver.lookup("foo/")
331
+ resolver = second.resolver.lookup("bar").resolver
332
+ fourth = referencing.jsonschema.lookup_recursive_ref(resolver=resolver)
333
+ assert fourth.contents == root.contents
334
+
335
+
336
+ def test_multiple_lookup_recursive_ref_with_nonrecursive_ref():
337
+ one = referencing.jsonschema.DRAFT201909.create_resource(
338
+ {"$recursiveAnchor": True},
339
+ )
340
+ two = referencing.jsonschema.DRAFT201909.create_resource(
341
+ {
342
+ "$id": "http://example.com",
343
+ "$recursiveAnchor": True,
344
+ "$defs": {
345
+ "foo": {
346
+ "$id": "foo",
347
+ "$recursiveAnchor": True,
348
+ "$defs": {
349
+ "bar": True,
350
+ "baz": {
351
+ "$recursiveAnchor": True,
352
+ "$anchor": "fooAnchor",
353
+ },
354
+ },
355
+ },
356
+ },
357
+ },
358
+ )
359
+ three = referencing.jsonschema.DRAFT201909.create_resource(
360
+ {"$recursiveAnchor": False},
361
+ )
362
+ resolver = (
363
+ Registry()
364
+ .with_resources(
365
+ [
366
+ ("http://example.com", three),
367
+ ("http://example.com/foo/", two),
368
+ ("http://example.com/foo/bar", one),
369
+ ],
370
+ )
371
+ .resolver()
372
+ )
373
+
374
+ first = resolver.lookup("http://example.com")
375
+ second = first.resolver.lookup("foo/")
376
+ resolver = second.resolver.lookup("bar").resolver
377
+ fourth = referencing.jsonschema.lookup_recursive_ref(resolver=resolver)
378
+ assert fourth.contents == two.contents
379
+
380
+
381
+ def test_empty_registry():
382
+ assert referencing.jsonschema.EMPTY_REGISTRY == Registry()
valley/lib/python3.10/site-packages/referencing/tests/test_referencing_suite.py ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from pathlib import Path
2
+ import json
3
+ import os
4
+
5
+ import pytest
6
+
7
+ from referencing import Registry
8
+ from referencing.exceptions import Unresolvable
9
+ import referencing.jsonschema
10
+
11
+
12
+ class SuiteNotFound(Exception):
13
+ def __str__(self): # pragma: no cover
14
+ return (
15
+ "Cannot find the referencing suite. "
16
+ "Set the REFERENCING_SUITE environment variable to the path to "
17
+ "the suite, or run the test suite from alongside a full checkout "
18
+ "of the git repository."
19
+ )
20
+
21
+
22
+ if "REFERENCING_SUITE" in os.environ: # pragma: no cover
23
+ SUITE = Path(os.environ["REFERENCING_SUITE"]) / "tests"
24
+ else:
25
+ SUITE = Path(__file__).parent.parent.parent / "suite/tests"
26
+ if not SUITE.is_dir(): # pragma: no cover
27
+ raise SuiteNotFound()
28
+ DIALECT_IDS = json.loads(SUITE.joinpath("specifications.json").read_text())
29
+
30
+
31
+ @pytest.mark.parametrize(
32
+ "test_path",
33
+ [
34
+ pytest.param(each, id=f"{each.parent.name}-{each.stem}")
35
+ for each in SUITE.glob("*/**/*.json")
36
+ ],
37
+ )
38
+ def test_referencing_suite(test_path, subtests):
39
+ dialect_id = DIALECT_IDS[test_path.relative_to(SUITE).parts[0]]
40
+ specification = referencing.jsonschema.specification_with(dialect_id)
41
+ loaded = json.loads(test_path.read_text())
42
+ registry = loaded["registry"]
43
+ registry = Registry().with_resources(
44
+ (uri, specification.create_resource(contents))
45
+ for uri, contents in loaded["registry"].items()
46
+ )
47
+ for test in loaded["tests"]:
48
+ with subtests.test(test=test):
49
+ if "normalization" in test_path.stem:
50
+ pytest.xfail("APIs need to change for proper URL support.")
51
+
52
+ resolver = registry.resolver(base_uri=test.get("base_uri", ""))
53
+
54
+ if test.get("error"):
55
+ with pytest.raises(Unresolvable):
56
+ resolver.lookup(test["ref"])
57
+ else:
58
+ resolved = resolver.lookup(test["ref"])
59
+ assert resolved.contents == test["target"]
60
+
61
+ then = test.get("then")
62
+ while then: # pragma: no cover
63
+ with subtests.test(test=test, then=then):
64
+ resolved = resolved.resolver.lookup(then["ref"])
65
+ assert resolved.contents == then["target"]
66
+ then = then.get("then")
valley/lib/python3.10/site-packages/referencing/tests/test_retrieval.py ADDED
@@ -0,0 +1,106 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from functools import lru_cache
2
+ import json
3
+
4
+ import pytest
5
+
6
+ from referencing import Registry, Resource, exceptions
7
+ from referencing.jsonschema import DRAFT202012
8
+ from referencing.retrieval import to_cached_resource
9
+
10
+
11
+ class TestToCachedResource:
12
+ def test_it_caches_retrieved_resources(self):
13
+ contents = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
14
+ stack = [json.dumps(contents)]
15
+
16
+ @to_cached_resource()
17
+ def retrieve(uri):
18
+ return stack.pop()
19
+
20
+ registry = Registry(retrieve=retrieve)
21
+
22
+ expected = Resource.from_contents(contents)
23
+
24
+ got = registry.get_or_retrieve("urn:example:schema")
25
+ assert got.value == expected
26
+
27
+ # And a second time we get the same value.
28
+ again = registry.get_or_retrieve("urn:example:schema")
29
+ assert again.value is got.value
30
+
31
+ def test_custom_loader(self):
32
+ contents = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
33
+ stack = [json.dumps(contents)[::-1]]
34
+
35
+ @to_cached_resource(loads=lambda s: json.loads(s[::-1]))
36
+ def retrieve(uri):
37
+ return stack.pop()
38
+
39
+ registry = Registry(retrieve=retrieve)
40
+
41
+ expected = Resource.from_contents(contents)
42
+
43
+ got = registry.get_or_retrieve("urn:example:schema")
44
+ assert got.value == expected
45
+
46
+ # And a second time we get the same value.
47
+ again = registry.get_or_retrieve("urn:example:schema")
48
+ assert again.value is got.value
49
+
50
+ def test_custom_from_contents(self):
51
+ contents = {}
52
+ stack = [json.dumps(contents)]
53
+
54
+ @to_cached_resource(from_contents=DRAFT202012.create_resource)
55
+ def retrieve(uri):
56
+ return stack.pop()
57
+
58
+ registry = Registry(retrieve=retrieve)
59
+
60
+ expected = DRAFT202012.create_resource(contents)
61
+
62
+ got = registry.get_or_retrieve("urn:example:schema")
63
+ assert got.value == expected
64
+
65
+ # And a second time we get the same value.
66
+ again = registry.get_or_retrieve("urn:example:schema")
67
+ assert again.value is got.value
68
+
69
+ def test_custom_cache(self):
70
+ schema = {"$schema": "https://json-schema.org/draft/2020-12/schema"}
71
+ mapping = {
72
+ "urn:example:1": dict(schema, foo=1),
73
+ "urn:example:2": dict(schema, foo=2),
74
+ "urn:example:3": dict(schema, foo=3),
75
+ }
76
+
77
+ resources = {
78
+ uri: Resource.from_contents(contents)
79
+ for uri, contents in mapping.items()
80
+ }
81
+
82
+ @to_cached_resource(cache=lru_cache(maxsize=2))
83
+ def retrieve(uri):
84
+ return json.dumps(mapping.pop(uri))
85
+
86
+ registry = Registry(retrieve=retrieve)
87
+
88
+ got = registry.get_or_retrieve("urn:example:1")
89
+ assert got.value == resources["urn:example:1"]
90
+ assert registry.get_or_retrieve("urn:example:1").value is got.value
91
+ assert registry.get_or_retrieve("urn:example:1").value is got.value
92
+
93
+ got = registry.get_or_retrieve("urn:example:2")
94
+ assert got.value == resources["urn:example:2"]
95
+ assert registry.get_or_retrieve("urn:example:2").value is got.value
96
+ assert registry.get_or_retrieve("urn:example:2").value is got.value
97
+
98
+ # This still succeeds, but evicts the first URI
99
+ got = registry.get_or_retrieve("urn:example:3")
100
+ assert got.value == resources["urn:example:3"]
101
+ assert registry.get_or_retrieve("urn:example:3").value is got.value
102
+ assert registry.get_or_retrieve("urn:example:3").value is got.value
103
+
104
+ # And now this fails (as we popped the value out of `mapping`)
105
+ with pytest.raises(exceptions.Unretrievable):
106
+ registry.get_or_retrieve("urn:example:1")
valley/lib/python3.10/site-packages/referencing/typing.py ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Type-annotation related support for the referencing library.
3
+ """
4
+
5
+ from __future__ import annotations
6
+
7
+ from typing import TYPE_CHECKING, Protocol, TypeVar
8
+
9
+ try:
10
+ from collections.abc import Mapping as Mapping
11
+
12
+ Mapping[str, str]
13
+ except TypeError: # pragma: no cover
14
+ from typing import Mapping as Mapping
15
+
16
+
17
+ if TYPE_CHECKING:
18
+ from referencing._core import Resolved, Resolver, Resource
19
+
20
+ #: A URI which identifies a `Resource`.
21
+ URI = str
22
+
23
+ #: The type of documents within a registry.
24
+ D = TypeVar("D")
25
+
26
+
27
+ class Retrieve(Protocol[D]):
28
+ """
29
+ A retrieval callable, usable within a `Registry` for resource retrieval.
30
+
31
+ Does not make assumptions about where the resource might be coming from.
32
+ """
33
+
34
+ def __call__(self, uri: URI) -> Resource[D]:
35
+ """
36
+ Retrieve the resource with the given URI.
37
+
38
+ Raise `referencing.exceptions.NoSuchResource` if you wish to indicate
39
+ the retriever cannot lookup the given URI.
40
+ """
41
+ ...
42
+
43
+
44
+ class Anchor(Protocol[D]):
45
+ """
46
+ An anchor within a `Resource`.
47
+
48
+ Beyond "simple" anchors, some specifications like JSON Schema's 2020
49
+ version have dynamic anchors.
50
+ """
51
+
52
+ @property
53
+ def name(self) -> str:
54
+ """
55
+ Return the name of this anchor.
56
+ """
57
+ ...
58
+
59
+ def resolve(self, resolver: Resolver[D]) -> Resolved[D]:
60
+ """
61
+ Return the resource for this anchor.
62
+ """
63
+ ...
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ This software is made available under the terms of *either* of the
2
+ licenses found in LICENSE.APACHE2 or LICENSE.MIT. Contributions to are
3
+ made under the terms of *both* these licenses.
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE.APACHE2 ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ Apache License
3
+ Version 2.0, January 2004
4
+ http://www.apache.org/licenses/
5
+
6
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
7
+
8
+ 1. Definitions.
9
+
10
+ "License" shall mean the terms and conditions for use, reproduction,
11
+ and distribution as defined by Sections 1 through 9 of this document.
12
+
13
+ "Licensor" shall mean the copyright owner or entity authorized by
14
+ the copyright owner that is granting the License.
15
+
16
+ "Legal Entity" shall mean the union of the acting entity and all
17
+ other entities that control, are controlled by, or are under common
18
+ control with that entity. For the purposes of this definition,
19
+ "control" means (i) the power, direct or indirect, to cause the
20
+ direction or management of such entity, whether by contract or
21
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
+ outstanding shares, or (iii) beneficial ownership of such entity.
23
+
24
+ "You" (or "Your") shall mean an individual or Legal Entity
25
+ exercising permissions granted by this License.
26
+
27
+ "Source" form shall mean the preferred form for making modifications,
28
+ including but not limited to software source code, documentation
29
+ source, and configuration files.
30
+
31
+ "Object" form shall mean any form resulting from mechanical
32
+ transformation or translation of a Source form, including but
33
+ not limited to compiled object code, generated documentation,
34
+ and conversions to other media types.
35
+
36
+ "Work" shall mean the work of authorship, whether in Source or
37
+ Object form, made available under the License, as indicated by a
38
+ copyright notice that is included in or attached to the work
39
+ (an example is provided in the Appendix below).
40
+
41
+ "Derivative Works" shall mean any work, whether in Source or Object
42
+ form, that is based on (or derived from) the Work and for which the
43
+ editorial revisions, annotations, elaborations, or other modifications
44
+ represent, as a whole, an original work of authorship. For the purposes
45
+ of this License, Derivative Works shall not include works that remain
46
+ separable from, or merely link (or bind by name) to the interfaces of,
47
+ the Work and Derivative Works thereof.
48
+
49
+ "Contribution" shall mean any work of authorship, including
50
+ the original version of the Work and any modifications or additions
51
+ to that Work or Derivative Works thereof, that is intentionally
52
+ submitted to Licensor for inclusion in the Work by the copyright owner
53
+ or by an individual or Legal Entity authorized to submit on behalf of
54
+ the copyright owner. For the purposes of this definition, "submitted"
55
+ means any form of electronic, verbal, or written communication sent
56
+ to the Licensor or its representatives, including but not limited to
57
+ communication on electronic mailing lists, source code control systems,
58
+ and issue tracking systems that are managed by, or on behalf of, the
59
+ Licensor for the purpose of discussing and improving the Work, but
60
+ excluding communication that is conspicuously marked or otherwise
61
+ designated in writing by the copyright owner as "Not a Contribution."
62
+
63
+ "Contributor" shall mean Licensor and any individual or Legal Entity
64
+ on behalf of whom a Contribution has been received by Licensor and
65
+ subsequently incorporated within the Work.
66
+
67
+ 2. Grant of Copyright License. Subject to the terms and conditions of
68
+ this License, each Contributor hereby grants to You a perpetual,
69
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
+ copyright license to reproduce, prepare Derivative Works of,
71
+ publicly display, publicly perform, sublicense, and distribute the
72
+ Work and such Derivative Works in Source or Object form.
73
+
74
+ 3. Grant of Patent License. Subject to the terms and conditions of
75
+ this License, each Contributor hereby grants to You a perpetual,
76
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
+ (except as stated in this section) patent license to make, have made,
78
+ use, offer to sell, sell, import, and otherwise transfer the Work,
79
+ where such license applies only to those patent claims licensable
80
+ by such Contributor that are necessarily infringed by their
81
+ Contribution(s) alone or by combination of their Contribution(s)
82
+ with the Work to which such Contribution(s) was submitted. If You
83
+ institute patent litigation against any entity (including a
84
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
85
+ or a Contribution incorporated within the Work constitutes direct
86
+ or contributory patent infringement, then any patent licenses
87
+ granted to You under this License for that Work shall terminate
88
+ as of the date such litigation is filed.
89
+
90
+ 4. Redistribution. You may reproduce and distribute copies of the
91
+ Work or Derivative Works thereof in any medium, with or without
92
+ modifications, and in Source or Object form, provided that You
93
+ meet the following conditions:
94
+
95
+ (a) You must give any other recipients of the Work or
96
+ Derivative Works a copy of this License; and
97
+
98
+ (b) You must cause any modified files to carry prominent notices
99
+ stating that You changed the files; and
100
+
101
+ (c) You must retain, in the Source form of any Derivative Works
102
+ that You distribute, all copyright, patent, trademark, and
103
+ attribution notices from the Source form of the Work,
104
+ excluding those notices that do not pertain to any part of
105
+ the Derivative Works; and
106
+
107
+ (d) If the Work includes a "NOTICE" text file as part of its
108
+ distribution, then any Derivative Works that You distribute must
109
+ include a readable copy of the attribution notices contained
110
+ within such NOTICE file, excluding those notices that do not
111
+ pertain to any part of the Derivative Works, in at least one
112
+ of the following places: within a NOTICE text file distributed
113
+ as part of the Derivative Works; within the Source form or
114
+ documentation, if provided along with the Derivative Works; or,
115
+ within a display generated by the Derivative Works, if and
116
+ wherever such third-party notices normally appear. The contents
117
+ of the NOTICE file are for informational purposes only and
118
+ do not modify the License. You may add Your own attribution
119
+ notices within Derivative Works that You distribute, alongside
120
+ or as an addendum to the NOTICE text from the Work, provided
121
+ that such additional attribution notices cannot be construed
122
+ as modifying the License.
123
+
124
+ You may add Your own copyright statement to Your modifications and
125
+ may provide additional or different license terms and conditions
126
+ for use, reproduction, or distribution of Your modifications, or
127
+ for any such Derivative Works as a whole, provided Your use,
128
+ reproduction, and distribution of the Work otherwise complies with
129
+ the conditions stated in this License.
130
+
131
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
132
+ any Contribution intentionally submitted for inclusion in the Work
133
+ by You to the Licensor shall be under the terms and conditions of
134
+ this License, without any additional terms or conditions.
135
+ Notwithstanding the above, nothing herein shall supersede or modify
136
+ the terms of any separate license agreement you may have executed
137
+ with Licensor regarding such Contributions.
138
+
139
+ 6. Trademarks. This License does not grant permission to use the trade
140
+ names, trademarks, service marks, or product names of the Licensor,
141
+ except as required for reasonable and customary use in describing the
142
+ origin of the Work and reproducing the content of the NOTICE file.
143
+
144
+ 7. Disclaimer of Warranty. Unless required by applicable law or
145
+ agreed to in writing, Licensor provides the Work (and each
146
+ Contributor provides its Contributions) on an "AS IS" BASIS,
147
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
+ implied, including, without limitation, any warranties or conditions
149
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
+ PARTICULAR PURPOSE. You are solely responsible for determining the
151
+ appropriateness of using or redistributing the Work and assume any
152
+ risks associated with Your exercise of permissions under this License.
153
+
154
+ 8. Limitation of Liability. In no event and under no legal theory,
155
+ whether in tort (including negligence), contract, or otherwise,
156
+ unless required by applicable law (such as deliberate and grossly
157
+ negligent acts) or agreed to in writing, shall any Contributor be
158
+ liable to You for damages, including any direct, indirect, special,
159
+ incidental, or consequential damages of any character arising as a
160
+ result of this License or out of the use or inability to use the
161
+ Work (including but not limited to damages for loss of goodwill,
162
+ work stoppage, computer failure or malfunction, or any and all
163
+ other commercial damages or losses), even if such Contributor
164
+ has been advised of the possibility of such damages.
165
+
166
+ 9. Accepting Warranty or Additional Liability. While redistributing
167
+ the Work or Derivative Works thereof, You may choose to offer,
168
+ and charge a fee for, acceptance of support, warranty, indemnity,
169
+ or other liability obligations and/or rights consistent with this
170
+ License. However, in accepting such obligations, You may act only
171
+ on Your own behalf and on Your sole responsibility, not on behalf
172
+ of any other Contributor, and only if You agree to indemnify,
173
+ defend, and hold each Contributor harmless for any liability
174
+ incurred by, or claims asserted against, such Contributor by reason
175
+ of your accepting any such warranty or additional liability.
176
+
177
+ END OF TERMS AND CONDITIONS
178
+
179
+ APPENDIX: How to apply the Apache License to your work.
180
+
181
+ To apply the Apache License to your work, attach the following
182
+ boilerplate notice, with the fields enclosed by brackets "[]"
183
+ replaced with your own identifying information. (Don't include
184
+ the brackets!) The text should be enclosed in the appropriate
185
+ comment syntax for the file format. We also recommend that a
186
+ file or class name and description of purpose be included on the
187
+ same "printed page" as the copyright notice for easier
188
+ identification within third-party archives.
189
+
190
+ Copyright [yyyy] [name of copyright owner]
191
+
192
+ Licensed under the Apache License, Version 2.0 (the "License");
193
+ you may not use this file except in compliance with the License.
194
+ You may obtain a copy of the License at
195
+
196
+ http://www.apache.org/licenses/LICENSE-2.0
197
+
198
+ Unless required by applicable law or agreed to in writing, software
199
+ distributed under the License is distributed on an "AS IS" BASIS,
200
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201
+ See the License for the specific language governing permissions and
202
+ limitations under the License.
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/LICENSE.MIT ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ The MIT License (MIT)
2
+
3
+ Permission is hereby granted, free of charge, to any person obtaining
4
+ a copy of this software and associated documentation files (the
5
+ "Software"), to deal in the Software without restriction, including
6
+ without limitation the rights to use, copy, modify, merge, publish,
7
+ distribute, sublicense, and/or sell copies of the Software, and to
8
+ permit persons to whom the Software is furnished to do so, subject to
9
+ the following conditions:
10
+
11
+ The above copyright notice and this permission notice shall be
12
+ included in all copies or substantial portions of the Software.
13
+
14
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
15
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
16
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
17
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
18
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
19
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
20
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/METADATA ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: sniffio
3
+ Version: 1.3.1
4
+ Summary: Sniff out which async library your code is running under
5
+ Author-email: "Nathaniel J. Smith" <njs@pobox.com>
6
+ License: MIT OR Apache-2.0
7
+ Project-URL: Homepage, https://github.com/python-trio/sniffio
8
+ Project-URL: Documentation, https://sniffio.readthedocs.io/
9
+ Project-URL: Changelog, https://sniffio.readthedocs.io/en/latest/history.html
10
+ Keywords: async,trio,asyncio
11
+ Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: License :: OSI Approved :: Apache Software License
13
+ Classifier: Framework :: Trio
14
+ Classifier: Framework :: AsyncIO
15
+ Classifier: Operating System :: POSIX :: Linux
16
+ Classifier: Operating System :: MacOS :: MacOS X
17
+ Classifier: Operating System :: Microsoft :: Windows
18
+ Classifier: Programming Language :: Python :: 3 :: Only
19
+ Classifier: Programming Language :: Python :: Implementation :: CPython
20
+ Classifier: Programming Language :: Python :: Implementation :: PyPy
21
+ Classifier: Intended Audience :: Developers
22
+ Classifier: Development Status :: 5 - Production/Stable
23
+ Requires-Python: >=3.7
24
+ Description-Content-Type: text/x-rst
25
+ License-File: LICENSE
26
+ License-File: LICENSE.APACHE2
27
+ License-File: LICENSE.MIT
28
+
29
+ .. image:: https://img.shields.io/badge/chat-join%20now-blue.svg
30
+ :target: https://gitter.im/python-trio/general
31
+ :alt: Join chatroom
32
+
33
+ .. image:: https://img.shields.io/badge/docs-read%20now-blue.svg
34
+ :target: https://sniffio.readthedocs.io/en/latest/?badge=latest
35
+ :alt: Documentation Status
36
+
37
+ .. image:: https://img.shields.io/pypi/v/sniffio.svg
38
+ :target: https://pypi.org/project/sniffio
39
+ :alt: Latest PyPi version
40
+
41
+ .. image:: https://img.shields.io/conda/vn/conda-forge/sniffio.svg
42
+ :target: https://anaconda.org/conda-forge/sniffio
43
+ :alt: Latest conda-forge version
44
+
45
+ .. image:: https://travis-ci.org/python-trio/sniffio.svg?branch=master
46
+ :target: https://travis-ci.org/python-trio/sniffio
47
+ :alt: Automated test status
48
+
49
+ .. image:: https://codecov.io/gh/python-trio/sniffio/branch/master/graph/badge.svg
50
+ :target: https://codecov.io/gh/python-trio/sniffio
51
+ :alt: Test coverage
52
+
53
+ =================================================================
54
+ sniffio: Sniff out which async library your code is running under
55
+ =================================================================
56
+
57
+ You're writing a library. You've decided to be ambitious, and support
58
+ multiple async I/O packages, like `Trio
59
+ <https://trio.readthedocs.io>`__, and `asyncio
60
+ <https://docs.python.org/3/library/asyncio.html>`__, and ... You've
61
+ written a bunch of clever code to handle all the differences. But...
62
+ how do you know *which* piece of clever code to run?
63
+
64
+ This is a tiny package whose only purpose is to let you detect which
65
+ async library your code is running under.
66
+
67
+ * Documentation: https://sniffio.readthedocs.io
68
+
69
+ * Bug tracker and source code: https://github.com/python-trio/sniffio
70
+
71
+ * License: MIT or Apache License 2.0, your choice
72
+
73
+ * Contributor guide: https://trio.readthedocs.io/en/latest/contributing.html
74
+
75
+ * Code of conduct: Contributors are requested to follow our `code of
76
+ conduct
77
+ <https://trio.readthedocs.io/en/latest/code-of-conduct.html>`_
78
+ in all project spaces.
79
+
80
+ This library is maintained by the Trio project, as a service to the
81
+ async Python community as a whole.
82
+
83
+
84
+ Quickstart
85
+ ----------
86
+
87
+ .. code-block:: python3
88
+
89
+ from sniffio import current_async_library
90
+ import trio
91
+ import asyncio
92
+
93
+ async def print_library():
94
+ library = current_async_library()
95
+ print("This is:", library)
96
+
97
+ # Prints "This is trio"
98
+ trio.run(print_library)
99
+
100
+ # Prints "This is asyncio"
101
+ asyncio.run(print_library())
102
+
103
+ For more details, including how to add support to new async libraries,
104
+ `please peruse our fine manual <https://sniffio.readthedocs.io>`__.
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/RECORD ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ sniffio-1.3.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
2
+ sniffio-1.3.1.dist-info/LICENSE,sha256=ZSyHhIjRRWNh4Iw_hgf9e6WYkqFBA9Fczk_5PIW1zIs,185
3
+ sniffio-1.3.1.dist-info/LICENSE.APACHE2,sha256=z8d0m5b2O9McPEK1xHG_dWgUBT6EfBDz6wA0F7xSPTA,11358
4
+ sniffio-1.3.1.dist-info/LICENSE.MIT,sha256=Pm2uVV65J4f8gtHUg1Vnf0VMf2Wus40_nnK_mj2vA0s,1046
5
+ sniffio-1.3.1.dist-info/METADATA,sha256=CzGLVwmO3sz1heYKiJprantcQIbzqapi7_dqHTzuEtk,3875
6
+ sniffio-1.3.1.dist-info/RECORD,,
7
+ sniffio-1.3.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
8
+ sniffio-1.3.1.dist-info/WHEEL,sha256=oiQVh_5PnQM0E3gPdiz09WCNmwiHDMaGer_elqB3coM,92
9
+ sniffio-1.3.1.dist-info/top_level.txt,sha256=v9UJXGs5CyddCVeAqXkQiWOrpp6Wtx6GeRrPt9-jjHg,8
10
+ sniffio/__init__.py,sha256=9WJEJlXu7yluP0YtI5SQ9M9OTQfbNHkadarK1vXGDPM,335
11
+ sniffio/__pycache__/__init__.cpython-310.pyc,,
12
+ sniffio/__pycache__/_impl.cpython-310.pyc,,
13
+ sniffio/__pycache__/_version.cpython-310.pyc,,
14
+ sniffio/_impl.py,sha256=UmUFMZpiuOrcjnuHhuYiYMxeCNWfqu9kBlaPf0xk6X8,2843
15
+ sniffio/_tests/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
16
+ sniffio/_tests/__pycache__/__init__.cpython-310.pyc,,
17
+ sniffio/_tests/__pycache__/test_sniffio.cpython-310.pyc,,
18
+ sniffio/_tests/test_sniffio.py,sha256=MMJZZJjQrUi95RANNM-a_55BZquA_gv4rHU1pevcTCM,2058
19
+ sniffio/_version.py,sha256=iVes5xwsHeRzQDexBaAhyx_taNt2ucfA7CWAo4QDt6Q,89
20
+ sniffio/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/REQUESTED ADDED
File without changes
valley/lib/python3.10/site-packages/sniffio-1.3.1.dist-info/WHEEL ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: bdist_wheel (0.42.0)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+