michaelwaves commited on
Commit
a8667ba
·
verified ·
1 Parent(s): 48e124a

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/INSTALLER +1 -0
  2. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/LICENSE +279 -0
  3. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/METADATA +123 -0
  4. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/RECORD +12 -0
  5. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/REQUESTED +0 -0
  6. lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/WHEEL +4 -0
  7. lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/INSTALLER +1 -0
  8. lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/METADATA +295 -0
  9. lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/RECORD +9 -0
  10. lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/REQUESTED +0 -0
  11. lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/WHEEL +4 -0
  12. lib/python3.13/site-packages/attrs-25.4.0.dist-info/RECORD +37 -0
  13. lib/python3.13/site-packages/blinker/__init__.py +17 -0
  14. lib/python3.13/site-packages/blinker/_utilities.py +64 -0
  15. lib/python3.13/site-packages/blinker/base.py +512 -0
  16. lib/python3.13/site-packages/blinker/py.typed +0 -0
  17. lib/python3.13/site-packages/cachetools/__init__.py +718 -0
  18. lib/python3.13/site-packages/cachetools/_cached.py +247 -0
  19. lib/python3.13/site-packages/cachetools/_cachedmethod.py +128 -0
  20. lib/python3.13/site-packages/cachetools/func.py +102 -0
  21. lib/python3.13/site-packages/cachetools/keys.py +62 -0
  22. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/INSTALLER +1 -0
  23. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/METADATA +47 -0
  24. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/RECORD +22 -0
  25. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/REQUESTED +0 -0
  26. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/WHEEL +5 -0
  27. lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/top_level.txt +1 -0
  28. lib/python3.13/site-packages/cycler-0.12.1.dist-info/LICENSE +27 -0
  29. lib/python3.13/site-packages/cycler-0.12.1.dist-info/METADATA +78 -0
  30. lib/python3.13/site-packages/cycler-0.12.1.dist-info/RECORD +9 -0
  31. lib/python3.13/site-packages/cycler-0.12.1.dist-info/REQUESTED +0 -0
  32. lib/python3.13/site-packages/cycler-0.12.1.dist-info/WHEEL +5 -0
  33. lib/python3.13/site-packages/datasets-4.4.1.dist-info/AUTHORS +8 -0
  34. lib/python3.13/site-packages/datasets-4.4.1.dist-info/INSTALLER +1 -0
  35. lib/python3.13/site-packages/datasets-4.4.1.dist-info/LICENSE +202 -0
  36. lib/python3.13/site-packages/datasets-4.4.1.dist-info/METADATA +375 -0
  37. lib/python3.13/site-packages/datasets-4.4.1.dist-info/RECORD +140 -0
  38. lib/python3.13/site-packages/datasets-4.4.1.dist-info/REQUESTED +0 -0
  39. lib/python3.13/site-packages/datasets-4.4.1.dist-info/WHEEL +5 -0
  40. lib/python3.13/site-packages/datasets-4.4.1.dist-info/entry_points.txt +2 -0
  41. lib/python3.13/site-packages/datasets-4.4.1.dist-info/top_level.txt +1 -0
  42. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/INSTALLER +1 -0
  43. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/LICENSE +21 -0
  44. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/METADATA +226 -0
  45. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/RECORD +13 -0
  46. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/REQUESTED +0 -0
  47. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/WHEEL +7 -0
  48. lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/top_level.txt +1 -0
  49. lib/python3.13/site-packages/flashinfer_python-0.4.1.dist-info/INSTALLER +1 -0
  50. lib/python3.13/site-packages/flashinfer_python-0.4.1.dist-info/METADATA +243 -0
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/LICENSE ADDED
@@ -0,0 +1,279 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ A. HISTORY OF THE SOFTWARE
2
+ ==========================
3
+
4
+ Python was created in the early 1990s by Guido van Rossum at Stichting
5
+ Mathematisch Centrum (CWI, see https://www.cwi.nl) in the Netherlands
6
+ as a successor of a language called ABC. Guido remains Python's
7
+ principal author, although it includes many contributions from others.
8
+
9
+ In 1995, Guido continued his work on Python at the Corporation for
10
+ National Research Initiatives (CNRI, see https://www.cnri.reston.va.us)
11
+ in Reston, Virginia where he released several versions of the
12
+ software.
13
+
14
+ In May 2000, Guido and the Python core development team moved to
15
+ BeOpen.com to form the BeOpen PythonLabs team. In October of the same
16
+ year, the PythonLabs team moved to Digital Creations, which became
17
+ Zope Corporation. In 2001, the Python Software Foundation (PSF, see
18
+ https://www.python.org/psf/) was formed, a non-profit organization
19
+ created specifically to own Python-related Intellectual Property.
20
+ Zope Corporation was a sponsoring member of the PSF.
21
+
22
+ All Python releases are Open Source (see https://opensource.org for
23
+ the Open Source Definition). Historically, most, but not all, Python
24
+ releases have also been GPL-compatible; the table below summarizes
25
+ the various releases.
26
+
27
+ Release Derived Year Owner GPL-
28
+ from compatible? (1)
29
+
30
+ 0.9.0 thru 1.2 1991-1995 CWI yes
31
+ 1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
32
+ 1.6 1.5.2 2000 CNRI no
33
+ 2.0 1.6 2000 BeOpen.com no
34
+ 1.6.1 1.6 2001 CNRI yes (2)
35
+ 2.1 2.0+1.6.1 2001 PSF no
36
+ 2.0.1 2.0+1.6.1 2001 PSF yes
37
+ 2.1.1 2.1+2.0.1 2001 PSF yes
38
+ 2.1.2 2.1.1 2002 PSF yes
39
+ 2.1.3 2.1.2 2002 PSF yes
40
+ 2.2 and above 2.1.1 2001-now PSF yes
41
+
42
+ Footnotes:
43
+
44
+ (1) GPL-compatible doesn't mean that we're distributing Python under
45
+ the GPL. All Python licenses, unlike the GPL, let you distribute
46
+ a modified version without making your changes open source. The
47
+ GPL-compatible licenses make it possible to combine Python with
48
+ other software that is released under the GPL; the others don't.
49
+
50
+ (2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
51
+ because its license has a choice of law clause. According to
52
+ CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
53
+ is "not incompatible" with the GPL.
54
+
55
+ Thanks to the many outside volunteers who have worked under Guido's
56
+ direction to make these releases possible.
57
+
58
+
59
+ B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
60
+ ===============================================================
61
+
62
+ Python software and documentation are licensed under the
63
+ Python Software Foundation License Version 2.
64
+
65
+ Starting with Python 3.8.6, examples, recipes, and other code in
66
+ the documentation are dual licensed under the PSF License Version 2
67
+ and the Zero-Clause BSD license.
68
+
69
+ Some software incorporated into Python is under different licenses.
70
+ The licenses are listed with code falling under that license.
71
+
72
+
73
+ PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
74
+ --------------------------------------------
75
+
76
+ 1. This LICENSE AGREEMENT is between the Python Software Foundation
77
+ ("PSF"), and the Individual or Organization ("Licensee") accessing and
78
+ otherwise using this software ("Python") in source or binary form and
79
+ its associated documentation.
80
+
81
+ 2. Subject to the terms and conditions of this License Agreement, PSF hereby
82
+ grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
83
+ analyze, test, perform and/or display publicly, prepare derivative works,
84
+ distribute, and otherwise use Python alone or in any derivative version,
85
+ provided, however, that PSF's License Agreement and PSF's notice of copyright,
86
+ i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
87
+ 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023 Python Software Foundation;
88
+ All Rights Reserved" are retained in Python alone or in any derivative version
89
+ prepared by Licensee.
90
+
91
+ 3. In the event Licensee prepares a derivative work that is based on
92
+ or incorporates Python or any part thereof, and wants to make
93
+ the derivative work available to others as provided herein, then
94
+ Licensee hereby agrees to include in any such work a brief summary of
95
+ the changes made to Python.
96
+
97
+ 4. PSF is making Python available to Licensee on an "AS IS"
98
+ basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
99
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
100
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
101
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
102
+ INFRINGE ANY THIRD PARTY RIGHTS.
103
+
104
+ 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
105
+ FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
106
+ A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
107
+ OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
108
+
109
+ 6. This License Agreement will automatically terminate upon a material
110
+ breach of its terms and conditions.
111
+
112
+ 7. Nothing in this License Agreement shall be deemed to create any
113
+ relationship of agency, partnership, or joint venture between PSF and
114
+ Licensee. This License Agreement does not grant permission to use PSF
115
+ trademarks or trade name in a trademark sense to endorse or promote
116
+ products or services of Licensee, or any third party.
117
+
118
+ 8. By copying, installing or otherwise using Python, Licensee
119
+ agrees to be bound by the terms and conditions of this License
120
+ Agreement.
121
+
122
+
123
+ BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
124
+ -------------------------------------------
125
+
126
+ BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
127
+
128
+ 1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
129
+ office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
130
+ Individual or Organization ("Licensee") accessing and otherwise using
131
+ this software in source or binary form and its associated
132
+ documentation ("the Software").
133
+
134
+ 2. Subject to the terms and conditions of this BeOpen Python License
135
+ Agreement, BeOpen hereby grants Licensee a non-exclusive,
136
+ royalty-free, world-wide license to reproduce, analyze, test, perform
137
+ and/or display publicly, prepare derivative works, distribute, and
138
+ otherwise use the Software alone or in any derivative version,
139
+ provided, however, that the BeOpen Python License is retained in the
140
+ Software, alone or in any derivative version prepared by Licensee.
141
+
142
+ 3. BeOpen is making the Software available to Licensee on an "AS IS"
143
+ basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
144
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
145
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
146
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
147
+ INFRINGE ANY THIRD PARTY RIGHTS.
148
+
149
+ 4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
150
+ SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
151
+ AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
152
+ DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
153
+
154
+ 5. This License Agreement will automatically terminate upon a material
155
+ breach of its terms and conditions.
156
+
157
+ 6. This License Agreement shall be governed by and interpreted in all
158
+ respects by the law of the State of California, excluding conflict of
159
+ law provisions. Nothing in this License Agreement shall be deemed to
160
+ create any relationship of agency, partnership, or joint venture
161
+ between BeOpen and Licensee. This License Agreement does not grant
162
+ permission to use BeOpen trademarks or trade names in a trademark
163
+ sense to endorse or promote products or services of Licensee, or any
164
+ third party. As an exception, the "BeOpen Python" logos available at
165
+ http://www.pythonlabs.com/logos.html may be used according to the
166
+ permissions granted on that web page.
167
+
168
+ 7. By copying, installing or otherwise using the software, Licensee
169
+ agrees to be bound by the terms and conditions of this License
170
+ Agreement.
171
+
172
+
173
+ CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
174
+ ---------------------------------------
175
+
176
+ 1. This LICENSE AGREEMENT is between the Corporation for National
177
+ Research Initiatives, having an office at 1895 Preston White Drive,
178
+ Reston, VA 20191 ("CNRI"), and the Individual or Organization
179
+ ("Licensee") accessing and otherwise using Python 1.6.1 software in
180
+ source or binary form and its associated documentation.
181
+
182
+ 2. Subject to the terms and conditions of this License Agreement, CNRI
183
+ hereby grants Licensee a nonexclusive, royalty-free, world-wide
184
+ license to reproduce, analyze, test, perform and/or display publicly,
185
+ prepare derivative works, distribute, and otherwise use Python 1.6.1
186
+ alone or in any derivative version, provided, however, that CNRI's
187
+ License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
188
+ 1995-2001 Corporation for National Research Initiatives; All Rights
189
+ Reserved" are retained in Python 1.6.1 alone or in any derivative
190
+ version prepared by Licensee. Alternately, in lieu of CNRI's License
191
+ Agreement, Licensee may substitute the following text (omitting the
192
+ quotes): "Python 1.6.1 is made available subject to the terms and
193
+ conditions in CNRI's License Agreement. This Agreement together with
194
+ Python 1.6.1 may be located on the internet using the following
195
+ unique, persistent identifier (known as a handle): 1895.22/1013. This
196
+ Agreement may also be obtained from a proxy server on the internet
197
+ using the following URL: http://hdl.handle.net/1895.22/1013".
198
+
199
+ 3. In the event Licensee prepares a derivative work that is based on
200
+ or incorporates Python 1.6.1 or any part thereof, and wants to make
201
+ the derivative work available to others as provided herein, then
202
+ Licensee hereby agrees to include in any such work a brief summary of
203
+ the changes made to Python 1.6.1.
204
+
205
+ 4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
206
+ basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
207
+ IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
208
+ DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
209
+ FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
210
+ INFRINGE ANY THIRD PARTY RIGHTS.
211
+
212
+ 5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
213
+ 1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
214
+ A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
215
+ OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
216
+
217
+ 6. This License Agreement will automatically terminate upon a material
218
+ breach of its terms and conditions.
219
+
220
+ 7. This License Agreement shall be governed by the federal
221
+ intellectual property law of the United States, including without
222
+ limitation the federal copyright law, and, to the extent such
223
+ U.S. federal law does not apply, by the law of the Commonwealth of
224
+ Virginia, excluding Virginia's conflict of law provisions.
225
+ Notwithstanding the foregoing, with regard to derivative works based
226
+ on Python 1.6.1 that incorporate non-separable material that was
227
+ previously distributed under the GNU General Public License (GPL), the
228
+ law of the Commonwealth of Virginia shall govern this License
229
+ Agreement only as to issues arising under or with respect to
230
+ Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
231
+ License Agreement shall be deemed to create any relationship of
232
+ agency, partnership, or joint venture between CNRI and Licensee. This
233
+ License Agreement does not grant permission to use CNRI trademarks or
234
+ trade name in a trademark sense to endorse or promote products or
235
+ services of Licensee, or any third party.
236
+
237
+ 8. By clicking on the "ACCEPT" button where indicated, or by copying,
238
+ installing or otherwise using Python 1.6.1, Licensee agrees to be
239
+ bound by the terms and conditions of this License Agreement.
240
+
241
+ ACCEPT
242
+
243
+
244
+ CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
245
+ --------------------------------------------------
246
+
247
+ Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
248
+ The Netherlands. All rights reserved.
249
+
250
+ Permission to use, copy, modify, and distribute this software and its
251
+ documentation for any purpose and without fee is hereby granted,
252
+ provided that the above copyright notice appear in all copies and that
253
+ both that copyright notice and this permission notice appear in
254
+ supporting documentation, and that the name of Stichting Mathematisch
255
+ Centrum or CWI not be used in advertising or publicity pertaining to
256
+ distribution of the software without specific, written prior
257
+ permission.
258
+
259
+ STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
260
+ THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
261
+ FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
262
+ FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
263
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
264
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
265
+ OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
266
+
267
+ ZERO-CLAUSE BSD LICENSE FOR CODE IN THE PYTHON DOCUMENTATION
268
+ ----------------------------------------------------------------------
269
+
270
+ Permission to use, copy, modify, and/or distribute this software for any
271
+ purpose with or without fee is hereby granted.
272
+
273
+ THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
274
+ REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
275
+ AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
276
+ INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
277
+ LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
278
+ OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
279
+ PERFORMANCE OF THIS SOFTWARE.
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/METADATA ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.3
2
+ Name: aiohappyeyeballs
3
+ Version: 2.6.1
4
+ Summary: Happy Eyeballs for asyncio
5
+ License: PSF-2.0
6
+ Author: J. Nick Koston
7
+ Author-email: nick@koston.org
8
+ Requires-Python: >=3.9
9
+ Classifier: Development Status :: 5 - Production/Stable
10
+ Classifier: Intended Audience :: Developers
11
+ Classifier: Natural Language :: English
12
+ Classifier: Operating System :: OS Independent
13
+ Classifier: Topic :: Software Development :: Libraries
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3.9
16
+ Classifier: Programming Language :: Python :: 3.10
17
+ Classifier: Programming Language :: Python :: 3.11
18
+ Classifier: Programming Language :: Python :: 3.12
19
+ Classifier: Programming Language :: Python :: 3.13
20
+ Classifier: License :: OSI Approved :: Python Software Foundation License
21
+ Project-URL: Bug Tracker, https://github.com/aio-libs/aiohappyeyeballs/issues
22
+ Project-URL: Changelog, https://github.com/aio-libs/aiohappyeyeballs/blob/main/CHANGELOG.md
23
+ Project-URL: Documentation, https://aiohappyeyeballs.readthedocs.io
24
+ Project-URL: Repository, https://github.com/aio-libs/aiohappyeyeballs
25
+ Description-Content-Type: text/markdown
26
+
27
+ # aiohappyeyeballs
28
+
29
+ <p align="center">
30
+ <a href="https://github.com/aio-libs/aiohappyeyeballs/actions/workflows/ci.yml?query=branch%3Amain">
31
+ <img src="https://img.shields.io/github/actions/workflow/status/aio-libs/aiohappyeyeballs/ci-cd.yml?branch=main&label=CI&logo=github&style=flat-square" alt="CI Status" >
32
+ </a>
33
+ <a href="https://aiohappyeyeballs.readthedocs.io">
34
+ <img src="https://img.shields.io/readthedocs/aiohappyeyeballs.svg?logo=read-the-docs&logoColor=fff&style=flat-square" alt="Documentation Status">
35
+ </a>
36
+ <a href="https://codecov.io/gh/aio-libs/aiohappyeyeballs">
37
+ <img src="https://img.shields.io/codecov/c/github/aio-libs/aiohappyeyeballs.svg?logo=codecov&logoColor=fff&style=flat-square" alt="Test coverage percentage">
38
+ </a>
39
+ </p>
40
+ <p align="center">
41
+ <a href="https://python-poetry.org/">
42
+ <img src="https://img.shields.io/badge/packaging-poetry-299bd7?style=flat-square&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAASCAYAAABrXO8xAAAACXBIWXMAAAsTAAALEwEAmpwYAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAJJSURBVHgBfZLPa1NBEMe/s7tNXoxW1KJQKaUHkXhQvHgW6UHQQ09CBS/6V3hKc/AP8CqCrUcpmop3Cx48eDB4yEECjVQrlZb80CRN8t6OM/teagVxYZi38+Yz853dJbzoMV3MM8cJUcLMSUKIE8AzQ2PieZzFxEJOHMOgMQQ+dUgSAckNXhapU/NMhDSWLs1B24A8sO1xrN4NECkcAC9ASkiIJc6k5TRiUDPhnyMMdhKc+Zx19l6SgyeW76BEONY9exVQMzKExGKwwPsCzza7KGSSWRWEQhyEaDXp6ZHEr416ygbiKYOd7TEWvvcQIeusHYMJGhTwF9y7sGnSwaWyFAiyoxzqW0PM/RjghPxF2pWReAowTEXnDh0xgcLs8l2YQmOrj3N7ByiqEoH0cARs4u78WgAVkoEDIDoOi3AkcLOHU60RIg5wC4ZuTC7FaHKQm8Hq1fQuSOBvX/sodmNJSB5geaF5CPIkUeecdMxieoRO5jz9bheL6/tXjrwCyX/UYBUcjCaWHljx1xiX6z9xEjkYAzbGVnB8pvLmyXm9ep+W8CmsSHQQY77Zx1zboxAV0w7ybMhQmfqdmmw3nEp1I0Z+FGO6M8LZdoyZnuzzBdjISicKRnpxzI9fPb+0oYXsNdyi+d3h9bm9MWYHFtPeIZfLwzmFDKy1ai3p+PDls1Llz4yyFpferxjnyjJDSEy9CaCx5m2cJPerq6Xm34eTrZt3PqxYO1XOwDYZrFlH1fWnpU38Y9HRze3lj0vOujZcXKuuXm3jP+s3KbZVra7y2EAAAAAASUVORK5CYII=" alt="Poetry">
43
+ </a>
44
+ <a href="https://github.com/astral-sh/ruff">
45
+ <img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json" alt="Ruff">
46
+ </a>
47
+ <a href="https://github.com/pre-commit/pre-commit">
48
+ <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" alt="pre-commit">
49
+ </a>
50
+ </p>
51
+ <p align="center">
52
+ <a href="https://pypi.org/project/aiohappyeyeballs/">
53
+ <img src="https://img.shields.io/pypi/v/aiohappyeyeballs.svg?logo=python&logoColor=fff&style=flat-square" alt="PyPI Version">
54
+ </a>
55
+ <img src="https://img.shields.io/pypi/pyversions/aiohappyeyeballs.svg?style=flat-square&logo=python&amp;logoColor=fff" alt="Supported Python versions">
56
+ <img src="https://img.shields.io/pypi/l/aiohappyeyeballs.svg?style=flat-square" alt="License">
57
+ </p>
58
+
59
+ ---
60
+
61
+ **Documentation**: <a href="https://aiohappyeyeballs.readthedocs.io" target="_blank">https://aiohappyeyeballs.readthedocs.io </a>
62
+
63
+ **Source Code**: <a href="https://github.com/aio-libs/aiohappyeyeballs" target="_blank">https://github.com/aio-libs/aiohappyeyeballs </a>
64
+
65
+ ---
66
+
67
+ [Happy Eyeballs](https://en.wikipedia.org/wiki/Happy_Eyeballs)
68
+ ([RFC 8305](https://www.rfc-editor.org/rfc/rfc8305.html))
69
+
70
+ ## Use case
71
+
72
+ This library exists to allow connecting with
73
+ [Happy Eyeballs](https://en.wikipedia.org/wiki/Happy_Eyeballs)
74
+ ([RFC 8305](https://www.rfc-editor.org/rfc/rfc8305.html))
75
+ when you
76
+ already have a list of addrinfo and not a DNS name.
77
+
78
+ The stdlib version of `loop.create_connection()`
79
+ will only work when you pass in an unresolved name which
80
+ is not a good fit when using DNS caching or resolving
81
+ names via another method such as `zeroconf`.
82
+
83
+ ## Installation
84
+
85
+ Install this via pip (or your favourite package manager):
86
+
87
+ `pip install aiohappyeyeballs`
88
+
89
+ ## License
90
+
91
+ [aiohappyeyeballs is licensed under the same terms as cpython itself.](https://github.com/python/cpython/blob/main/LICENSE)
92
+
93
+ ## Example usage
94
+
95
+ ```python
96
+
97
+ addr_infos = await loop.getaddrinfo("example.org", 80)
98
+
99
+ socket = await start_connection(addr_infos)
100
+ socket = await start_connection(addr_infos, local_addr_infos=local_addr_infos, happy_eyeballs_delay=0.2)
101
+
102
+ transport, protocol = await loop.create_connection(
103
+ MyProtocol, sock=socket, ...)
104
+
105
+ # Remove the first address for each family from addr_info
106
+ pop_addr_infos_interleave(addr_info, 1)
107
+
108
+ # Remove all matching address from addr_info
109
+ remove_addr_infos(addr_info, "dead::beef::")
110
+
111
+ # Convert a local_addr to local_addr_infos
112
+ local_addr_infos = addr_to_addr_infos(("127.0.0.1",0))
113
+ ```
114
+
115
+ ## Credits
116
+
117
+ This package contains code from cpython and is licensed under the same terms as cpython itself.
118
+
119
+ This package was created with
120
+ [Copier](https://copier.readthedocs.io/) and the
121
+ [browniebroke/pypackage-template](https://github.com/browniebroke/pypackage-template)
122
+ project template.
123
+
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/RECORD ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohappyeyeballs-2.6.1.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
2
+ aiohappyeyeballs-2.6.1.dist-info/LICENSE,sha256=Oy-B_iHRgcSZxZolbI4ZaEVdZonSaaqFNzv7avQdo78,13936
3
+ aiohappyeyeballs-2.6.1.dist-info/METADATA,sha256=NSXlhJwAfi380eEjAo7BQ4P_TVal9xi0qkyZWibMsVM,5915
4
+ aiohappyeyeballs-2.6.1.dist-info/RECORD,,
5
+ aiohappyeyeballs-2.6.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
+ aiohappyeyeballs-2.6.1.dist-info/WHEEL,sha256=XbeZDeTWKc1w7CSIyre5aMDU_-PohRwTQceYnisIYYY,88
7
+ aiohappyeyeballs/__init__.py,sha256=x7kktHEtaD9quBcWDJPuLeKyjuVAI-Jj14S9B_5hcTs,361
8
+ aiohappyeyeballs/_staggered.py,sha256=edfVowFx-P-ywJjIEF3MdPtEMVODujV6CeMYr65otac,6900
9
+ aiohappyeyeballs/impl.py,sha256=Dlcm2mTJ28ucrGnxkb_fo9CZzLAkOOBizOt7dreBbXE,9681
10
+ aiohappyeyeballs/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
11
+ aiohappyeyeballs/types.py,sha256=YZJIAnyoV4Dz0WFtlaf_OyE4EW7Xus1z7aIfNI6tDDQ,425
12
+ aiohappyeyeballs/utils.py,sha256=on9GxIR0LhEfZu8P6Twi9hepX9zDanuZM20MWsb3xlQ,3028
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/aiohappyeyeballs-2.6.1.dist-info/WHEEL ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: poetry-core 2.1.1
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/METADATA ADDED
@@ -0,0 +1,295 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.3
2
+ Name: annotated-types
3
+ Version: 0.7.0
4
+ Summary: Reusable constraint types to use with typing.Annotated
5
+ Project-URL: Homepage, https://github.com/annotated-types/annotated-types
6
+ Project-URL: Source, https://github.com/annotated-types/annotated-types
7
+ Project-URL: Changelog, https://github.com/annotated-types/annotated-types/releases
8
+ Author-email: Adrian Garcia Badaracco <1755071+adriangb@users.noreply.github.com>, Samuel Colvin <s@muelcolvin.com>, Zac Hatfield-Dodds <zac@zhd.dev>
9
+ License-File: LICENSE
10
+ Classifier: Development Status :: 4 - Beta
11
+ Classifier: Environment :: Console
12
+ Classifier: Environment :: MacOS X
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: Intended Audience :: Information Technology
15
+ Classifier: License :: OSI Approved :: MIT License
16
+ Classifier: Operating System :: POSIX :: Linux
17
+ Classifier: Operating System :: Unix
18
+ Classifier: Programming Language :: Python :: 3 :: Only
19
+ Classifier: Programming Language :: Python :: 3.8
20
+ Classifier: Programming Language :: Python :: 3.9
21
+ Classifier: Programming Language :: Python :: 3.10
22
+ Classifier: Programming Language :: Python :: 3.11
23
+ Classifier: Programming Language :: Python :: 3.12
24
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
25
+ Classifier: Typing :: Typed
26
+ Requires-Python: >=3.8
27
+ Requires-Dist: typing-extensions>=4.0.0; python_version < '3.9'
28
+ Description-Content-Type: text/markdown
29
+
30
+ # annotated-types
31
+
32
+ [![CI](https://github.com/annotated-types/annotated-types/workflows/CI/badge.svg?event=push)](https://github.com/annotated-types/annotated-types/actions?query=event%3Apush+branch%3Amain+workflow%3ACI)
33
+ [![pypi](https://img.shields.io/pypi/v/annotated-types.svg)](https://pypi.python.org/pypi/annotated-types)
34
+ [![versions](https://img.shields.io/pypi/pyversions/annotated-types.svg)](https://github.com/annotated-types/annotated-types)
35
+ [![license](https://img.shields.io/github/license/annotated-types/annotated-types.svg)](https://github.com/annotated-types/annotated-types/blob/main/LICENSE)
36
+
37
+ [PEP-593](https://peps.python.org/pep-0593/) added `typing.Annotated` as a way of
38
+ adding context-specific metadata to existing types, and specifies that
39
+ `Annotated[T, x]` _should_ be treated as `T` by any tool or library without special
40
+ logic for `x`.
41
+
42
+ This package provides metadata objects which can be used to represent common
43
+ constraints such as upper and lower bounds on scalar values and collection sizes,
44
+ a `Predicate` marker for runtime checks, and
45
+ descriptions of how we intend these metadata to be interpreted. In some cases,
46
+ we also note alternative representations which do not require this package.
47
+
48
+ ## Install
49
+
50
+ ```bash
51
+ pip install annotated-types
52
+ ```
53
+
54
+ ## Examples
55
+
56
+ ```python
57
+ from typing import Annotated
58
+ from annotated_types import Gt, Len, Predicate
59
+
60
+ class MyClass:
61
+ age: Annotated[int, Gt(18)] # Valid: 19, 20, ...
62
+ # Invalid: 17, 18, "19", 19.0, ...
63
+ factors: list[Annotated[int, Predicate(is_prime)]] # Valid: 2, 3, 5, 7, 11, ...
64
+ # Invalid: 4, 8, -2, 5.0, "prime", ...
65
+
66
+ my_list: Annotated[list[int], Len(0, 10)] # Valid: [], [10, 20, 30, 40, 50]
67
+ # Invalid: (1, 2), ["abc"], [0] * 20
68
+ ```
69
+
70
+ ## Documentation
71
+
72
+ _While `annotated-types` avoids runtime checks for performance, users should not
73
+ construct invalid combinations such as `MultipleOf("non-numeric")` or `Annotated[int, Len(3)]`.
74
+ Downstream implementors may choose to raise an error, emit a warning, silently ignore
75
+ a metadata item, etc., if the metadata objects described below are used with an
76
+ incompatible type - or for any other reason!_
77
+
78
+ ### Gt, Ge, Lt, Le
79
+
80
+ Express inclusive and/or exclusive bounds on orderable values - which may be numbers,
81
+ dates, times, strings, sets, etc. Note that the boundary value need not be of the
82
+ same type that was annotated, so long as they can be compared: `Annotated[int, Gt(1.5)]`
83
+ is fine, for example, and implies that the value is an integer x such that `x > 1.5`.
84
+
85
+ We suggest that implementors may also interpret `functools.partial(operator.le, 1.5)`
86
+ as being equivalent to `Gt(1.5)`, for users who wish to avoid a runtime dependency on
87
+ the `annotated-types` package.
88
+
89
+ To be explicit, these types have the following meanings:
90
+
91
+ * `Gt(x)` - value must be "Greater Than" `x` - equivalent to exclusive minimum
92
+ * `Ge(x)` - value must be "Greater than or Equal" to `x` - equivalent to inclusive minimum
93
+ * `Lt(x)` - value must be "Less Than" `x` - equivalent to exclusive maximum
94
+ * `Le(x)` - value must be "Less than or Equal" to `x` - equivalent to inclusive maximum
95
+
96
+ ### Interval
97
+
98
+ `Interval(gt, ge, lt, le)` allows you to specify an upper and lower bound with a single
99
+ metadata object. `None` attributes should be ignored, and non-`None` attributes
100
+ treated as per the single bounds above.
101
+
102
+ ### MultipleOf
103
+
104
+ `MultipleOf(multiple_of=x)` might be interpreted in two ways:
105
+
106
+ 1. Python semantics, implying `value % multiple_of == 0`, or
107
+ 2. [JSONschema semantics](https://json-schema.org/draft/2020-12/json-schema-validation.html#rfc.section.6.2.1),
108
+ where `int(value / multiple_of) == value / multiple_of`.
109
+
110
+ We encourage users to be aware of these two common interpretations and their
111
+ distinct behaviours, especially since very large or non-integer numbers make
112
+ it easy to cause silent data corruption due to floating-point imprecision.
113
+
114
+ We encourage libraries to carefully document which interpretation they implement.
115
+
116
+ ### MinLen, MaxLen, Len
117
+
118
+ `Len()` implies that `min_length <= len(value) <= max_length` - lower and upper bounds are inclusive.
119
+
120
+ As well as `Len()` which can optionally include upper and lower bounds, we also
121
+ provide `MinLen(x)` and `MaxLen(y)` which are equivalent to `Len(min_length=x)`
122
+ and `Len(max_length=y)` respectively.
123
+
124
+ `Len`, `MinLen`, and `MaxLen` may be used with any type which supports `len(value)`.
125
+
126
+ Examples of usage:
127
+
128
+ * `Annotated[list, MaxLen(10)]` (or `Annotated[list, Len(max_length=10))`) - list must have a length of 10 or less
129
+ * `Annotated[str, MaxLen(10)]` - string must have a length of 10 or less
130
+ * `Annotated[list, MinLen(3))` (or `Annotated[list, Len(min_length=3))`) - list must have a length of 3 or more
131
+ * `Annotated[list, Len(4, 6)]` - list must have a length of 4, 5, or 6
132
+ * `Annotated[list, Len(8, 8)]` - list must have a length of exactly 8
133
+
134
+ #### Changed in v0.4.0
135
+
136
+ * `min_inclusive` has been renamed to `min_length`, no change in meaning
137
+ * `max_exclusive` has been renamed to `max_length`, upper bound is now **inclusive** instead of **exclusive**
138
+ * The recommendation that slices are interpreted as `Len` has been removed due to ambiguity and different semantic
139
+ meaning of the upper bound in slices vs. `Len`
140
+
141
+ See [issue #23](https://github.com/annotated-types/annotated-types/issues/23) for discussion.
142
+
143
+ ### Timezone
144
+
145
+ `Timezone` can be used with a `datetime` or a `time` to express which timezones
146
+ are allowed. `Annotated[datetime, Timezone(None)]` must be a naive datetime.
147
+ `Timezone[...]` ([literal ellipsis](https://docs.python.org/3/library/constants.html#Ellipsis))
148
+ expresses that any timezone-aware datetime is allowed. You may also pass a specific
149
+ timezone string or [`tzinfo`](https://docs.python.org/3/library/datetime.html#tzinfo-objects)
150
+ object such as `Timezone(timezone.utc)` or `Timezone("Africa/Abidjan")` to express that you only
151
+ allow a specific timezone, though we note that this is often a symptom of fragile design.
152
+
153
+ #### Changed in v0.x.x
154
+
155
+ * `Timezone` accepts [`tzinfo`](https://docs.python.org/3/library/datetime.html#tzinfo-objects) objects instead of
156
+ `timezone`, extending compatibility to [`zoneinfo`](https://docs.python.org/3/library/zoneinfo.html) and third party libraries.
157
+
158
+ ### Unit
159
+
160
+ `Unit(unit: str)` expresses that the annotated numeric value is the magnitude of
161
+ a quantity with the specified unit. For example, `Annotated[float, Unit("m/s")]`
162
+ would be a float representing a velocity in meters per second.
163
+
164
+ Please note that `annotated_types` itself makes no attempt to parse or validate
165
+ the unit string in any way. That is left entirely to downstream libraries,
166
+ such as [`pint`](https://pint.readthedocs.io) or
167
+ [`astropy.units`](https://docs.astropy.org/en/stable/units/).
168
+
169
+ An example of how a library might use this metadata:
170
+
171
+ ```python
172
+ from annotated_types import Unit
173
+ from typing import Annotated, TypeVar, Callable, Any, get_origin, get_args
174
+
175
+ # given a type annotated with a unit:
176
+ Meters = Annotated[float, Unit("m")]
177
+
178
+
179
+ # you can cast the annotation to a specific unit type with any
180
+ # callable that accepts a string and returns the desired type
181
+ T = TypeVar("T")
182
+ def cast_unit(tp: Any, unit_cls: Callable[[str], T]) -> T | None:
183
+ if get_origin(tp) is Annotated:
184
+ for arg in get_args(tp):
185
+ if isinstance(arg, Unit):
186
+ return unit_cls(arg.unit)
187
+ return None
188
+
189
+
190
+ # using `pint`
191
+ import pint
192
+ pint_unit = cast_unit(Meters, pint.Unit)
193
+
194
+
195
+ # using `astropy.units`
196
+ import astropy.units as u
197
+ astropy_unit = cast_unit(Meters, u.Unit)
198
+ ```
199
+
200
+ ### Predicate
201
+
202
+ `Predicate(func: Callable)` expresses that `func(value)` is truthy for valid values.
203
+ Users should prefer the statically inspectable metadata above, but if you need
204
+ the full power and flexibility of arbitrary runtime predicates... here it is.
205
+
206
+ For some common constraints, we provide generic types:
207
+
208
+ * `IsLower = Annotated[T, Predicate(str.islower)]`
209
+ * `IsUpper = Annotated[T, Predicate(str.isupper)]`
210
+ * `IsDigit = Annotated[T, Predicate(str.isdigit)]`
211
+ * `IsFinite = Annotated[T, Predicate(math.isfinite)]`
212
+ * `IsNotFinite = Annotated[T, Predicate(Not(math.isfinite))]`
213
+ * `IsNan = Annotated[T, Predicate(math.isnan)]`
214
+ * `IsNotNan = Annotated[T, Predicate(Not(math.isnan))]`
215
+ * `IsInfinite = Annotated[T, Predicate(math.isinf)]`
216
+ * `IsNotInfinite = Annotated[T, Predicate(Not(math.isinf))]`
217
+
218
+ so that you can write e.g. `x: IsFinite[float] = 2.0` instead of the longer
219
+ (but exactly equivalent) `x: Annotated[float, Predicate(math.isfinite)] = 2.0`.
220
+
221
+ Some libraries might have special logic to handle known or understandable predicates,
222
+ for example by checking for `str.isdigit` and using its presence to both call custom
223
+ logic to enforce digit-only strings, and customise some generated external schema.
224
+ Users are therefore encouraged to avoid indirection like `lambda s: s.lower()`, in
225
+ favor of introspectable methods such as `str.lower` or `re.compile("pattern").search`.
226
+
227
+ To enable basic negation of commonly used predicates like `math.isnan` without introducing introspection that makes it impossible for implementers to introspect the predicate we provide a `Not` wrapper that simply negates the predicate in an introspectable manner. Several of the predicates listed above are created in this manner.
228
+
229
+ We do not specify what behaviour should be expected for predicates that raise
230
+ an exception. For example `Annotated[int, Predicate(str.isdigit)]` might silently
231
+ skip invalid constraints, or statically raise an error; or it might try calling it
232
+ and then propagate or discard the resulting
233
+ `TypeError: descriptor 'isdigit' for 'str' objects doesn't apply to a 'int' object`
234
+ exception. We encourage libraries to document the behaviour they choose.
235
+
236
+ ### Doc
237
+
238
+ `doc()` can be used to add documentation information in `Annotated`, for function and method parameters, variables, class attributes, return types, and any place where `Annotated` can be used.
239
+
240
+ It expects a value that can be statically analyzed, as the main use case is for static analysis, editors, documentation generators, and similar tools.
241
+
242
+ It returns a `DocInfo` class with a single attribute `documentation` containing the value passed to `doc()`.
243
+
244
+ This is the early adopter's alternative form of the [`typing-doc` proposal](https://github.com/tiangolo/fastapi/blob/typing-doc/typing_doc.md).
245
+
246
+ ### Integrating downstream types with `GroupedMetadata`
247
+
248
+ Implementers may choose to provide a convenience wrapper that groups multiple pieces of metadata.
249
+ This can help reduce verbosity and cognitive overhead for users.
250
+ For example, an implementer like Pydantic might provide a `Field` or `Meta` type that accepts keyword arguments and transforms these into low-level metadata:
251
+
252
+ ```python
253
+ from dataclasses import dataclass
254
+ from typing import Iterator
255
+ from annotated_types import GroupedMetadata, Ge
256
+
257
+ @dataclass
258
+ class Field(GroupedMetadata):
259
+ ge: int | None = None
260
+ description: str | None = None
261
+
262
+ def __iter__(self) -> Iterator[object]:
263
+ # Iterating over a GroupedMetadata object should yield annotated-types
264
+ # constraint metadata objects which describe it as fully as possible,
265
+ # and may include other unknown objects too.
266
+ if self.ge is not None:
267
+ yield Ge(self.ge)
268
+ if self.description is not None:
269
+ yield Description(self.description)
270
+ ```
271
+
272
+ Libraries consuming annotated-types constraints should check for `GroupedMetadata` and unpack it by iterating over the object and treating the results as if they had been "unpacked" in the `Annotated` type. The same logic should be applied to the [PEP 646 `Unpack` type](https://peps.python.org/pep-0646/), so that `Annotated[T, Field(...)]`, `Annotated[T, Unpack[Field(...)]]` and `Annotated[T, *Field(...)]` are all treated consistently.
273
+
274
+ Libraries consuming annotated-types should also ignore any metadata they do not recongize that came from unpacking a `GroupedMetadata`, just like they ignore unrecognized metadata in `Annotated` itself.
275
+
276
+ Our own `annotated_types.Interval` class is a `GroupedMetadata` which unpacks itself into `Gt`, `Lt`, etc., so this is not an abstract concern. Similarly, `annotated_types.Len` is a `GroupedMetadata` which unpacks itself into `MinLen` (optionally) and `MaxLen`.
277
+
278
+ ### Consuming metadata
279
+
280
+ We intend to not be prescriptive as to _how_ the metadata and constraints are used, but as an example of how one might parse constraints from types annotations see our [implementation in `test_main.py`](https://github.com/annotated-types/annotated-types/blob/f59cf6d1b5255a0fe359b93896759a180bec30ae/tests/test_main.py#L94-L103).
281
+
282
+ It is up to the implementer to determine how this metadata is used.
283
+ You could use the metadata for runtime type checking, for generating schemas or to generate example data, amongst other use cases.
284
+
285
+ ## Design & History
286
+
287
+ This package was designed at the PyCon 2022 sprints by the maintainers of Pydantic
288
+ and Hypothesis, with the goal of making it as easy as possible for end-users to
289
+ provide more informative annotations for use by runtime libraries.
290
+
291
+ It is deliberately minimal, and following PEP-593 allows considerable downstream
292
+ discretion in what (if anything!) they choose to support. Nonetheless, we expect
293
+ that staying simple and covering _only_ the most common use-cases will give users
294
+ and maintainers the best experience we can. If you'd like more constraints for your
295
+ types - follow our lead, by defining them and documenting them downstream!
lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/RECORD ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ annotated_types-0.7.0.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
2
+ annotated_types-0.7.0.dist-info/METADATA,sha256=7ltqxksJJ0wCYFGBNIQCWTlWQGeAH0hRFdnK3CB895E,15046
3
+ annotated_types-0.7.0.dist-info/RECORD,,
4
+ annotated_types-0.7.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
5
+ annotated_types-0.7.0.dist-info/WHEEL,sha256=zEMcRr9Kr03x1ozGwg5v9NQBKn3kndp6LSoSlVg-jhU,87
6
+ annotated_types-0.7.0.dist-info/licenses/LICENSE,sha256=_hBJiEsaDZNCkB6I4H8ykl0ksxIdmXK2poBfuYJLCV0,1083
7
+ annotated_types/__init__.py,sha256=RynLsRKUEGI0KimXydlD1fZEfEzWwDo0Uon3zOKhG1Q,13819
8
+ annotated_types/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
9
+ annotated_types/test_cases.py,sha256=zHFX6EpcMbGJ8FzBYDbO56bPwx_DYIVSKbZM-4B3_lg,6421
lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/annotated_types-0.7.0.dist-info/WHEEL ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: hatchling 1.24.2
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
lib/python3.13/site-packages/attrs-25.4.0.dist-info/RECORD ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ attr/__init__.py,sha256=fOYIvt1eGSqQre4uCS3sJWKZ0mwAuC8UD6qba5OS9_U,2057
2
+ attr/__init__.pyi,sha256=IZkzIjvtbRqDWGkDBIF9dd12FgDa379JYq3GHnVOvFQ,11309
3
+ attr/_cmp.py,sha256=3Nn1TjxllUYiX_nJoVnEkXoDk0hM1DYKj5DE7GZe4i0,4117
4
+ attr/_cmp.pyi,sha256=U-_RU_UZOyPUEQzXE6RMYQQcjkZRY25wTH99sN0s7MM,368
5
+ attr/_compat.py,sha256=x0g7iEUOnBVJC72zyFCgb1eKqyxS-7f2LGnNyZ_r95s,2829
6
+ attr/_config.py,sha256=dGq3xR6fgZEF6UBt_L0T-eUHIB4i43kRmH0P28sJVw8,843
7
+ attr/_funcs.py,sha256=Ix5IETTfz5F01F-12MF_CSFomIn2h8b67EVVz2gCtBE,16479
8
+ attr/_make.py,sha256=NRJDGS8syg2h3YNflVNoK2FwR3CpdSZxx8M6lacwljA,104141
9
+ attr/_next_gen.py,sha256=BQtCUlzwg2gWHTYXBQvrEYBnzBUrDvO57u0Py6UCPhc,26274
10
+ attr/_typing_compat.pyi,sha256=XDP54TUn-ZKhD62TOQebmzrwFyomhUCoGRpclb6alRA,469
11
+ attr/_version_info.py,sha256=w4R-FYC3NK_kMkGUWJlYP4cVAlH9HRaC-um3fcjYkHM,2222
12
+ attr/_version_info.pyi,sha256=x_M3L3WuB7r_ULXAWjx959udKQ4HLB8l-hsc1FDGNvk,209
13
+ attr/converters.py,sha256=GlDeOzPeTFgeBBLbj9G57Ez5lAk68uhSALRYJ_exe84,3861
14
+ attr/converters.pyi,sha256=orU2bff-VjQa2kMDyvnMQV73oJT2WRyQuw4ZR1ym1bE,643
15
+ attr/exceptions.py,sha256=HRFq4iybmv7-DcZwyjl6M1euM2YeJVK_hFxuaBGAngI,1977
16
+ attr/exceptions.pyi,sha256=zZq8bCUnKAy9mDtBEw42ZhPhAUIHoTKedDQInJD883M,539
17
+ attr/filters.py,sha256=ZBiKWLp3R0LfCZsq7X11pn9WX8NslS2wXM4jsnLOGc8,1795
18
+ attr/filters.pyi,sha256=3J5BG-dTxltBk1_-RuNRUHrv2qu1v8v4aDNAQ7_mifA,208
19
+ attr/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
20
+ attr/setters.py,sha256=5-dcT63GQK35ONEzSgfXCkbB7pPkaR-qv15mm4PVSzQ,1617
21
+ attr/setters.pyi,sha256=NnVkaFU1BB4JB8E4JuXyrzTUgvtMpj8p3wBdJY7uix4,584
22
+ attr/validators.py,sha256=1BnYGTuYvSucGEI4ju-RPNJteVzG0ZlfWpJiWoSFHQ8,21458
23
+ attr/validators.pyi,sha256=ftmW3m4KJ3pQcIXAj-BejT7BY4ZfqrC1G-5W7XvoPds,4082
24
+ attrs-25.4.0.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
25
+ attrs-25.4.0.dist-info/METADATA,sha256=2Rerxj7agcMRxiwdkt6lC2guqHAmkGKCH13nWWK7ZoQ,10473
26
+ attrs-25.4.0.dist-info/RECORD,,
27
+ attrs-25.4.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
28
+ attrs-25.4.0.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
29
+ attrs-25.4.0.dist-info/licenses/LICENSE,sha256=iCEVyV38KvHutnFPjsbVy8q_Znyv-HKfQkINpj9xTp8,1109
30
+ attrs/__init__.py,sha256=RxaAZNwYiEh-fcvHLZNpQ_DWKni73M_jxEPEftiq1Zc,1183
31
+ attrs/__init__.pyi,sha256=2gV79g9UxJppGSM48hAZJ6h_MHb70dZoJL31ZNJeZYI,9416
32
+ attrs/converters.py,sha256=8kQljrVwfSTRu8INwEk8SI0eGrzmWftsT7rM0EqyohM,76
33
+ attrs/exceptions.py,sha256=ACCCmg19-vDFaDPY9vFl199SPXCQMN_bENs4DALjzms,76
34
+ attrs/filters.py,sha256=VOUMZug9uEU6dUuA0dF1jInUK0PL3fLgP0VBS5d-CDE,73
35
+ attrs/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
36
+ attrs/setters.py,sha256=eL1YidYQV3T2h9_SYIZSZR1FAcHGb1TuCTy0E0Lv2SU,73
37
+ attrs/validators.py,sha256=xcy6wD5TtTkdCG1f4XWbocPSO0faBjk5IfVJfP6SUj0,76
lib/python3.13/site-packages/blinker/__init__.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from .base import ANY
4
+ from .base import default_namespace
5
+ from .base import NamedSignal
6
+ from .base import Namespace
7
+ from .base import Signal
8
+ from .base import signal
9
+
10
+ __all__ = [
11
+ "ANY",
12
+ "default_namespace",
13
+ "NamedSignal",
14
+ "Namespace",
15
+ "Signal",
16
+ "signal",
17
+ ]
lib/python3.13/site-packages/blinker/_utilities.py ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ import collections.abc as c
4
+ import inspect
5
+ import typing as t
6
+ from weakref import ref
7
+ from weakref import WeakMethod
8
+
9
+ T = t.TypeVar("T")
10
+
11
+
12
+ class Symbol:
13
+ """A constant symbol, nicer than ``object()``. Repeated calls return the
14
+ same instance.
15
+
16
+ >>> Symbol('foo') is Symbol('foo')
17
+ True
18
+ >>> Symbol('foo')
19
+ foo
20
+ """
21
+
22
+ symbols: t.ClassVar[dict[str, Symbol]] = {}
23
+
24
+ def __new__(cls, name: str) -> Symbol:
25
+ if name in cls.symbols:
26
+ return cls.symbols[name]
27
+
28
+ obj = super().__new__(cls)
29
+ cls.symbols[name] = obj
30
+ return obj
31
+
32
+ def __init__(self, name: str) -> None:
33
+ self.name = name
34
+
35
+ def __repr__(self) -> str:
36
+ return self.name
37
+
38
+ def __getnewargs__(self) -> tuple[t.Any, ...]:
39
+ return (self.name,)
40
+
41
+
42
+ def make_id(obj: object) -> c.Hashable:
43
+ """Get a stable identifier for a receiver or sender, to be used as a dict
44
+ key or in a set.
45
+ """
46
+ if inspect.ismethod(obj):
47
+ # The id of a bound method is not stable, but the id of the unbound
48
+ # function and instance are.
49
+ return id(obj.__func__), id(obj.__self__)
50
+
51
+ if isinstance(obj, (str, int)):
52
+ # Instances with the same value always compare equal and have the same
53
+ # hash, even if the id may change.
54
+ return obj
55
+
56
+ # Assume other types are not hashable but will always be the same instance.
57
+ return id(obj)
58
+
59
+
60
+ def make_ref(obj: T, callback: c.Callable[[ref[T]], None] | None = None) -> ref[T]:
61
+ if inspect.ismethod(obj):
62
+ return WeakMethod(obj, callback) # type: ignore[arg-type, return-value]
63
+
64
+ return ref(obj, callback)
lib/python3.13/site-packages/blinker/base.py ADDED
@@ -0,0 +1,512 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ import collections.abc as c
4
+ import sys
5
+ import typing as t
6
+ import weakref
7
+ from collections import defaultdict
8
+ from contextlib import contextmanager
9
+ from functools import cached_property
10
+ from inspect import iscoroutinefunction
11
+
12
+ from ._utilities import make_id
13
+ from ._utilities import make_ref
14
+ from ._utilities import Symbol
15
+
16
+ F = t.TypeVar("F", bound=c.Callable[..., t.Any])
17
+
18
+ ANY = Symbol("ANY")
19
+ """Symbol for "any sender"."""
20
+
21
+ ANY_ID = 0
22
+
23
+
24
+ class Signal:
25
+ """A notification emitter.
26
+
27
+ :param doc: The docstring for the signal.
28
+ """
29
+
30
+ ANY = ANY
31
+ """An alias for the :data:`~blinker.ANY` sender symbol."""
32
+
33
+ set_class: type[set[t.Any]] = set
34
+ """The set class to use for tracking connected receivers and senders.
35
+ Python's ``set`` is unordered. If receivers must be dispatched in the order
36
+ they were connected, an ordered set implementation can be used.
37
+
38
+ .. versionadded:: 1.7
39
+ """
40
+
41
+ @cached_property
42
+ def receiver_connected(self) -> Signal:
43
+ """Emitted at the end of each :meth:`connect` call.
44
+
45
+ The signal sender is the signal instance, and the :meth:`connect`
46
+ arguments are passed through: ``receiver``, ``sender``, and ``weak``.
47
+
48
+ .. versionadded:: 1.2
49
+ """
50
+ return Signal(doc="Emitted after a receiver connects.")
51
+
52
+ @cached_property
53
+ def receiver_disconnected(self) -> Signal:
54
+ """Emitted at the end of each :meth:`disconnect` call.
55
+
56
+ The sender is the signal instance, and the :meth:`disconnect` arguments
57
+ are passed through: ``receiver`` and ``sender``.
58
+
59
+ This signal is emitted **only** when :meth:`disconnect` is called
60
+ explicitly. This signal cannot be emitted by an automatic disconnect
61
+ when a weakly referenced receiver or sender goes out of scope, as the
62
+ instance is no longer be available to be used as the sender for this
63
+ signal.
64
+
65
+ An alternative approach is available by subscribing to
66
+ :attr:`receiver_connected` and setting up a custom weakref cleanup
67
+ callback on weak receivers and senders.
68
+
69
+ .. versionadded:: 1.2
70
+ """
71
+ return Signal(doc="Emitted after a receiver disconnects.")
72
+
73
+ def __init__(self, doc: str | None = None) -> None:
74
+ if doc:
75
+ self.__doc__ = doc
76
+
77
+ self.receivers: dict[
78
+ t.Any, weakref.ref[c.Callable[..., t.Any]] | c.Callable[..., t.Any]
79
+ ] = {}
80
+ """The map of connected receivers. Useful to quickly check if any
81
+ receivers are connected to the signal: ``if s.receivers:``. The
82
+ structure and data is not part of the public API, but checking its
83
+ boolean value is.
84
+ """
85
+
86
+ self.is_muted: bool = False
87
+ self._by_receiver: dict[t.Any, set[t.Any]] = defaultdict(self.set_class)
88
+ self._by_sender: dict[t.Any, set[t.Any]] = defaultdict(self.set_class)
89
+ self._weak_senders: dict[t.Any, weakref.ref[t.Any]] = {}
90
+
91
+ def connect(self, receiver: F, sender: t.Any = ANY, weak: bool = True) -> F:
92
+ """Connect ``receiver`` to be called when the signal is sent by
93
+ ``sender``.
94
+
95
+ :param receiver: The callable to call when :meth:`send` is called with
96
+ the given ``sender``, passing ``sender`` as a positional argument
97
+ along with any extra keyword arguments.
98
+ :param sender: Any object or :data:`ANY`. ``receiver`` will only be
99
+ called when :meth:`send` is called with this sender. If ``ANY``, the
100
+ receiver will be called for any sender. A receiver may be connected
101
+ to multiple senders by calling :meth:`connect` multiple times.
102
+ :param weak: Track the receiver with a :mod:`weakref`. The receiver will
103
+ be automatically disconnected when it is garbage collected. When
104
+ connecting a receiver defined within a function, set to ``False``,
105
+ otherwise it will be disconnected when the function scope ends.
106
+ """
107
+ receiver_id = make_id(receiver)
108
+ sender_id = ANY_ID if sender is ANY else make_id(sender)
109
+
110
+ if weak:
111
+ self.receivers[receiver_id] = make_ref(
112
+ receiver, self._make_cleanup_receiver(receiver_id)
113
+ )
114
+ else:
115
+ self.receivers[receiver_id] = receiver
116
+
117
+ self._by_sender[sender_id].add(receiver_id)
118
+ self._by_receiver[receiver_id].add(sender_id)
119
+
120
+ if sender is not ANY and sender_id not in self._weak_senders:
121
+ # store a cleanup for weakref-able senders
122
+ try:
123
+ self._weak_senders[sender_id] = make_ref(
124
+ sender, self._make_cleanup_sender(sender_id)
125
+ )
126
+ except TypeError:
127
+ pass
128
+
129
+ if "receiver_connected" in self.__dict__ and self.receiver_connected.receivers:
130
+ try:
131
+ self.receiver_connected.send(
132
+ self, receiver=receiver, sender=sender, weak=weak
133
+ )
134
+ except TypeError:
135
+ # TODO no explanation or test for this
136
+ self.disconnect(receiver, sender)
137
+ raise
138
+
139
+ return receiver
140
+
141
+ def connect_via(self, sender: t.Any, weak: bool = False) -> c.Callable[[F], F]:
142
+ """Connect the decorated function to be called when the signal is sent
143
+ by ``sender``.
144
+
145
+ The decorated function will be called when :meth:`send` is called with
146
+ the given ``sender``, passing ``sender`` as a positional argument along
147
+ with any extra keyword arguments.
148
+
149
+ :param sender: Any object or :data:`ANY`. ``receiver`` will only be
150
+ called when :meth:`send` is called with this sender. If ``ANY``, the
151
+ receiver will be called for any sender. A receiver may be connected
152
+ to multiple senders by calling :meth:`connect` multiple times.
153
+ :param weak: Track the receiver with a :mod:`weakref`. The receiver will
154
+ be automatically disconnected when it is garbage collected. When
155
+ connecting a receiver defined within a function, set to ``False``,
156
+ otherwise it will be disconnected when the function scope ends.=
157
+
158
+ .. versionadded:: 1.1
159
+ """
160
+
161
+ def decorator(fn: F) -> F:
162
+ self.connect(fn, sender, weak)
163
+ return fn
164
+
165
+ return decorator
166
+
167
+ @contextmanager
168
+ def connected_to(
169
+ self, receiver: c.Callable[..., t.Any], sender: t.Any = ANY
170
+ ) -> c.Generator[None, None, None]:
171
+ """A context manager that temporarily connects ``receiver`` to the
172
+ signal while a ``with`` block executes. When the block exits, the
173
+ receiver is disconnected. Useful for tests.
174
+
175
+ :param receiver: The callable to call when :meth:`send` is called with
176
+ the given ``sender``, passing ``sender`` as a positional argument
177
+ along with any extra keyword arguments.
178
+ :param sender: Any object or :data:`ANY`. ``receiver`` will only be
179
+ called when :meth:`send` is called with this sender. If ``ANY``, the
180
+ receiver will be called for any sender.
181
+
182
+ .. versionadded:: 1.1
183
+ """
184
+ self.connect(receiver, sender=sender, weak=False)
185
+
186
+ try:
187
+ yield None
188
+ finally:
189
+ self.disconnect(receiver)
190
+
191
+ @contextmanager
192
+ def muted(self) -> c.Generator[None, None, None]:
193
+ """A context manager that temporarily disables the signal. No receivers
194
+ will be called if the signal is sent, until the ``with`` block exits.
195
+ Useful for tests.
196
+ """
197
+ self.is_muted = True
198
+
199
+ try:
200
+ yield None
201
+ finally:
202
+ self.is_muted = False
203
+
204
+ def send(
205
+ self,
206
+ sender: t.Any | None = None,
207
+ /,
208
+ *,
209
+ _async_wrapper: c.Callable[
210
+ [c.Callable[..., c.Coroutine[t.Any, t.Any, t.Any]]], c.Callable[..., t.Any]
211
+ ]
212
+ | None = None,
213
+ **kwargs: t.Any,
214
+ ) -> list[tuple[c.Callable[..., t.Any], t.Any]]:
215
+ """Call all receivers that are connected to the given ``sender``
216
+ or :data:`ANY`. Each receiver is called with ``sender`` as a positional
217
+ argument along with any extra keyword arguments. Return a list of
218
+ ``(receiver, return value)`` tuples.
219
+
220
+ The order receivers are called is undefined, but can be influenced by
221
+ setting :attr:`set_class`.
222
+
223
+ If a receiver raises an exception, that exception will propagate up.
224
+ This makes debugging straightforward, with an assumption that correctly
225
+ implemented receivers will not raise.
226
+
227
+ :param sender: Call receivers connected to this sender, in addition to
228
+ those connected to :data:`ANY`.
229
+ :param _async_wrapper: Will be called on any receivers that are async
230
+ coroutines to turn them into sync callables. For example, could run
231
+ the receiver with an event loop.
232
+ :param kwargs: Extra keyword arguments to pass to each receiver.
233
+
234
+ .. versionchanged:: 1.7
235
+ Added the ``_async_wrapper`` argument.
236
+ """
237
+ if self.is_muted:
238
+ return []
239
+
240
+ results = []
241
+
242
+ for receiver in self.receivers_for(sender):
243
+ if iscoroutinefunction(receiver):
244
+ if _async_wrapper is None:
245
+ raise RuntimeError("Cannot send to a coroutine function.")
246
+
247
+ result = _async_wrapper(receiver)(sender, **kwargs)
248
+ else:
249
+ result = receiver(sender, **kwargs)
250
+
251
+ results.append((receiver, result))
252
+
253
+ return results
254
+
255
+ async def send_async(
256
+ self,
257
+ sender: t.Any | None = None,
258
+ /,
259
+ *,
260
+ _sync_wrapper: c.Callable[
261
+ [c.Callable[..., t.Any]], c.Callable[..., c.Coroutine[t.Any, t.Any, t.Any]]
262
+ ]
263
+ | None = None,
264
+ **kwargs: t.Any,
265
+ ) -> list[tuple[c.Callable[..., t.Any], t.Any]]:
266
+ """Await all receivers that are connected to the given ``sender``
267
+ or :data:`ANY`. Each receiver is called with ``sender`` as a positional
268
+ argument along with any extra keyword arguments. Return a list of
269
+ ``(receiver, return value)`` tuples.
270
+
271
+ The order receivers are called is undefined, but can be influenced by
272
+ setting :attr:`set_class`.
273
+
274
+ If a receiver raises an exception, that exception will propagate up.
275
+ This makes debugging straightforward, with an assumption that correctly
276
+ implemented receivers will not raise.
277
+
278
+ :param sender: Call receivers connected to this sender, in addition to
279
+ those connected to :data:`ANY`.
280
+ :param _sync_wrapper: Will be called on any receivers that are sync
281
+ callables to turn them into async coroutines. For example,
282
+ could call the receiver in a thread.
283
+ :param kwargs: Extra keyword arguments to pass to each receiver.
284
+
285
+ .. versionadded:: 1.7
286
+ """
287
+ if self.is_muted:
288
+ return []
289
+
290
+ results = []
291
+
292
+ for receiver in self.receivers_for(sender):
293
+ if not iscoroutinefunction(receiver):
294
+ if _sync_wrapper is None:
295
+ raise RuntimeError("Cannot send to a non-coroutine function.")
296
+
297
+ result = await _sync_wrapper(receiver)(sender, **kwargs)
298
+ else:
299
+ result = await receiver(sender, **kwargs)
300
+
301
+ results.append((receiver, result))
302
+
303
+ return results
304
+
305
+ def has_receivers_for(self, sender: t.Any) -> bool:
306
+ """Check if there is at least one receiver that will be called with the
307
+ given ``sender``. A receiver connected to :data:`ANY` will always be
308
+ called, regardless of sender. Does not check if weakly referenced
309
+ receivers are still live. See :meth:`receivers_for` for a stronger
310
+ search.
311
+
312
+ :param sender: Check for receivers connected to this sender, in addition
313
+ to those connected to :data:`ANY`.
314
+ """
315
+ if not self.receivers:
316
+ return False
317
+
318
+ if self._by_sender[ANY_ID]:
319
+ return True
320
+
321
+ if sender is ANY:
322
+ return False
323
+
324
+ return make_id(sender) in self._by_sender
325
+
326
+ def receivers_for(
327
+ self, sender: t.Any
328
+ ) -> c.Generator[c.Callable[..., t.Any], None, None]:
329
+ """Yield each receiver to be called for ``sender``, in addition to those
330
+ to be called for :data:`ANY`. Weakly referenced receivers that are not
331
+ live will be disconnected and skipped.
332
+
333
+ :param sender: Yield receivers connected to this sender, in addition
334
+ to those connected to :data:`ANY`.
335
+ """
336
+ # TODO: test receivers_for(ANY)
337
+ if not self.receivers:
338
+ return
339
+
340
+ sender_id = make_id(sender)
341
+
342
+ if sender_id in self._by_sender:
343
+ ids = self._by_sender[ANY_ID] | self._by_sender[sender_id]
344
+ else:
345
+ ids = self._by_sender[ANY_ID].copy()
346
+
347
+ for receiver_id in ids:
348
+ receiver = self.receivers.get(receiver_id)
349
+
350
+ if receiver is None:
351
+ continue
352
+
353
+ if isinstance(receiver, weakref.ref):
354
+ strong = receiver()
355
+
356
+ if strong is None:
357
+ self._disconnect(receiver_id, ANY_ID)
358
+ continue
359
+
360
+ yield strong
361
+ else:
362
+ yield receiver
363
+
364
+ def disconnect(self, receiver: c.Callable[..., t.Any], sender: t.Any = ANY) -> None:
365
+ """Disconnect ``receiver`` from being called when the signal is sent by
366
+ ``sender``.
367
+
368
+ :param receiver: A connected receiver callable.
369
+ :param sender: Disconnect from only this sender. By default, disconnect
370
+ from all senders.
371
+ """
372
+ sender_id: c.Hashable
373
+
374
+ if sender is ANY:
375
+ sender_id = ANY_ID
376
+ else:
377
+ sender_id = make_id(sender)
378
+
379
+ receiver_id = make_id(receiver)
380
+ self._disconnect(receiver_id, sender_id)
381
+
382
+ if (
383
+ "receiver_disconnected" in self.__dict__
384
+ and self.receiver_disconnected.receivers
385
+ ):
386
+ self.receiver_disconnected.send(self, receiver=receiver, sender=sender)
387
+
388
+ def _disconnect(self, receiver_id: c.Hashable, sender_id: c.Hashable) -> None:
389
+ if sender_id == ANY_ID:
390
+ if self._by_receiver.pop(receiver_id, None) is not None:
391
+ for bucket in self._by_sender.values():
392
+ bucket.discard(receiver_id)
393
+
394
+ self.receivers.pop(receiver_id, None)
395
+ else:
396
+ self._by_sender[sender_id].discard(receiver_id)
397
+ self._by_receiver[receiver_id].discard(sender_id)
398
+
399
+ def _make_cleanup_receiver(
400
+ self, receiver_id: c.Hashable
401
+ ) -> c.Callable[[weakref.ref[c.Callable[..., t.Any]]], None]:
402
+ """Create a callback function to disconnect a weakly referenced
403
+ receiver when it is garbage collected.
404
+ """
405
+
406
+ def cleanup(ref: weakref.ref[c.Callable[..., t.Any]]) -> None:
407
+ # If the interpreter is shutting down, disconnecting can result in a
408
+ # weird ignored exception. Don't call it in that case.
409
+ if not sys.is_finalizing():
410
+ self._disconnect(receiver_id, ANY_ID)
411
+
412
+ return cleanup
413
+
414
+ def _make_cleanup_sender(
415
+ self, sender_id: c.Hashable
416
+ ) -> c.Callable[[weakref.ref[t.Any]], None]:
417
+ """Create a callback function to disconnect all receivers for a weakly
418
+ referenced sender when it is garbage collected.
419
+ """
420
+ assert sender_id != ANY_ID
421
+
422
+ def cleanup(ref: weakref.ref[t.Any]) -> None:
423
+ self._weak_senders.pop(sender_id, None)
424
+
425
+ for receiver_id in self._by_sender.pop(sender_id, ()):
426
+ self._by_receiver[receiver_id].discard(sender_id)
427
+
428
+ return cleanup
429
+
430
+ def _cleanup_bookkeeping(self) -> None:
431
+ """Prune unused sender/receiver bookkeeping. Not threadsafe.
432
+
433
+ Connecting & disconnecting leaves behind a small amount of bookkeeping
434
+ data. Typical workloads using Blinker, for example in most web apps,
435
+ Flask, CLI scripts, etc., are not adversely affected by this
436
+ bookkeeping.
437
+
438
+ With a long-running process performing dynamic signal routing with high
439
+ volume, e.g. connecting to function closures, senders are all unique
440
+ object instances. Doing all of this over and over may cause memory usage
441
+ to grow due to extraneous bookkeeping. (An empty ``set`` for each stale
442
+ sender/receiver pair.)
443
+
444
+ This method will prune that bookkeeping away, with the caveat that such
445
+ pruning is not threadsafe. The risk is that cleanup of a fully
446
+ disconnected receiver/sender pair occurs while another thread is
447
+ connecting that same pair. If you are in the highly dynamic, unique
448
+ receiver/sender situation that has lead you to this method, that failure
449
+ mode is perhaps not a big deal for you.
450
+ """
451
+ for mapping in (self._by_sender, self._by_receiver):
452
+ for ident, bucket in list(mapping.items()):
453
+ if not bucket:
454
+ mapping.pop(ident, None)
455
+
456
+ def _clear_state(self) -> None:
457
+ """Disconnect all receivers and senders. Useful for tests."""
458
+ self._weak_senders.clear()
459
+ self.receivers.clear()
460
+ self._by_sender.clear()
461
+ self._by_receiver.clear()
462
+
463
+
464
+ class NamedSignal(Signal):
465
+ """A named generic notification emitter. The name is not used by the signal
466
+ itself, but matches the key in the :class:`Namespace` that it belongs to.
467
+
468
+ :param name: The name of the signal within the namespace.
469
+ :param doc: The docstring for the signal.
470
+ """
471
+
472
+ def __init__(self, name: str, doc: str | None = None) -> None:
473
+ super().__init__(doc)
474
+
475
+ #: The name of this signal.
476
+ self.name: str = name
477
+
478
+ def __repr__(self) -> str:
479
+ base = super().__repr__()
480
+ return f"{base[:-1]}; {self.name!r}>" # noqa: E702
481
+
482
+
483
+ class Namespace(dict[str, NamedSignal]):
484
+ """A dict mapping names to signals."""
485
+
486
+ def signal(self, name: str, doc: str | None = None) -> NamedSignal:
487
+ """Return the :class:`NamedSignal` for the given ``name``, creating it
488
+ if required. Repeated calls with the same name return the same signal.
489
+
490
+ :param name: The name of the signal.
491
+ :param doc: The docstring of the signal.
492
+ """
493
+ if name not in self:
494
+ self[name] = NamedSignal(name, doc)
495
+
496
+ return self[name]
497
+
498
+
499
+ class _PNamespaceSignal(t.Protocol):
500
+ def __call__(self, name: str, doc: str | None = None) -> NamedSignal: ...
501
+
502
+
503
+ default_namespace: Namespace = Namespace()
504
+ """A default :class:`Namespace` for creating named signals. :func:`signal`
505
+ creates a :class:`NamedSignal` in this namespace.
506
+ """
507
+
508
+ signal: _PNamespaceSignal = default_namespace.signal
509
+ """Return a :class:`NamedSignal` in :data:`default_namespace` with the given
510
+ ``name``, creating it if required. Repeated calls with the same name return the
511
+ same signal.
512
+ """
lib/python3.13/site-packages/blinker/py.typed ADDED
File without changes
lib/python3.13/site-packages/cachetools/__init__.py ADDED
@@ -0,0 +1,718 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Extensible memoizing collections and decorators."""
2
+
3
+ __all__ = (
4
+ "Cache",
5
+ "FIFOCache",
6
+ "LFUCache",
7
+ "LRUCache",
8
+ "RRCache",
9
+ "TLRUCache",
10
+ "TTLCache",
11
+ "cached",
12
+ "cachedmethod",
13
+ )
14
+
15
+ __version__ = "6.2.2"
16
+
17
+ import collections
18
+ import collections.abc
19
+ import functools
20
+ import heapq
21
+ import random
22
+ import time
23
+
24
+ from . import keys
25
+
26
+
27
+ class _DefaultSize:
28
+ __slots__ = ()
29
+
30
+ def __getitem__(self, _key):
31
+ return 1
32
+
33
+ def __setitem__(self, _key, _value):
34
+ pass
35
+
36
+ def pop(self, _key):
37
+ return 1
38
+
39
+
40
+ class Cache(collections.abc.MutableMapping):
41
+ """Mutable mapping to serve as a simple cache or cache base class."""
42
+
43
+ __marker = object()
44
+
45
+ __size = _DefaultSize()
46
+
47
+ def __init__(self, maxsize, getsizeof=None):
48
+ if getsizeof:
49
+ self.getsizeof = getsizeof
50
+ if self.getsizeof is not Cache.getsizeof:
51
+ self.__size = dict()
52
+ self.__data = dict()
53
+ self.__currsize = 0
54
+ self.__maxsize = maxsize
55
+
56
+ def __repr__(self):
57
+ return "%s(%s, maxsize=%r, currsize=%r)" % (
58
+ type(self).__name__,
59
+ repr(self.__data),
60
+ self.__maxsize,
61
+ self.__currsize,
62
+ )
63
+
64
+ def __getitem__(self, key):
65
+ try:
66
+ return self.__data[key]
67
+ except KeyError:
68
+ return self.__missing__(key)
69
+
70
+ def __setitem__(self, key, value):
71
+ maxsize = self.__maxsize
72
+ size = self.getsizeof(value)
73
+ if size > maxsize:
74
+ raise ValueError("value too large")
75
+ if key not in self.__data or self.__size[key] < size:
76
+ while self.__currsize + size > maxsize:
77
+ self.popitem()
78
+ if key in self.__data:
79
+ diffsize = size - self.__size[key]
80
+ else:
81
+ diffsize = size
82
+ self.__data[key] = value
83
+ self.__size[key] = size
84
+ self.__currsize += diffsize
85
+
86
+ def __delitem__(self, key):
87
+ size = self.__size.pop(key)
88
+ del self.__data[key]
89
+ self.__currsize -= size
90
+
91
+ def __contains__(self, key):
92
+ return key in self.__data
93
+
94
+ def __missing__(self, key):
95
+ raise KeyError(key)
96
+
97
+ def __iter__(self):
98
+ return iter(self.__data)
99
+
100
+ def __len__(self):
101
+ return len(self.__data)
102
+
103
+ # Note that we cannot simply inherit get(), pop() and setdefault()
104
+ # from MutableMapping, since these rely on __getitem__ throwing a
105
+ # KeyError on cache miss. This is not the case if __missing__ is
106
+ # implemented for a Cache subclass, so we have to roll our own,
107
+ # somewhat less elegant versions.
108
+
109
+ def get(self, key, default=None):
110
+ if key in self:
111
+ return self[key]
112
+ else:
113
+ return default
114
+
115
+ def pop(self, key, default=__marker):
116
+ if key in self:
117
+ value = self[key]
118
+ del self[key]
119
+ elif default is self.__marker:
120
+ raise KeyError(key)
121
+ else:
122
+ value = default
123
+ return value
124
+
125
+ def setdefault(self, key, default=None):
126
+ if key in self:
127
+ value = self[key]
128
+ else:
129
+ self[key] = value = default
130
+ return value
131
+
132
+ @property
133
+ def maxsize(self):
134
+ """The maximum size of the cache."""
135
+ return self.__maxsize
136
+
137
+ @property
138
+ def currsize(self):
139
+ """The current size of the cache."""
140
+ return self.__currsize
141
+
142
+ @staticmethod
143
+ def getsizeof(value):
144
+ """Return the size of a cache element's value."""
145
+ return 1
146
+
147
+
148
+ class FIFOCache(Cache):
149
+ """First In First Out (FIFO) cache implementation."""
150
+
151
+ def __init__(self, maxsize, getsizeof=None):
152
+ Cache.__init__(self, maxsize, getsizeof)
153
+ self.__order = collections.OrderedDict()
154
+
155
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
156
+ cache_setitem(self, key, value)
157
+ try:
158
+ self.__order.move_to_end(key)
159
+ except KeyError:
160
+ self.__order[key] = None
161
+
162
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
163
+ cache_delitem(self, key)
164
+ del self.__order[key]
165
+
166
+ def popitem(self):
167
+ """Remove and return the `(key, value)` pair first inserted."""
168
+ try:
169
+ key = next(iter(self.__order))
170
+ except StopIteration:
171
+ raise KeyError("%s is empty" % type(self).__name__) from None
172
+ else:
173
+ return (key, self.pop(key))
174
+
175
+
176
+ class LFUCache(Cache):
177
+ """Least Frequently Used (LFU) cache implementation."""
178
+
179
+ class _Link:
180
+ __slots__ = ("count", "keys", "next", "prev")
181
+
182
+ def __init__(self, count):
183
+ self.count = count
184
+ self.keys = set()
185
+
186
+ def unlink(self):
187
+ next = self.next
188
+ prev = self.prev
189
+ prev.next = next
190
+ next.prev = prev
191
+
192
+ def __init__(self, maxsize, getsizeof=None):
193
+ Cache.__init__(self, maxsize, getsizeof)
194
+ self.__root = root = LFUCache._Link(0) # sentinel
195
+ root.prev = root.next = root
196
+ self.__links = {}
197
+
198
+ def __getitem__(self, key, cache_getitem=Cache.__getitem__):
199
+ value = cache_getitem(self, key)
200
+ if key in self: # __missing__ may not store item
201
+ self.__touch(key)
202
+ return value
203
+
204
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
205
+ cache_setitem(self, key, value)
206
+ if key in self.__links:
207
+ return self.__touch(key)
208
+ root = self.__root
209
+ link = root.next
210
+ if link.count != 1:
211
+ link = LFUCache._Link(1)
212
+ link.next = root.next
213
+ root.next = link.next.prev = link
214
+ link.prev = root
215
+ link.keys.add(key)
216
+ self.__links[key] = link
217
+
218
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
219
+ cache_delitem(self, key)
220
+ link = self.__links.pop(key)
221
+ link.keys.remove(key)
222
+ if not link.keys:
223
+ link.unlink()
224
+
225
+ def popitem(self):
226
+ """Remove and return the `(key, value)` pair least frequently used."""
227
+ root = self.__root
228
+ curr = root.next
229
+ if curr is root:
230
+ raise KeyError("%s is empty" % type(self).__name__) from None
231
+ key = next(iter(curr.keys)) # remove an arbitrary element
232
+ return (key, self.pop(key))
233
+
234
+ def __touch(self, key):
235
+ """Increment use count"""
236
+ link = self.__links[key]
237
+ curr = link.next
238
+ if curr.count != link.count + 1:
239
+ if len(link.keys) == 1:
240
+ link.count += 1
241
+ return
242
+ curr = LFUCache._Link(link.count + 1)
243
+ curr.next = link.next
244
+ link.next = curr.next.prev = curr
245
+ curr.prev = link
246
+ curr.keys.add(key)
247
+ link.keys.remove(key)
248
+ if not link.keys:
249
+ link.unlink()
250
+ self.__links[key] = curr
251
+
252
+
253
+ class LRUCache(Cache):
254
+ """Least Recently Used (LRU) cache implementation."""
255
+
256
+ def __init__(self, maxsize, getsizeof=None):
257
+ Cache.__init__(self, maxsize, getsizeof)
258
+ self.__order = collections.OrderedDict()
259
+
260
+ def __getitem__(self, key, cache_getitem=Cache.__getitem__):
261
+ value = cache_getitem(self, key)
262
+ if key in self: # __missing__ may not store item
263
+ self.__touch(key)
264
+ return value
265
+
266
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
267
+ cache_setitem(self, key, value)
268
+ self.__touch(key)
269
+
270
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
271
+ cache_delitem(self, key)
272
+ del self.__order[key]
273
+
274
+ def popitem(self):
275
+ """Remove and return the `(key, value)` pair least recently used."""
276
+ try:
277
+ key = next(iter(self.__order))
278
+ except StopIteration:
279
+ raise KeyError("%s is empty" % type(self).__name__) from None
280
+ else:
281
+ return (key, self.pop(key))
282
+
283
+ def __touch(self, key):
284
+ """Mark as recently used"""
285
+ try:
286
+ self.__order.move_to_end(key)
287
+ except KeyError:
288
+ self.__order[key] = None
289
+
290
+
291
+ class RRCache(Cache):
292
+ """Random Replacement (RR) cache implementation."""
293
+
294
+ def __init__(self, maxsize, choice=random.choice, getsizeof=None):
295
+ Cache.__init__(self, maxsize, getsizeof)
296
+ self.__choice = choice
297
+ self.__index = {}
298
+ self.__keys = []
299
+
300
+ @property
301
+ def choice(self):
302
+ """The `choice` function used by the cache."""
303
+ return self.__choice
304
+
305
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
306
+ cache_setitem(self, key, value)
307
+ if key not in self.__index:
308
+ self.__index[key] = len(self.__keys)
309
+ self.__keys.append(key)
310
+
311
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
312
+ cache_delitem(self, key)
313
+ index = self.__index.pop(key)
314
+ if index != len(self.__keys) - 1:
315
+ last = self.__keys[-1]
316
+ self.__keys[index] = last
317
+ self.__index[last] = index
318
+ self.__keys.pop()
319
+
320
+ def popitem(self):
321
+ """Remove and return a random `(key, value)` pair."""
322
+ try:
323
+ key = self.__choice(self.__keys)
324
+ except IndexError:
325
+ raise KeyError("%s is empty" % type(self).__name__) from None
326
+ else:
327
+ return (key, self.pop(key))
328
+
329
+
330
+ class _TimedCache(Cache):
331
+ """Base class for time aware cache implementations."""
332
+
333
+ class _Timer:
334
+ def __init__(self, timer):
335
+ self.__timer = timer
336
+ self.__nesting = 0
337
+
338
+ def __call__(self):
339
+ if self.__nesting == 0:
340
+ return self.__timer()
341
+ else:
342
+ return self.__time
343
+
344
+ def __enter__(self):
345
+ if self.__nesting == 0:
346
+ self.__time = time = self.__timer()
347
+ else:
348
+ time = self.__time
349
+ self.__nesting += 1
350
+ return time
351
+
352
+ def __exit__(self, *exc):
353
+ self.__nesting -= 1
354
+
355
+ def __reduce__(self):
356
+ return _TimedCache._Timer, (self.__timer,)
357
+
358
+ def __getattr__(self, name):
359
+ return getattr(self.__timer, name)
360
+
361
+ def __init__(self, maxsize, timer=time.monotonic, getsizeof=None):
362
+ Cache.__init__(self, maxsize, getsizeof)
363
+ self.__timer = _TimedCache._Timer(timer)
364
+
365
+ def __repr__(self, cache_repr=Cache.__repr__):
366
+ with self.__timer as time:
367
+ self.expire(time)
368
+ return cache_repr(self)
369
+
370
+ def __len__(self, cache_len=Cache.__len__):
371
+ with self.__timer as time:
372
+ self.expire(time)
373
+ return cache_len(self)
374
+
375
+ @property
376
+ def currsize(self):
377
+ with self.__timer as time:
378
+ self.expire(time)
379
+ return super().currsize
380
+
381
+ @property
382
+ def timer(self):
383
+ """The timer function used by the cache."""
384
+ return self.__timer
385
+
386
+ def clear(self):
387
+ with self.__timer as time:
388
+ self.expire(time)
389
+ Cache.clear(self)
390
+
391
+ def get(self, *args, **kwargs):
392
+ with self.__timer:
393
+ return Cache.get(self, *args, **kwargs)
394
+
395
+ def pop(self, *args, **kwargs):
396
+ with self.__timer:
397
+ return Cache.pop(self, *args, **kwargs)
398
+
399
+ def setdefault(self, *args, **kwargs):
400
+ with self.__timer:
401
+ return Cache.setdefault(self, *args, **kwargs)
402
+
403
+
404
+ class TTLCache(_TimedCache):
405
+ """LRU Cache implementation with per-item time-to-live (TTL) value."""
406
+
407
+ class _Link:
408
+ __slots__ = ("key", "expires", "next", "prev")
409
+
410
+ def __init__(self, key=None, expires=None):
411
+ self.key = key
412
+ self.expires = expires
413
+
414
+ def __reduce__(self):
415
+ return TTLCache._Link, (self.key, self.expires)
416
+
417
+ def unlink(self):
418
+ next = self.next
419
+ prev = self.prev
420
+ prev.next = next
421
+ next.prev = prev
422
+
423
+ def __init__(self, maxsize, ttl, timer=time.monotonic, getsizeof=None):
424
+ _TimedCache.__init__(self, maxsize, timer, getsizeof)
425
+ self.__root = root = TTLCache._Link()
426
+ root.prev = root.next = root
427
+ self.__links = collections.OrderedDict()
428
+ self.__ttl = ttl
429
+
430
+ def __contains__(self, key):
431
+ try:
432
+ link = self.__links[key] # no reordering
433
+ except KeyError:
434
+ return False
435
+ else:
436
+ return self.timer() < link.expires
437
+
438
+ def __getitem__(self, key, cache_getitem=Cache.__getitem__):
439
+ try:
440
+ link = self.__getlink(key)
441
+ except KeyError:
442
+ expired = False
443
+ else:
444
+ expired = not (self.timer() < link.expires)
445
+ if expired:
446
+ return self.__missing__(key)
447
+ else:
448
+ return cache_getitem(self, key)
449
+
450
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
451
+ with self.timer as time:
452
+ self.expire(time)
453
+ cache_setitem(self, key, value)
454
+ try:
455
+ link = self.__getlink(key)
456
+ except KeyError:
457
+ self.__links[key] = link = TTLCache._Link(key)
458
+ else:
459
+ link.unlink()
460
+ link.expires = time + self.__ttl
461
+ link.next = root = self.__root
462
+ link.prev = prev = root.prev
463
+ prev.next = root.prev = link
464
+
465
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
466
+ cache_delitem(self, key)
467
+ link = self.__links.pop(key)
468
+ link.unlink()
469
+ if not (self.timer() < link.expires):
470
+ raise KeyError(key)
471
+
472
+ def __iter__(self):
473
+ root = self.__root
474
+ curr = root.next
475
+ while curr is not root:
476
+ # "freeze" time for iterator access
477
+ with self.timer as time:
478
+ if time < curr.expires:
479
+ yield curr.key
480
+ curr = curr.next
481
+
482
+ def __setstate__(self, state):
483
+ self.__dict__.update(state)
484
+ root = self.__root
485
+ root.prev = root.next = root
486
+ for link in sorted(self.__links.values(), key=lambda obj: obj.expires):
487
+ link.next = root
488
+ link.prev = prev = root.prev
489
+ prev.next = root.prev = link
490
+ self.expire(self.timer())
491
+
492
+ @property
493
+ def ttl(self):
494
+ """The time-to-live value of the cache's items."""
495
+ return self.__ttl
496
+
497
+ def expire(self, time=None):
498
+ """Remove expired items from the cache and return an iterable of the
499
+ expired `(key, value)` pairs.
500
+
501
+ """
502
+ if time is None:
503
+ time = self.timer()
504
+ root = self.__root
505
+ curr = root.next
506
+ links = self.__links
507
+ expired = []
508
+ cache_delitem = Cache.__delitem__
509
+ cache_getitem = Cache.__getitem__
510
+ while curr is not root and not (time < curr.expires):
511
+ expired.append((curr.key, cache_getitem(self, curr.key)))
512
+ cache_delitem(self, curr.key)
513
+ del links[curr.key]
514
+ next = curr.next
515
+ curr.unlink()
516
+ curr = next
517
+ return expired
518
+
519
+ def popitem(self):
520
+ """Remove and return the `(key, value)` pair least recently used that
521
+ has not already expired.
522
+
523
+ """
524
+ with self.timer as time:
525
+ self.expire(time)
526
+ try:
527
+ key = next(iter(self.__links))
528
+ except StopIteration:
529
+ raise KeyError("%s is empty" % type(self).__name__) from None
530
+ else:
531
+ return (key, self.pop(key))
532
+
533
+ def __getlink(self, key):
534
+ value = self.__links[key]
535
+ self.__links.move_to_end(key)
536
+ return value
537
+
538
+
539
+ class TLRUCache(_TimedCache):
540
+ """Time aware Least Recently Used (TLRU) cache implementation."""
541
+
542
+ @functools.total_ordering
543
+ class _Item:
544
+ __slots__ = ("key", "expires", "removed")
545
+
546
+ def __init__(self, key=None, expires=None):
547
+ self.key = key
548
+ self.expires = expires
549
+ self.removed = False
550
+
551
+ def __lt__(self, other):
552
+ return self.expires < other.expires
553
+
554
+ def __init__(self, maxsize, ttu, timer=time.monotonic, getsizeof=None):
555
+ _TimedCache.__init__(self, maxsize, timer, getsizeof)
556
+ self.__items = collections.OrderedDict()
557
+ self.__order = []
558
+ self.__ttu = ttu
559
+
560
+ def __contains__(self, key):
561
+ try:
562
+ item = self.__items[key] # no reordering
563
+ except KeyError:
564
+ return False
565
+ else:
566
+ return self.timer() < item.expires
567
+
568
+ def __getitem__(self, key, cache_getitem=Cache.__getitem__):
569
+ try:
570
+ item = self.__getitem(key)
571
+ except KeyError:
572
+ expired = False
573
+ else:
574
+ expired = not (self.timer() < item.expires)
575
+ if expired:
576
+ return self.__missing__(key)
577
+ else:
578
+ return cache_getitem(self, key)
579
+
580
+ def __setitem__(self, key, value, cache_setitem=Cache.__setitem__):
581
+ with self.timer as time:
582
+ expires = self.__ttu(key, value, time)
583
+ if not (time < expires):
584
+ return # skip expired items
585
+ self.expire(time)
586
+ cache_setitem(self, key, value)
587
+ # removing an existing item would break the heap structure, so
588
+ # only mark it as removed for now
589
+ try:
590
+ self.__getitem(key).removed = True
591
+ except KeyError:
592
+ pass
593
+ self.__items[key] = item = TLRUCache._Item(key, expires)
594
+ heapq.heappush(self.__order, item)
595
+
596
+ def __delitem__(self, key, cache_delitem=Cache.__delitem__):
597
+ with self.timer as time:
598
+ # no self.expire() for performance reasons, e.g. self.clear() [#67]
599
+ cache_delitem(self, key)
600
+ item = self.__items.pop(key)
601
+ item.removed = True
602
+ if not (time < item.expires):
603
+ raise KeyError(key)
604
+
605
+ def __iter__(self):
606
+ for curr in self.__order:
607
+ # "freeze" time for iterator access
608
+ with self.timer as time:
609
+ if time < curr.expires and not curr.removed:
610
+ yield curr.key
611
+
612
+ @property
613
+ def ttu(self):
614
+ """The local time-to-use function used by the cache."""
615
+ return self.__ttu
616
+
617
+ def expire(self, time=None):
618
+ """Remove expired items from the cache and return an iterable of the
619
+ expired `(key, value)` pairs.
620
+
621
+ """
622
+ if time is None:
623
+ time = self.timer()
624
+ items = self.__items
625
+ order = self.__order
626
+ # clean up the heap if too many items are marked as removed
627
+ if len(order) > len(items) * 2:
628
+ self.__order = order = [item for item in order if not item.removed]
629
+ heapq.heapify(order)
630
+ expired = []
631
+ cache_delitem = Cache.__delitem__
632
+ cache_getitem = Cache.__getitem__
633
+ while order and (order[0].removed or not (time < order[0].expires)):
634
+ item = heapq.heappop(order)
635
+ if not item.removed:
636
+ expired.append((item.key, cache_getitem(self, item.key)))
637
+ cache_delitem(self, item.key)
638
+ del items[item.key]
639
+ return expired
640
+
641
+ def popitem(self):
642
+ """Remove and return the `(key, value)` pair least recently used that
643
+ has not already expired.
644
+
645
+ """
646
+ with self.timer as time:
647
+ self.expire(time)
648
+ try:
649
+ key = next(iter(self.__items))
650
+ except StopIteration:
651
+ raise KeyError("%s is empty" % type(self).__name__) from None
652
+ else:
653
+ return (key, self.pop(key))
654
+
655
+ def __getitem(self, key):
656
+ value = self.__items[key]
657
+ self.__items.move_to_end(key)
658
+ return value
659
+
660
+
661
+ _CacheInfo = collections.namedtuple(
662
+ "CacheInfo", ["hits", "misses", "maxsize", "currsize"]
663
+ )
664
+
665
+
666
+ def cached(cache, key=keys.hashkey, lock=None, condition=None, info=False):
667
+ """Decorator to wrap a function with a memoizing callable that saves
668
+ results in a cache.
669
+
670
+ """
671
+ from ._cached import _wrapper
672
+
673
+ if isinstance(condition, bool):
674
+ from warnings import warn
675
+
676
+ warn(
677
+ "passing `info` as positional parameter is deprecated",
678
+ DeprecationWarning,
679
+ stacklevel=2,
680
+ )
681
+ info = condition
682
+ condition = None
683
+
684
+ def decorator(func):
685
+ if info:
686
+ if isinstance(cache, Cache):
687
+
688
+ def make_info(hits, misses):
689
+ return _CacheInfo(hits, misses, cache.maxsize, cache.currsize)
690
+
691
+ elif isinstance(cache, collections.abc.Mapping):
692
+
693
+ def make_info(hits, misses):
694
+ return _CacheInfo(hits, misses, None, len(cache))
695
+
696
+ else:
697
+
698
+ def make_info(hits, misses):
699
+ return _CacheInfo(hits, misses, 0, 0)
700
+
701
+ return _wrapper(func, cache, key, lock, condition, info=make_info)
702
+ else:
703
+ return _wrapper(func, cache, key, lock, condition)
704
+
705
+ return decorator
706
+
707
+
708
+ def cachedmethod(cache, key=keys.methodkey, lock=None, condition=None):
709
+ """Decorator to wrap a class or instance method with a memoizing
710
+ callable that saves results in a cache.
711
+
712
+ """
713
+ from ._cachedmethod import _wrapper
714
+
715
+ def decorator(method):
716
+ return _wrapper(method, cache, key, lock, condition)
717
+
718
+ return decorator
lib/python3.13/site-packages/cachetools/_cached.py ADDED
@@ -0,0 +1,247 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Function decorator helpers."""
2
+
3
+ import functools
4
+
5
+
6
+ def _condition_info(func, cache, key, lock, cond, info):
7
+ hits = misses = 0
8
+ pending = set()
9
+
10
+ def wrapper(*args, **kwargs):
11
+ nonlocal hits, misses
12
+ k = key(*args, **kwargs)
13
+ with lock:
14
+ cond.wait_for(lambda: k not in pending)
15
+ try:
16
+ result = cache[k]
17
+ hits += 1
18
+ return result
19
+ except KeyError:
20
+ pending.add(k)
21
+ misses += 1
22
+ try:
23
+ v = func(*args, **kwargs)
24
+ with lock:
25
+ try:
26
+ cache[k] = v
27
+ except ValueError:
28
+ pass # value too large
29
+ return v
30
+ finally:
31
+ with lock:
32
+ pending.remove(k)
33
+ cond.notify_all()
34
+
35
+ def cache_clear():
36
+ nonlocal hits, misses
37
+ with lock:
38
+ cache.clear()
39
+ hits = misses = 0
40
+
41
+ def cache_info():
42
+ with lock:
43
+ return info(hits, misses)
44
+
45
+ wrapper.cache_clear = cache_clear
46
+ wrapper.cache_info = cache_info
47
+ return wrapper
48
+
49
+
50
+ def _locked_info(func, cache, key, lock, info):
51
+ hits = misses = 0
52
+
53
+ def wrapper(*args, **kwargs):
54
+ nonlocal hits, misses
55
+ k = key(*args, **kwargs)
56
+ with lock:
57
+ try:
58
+ result = cache[k]
59
+ hits += 1
60
+ return result
61
+ except KeyError:
62
+ misses += 1
63
+ v = func(*args, **kwargs)
64
+ with lock:
65
+ try:
66
+ # In case of a race condition, i.e. if another thread
67
+ # stored a value for this key while we were calling
68
+ # func(), prefer the cached value.
69
+ return cache.setdefault(k, v)
70
+ except ValueError:
71
+ return v # value too large
72
+
73
+ def cache_clear():
74
+ nonlocal hits, misses
75
+ with lock:
76
+ cache.clear()
77
+ hits = misses = 0
78
+
79
+ def cache_info():
80
+ with lock:
81
+ return info(hits, misses)
82
+
83
+ wrapper.cache_clear = cache_clear
84
+ wrapper.cache_info = cache_info
85
+ return wrapper
86
+
87
+
88
+ def _unlocked_info(func, cache, key, info):
89
+ hits = misses = 0
90
+
91
+ def wrapper(*args, **kwargs):
92
+ nonlocal hits, misses
93
+ k = key(*args, **kwargs)
94
+ try:
95
+ result = cache[k]
96
+ hits += 1
97
+ return result
98
+ except KeyError:
99
+ misses += 1
100
+ v = func(*args, **kwargs)
101
+ try:
102
+ cache[k] = v
103
+ except ValueError:
104
+ pass # value too large
105
+ return v
106
+
107
+ def cache_clear():
108
+ nonlocal hits, misses
109
+ cache.clear()
110
+ hits = misses = 0
111
+
112
+ wrapper.cache_clear = cache_clear
113
+ wrapper.cache_info = lambda: info(hits, misses)
114
+ return wrapper
115
+
116
+
117
+ def _uncached_info(func, info):
118
+ misses = 0
119
+
120
+ def wrapper(*args, **kwargs):
121
+ nonlocal misses
122
+ misses += 1
123
+ return func(*args, **kwargs)
124
+
125
+ def cache_clear():
126
+ nonlocal misses
127
+ misses = 0
128
+
129
+ wrapper.cache_clear = cache_clear
130
+ wrapper.cache_info = lambda: info(0, misses)
131
+ return wrapper
132
+
133
+
134
+ def _condition(func, cache, key, lock, cond):
135
+ pending = set()
136
+
137
+ def wrapper(*args, **kwargs):
138
+ k = key(*args, **kwargs)
139
+ with lock:
140
+ cond.wait_for(lambda: k not in pending)
141
+ try:
142
+ result = cache[k]
143
+ return result
144
+ except KeyError:
145
+ pending.add(k)
146
+ try:
147
+ v = func(*args, **kwargs)
148
+ with lock:
149
+ try:
150
+ cache[k] = v
151
+ except ValueError:
152
+ pass # value too large
153
+ return v
154
+ finally:
155
+ with lock:
156
+ pending.remove(k)
157
+ cond.notify_all()
158
+
159
+ def cache_clear():
160
+ with lock:
161
+ cache.clear()
162
+
163
+ wrapper.cache_clear = cache_clear
164
+ return wrapper
165
+
166
+
167
+ def _locked(func, cache, key, lock):
168
+ def wrapper(*args, **kwargs):
169
+ k = key(*args, **kwargs)
170
+ with lock:
171
+ try:
172
+ return cache[k]
173
+ except KeyError:
174
+ pass # key not found
175
+ v = func(*args, **kwargs)
176
+ with lock:
177
+ try:
178
+ # possible race condition: see above
179
+ return cache.setdefault(k, v)
180
+ except ValueError:
181
+ return v # value too large
182
+
183
+ def cache_clear():
184
+ with lock:
185
+ cache.clear()
186
+
187
+ wrapper.cache_clear = cache_clear
188
+ return wrapper
189
+
190
+
191
+ def _unlocked(func, cache, key):
192
+ def wrapper(*args, **kwargs):
193
+ k = key(*args, **kwargs)
194
+ try:
195
+ return cache[k]
196
+ except KeyError:
197
+ pass # key not found
198
+ v = func(*args, **kwargs)
199
+ try:
200
+ cache[k] = v
201
+ except ValueError:
202
+ pass # value too large
203
+ return v
204
+
205
+ wrapper.cache_clear = lambda: cache.clear()
206
+ return wrapper
207
+
208
+
209
+ def _uncached(func):
210
+ def wrapper(*args, **kwargs):
211
+ return func(*args, **kwargs)
212
+
213
+ wrapper.cache_clear = lambda: None
214
+ return wrapper
215
+
216
+
217
+ def _wrapper(func, cache, key, lock=None, cond=None, info=None):
218
+ if info is not None:
219
+ if cache is None:
220
+ wrapper = _uncached_info(func, info)
221
+ elif cond is not None and lock is not None:
222
+ wrapper = _condition_info(func, cache, key, lock, cond, info)
223
+ elif cond is not None:
224
+ wrapper = _condition_info(func, cache, key, cond, cond, info)
225
+ elif lock is not None:
226
+ wrapper = _locked_info(func, cache, key, lock, info)
227
+ else:
228
+ wrapper = _unlocked_info(func, cache, key, info)
229
+ else:
230
+ if cache is None:
231
+ wrapper = _uncached(func)
232
+ elif cond is not None and lock is not None:
233
+ wrapper = _condition(func, cache, key, lock, cond)
234
+ elif cond is not None:
235
+ wrapper = _condition(func, cache, key, cond, cond)
236
+ elif lock is not None:
237
+ wrapper = _locked(func, cache, key, lock)
238
+ else:
239
+ wrapper = _unlocked(func, cache, key)
240
+ wrapper.cache_info = None
241
+
242
+ wrapper.cache = cache
243
+ wrapper.cache_key = key
244
+ wrapper.cache_lock = lock if lock is not None else cond
245
+ wrapper.cache_condition = cond
246
+
247
+ return functools.update_wrapper(wrapper, func)
lib/python3.13/site-packages/cachetools/_cachedmethod.py ADDED
@@ -0,0 +1,128 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Method decorator helpers."""
2
+
3
+ import functools
4
+ import weakref
5
+
6
+
7
+ def warn_cache_none():
8
+ from warnings import warn
9
+
10
+ warn(
11
+ "returning `None` from `cache(self)` is deprecated",
12
+ DeprecationWarning,
13
+ stacklevel=3,
14
+ )
15
+
16
+
17
+ def _condition(method, cache, key, lock, cond):
18
+ pending = weakref.WeakKeyDictionary()
19
+
20
+ def wrapper(self, *args, **kwargs):
21
+ c = cache(self)
22
+ if c is None:
23
+ warn_cache_none()
24
+ return method(self, *args, **kwargs)
25
+ k = key(self, *args, **kwargs)
26
+ with lock(self):
27
+ p = pending.setdefault(self, set())
28
+ cond(self).wait_for(lambda: k not in p)
29
+ try:
30
+ return c[k]
31
+ except KeyError:
32
+ p.add(k)
33
+ try:
34
+ v = method(self, *args, **kwargs)
35
+ with lock(self):
36
+ try:
37
+ c[k] = v
38
+ except ValueError:
39
+ pass # value too large
40
+ return v
41
+ finally:
42
+ with lock(self):
43
+ pending[self].remove(k)
44
+ cond(self).notify_all()
45
+
46
+ def cache_clear(self):
47
+ c = cache(self)
48
+ if c is not None:
49
+ with lock(self):
50
+ c.clear()
51
+
52
+ wrapper.cache_clear = cache_clear
53
+ return wrapper
54
+
55
+
56
+ def _locked(method, cache, key, lock):
57
+ def wrapper(self, *args, **kwargs):
58
+ c = cache(self)
59
+ if c is None:
60
+ warn_cache_none()
61
+ return method(self, *args, **kwargs)
62
+ k = key(self, *args, **kwargs)
63
+ with lock(self):
64
+ try:
65
+ return c[k]
66
+ except KeyError:
67
+ pass # key not found
68
+ v = method(self, *args, **kwargs)
69
+ # in case of a race, prefer the item already in the cache
70
+ with lock(self):
71
+ try:
72
+ return c.setdefault(k, v)
73
+ except ValueError:
74
+ return v # value too large
75
+
76
+ def cache_clear(self):
77
+ c = cache(self)
78
+ if c is not None:
79
+ with lock(self):
80
+ c.clear()
81
+
82
+ wrapper.cache_clear = cache_clear
83
+ return wrapper
84
+
85
+
86
+ def _unlocked(method, cache, key):
87
+ def wrapper(self, *args, **kwargs):
88
+ c = cache(self)
89
+ if c is None:
90
+ warn_cache_none()
91
+ return method(self, *args, **kwargs)
92
+ k = key(self, *args, **kwargs)
93
+ try:
94
+ return c[k]
95
+ except KeyError:
96
+ pass # key not found
97
+ v = method(self, *args, **kwargs)
98
+ try:
99
+ c[k] = v
100
+ except ValueError:
101
+ pass # value too large
102
+ return v
103
+
104
+ def cache_clear(self):
105
+ c = cache(self)
106
+ if c is not None:
107
+ c.clear()
108
+
109
+ wrapper.cache_clear = cache_clear
110
+ return wrapper
111
+
112
+
113
+ def _wrapper(method, cache, key, lock=None, cond=None):
114
+ if cond is not None and lock is not None:
115
+ wrapper = _condition(method, cache, key, lock, cond)
116
+ elif cond is not None:
117
+ wrapper = _condition(method, cache, key, cond, cond)
118
+ elif lock is not None:
119
+ wrapper = _locked(method, cache, key, lock)
120
+ else:
121
+ wrapper = _unlocked(method, cache, key)
122
+
123
+ wrapper.cache = cache
124
+ wrapper.cache_key = key
125
+ wrapper.cache_lock = lock if lock is not None else cond
126
+ wrapper.cache_condition = cond
127
+
128
+ return functools.update_wrapper(wrapper, method)
lib/python3.13/site-packages/cachetools/func.py ADDED
@@ -0,0 +1,102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """`functools.lru_cache` compatible memoizing function decorators."""
2
+
3
+ __all__ = ("fifo_cache", "lfu_cache", "lru_cache", "rr_cache", "ttl_cache")
4
+
5
+ import functools
6
+ import math
7
+ import random
8
+ import time
9
+ from threading import Condition
10
+
11
+ from . import FIFOCache, LFUCache, LRUCache, RRCache, TTLCache
12
+ from . import cached
13
+ from . import keys
14
+
15
+
16
+ class _UnboundTTLCache(TTLCache):
17
+ def __init__(self, ttl, timer):
18
+ TTLCache.__init__(self, math.inf, ttl, timer)
19
+
20
+ @property
21
+ def maxsize(self):
22
+ return None
23
+
24
+
25
+ def _cache(cache, maxsize, typed):
26
+ def decorator(func):
27
+ key = keys.typedkey if typed else keys.hashkey
28
+ wrapper = cached(cache=cache, key=key, condition=Condition(), info=True)(func)
29
+ wrapper.cache_parameters = lambda: {"maxsize": maxsize, "typed": typed}
30
+ return wrapper
31
+
32
+ return decorator
33
+
34
+
35
+ def fifo_cache(maxsize=128, typed=False):
36
+ """Decorator to wrap a function with a memoizing callable that saves
37
+ up to `maxsize` results based on a First In First Out (FIFO)
38
+ algorithm.
39
+
40
+ """
41
+ if maxsize is None:
42
+ return _cache({}, None, typed)
43
+ elif callable(maxsize):
44
+ return _cache(FIFOCache(128), 128, typed)(maxsize)
45
+ else:
46
+ return _cache(FIFOCache(maxsize), maxsize, typed)
47
+
48
+
49
+ def lfu_cache(maxsize=128, typed=False):
50
+ """Decorator to wrap a function with a memoizing callable that saves
51
+ up to `maxsize` results based on a Least Frequently Used (LFU)
52
+ algorithm.
53
+
54
+ """
55
+ if maxsize is None:
56
+ return _cache({}, None, typed)
57
+ elif callable(maxsize):
58
+ return _cache(LFUCache(128), 128, typed)(maxsize)
59
+ else:
60
+ return _cache(LFUCache(maxsize), maxsize, typed)
61
+
62
+
63
+ def lru_cache(maxsize=128, typed=False):
64
+ """Decorator to wrap a function with a memoizing callable that saves
65
+ up to `maxsize` results based on a Least Recently Used (LRU)
66
+ algorithm.
67
+
68
+ """
69
+ if maxsize is None:
70
+ return _cache({}, None, typed)
71
+ elif callable(maxsize):
72
+ return _cache(LRUCache(128), 128, typed)(maxsize)
73
+ else:
74
+ return _cache(LRUCache(maxsize), maxsize, typed)
75
+
76
+
77
+ def rr_cache(maxsize=128, choice=random.choice, typed=False):
78
+ """Decorator to wrap a function with a memoizing callable that saves
79
+ up to `maxsize` results based on a Random Replacement (RR)
80
+ algorithm.
81
+
82
+ """
83
+ if maxsize is None:
84
+ return _cache({}, None, typed)
85
+ elif callable(maxsize):
86
+ return _cache(RRCache(128, choice), 128, typed)(maxsize)
87
+ else:
88
+ return _cache(RRCache(maxsize, choice), maxsize, typed)
89
+
90
+
91
+ def ttl_cache(maxsize=128, ttl=600, timer=time.monotonic, typed=False):
92
+ """Decorator to wrap a function with a memoizing callable that saves
93
+ up to `maxsize` results based on a Least Recently Used (LRU)
94
+ algorithm with a per-item time-to-live (TTL) value.
95
+
96
+ """
97
+ if maxsize is None:
98
+ return _cache(_UnboundTTLCache(ttl, timer), None, typed)
99
+ elif callable(maxsize):
100
+ return _cache(TTLCache(128, ttl, timer), 128, typed)(maxsize)
101
+ else:
102
+ return _cache(TTLCache(maxsize, ttl, timer), maxsize, typed)
lib/python3.13/site-packages/cachetools/keys.py ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Key functions for memoizing decorators."""
2
+
3
+ __all__ = ("hashkey", "methodkey", "typedkey", "typedmethodkey")
4
+
5
+
6
+ class _HashedTuple(tuple):
7
+ """A tuple that ensures that hash() will be called no more than once
8
+ per element, since cache decorators will hash the key multiple
9
+ times on a cache miss. See also _HashedSeq in the standard
10
+ library functools implementation.
11
+
12
+ """
13
+
14
+ __hashvalue = None
15
+
16
+ def __hash__(self, hash=tuple.__hash__):
17
+ hashvalue = self.__hashvalue
18
+ if hashvalue is None:
19
+ self.__hashvalue = hashvalue = hash(self)
20
+ return hashvalue
21
+
22
+ def __add__(self, other, add=tuple.__add__):
23
+ return _HashedTuple(add(self, other))
24
+
25
+ def __radd__(self, other, add=tuple.__add__):
26
+ return _HashedTuple(add(other, self))
27
+
28
+ def __getstate__(self):
29
+ return {}
30
+
31
+
32
+ # A sentinel for separating args from kwargs. Using the class itself
33
+ # ensures uniqueness and preserves identity when pickling/unpickling.
34
+ _kwmark = (_HashedTuple,)
35
+
36
+
37
+ def hashkey(*args, **kwargs):
38
+ """Return a cache key for the specified hashable arguments."""
39
+
40
+ if kwargs:
41
+ return _HashedTuple(args + sum(sorted(kwargs.items()), _kwmark))
42
+ else:
43
+ return _HashedTuple(args)
44
+
45
+
46
+ def methodkey(self, *args, **kwargs):
47
+ """Return a cache key for use with cached methods."""
48
+ return hashkey(*args, **kwargs)
49
+
50
+
51
+ def typedkey(*args, **kwargs):
52
+ """Return a typed cache key for the specified hashable arguments."""
53
+
54
+ key = hashkey(*args, **kwargs)
55
+ key += tuple(type(v) for v in args)
56
+ key += tuple(type(v) for _, v in sorted(kwargs.items()))
57
+ return key
58
+
59
+
60
+ def typedmethodkey(self, *args, **kwargs):
61
+ """Return a typed cache key for use with cached methods."""
62
+ return typedkey(*args, **kwargs)
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/METADATA ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.4
2
+ Name: cuda-pathfinder
3
+ Version: 1.3.2
4
+ Summary: Pathfinder for CUDA components
5
+ Author-email: NVIDIA Corporation <cuda-python-conduct@nvidia.com>
6
+ License-Expression: Apache-2.0
7
+ Project-URL: Repository, https://github.com/NVIDIA/cuda-python
8
+ Project-URL: Documentation, https://nvidia.github.io/cuda-python/
9
+ Requires-Python: >=3.9
10
+ Description-Content-Type: text/x-rst
11
+ License-File: LICENSE
12
+ Dynamic: license-file
13
+
14
+ .. SPDX-FileCopyrightText: Copyright (c) 2025 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
15
+ .. SPDX-License-Identifier: Apache-2.0
16
+
17
+ *******************************************************
18
+ cuda-pathfinder: Utilities for locating CUDA components
19
+ *******************************************************
20
+
21
+ .. image:: https://img.shields.io/badge/NVIDIA-black?logo=nvidia
22
+ :target: https://www.nvidia.com/
23
+ :alt: NVIDIA
24
+
25
+ `cuda.pathfinder <https://nvidia.github.io/cuda-python/cuda-pathfinder/>`_
26
+ aims to be a one-stop solution for locating CUDA components. Currently
27
+ it supports locating and loading dynamic libraries (``.so``, ``.dll``), and
28
+ locating CTK header directories. Support for other artifacts is in progress.
29
+
30
+ * `Documentation <https://nvidia.github.io/cuda-python/cuda-pathfinder/>`_
31
+ * `Releases <https://nvidia.github.io/cuda-python/cuda-pathfinder/latest/release.html>`_
32
+ * `Repository <https://github.com/NVIDIA/cuda-python/tree/main/cuda_pathfinder/>`_
33
+ * `Issue tracker <https://github.com/NVIDIA/cuda-python/issues/>`_ (select component ``cuda.pathfinder``)
34
+
35
+ ``cuda.pathfinder`` is under active development. Feedback and suggestions are welcome.
36
+
37
+
38
+ Installation
39
+ ============
40
+
41
+ .. code-block:: bash
42
+
43
+ pip install cuda-pathfinder
44
+
45
+ ``cuda-pathfinder`` is `CUDA Toolkit (CTK) <https://developer.nvidia.com/cuda-toolkit>`_
46
+ version-agnostic. It follows the general CUDA Toolkit support policy: the
47
+ two most recent major versions are supported simultaneously.
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/RECORD ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ cuda/pathfinder/__init__.py,sha256=0wYh03rQ052PLO_Tx29ns78ZvujcFB3w4lP5vqKBEQo,1446
2
+ cuda/pathfinder/_dynamic_libs/find_nvidia_dynamic_lib.py,sha256=OZiFguL6h1wWYgChXn2Q_o1qey2NabAC_8JDvBqE-oE,7700
3
+ cuda/pathfinder/_dynamic_libs/load_dl_common.py,sha256=GNxt26-3sIEVszIDhbASUgG5eSYeQ27wA8w09eRGVIQ,698
4
+ cuda/pathfinder/_dynamic_libs/load_dl_linux.py,sha256=cn6ohzxVewAQm7YjiWHRom2VkpdTt90hP1v8Xxtkdsk,8035
5
+ cuda/pathfinder/_dynamic_libs/load_dl_windows.py,sha256=neDkJyIjlxRQdyfwPqAoFd-TfHc_KF196q-iLCkBftw,6114
6
+ cuda/pathfinder/_dynamic_libs/load_nvidia_dynamic_lib.py,sha256=g4BIea4FsWgrNT3BvnFcQRorzQEJkolwE3hFi1w-LZw,4971
7
+ cuda/pathfinder/_dynamic_libs/supported_nvidia_libs.py,sha256=ZtgK9JXYm8hN0lZsVFTAQUI4DyiG0sfcrLTx5ohDe98,15446
8
+ cuda/pathfinder/_headers/find_nvidia_headers.py,sha256=6ff20zQgESXPAWBitwQqhqql08dCjbKULfXZb1cApgc,5511
9
+ cuda/pathfinder/_headers/supported_nvidia_headers.py,sha256=_XK_PWvXniXj465Uopm3Kx0NrxD2dFbh9mSnUyn1C0s,3290
10
+ cuda/pathfinder/_utils/env_vars.py,sha256=wMPynYwhzs2Omf3zDRcedqaxFlMJFsM0r1gkwNIqC4w,1787
11
+ cuda/pathfinder/_utils/find_site_packages_dll.py,sha256=rZV7ylufarBmqdDk9BW1vEvuz-wYj10mISJCXzPrGU0,1002
12
+ cuda/pathfinder/_utils/find_site_packages_so.py,sha256=pSWtfnJiToQddFBC5hzaYQK_YBZwTQ8oMQLV3_p9A_E,1498
13
+ cuda/pathfinder/_utils/find_sub_dirs.py,sha256=2PwRkBcJpARP8r-p0iNpeZmijCJnSHudfzRSuKp0qr0,1883
14
+ cuda/pathfinder/_utils/platform_aware.py,sha256=oOCqy_VyNdEdISVM2ilde8Kl6rOeElItvsAnw0WD9b8,463
15
+ cuda/pathfinder/_version.py,sha256=F68ShSCFLZTYqos0Vv_q6vWAzK3RZJcQ1SaxpVqFwNM,160
16
+ cuda_pathfinder-1.3.2.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
17
+ cuda_pathfinder-1.3.2.dist-info/METADATA,sha256=poxeLLs7bHM7_lIrMY9lLgn96XwtgFOyRVarSphdoWo,1905
18
+ cuda_pathfinder-1.3.2.dist-info/RECORD,,
19
+ cuda_pathfinder-1.3.2.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
20
+ cuda_pathfinder-1.3.2.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
21
+ cuda_pathfinder-1.3.2.dist-info/licenses/LICENSE,sha256=DVQuDIgE45qn836wDaWnYhSdxoLXgpRRKH4RuTjpRZQ,10174
22
+ cuda_pathfinder-1.3.2.dist-info/top_level.txt,sha256=U5vpnvwNpaJF8bl4KnoUluMDTRt0J972FipwXjgNQ3A,5
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/WHEEL ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: setuptools (80.9.0)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
lib/python3.13/site-packages/cuda_pathfinder-1.3.2.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ cuda
lib/python3.13/site-packages/cycler-0.12.1.dist-info/LICENSE ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Copyright (c) 2015, matplotlib project
2
+ All rights reserved.
3
+
4
+ Redistribution and use in source and binary forms, with or without
5
+ modification, are permitted provided that the following conditions are met:
6
+
7
+ * Redistributions of source code must retain the above copyright notice, this
8
+ list of conditions and the following disclaimer.
9
+
10
+ * Redistributions in binary form must reproduce the above copyright notice,
11
+ this list of conditions and the following disclaimer in the documentation
12
+ and/or other materials provided with the distribution.
13
+
14
+ * Neither the name of the matplotlib project nor the names of its
15
+ contributors may be used to endorse or promote products derived from
16
+ this software without specific prior written permission.
17
+
18
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
19
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
20
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
21
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
22
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
23
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
24
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
25
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
26
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
27
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
lib/python3.13/site-packages/cycler-0.12.1.dist-info/METADATA ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: cycler
3
+ Version: 0.12.1
4
+ Summary: Composable style cycles
5
+ Author-email: Thomas A Caswell <matplotlib-users@python.org>
6
+ License: Copyright (c) 2015, matplotlib project
7
+ All rights reserved.
8
+
9
+ Redistribution and use in source and binary forms, with or without
10
+ modification, are permitted provided that the following conditions are met:
11
+
12
+ * Redistributions of source code must retain the above copyright notice, this
13
+ list of conditions and the following disclaimer.
14
+
15
+ * Redistributions in binary form must reproduce the above copyright notice,
16
+ this list of conditions and the following disclaimer in the documentation
17
+ and/or other materials provided with the distribution.
18
+
19
+ * Neither the name of the matplotlib project nor the names of its
20
+ contributors may be used to endorse or promote products derived from
21
+ this software without specific prior written permission.
22
+
23
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
24
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
25
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
26
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
27
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
28
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
29
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
30
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
31
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
32
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
33
+ Project-URL: homepage, https://matplotlib.org/cycler/
34
+ Project-URL: repository, https://github.com/matplotlib/cycler
35
+ Keywords: cycle kwargs
36
+ Classifier: License :: OSI Approved :: BSD License
37
+ Classifier: Development Status :: 4 - Beta
38
+ Classifier: Programming Language :: Python :: 3
39
+ Classifier: Programming Language :: Python :: 3.8
40
+ Classifier: Programming Language :: Python :: 3.9
41
+ Classifier: Programming Language :: Python :: 3.10
42
+ Classifier: Programming Language :: Python :: 3.11
43
+ Classifier: Programming Language :: Python :: 3.12
44
+ Classifier: Programming Language :: Python :: 3 :: Only
45
+ Requires-Python: >=3.8
46
+ Description-Content-Type: text/x-rst
47
+ License-File: LICENSE
48
+ Provides-Extra: docs
49
+ Requires-Dist: ipython ; extra == 'docs'
50
+ Requires-Dist: matplotlib ; extra == 'docs'
51
+ Requires-Dist: numpydoc ; extra == 'docs'
52
+ Requires-Dist: sphinx ; extra == 'docs'
53
+ Provides-Extra: tests
54
+ Requires-Dist: pytest ; extra == 'tests'
55
+ Requires-Dist: pytest-cov ; extra == 'tests'
56
+ Requires-Dist: pytest-xdist ; extra == 'tests'
57
+
58
+ |PyPi|_ |Conda|_ |Supported Python versions|_ |GitHub Actions|_ |Codecov|_
59
+
60
+ .. |PyPi| image:: https://img.shields.io/pypi/v/cycler.svg?style=flat
61
+ .. _PyPi: https://pypi.python.org/pypi/cycler
62
+
63
+ .. |Conda| image:: https://img.shields.io/conda/v/conda-forge/cycler
64
+ .. _Conda: https://anaconda.org/conda-forge/cycler
65
+
66
+ .. |Supported Python versions| image:: https://img.shields.io/pypi/pyversions/cycler.svg
67
+ .. _Supported Python versions: https://pypi.python.org/pypi/cycler
68
+
69
+ .. |GitHub Actions| image:: https://github.com/matplotlib/cycler/actions/workflows/tests.yml/badge.svg
70
+ .. _GitHub Actions: https://github.com/matplotlib/cycler/actions
71
+
72
+ .. |Codecov| image:: https://codecov.io/github/matplotlib/cycler/badge.svg?branch=main&service=github
73
+ .. _Codecov: https://codecov.io/github/matplotlib/cycler?branch=main
74
+
75
+ cycler: composable cycles
76
+ =========================
77
+
78
+ Docs: https://matplotlib.org/cycler/
lib/python3.13/site-packages/cycler-0.12.1.dist-info/RECORD ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ cycler-0.12.1.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
2
+ cycler-0.12.1.dist-info/LICENSE,sha256=8SGBQ9dm2j_qZvEzlrfxXfRqgzA_Kb-Wum6Y601C9Ag,1497
3
+ cycler-0.12.1.dist-info/METADATA,sha256=IyieGbdvHgE5Qidpbmryts0c556JcxIJv5GVFIsY7TY,3779
4
+ cycler-0.12.1.dist-info/RECORD,,
5
+ cycler-0.12.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
+ cycler-0.12.1.dist-info/WHEEL,sha256=yQN5g4mg4AybRjkgi-9yy4iQEFibGQmlz78Pik5Or-A,92
7
+ cycler-0.12.1.dist-info/top_level.txt,sha256=D8BVVDdAAelLb2FOEz7lDpc6-AL21ylKPrMhtG6yzyE,7
8
+ cycler/__init__.py,sha256=1JdRgv5Zzxo-W1ev7B_LWquysWP6LZH6CHk_COtIaXE,16709
9
+ cycler/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
lib/python3.13/site-packages/cycler-0.12.1.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/cycler-0.12.1.dist-info/WHEEL ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: bdist_wheel (0.41.2)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
lib/python3.13/site-packages/datasets-4.4.1.dist-info/AUTHORS ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ # This is the list of HuggingFace Datasets authors for copyright purposes.
2
+ #
3
+ # This does not necessarily list everyone who has contributed code, since in
4
+ # some cases, their employer may be the copyright holder. To see the full list
5
+ # of contributors, see the revision history in source control.
6
+
7
+ Google Inc.
8
+ HuggingFace Inc.
lib/python3.13/site-packages/datasets-4.4.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/datasets-4.4.1.dist-info/LICENSE ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ Apache License
3
+ Version 2.0, January 2004
4
+ http://www.apache.org/licenses/
5
+
6
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
7
+
8
+ 1. Definitions.
9
+
10
+ "License" shall mean the terms and conditions for use, reproduction,
11
+ and distribution as defined by Sections 1 through 9 of this document.
12
+
13
+ "Licensor" shall mean the copyright owner or entity authorized by
14
+ the copyright owner that is granting the License.
15
+
16
+ "Legal Entity" shall mean the union of the acting entity and all
17
+ other entities that control, are controlled by, or are under common
18
+ control with that entity. For the purposes of this definition,
19
+ "control" means (i) the power, direct or indirect, to cause the
20
+ direction or management of such entity, whether by contract or
21
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
+ outstanding shares, or (iii) beneficial ownership of such entity.
23
+
24
+ "You" (or "Your") shall mean an individual or Legal Entity
25
+ exercising permissions granted by this License.
26
+
27
+ "Source" form shall mean the preferred form for making modifications,
28
+ including but not limited to software source code, documentation
29
+ source, and configuration files.
30
+
31
+ "Object" form shall mean any form resulting from mechanical
32
+ transformation or translation of a Source form, including but
33
+ not limited to compiled object code, generated documentation,
34
+ and conversions to other media types.
35
+
36
+ "Work" shall mean the work of authorship, whether in Source or
37
+ Object form, made available under the License, as indicated by a
38
+ copyright notice that is included in or attached to the work
39
+ (an example is provided in the Appendix below).
40
+
41
+ "Derivative Works" shall mean any work, whether in Source or Object
42
+ form, that is based on (or derived from) the Work and for which the
43
+ editorial revisions, annotations, elaborations, or other modifications
44
+ represent, as a whole, an original work of authorship. For the purposes
45
+ of this License, Derivative Works shall not include works that remain
46
+ separable from, or merely link (or bind by name) to the interfaces of,
47
+ the Work and Derivative Works thereof.
48
+
49
+ "Contribution" shall mean any work of authorship, including
50
+ the original version of the Work and any modifications or additions
51
+ to that Work or Derivative Works thereof, that is intentionally
52
+ submitted to Licensor for inclusion in the Work by the copyright owner
53
+ or by an individual or Legal Entity authorized to submit on behalf of
54
+ the copyright owner. For the purposes of this definition, "submitted"
55
+ means any form of electronic, verbal, or written communication sent
56
+ to the Licensor or its representatives, including but not limited to
57
+ communication on electronic mailing lists, source code control systems,
58
+ and issue tracking systems that are managed by, or on behalf of, the
59
+ Licensor for the purpose of discussing and improving the Work, but
60
+ excluding communication that is conspicuously marked or otherwise
61
+ designated in writing by the copyright owner as "Not a Contribution."
62
+
63
+ "Contributor" shall mean Licensor and any individual or Legal Entity
64
+ on behalf of whom a Contribution has been received by Licensor and
65
+ subsequently incorporated within the Work.
66
+
67
+ 2. Grant of Copyright License. Subject to the terms and conditions of
68
+ this License, each Contributor hereby grants to You a perpetual,
69
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
+ copyright license to reproduce, prepare Derivative Works of,
71
+ publicly display, publicly perform, sublicense, and distribute the
72
+ Work and such Derivative Works in Source or Object form.
73
+
74
+ 3. Grant of Patent License. Subject to the terms and conditions of
75
+ this License, each Contributor hereby grants to You a perpetual,
76
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
+ (except as stated in this section) patent license to make, have made,
78
+ use, offer to sell, sell, import, and otherwise transfer the Work,
79
+ where such license applies only to those patent claims licensable
80
+ by such Contributor that are necessarily infringed by their
81
+ Contribution(s) alone or by combination of their Contribution(s)
82
+ with the Work to which such Contribution(s) was submitted. If You
83
+ institute patent litigation against any entity (including a
84
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
85
+ or a Contribution incorporated within the Work constitutes direct
86
+ or contributory patent infringement, then any patent licenses
87
+ granted to You under this License for that Work shall terminate
88
+ as of the date such litigation is filed.
89
+
90
+ 4. Redistribution. You may reproduce and distribute copies of the
91
+ Work or Derivative Works thereof in any medium, with or without
92
+ modifications, and in Source or Object form, provided that You
93
+ meet the following conditions:
94
+
95
+ (a) You must give any other recipients of the Work or
96
+ Derivative Works a copy of this License; and
97
+
98
+ (b) You must cause any modified files to carry prominent notices
99
+ stating that You changed the files; and
100
+
101
+ (c) You must retain, in the Source form of any Derivative Works
102
+ that You distribute, all copyright, patent, trademark, and
103
+ attribution notices from the Source form of the Work,
104
+ excluding those notices that do not pertain to any part of
105
+ the Derivative Works; and
106
+
107
+ (d) If the Work includes a "NOTICE" text file as part of its
108
+ distribution, then any Derivative Works that You distribute must
109
+ include a readable copy of the attribution notices contained
110
+ within such NOTICE file, excluding those notices that do not
111
+ pertain to any part of the Derivative Works, in at least one
112
+ of the following places: within a NOTICE text file distributed
113
+ as part of the Derivative Works; within the Source form or
114
+ documentation, if provided along with the Derivative Works; or,
115
+ within a display generated by the Derivative Works, if and
116
+ wherever such third-party notices normally appear. The contents
117
+ of the NOTICE file are for informational purposes only and
118
+ do not modify the License. You may add Your own attribution
119
+ notices within Derivative Works that You distribute, alongside
120
+ or as an addendum to the NOTICE text from the Work, provided
121
+ that such additional attribution notices cannot be construed
122
+ as modifying the License.
123
+
124
+ You may add Your own copyright statement to Your modifications and
125
+ may provide additional or different license terms and conditions
126
+ for use, reproduction, or distribution of Your modifications, or
127
+ for any such Derivative Works as a whole, provided Your use,
128
+ reproduction, and distribution of the Work otherwise complies with
129
+ the conditions stated in this License.
130
+
131
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
132
+ any Contribution intentionally submitted for inclusion in the Work
133
+ by You to the Licensor shall be under the terms and conditions of
134
+ this License, without any additional terms or conditions.
135
+ Notwithstanding the above, nothing herein shall supersede or modify
136
+ the terms of any separate license agreement you may have executed
137
+ with Licensor regarding such Contributions.
138
+
139
+ 6. Trademarks. This License does not grant permission to use the trade
140
+ names, trademarks, service marks, or product names of the Licensor,
141
+ except as required for reasonable and customary use in describing the
142
+ origin of the Work and reproducing the content of the NOTICE file.
143
+
144
+ 7. Disclaimer of Warranty. Unless required by applicable law or
145
+ agreed to in writing, Licensor provides the Work (and each
146
+ Contributor provides its Contributions) on an "AS IS" BASIS,
147
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
+ implied, including, without limitation, any warranties or conditions
149
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
+ PARTICULAR PURPOSE. You are solely responsible for determining the
151
+ appropriateness of using or redistributing the Work and assume any
152
+ risks associated with Your exercise of permissions under this License.
153
+
154
+ 8. Limitation of Liability. In no event and under no legal theory,
155
+ whether in tort (including negligence), contract, or otherwise,
156
+ unless required by applicable law (such as deliberate and grossly
157
+ negligent acts) or agreed to in writing, shall any Contributor be
158
+ liable to You for damages, including any direct, indirect, special,
159
+ incidental, or consequential damages of any character arising as a
160
+ result of this License or out of the use or inability to use the
161
+ Work (including but not limited to damages for loss of goodwill,
162
+ work stoppage, computer failure or malfunction, or any and all
163
+ other commercial damages or losses), even if such Contributor
164
+ has been advised of the possibility of such damages.
165
+
166
+ 9. Accepting Warranty or Additional Liability. While redistributing
167
+ the Work or Derivative Works thereof, You may choose to offer,
168
+ and charge a fee for, acceptance of support, warranty, indemnity,
169
+ or other liability obligations and/or rights consistent with this
170
+ License. However, in accepting such obligations, You may act only
171
+ on Your own behalf and on Your sole responsibility, not on behalf
172
+ of any other Contributor, and only if You agree to indemnify,
173
+ defend, and hold each Contributor harmless for any liability
174
+ incurred by, or claims asserted against, such Contributor by reason
175
+ of your accepting any such warranty or additional liability.
176
+
177
+ END OF TERMS AND CONDITIONS
178
+
179
+ APPENDIX: How to apply the Apache License to your work.
180
+
181
+ To apply the Apache License to your work, attach the following
182
+ boilerplate notice, with the fields enclosed by brackets "[]"
183
+ replaced with your own identifying information. (Don't include
184
+ the brackets!) The text should be enclosed in the appropriate
185
+ comment syntax for the file format. We also recommend that a
186
+ file or class name and description of purpose be included on the
187
+ same "printed page" as the copyright notice for easier
188
+ identification within third-party archives.
189
+
190
+ Copyright [yyyy] [name of copyright owner]
191
+
192
+ Licensed under the Apache License, Version 2.0 (the "License");
193
+ you may not use this file except in compliance with the License.
194
+ You may obtain a copy of the License at
195
+
196
+ http://www.apache.org/licenses/LICENSE-2.0
197
+
198
+ Unless required by applicable law or agreed to in writing, software
199
+ distributed under the License is distributed on an "AS IS" BASIS,
200
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201
+ See the License for the specific language governing permissions and
202
+ limitations under the License.
lib/python3.13/site-packages/datasets-4.4.1.dist-info/METADATA ADDED
@@ -0,0 +1,375 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.2
2
+ Name: datasets
3
+ Version: 4.4.1
4
+ Summary: HuggingFace community-driven open-source library of datasets
5
+ Home-page: https://github.com/huggingface/datasets
6
+ Download-URL: https://github.com/huggingface/datasets/tags
7
+ Author: HuggingFace Inc.
8
+ Author-email: thomas@huggingface.co
9
+ License: Apache 2.0
10
+ Keywords: datasets machine learning datasets
11
+ Classifier: Development Status :: 5 - Production/Stable
12
+ Classifier: Intended Audience :: Developers
13
+ Classifier: Intended Audience :: Education
14
+ Classifier: Intended Audience :: Science/Research
15
+ Classifier: License :: OSI Approved :: Apache Software License
16
+ Classifier: Operating System :: OS Independent
17
+ Classifier: Programming Language :: Python :: 3
18
+ Classifier: Programming Language :: Python :: 3.9
19
+ Classifier: Programming Language :: Python :: 3.10
20
+ Classifier: Programming Language :: Python :: 3.11
21
+ Classifier: Programming Language :: Python :: 3.12
22
+ Classifier: Programming Language :: Python :: 3.13
23
+ Classifier: Programming Language :: Python :: 3.14
24
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
25
+ Requires-Python: >=3.9.0
26
+ Description-Content-Type: text/markdown
27
+ License-File: LICENSE
28
+ License-File: AUTHORS
29
+ Requires-Dist: filelock
30
+ Requires-Dist: numpy>=1.17
31
+ Requires-Dist: pyarrow>=21.0.0
32
+ Requires-Dist: dill<0.4.1,>=0.3.0
33
+ Requires-Dist: pandas
34
+ Requires-Dist: requests>=2.32.2
35
+ Requires-Dist: httpx<1.0.0
36
+ Requires-Dist: tqdm>=4.66.3
37
+ Requires-Dist: xxhash
38
+ Requires-Dist: multiprocess<0.70.19
39
+ Requires-Dist: fsspec[http]<=2025.10.0,>=2023.1.0
40
+ Requires-Dist: huggingface-hub<2.0,>=0.25.0
41
+ Requires-Dist: packaging
42
+ Requires-Dist: pyyaml>=5.1
43
+ Provides-Extra: audio
44
+ Requires-Dist: torchcodec>=0.6.0; extra == "audio"
45
+ Requires-Dist: torch>=2.8.0; extra == "audio"
46
+ Provides-Extra: vision
47
+ Requires-Dist: Pillow>=9.4.0; extra == "vision"
48
+ Provides-Extra: tensorflow
49
+ Requires-Dist: tensorflow>=2.6.0; extra == "tensorflow"
50
+ Provides-Extra: tensorflow-gpu
51
+ Requires-Dist: tensorflow>=2.6.0; extra == "tensorflow-gpu"
52
+ Provides-Extra: torch
53
+ Requires-Dist: torch; extra == "torch"
54
+ Provides-Extra: jax
55
+ Requires-Dist: jax>=0.3.14; extra == "jax"
56
+ Requires-Dist: jaxlib>=0.3.14; extra == "jax"
57
+ Provides-Extra: streaming
58
+ Provides-Extra: dev
59
+ Requires-Dist: numba>=0.56.4; python_version < "3.14" and extra == "dev"
60
+ Requires-Dist: absl-py; extra == "dev"
61
+ Requires-Dist: decorator; extra == "dev"
62
+ Requires-Dist: joblib<1.3.0; extra == "dev"
63
+ Requires-Dist: joblibspark; python_version < "3.14" and extra == "dev"
64
+ Requires-Dist: pytest; extra == "dev"
65
+ Requires-Dist: pytest-datadir; extra == "dev"
66
+ Requires-Dist: pytest-xdist; extra == "dev"
67
+ Requires-Dist: aiohttp; extra == "dev"
68
+ Requires-Dist: elasticsearch<8.0.0,>=7.17.12; extra == "dev"
69
+ Requires-Dist: faiss-cpu>=1.8.0.post1; extra == "dev"
70
+ Requires-Dist: h5py; extra == "dev"
71
+ Requires-Dist: jax>=0.3.14; sys_platform != "win32" and extra == "dev"
72
+ Requires-Dist: jaxlib>=0.3.14; sys_platform != "win32" and extra == "dev"
73
+ Requires-Dist: lz4; python_version < "3.14" and extra == "dev"
74
+ Requires-Dist: moto[server]; extra == "dev"
75
+ Requires-Dist: pyspark>=3.4; extra == "dev"
76
+ Requires-Dist: py7zr; extra == "dev"
77
+ Requires-Dist: rarfile>=4.0; extra == "dev"
78
+ Requires-Dist: sqlalchemy; extra == "dev"
79
+ Requires-Dist: protobuf<4.0.0; extra == "dev"
80
+ Requires-Dist: tensorflow>=2.6.0; (python_version < "3.10" and sys_platform != "win32") and extra == "dev"
81
+ Requires-Dist: tensorflow>=2.16.0; (python_version >= "3.10" and sys_platform != "win32" and python_version < "3.14") and extra == "dev"
82
+ Requires-Dist: tiktoken; extra == "dev"
83
+ Requires-Dist: torch>=2.8.0; extra == "dev"
84
+ Requires-Dist: torchdata; extra == "dev"
85
+ Requires-Dist: transformers>=4.42.0; extra == "dev"
86
+ Requires-Dist: zstandard; extra == "dev"
87
+ Requires-Dist: polars[timezone]>=0.20.0; extra == "dev"
88
+ Requires-Dist: Pillow>=9.4.0; extra == "dev"
89
+ Requires-Dist: torchcodec>=0.7.0; python_version < "3.14" and extra == "dev"
90
+ Requires-Dist: nibabel>=5.3.1; extra == "dev"
91
+ Requires-Dist: ruff>=0.3.0; extra == "dev"
92
+ Requires-Dist: transformers; extra == "dev"
93
+ Requires-Dist: torch; extra == "dev"
94
+ Requires-Dist: tensorflow>=2.6.0; extra == "dev"
95
+ Provides-Extra: tests
96
+ Requires-Dist: numba>=0.56.4; python_version < "3.14" and extra == "tests"
97
+ Requires-Dist: absl-py; extra == "tests"
98
+ Requires-Dist: decorator; extra == "tests"
99
+ Requires-Dist: joblib<1.3.0; extra == "tests"
100
+ Requires-Dist: joblibspark; python_version < "3.14" and extra == "tests"
101
+ Requires-Dist: pytest; extra == "tests"
102
+ Requires-Dist: pytest-datadir; extra == "tests"
103
+ Requires-Dist: pytest-xdist; extra == "tests"
104
+ Requires-Dist: aiohttp; extra == "tests"
105
+ Requires-Dist: elasticsearch<8.0.0,>=7.17.12; extra == "tests"
106
+ Requires-Dist: faiss-cpu>=1.8.0.post1; extra == "tests"
107
+ Requires-Dist: h5py; extra == "tests"
108
+ Requires-Dist: jax>=0.3.14; sys_platform != "win32" and extra == "tests"
109
+ Requires-Dist: jaxlib>=0.3.14; sys_platform != "win32" and extra == "tests"
110
+ Requires-Dist: lz4; python_version < "3.14" and extra == "tests"
111
+ Requires-Dist: moto[server]; extra == "tests"
112
+ Requires-Dist: pyspark>=3.4; extra == "tests"
113
+ Requires-Dist: py7zr; extra == "tests"
114
+ Requires-Dist: rarfile>=4.0; extra == "tests"
115
+ Requires-Dist: sqlalchemy; extra == "tests"
116
+ Requires-Dist: protobuf<4.0.0; extra == "tests"
117
+ Requires-Dist: tensorflow>=2.6.0; (python_version < "3.10" and sys_platform != "win32") and extra == "tests"
118
+ Requires-Dist: tensorflow>=2.16.0; (python_version >= "3.10" and sys_platform != "win32" and python_version < "3.14") and extra == "tests"
119
+ Requires-Dist: tiktoken; extra == "tests"
120
+ Requires-Dist: torch>=2.8.0; extra == "tests"
121
+ Requires-Dist: torchdata; extra == "tests"
122
+ Requires-Dist: transformers>=4.42.0; extra == "tests"
123
+ Requires-Dist: zstandard; extra == "tests"
124
+ Requires-Dist: polars[timezone]>=0.20.0; extra == "tests"
125
+ Requires-Dist: Pillow>=9.4.0; extra == "tests"
126
+ Requires-Dist: torchcodec>=0.7.0; python_version < "3.14" and extra == "tests"
127
+ Requires-Dist: nibabel>=5.3.1; extra == "tests"
128
+ Provides-Extra: tests-numpy2
129
+ Requires-Dist: numba>=0.56.4; python_version < "3.14" and extra == "tests-numpy2"
130
+ Requires-Dist: absl-py; extra == "tests-numpy2"
131
+ Requires-Dist: decorator; extra == "tests-numpy2"
132
+ Requires-Dist: joblib<1.3.0; extra == "tests-numpy2"
133
+ Requires-Dist: joblibspark; python_version < "3.14" and extra == "tests-numpy2"
134
+ Requires-Dist: pytest; extra == "tests-numpy2"
135
+ Requires-Dist: pytest-datadir; extra == "tests-numpy2"
136
+ Requires-Dist: pytest-xdist; extra == "tests-numpy2"
137
+ Requires-Dist: aiohttp; extra == "tests-numpy2"
138
+ Requires-Dist: elasticsearch<8.0.0,>=7.17.12; extra == "tests-numpy2"
139
+ Requires-Dist: h5py; extra == "tests-numpy2"
140
+ Requires-Dist: jax>=0.3.14; sys_platform != "win32" and extra == "tests-numpy2"
141
+ Requires-Dist: jaxlib>=0.3.14; sys_platform != "win32" and extra == "tests-numpy2"
142
+ Requires-Dist: lz4; python_version < "3.14" and extra == "tests-numpy2"
143
+ Requires-Dist: moto[server]; extra == "tests-numpy2"
144
+ Requires-Dist: pyspark>=3.4; extra == "tests-numpy2"
145
+ Requires-Dist: py7zr; extra == "tests-numpy2"
146
+ Requires-Dist: rarfile>=4.0; extra == "tests-numpy2"
147
+ Requires-Dist: sqlalchemy; extra == "tests-numpy2"
148
+ Requires-Dist: protobuf<4.0.0; extra == "tests-numpy2"
149
+ Requires-Dist: tiktoken; extra == "tests-numpy2"
150
+ Requires-Dist: torch>=2.8.0; extra == "tests-numpy2"
151
+ Requires-Dist: torchdata; extra == "tests-numpy2"
152
+ Requires-Dist: transformers>=4.42.0; extra == "tests-numpy2"
153
+ Requires-Dist: zstandard; extra == "tests-numpy2"
154
+ Requires-Dist: polars[timezone]>=0.20.0; extra == "tests-numpy2"
155
+ Requires-Dist: Pillow>=9.4.0; extra == "tests-numpy2"
156
+ Requires-Dist: torchcodec>=0.7.0; python_version < "3.14" and extra == "tests-numpy2"
157
+ Requires-Dist: nibabel>=5.3.1; extra == "tests-numpy2"
158
+ Provides-Extra: quality
159
+ Requires-Dist: ruff>=0.3.0; extra == "quality"
160
+ Provides-Extra: benchmarks
161
+ Requires-Dist: tensorflow==2.12.0; extra == "benchmarks"
162
+ Requires-Dist: torch==2.0.1; extra == "benchmarks"
163
+ Requires-Dist: transformers==4.30.1; extra == "benchmarks"
164
+ Provides-Extra: docs
165
+ Requires-Dist: transformers; extra == "docs"
166
+ Requires-Dist: torch; extra == "docs"
167
+ Requires-Dist: tensorflow>=2.6.0; extra == "docs"
168
+ Provides-Extra: pdfs
169
+ Requires-Dist: pdfplumber>=0.11.4; extra == "pdfs"
170
+ Provides-Extra: nibabel
171
+ Requires-Dist: nibabel>=5.3.2; extra == "nibabel"
172
+ Dynamic: author
173
+ Dynamic: author-email
174
+ Dynamic: classifier
175
+ Dynamic: description
176
+ Dynamic: description-content-type
177
+ Dynamic: download-url
178
+ Dynamic: home-page
179
+ Dynamic: keywords
180
+ Dynamic: license
181
+ Dynamic: provides-extra
182
+ Dynamic: requires-dist
183
+ Dynamic: requires-python
184
+ Dynamic: summary
185
+
186
+ <p align="center">
187
+ <picture>
188
+ <source media="(prefers-color-scheme: dark)" srcset="https://huggingface.co/datasets/huggingface/documentation-images/raw/main/datasets-logo-dark.svg">
189
+ <source media="(prefers-color-scheme: light)" srcset="https://huggingface.co/datasets/huggingface/documentation-images/raw/main/datasets-logo-light.svg">
190
+ <img alt="Hugging Face Datasets Library" src="https://huggingface.co/datasets/huggingface/documentation-images/raw/main/datasets-logo-light.svg" width="352" height="59" style="max-width: 100%;">
191
+ </picture>
192
+ <br/>
193
+ <br/>
194
+ </p>
195
+
196
+ <p align="center">
197
+ <a href="https://github.com/huggingface/datasets/actions/workflows/ci.yml?query=branch%3Amain"><img alt="Build" src="https://github.com/huggingface/datasets/actions/workflows/ci.yml/badge.svg?branch=main"></a>
198
+ <a href="https://github.com/huggingface/datasets/blob/main/LICENSE"><img alt="GitHub" src="https://img.shields.io/github/license/huggingface/datasets.svg?color=blue"></a>
199
+ <a href="https://huggingface.co/docs/datasets/index.html"><img alt="Documentation" src="https://img.shields.io/website/http/huggingface.co/docs/datasets/index.html.svg?down_color=red&down_message=offline&up_message=online"></a>
200
+ <a href="https://github.com/huggingface/datasets/releases"><img alt="GitHub release" src="https://img.shields.io/github/release/huggingface/datasets.svg"></a>
201
+ <a href="https://huggingface.co/datasets/"><img alt="Number of datasets" src="https://img.shields.io/endpoint?url=https://huggingface.co/api/shields/datasets&color=brightgreen"></a>
202
+ <a href="CODE_OF_CONDUCT.md"><img alt="Contributor Covenant" src="https://img.shields.io/badge/Contributor%20Covenant-2.0-4baaaa.svg"></a>
203
+ <a href="https://zenodo.org/badge/latestdoi/250213286"><img src="https://zenodo.org/badge/250213286.svg" alt="DOI"></a>
204
+ </p>
205
+
206
+ 🤗 Datasets is a lightweight library providing **two** main features:
207
+
208
+ - **one-line dataloaders for many public datasets**: one-liners to download and pre-process any of the ![number of datasets](https://img.shields.io/endpoint?url=https://huggingface.co/api/shields/datasets&color=brightgreen) major public datasets (image datasets, audio datasets, text datasets in 467 languages and dialects, etc.) provided on the [HuggingFace Datasets Hub](https://huggingface.co/datasets). With a simple command like `squad_dataset = load_dataset("rajpurkar/squad")`, get any of these datasets ready to use in a dataloader for training/evaluating a ML model (Numpy/Pandas/PyTorch/TensorFlow/JAX),
209
+ - **efficient data pre-processing**: simple, fast and reproducible data pre-processing for the public datasets as well as your own local datasets in CSV, JSON, text, PNG, JPEG, WAV, MP3, Parquet, HDF5, etc. With simple commands like `processed_dataset = dataset.map(process_example)`, efficiently prepare the dataset for inspection and ML model evaluation and training.
210
+
211
+ [🎓 **Documentation**](https://huggingface.co/docs/datasets/) [🔎 **Find a dataset in the Hub**](https://huggingface.co/datasets) [🌟 **Share a dataset on the Hub**](https://huggingface.co/docs/datasets/share)
212
+
213
+ <h3 align="center">
214
+ <a href="https://hf.co/course"><img src="https://raw.githubusercontent.com/huggingface/datasets/main/docs/source/imgs/course_banner.png"></a>
215
+ </h3>
216
+
217
+ 🤗 Datasets is designed to let the community easily add and share new datasets.
218
+
219
+ 🤗 Datasets has many additional interesting features:
220
+
221
+ - Thrive on large datasets: 🤗 Datasets naturally frees the user from RAM memory limitation, all datasets are memory-mapped using an efficient zero-serialization cost backend (Apache Arrow).
222
+ - Smart caching: never wait for your data to process several times.
223
+ - Lightweight and fast with a transparent and pythonic API (multi-processing/caching/memory-mapping).
224
+ - Built-in interoperability with NumPy, PyTorch, TensorFlow 2, JAX, Pandas, Polars and more.
225
+ - Native support for audio, image and video data.
226
+ - Enable streaming mode to save disk space and start iterating over the dataset immediately.
227
+
228
+ 🤗 Datasets originated from a fork of the awesome [TensorFlow Datasets](https://github.com/tensorflow/datasets) and the HuggingFace team want to deeply thank the TensorFlow Datasets team for building this amazing library.
229
+
230
+ # Installation
231
+
232
+ ## With pip
233
+
234
+ 🤗 Datasets can be installed from PyPi and has to be installed in a virtual environment (venv or conda for instance)
235
+
236
+ ```bash
237
+ pip install datasets
238
+ ```
239
+
240
+ ## With conda
241
+
242
+ 🤗 Datasets can be installed using conda as follows:
243
+
244
+ ```bash
245
+ conda install -c huggingface -c conda-forge datasets
246
+ ```
247
+
248
+ Follow the installation pages of TensorFlow and PyTorch to see how to install them with conda.
249
+
250
+ For more details on installation, check the installation page in the documentation: https://huggingface.co/docs/datasets/installation
251
+
252
+ ## Installation to use with Machine Learning & Data frameworks frameworks
253
+
254
+ If you plan to use 🤗 Datasets with PyTorch (2.0+), TensorFlow (2.6+) or JAX (3.14+) you should also install PyTorch, TensorFlow or JAX.
255
+ 🤗 Datasets is also well integrated with data frameworks like PyArrow, Pandas, Polars and Spark, which should be installed separately.
256
+
257
+ For more details on using the library with these frameworks, check the quick start page in the documentation: https://huggingface.co/docs/datasets/quickstart
258
+
259
+ # Usage
260
+
261
+ 🤗 Datasets is made to be very simple to use - the API is centered around a single function, `datasets.load_dataset(dataset_name, **kwargs)`, that instantiates a dataset.
262
+
263
+ This library can be used for text/image/audio/etc. datasets. Here is an example to load a text dataset:
264
+
265
+ Here is a quick example:
266
+
267
+ ```python
268
+ from datasets import load_dataset
269
+
270
+ # Print all the available datasets
271
+ from huggingface_hub import list_datasets
272
+ print([dataset.id for dataset in list_datasets()])
273
+
274
+ # Load a dataset and print the first example in the training set
275
+ squad_dataset = load_dataset('rajpurkar/squad')
276
+ print(squad_dataset['train'][0])
277
+
278
+ # Process the dataset - add a column with the length of the context texts
279
+ dataset_with_length = squad_dataset.map(lambda x: {"length": len(x["context"])})
280
+
281
+ # Process the dataset - tokenize the context texts (using a tokenizer from the 🤗 Transformers library)
282
+ from transformers import AutoTokenizer
283
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-cased')
284
+
285
+ tokenized_dataset = squad_dataset.map(lambda x: tokenizer(x['context']), batched=True)
286
+ ```
287
+
288
+ If your dataset is bigger than your disk or if you don't want to wait to download the data, you can use streaming:
289
+
290
+ ```python
291
+ # If you want to use the dataset immediately and efficiently stream the data as you iterate over the dataset
292
+ image_dataset = load_dataset('timm/imagenet-1k-wds', streaming=True)
293
+ for example in image_dataset["train"]:
294
+ break
295
+ ```
296
+
297
+ For more details on using the library, check the quick start page in the documentation: https://huggingface.co/docs/datasets/quickstart and the specific pages on:
298
+
299
+ - Loading a dataset: https://huggingface.co/docs/datasets/loading
300
+ - What's in a Dataset: https://huggingface.co/docs/datasets/access
301
+ - Processing data with 🤗 Datasets: https://huggingface.co/docs/datasets/process
302
+ - Processing audio data: https://huggingface.co/docs/datasets/audio_process
303
+ - Processing image data: https://huggingface.co/docs/datasets/image_process
304
+ - Processing text data: https://huggingface.co/docs/datasets/nlp_process
305
+ - Streaming a dataset: https://huggingface.co/docs/datasets/stream
306
+ - etc.
307
+
308
+ # Add a new dataset to the Hub
309
+
310
+ We have a very detailed step-by-step guide to add a new dataset to the ![number of datasets](https://img.shields.io/endpoint?url=https://huggingface.co/api/shields/datasets&color=brightgreen) datasets already provided on the [HuggingFace Datasets Hub](https://huggingface.co/datasets).
311
+
312
+ You can find:
313
+ - [how to upload a dataset to the Hub using your web browser or Python](https://huggingface.co/docs/datasets/upload_dataset) and also
314
+ - [how to upload it using Git](https://huggingface.co/docs/datasets/share).
315
+
316
+ # Disclaimers
317
+
318
+ You can use 🤗 Datasets to load datasets based on versioned git repositories maintained by the dataset authors. For reproducibility reasons, we ask users to pin the `revision` of the repositories they use.
319
+
320
+ If you're a dataset owner and wish to update any part of it (description, citation, license, etc.), or do not want your dataset to be included in the Hugging Face Hub, please get in touch by opening a discussion or a pull request in the Community tab of the dataset page. Thanks for your contribution to the ML community!
321
+
322
+ ## BibTeX
323
+
324
+ If you want to cite our 🤗 Datasets library, you can use our [paper](https://arxiv.org/abs/2109.02846):
325
+
326
+ ```bibtex
327
+ @inproceedings{lhoest-etal-2021-datasets,
328
+ title = "Datasets: A Community Library for Natural Language Processing",
329
+ author = "Lhoest, Quentin and
330
+ Villanova del Moral, Albert and
331
+ Jernite, Yacine and
332
+ Thakur, Abhishek and
333
+ von Platen, Patrick and
334
+ Patil, Suraj and
335
+ Chaumond, Julien and
336
+ Drame, Mariama and
337
+ Plu, Julien and
338
+ Tunstall, Lewis and
339
+ Davison, Joe and
340
+ {\v{S}}a{\v{s}}ko, Mario and
341
+ Chhablani, Gunjan and
342
+ Malik, Bhavitvya and
343
+ Brandeis, Simon and
344
+ Le Scao, Teven and
345
+ Sanh, Victor and
346
+ Xu, Canwen and
347
+ Patry, Nicolas and
348
+ McMillan-Major, Angelina and
349
+ Schmid, Philipp and
350
+ Gugger, Sylvain and
351
+ Delangue, Cl{\'e}ment and
352
+ Matussi{\`e}re, Th{\'e}o and
353
+ Debut, Lysandre and
354
+ Bekman, Stas and
355
+ Cistac, Pierric and
356
+ Goehringer, Thibault and
357
+ Mustar, Victor and
358
+ Lagunas, Fran{\c{c}}ois and
359
+ Rush, Alexander and
360
+ Wolf, Thomas",
361
+ booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
362
+ month = nov,
363
+ year = "2021",
364
+ address = "Online and Punta Cana, Dominican Republic",
365
+ publisher = "Association for Computational Linguistics",
366
+ url = "https://aclanthology.org/2021.emnlp-demo.21",
367
+ pages = "175--184",
368
+ abstract = "The scale, variety, and quantity of publicly-available NLP datasets has grown rapidly as researchers propose new tasks, larger models, and novel benchmarks. Datasets is a community library for contemporary NLP designed to support this ecosystem. Datasets aims to standardize end-user interfaces, versioning, and documentation, while providing a lightweight front-end that behaves similarly for small datasets as for internet-scale corpora. The design of the library incorporates a distributed, community-driven approach to adding datasets and documenting usage. After a year of development, the library now includes more than 650 unique datasets, has more than 250 contributors, and has helped support a variety of novel cross-dataset research projects and shared tasks. The library is available at https://github.com/huggingface/datasets.",
369
+ eprint={2109.02846},
370
+ archivePrefix={arXiv},
371
+ primaryClass={cs.CL},
372
+ }
373
+ ```
374
+
375
+ If you need to cite a specific version of our 🤗 Datasets library for reproducibility, you can use the corresponding version Zenodo DOI from this [list](https://zenodo.org/search?q=conceptrecid:%224817768%22&sort=-version&all_versions=True).
lib/python3.13/site-packages/datasets-4.4.1.dist-info/RECORD ADDED
@@ -0,0 +1,140 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ../../../bin/datasets-cli,sha256=fPSBmke1bkm9nO1ckm0Z4Lr0DvLglPXHg3hWDZpJvwA,347
2
+ datasets-4.4.1.dist-info/AUTHORS,sha256=L0FBY23tCNHLmvsOKAbumHn8WZZIK98sH53JYxhAchU,327
3
+ datasets-4.4.1.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
4
+ datasets-4.4.1.dist-info/LICENSE,sha256=z8d0m5b2O9McPEK1xHG_dWgUBT6EfBDz6wA0F7xSPTA,11358
5
+ datasets-4.4.1.dist-info/METADATA,sha256=RUEdOKdx7SFIYVBiCtIVbyHyeXZ6vxaKjYL08llFZaw,19764
6
+ datasets-4.4.1.dist-info/RECORD,,
7
+ datasets-4.4.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
8
+ datasets-4.4.1.dist-info/WHEEL,sha256=In9FTNxeP60KnTkGw7wk6mJPYd_dQSjEZmXdBdMCI-8,91
9
+ datasets-4.4.1.dist-info/entry_points.txt,sha256=iM-h4A7OQCrZqr3L2mwiyMtPeFj8w4HAHzmI45y3tg0,69
10
+ datasets-4.4.1.dist-info/top_level.txt,sha256=9A857YvCQm_Dg3UjeKkWPz9sDBos0t3zN2pf5krTemQ,9
11
+ datasets/__init__.py,sha256=uQHgHsFq7zM2-XawTxvpiLFh7USPy8cRDgR2DlnVT4M,1630
12
+ datasets/arrow_dataset.py,sha256=WW5Auw2jgj-Zy4rZalRCgOsevASc7fqYeVfKNLemNgI,316758
13
+ datasets/arrow_reader.py,sha256=byEDpH_SwzjbnVqW0pXjBaE1NoXFE4Cmwoz7zTB_IvA,25131
14
+ datasets/arrow_writer.py,sha256=5f0mgbIbvFYBKAP3knoFaIQTk6WD-gDp84d2mPwjs1w,34913
15
+ datasets/builder.py,sha256=s_mYd7dVd9Fm5xcwNEOWbvXUKpJrDeoryOqDivXsN_8,87822
16
+ datasets/combine.py,sha256=HurdKaOMz8bQMNkmXxBkBgVALA8EpNQUAqR7iVkLnQk,11213
17
+ datasets/commands/__init__.py,sha256=rujbQtxJbwHhF9WQqp2DD9tfVTghDMJdl0v6H551Pcs,312
18
+ datasets/commands/datasets_cli.py,sha256=CRy2H60h2sxJDpfa4LZ4dxipdGIZlTL47pEXLV6tfwQ,1175
19
+ datasets/commands/delete_from_hub.py,sha256=o0wdolb1r1Jnl6F0KdqKn3u0l8VR2od6KzbRoqrSNPM,1396
20
+ datasets/commands/env.py,sha256=8qg-hpXSXXsHvtYFvJkn5rn9IncqPsjjx3nR8no4a2I,1239
21
+ datasets/commands/test.py,sha256=LEcbsx_zEl15679i2BPXFuvfdPsFGOogVR0rjBkf3_k,7820
22
+ datasets/config.py,sha256=fNBmbPUt0xiJf3VRF9WzgZtUfC4YWALNUHb8gnQDF-w,10408
23
+ datasets/data_files.py,sha256=vDMrILEt5OgoMyTW_m0XlwJum0DSlnewuxwzRzbHAFI,32163
24
+ datasets/dataset_dict.py,sha256=J8LQHfUlkU53Nn_pGxPjCrLHks8xWcIyRuTMLfC-qj0,135158
25
+ datasets/distributed.py,sha256=pulXFluRCmo69KeDqblPz32avS6LCHTGycS77XgI2mY,1562
26
+ datasets/download/__init__.py,sha256=lbFOtITDaR7PHrhzJ8VfRnpaOT6NYozSxUcLv_GVfTg,281
27
+ datasets/download/download_config.py,sha256=ODHFej2H-JIqFllklUFkiF4ILJ62l8clAGk13b8Ru3U,3796
28
+ datasets/download/download_manager.py,sha256=44VSuSzIMJoZ-bDa3uF494jio5JmZFMeGAPzuXYRA7Q,12762
29
+ datasets/download/streaming_download_manager.py,sha256=qvcoVsXnAGNi2lzKRktck_DJrIx1fQ7xedm881s0IQw,7537
30
+ datasets/exceptions.py,sha256=B93GwElhEvlhHPU9GBSY8if27jhRwu875-gL6B2CL6c,4185
31
+ datasets/features/__init__.py,sha256=cjbWpEW5nzjLYtCsTbICeVtthvAlrikobbeEJ_X85Mc,585
32
+ datasets/features/_torchcodec.py,sha256=Ws7JMYlUlPa7NHh1ZgxWQNrJV0c14kfYYCdrLjCkmjA,627
33
+ datasets/features/audio.py,sha256=led2AAsM9P7mZtEIKmzNwtFYdIBD35-wOowRTrgnOu4,14647
34
+ datasets/features/features.py,sha256=aYDYzFi1f0QH_nYmbBc_CXUvTFUYs0ie51jglEfnTGQ,94039
35
+ datasets/features/image.py,sha256=Hgy5DL-IgX0sHl_n9sNWphrwJJQA9z-ULNw4Uu9OOuk,16940
36
+ datasets/features/nifti.py,sha256=XByqtyA1CNC7UTgDlIgGrg7cRflnvgZhkr9o5LQUSEs,9768
37
+ datasets/features/pdf.py,sha256=3gWOD9kDsICWuKU1rSGQnS_l1T6cNfira5hLDI7GfKw,11141
38
+ datasets/features/translation.py,sha256=aIJfNMXTTQLamEk4L8mfTDDdyzscZUnhSPAor8RjE_8,4490
39
+ datasets/features/video.py,sha256=FgUOjqQT27ZD2-z13j6i6PEDkW__p8bePfMoYZb1hQ0,14053
40
+ datasets/filesystems/__init__.py,sha256=jBDUQosQqEFIXUDLZwRWaTgNomwL6Fq2qiYPvvxuae0,1523
41
+ datasets/filesystems/compression.py,sha256=J16G_P0F9SxVfsLy9XLesyQJgvp4kpaO9MAihokEUvY,4528
42
+ datasets/fingerprint.py,sha256=rPZzuQA9O3xe9pqxR13H1dgOGg7mbWVTiOoBYPDgfgs,20330
43
+ datasets/formatting/__init__.py,sha256=-pM10fSzw4MVj_L3NFWEv2sUyBh4mbnvCkfXgfS6WII,5412
44
+ datasets/formatting/formatting.py,sha256=8zMv7dfTyyEDvzNjKDzT7Eb2_fn2n6OknfGmYK_0-MM,26626
45
+ datasets/formatting/jax_formatter.py,sha256=uwckTeHc5DMt8CuqieBzSrfWQuhqaW3X076HaPxxKoY,7412
46
+ datasets/formatting/np_formatter.py,sha256=gmp76JnzjCaIZTvcZUzsGp-vvIjFMqXEUArMU-JisCw,5102
47
+ datasets/formatting/polars_formatter.py,sha256=oTm4l30SgGha-Oku42C0dA91Y8f2oifF9aWvi3QITDk,4744
48
+ datasets/formatting/tf_formatter.py,sha256=_wnRRFH1Q5uzCK8mG8qAfdAVybGGuljbGUe46hmMTrU,5236
49
+ datasets/formatting/torch_formatter.py,sha256=0VnHF8jC7KNjr7Ww9NHEnZDqeKKODiZBs8e6eMOzLDA,11623
50
+ datasets/hub.py,sha256=6qnJVVTdEIShaUGlHrRjFJwvnJiNQuzsU01C8ID-8lA,4822
51
+ datasets/info.py,sha256=AdNB1CcWOag6SfoR0IM7-grZbIYNPG_N1msl5ccJnq8,19642
52
+ datasets/inspect.py,sha256=nC0X--w_RXZflkhhr729MJoP_4i929ith9QwfoaDF4M,15647
53
+ datasets/io/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
54
+ datasets/io/abc.py,sha256=LwDMXYs6YkhZuz1JiMK4PDIqgNjv7I8xH3UMUELW2ys,1672
55
+ datasets/io/csv.py,sha256=v4zaWehHb9U3njbdhy7wQnb8qO_c_58XOUC9JgBBVwI,5265
56
+ datasets/io/generator.py,sha256=GLagjjxYSlfxWVHK5KZdEW6V3izo9KkQQkf03qpjkfs,2165
57
+ datasets/io/json.py,sha256=vQZT9vhTbKX5Nyob4zQZR1NXWCft7bT5_6_8DD4XZyo,6697
58
+ datasets/io/parquet.py,sha256=foFbhZzJr8VJH2Mctxz0xHR72BTOMyQcTJDPryw5qng,5388
59
+ datasets/io/spark.py,sha256=VUIODLHgIbiK0CI0UvthQ_gUO0MQDtHUozvw7Dfs8FI,1797
60
+ datasets/io/sql.py,sha256=4Zjw7peVEhhzoDtz2VTCFPqt2Tpy4zMB7T7ajb2GVTY,4234
61
+ datasets/io/text.py,sha256=bebEzXBSGC40_Gy94j9ZTJ7Hg0IfrV_4pnIUEhQZVig,1975
62
+ datasets/iterable_dataset.py,sha256=g-EcofKnrIfq94AR_F8LS3SI4FpivzJ2GLE514_nZ4U,211812
63
+ datasets/keyhash.py,sha256=4bqtuEHHlof2BBJIydN2s6N7--wJg54DXgsgzbtbNzA,3896
64
+ datasets/load.py,sha256=zarzIfFh0UXAw69z0qov4QWGfOicNNLugiHvWtxaIYg,66949
65
+ datasets/naming.py,sha256=aqQqYG4QR8YoxJJMAUyVv_oQyudm4WAApsEHvcozpNg,3001
66
+ datasets/packaged_modules/__init__.py,sha256=GRDBfCfxpVsJ_4WHVE4aC9t5Ws8Uo_t1FEhs5qqsclM,5822
67
+ datasets/packaged_modules/arrow/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
68
+ datasets/packaged_modules/arrow/arrow.py,sha256=lkadNXfBbJMQNDw-tK4B4Y1KJR5G-J6aAn9I9jHiLWY,3494
69
+ datasets/packaged_modules/audiofolder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
70
+ datasets/packaged_modules/audiofolder/audiofolder.py,sha256=N4mOZypp8oTI-9FBSeEFE-oQ23U6ZmqPlFcqbUkviA8,1744
71
+ datasets/packaged_modules/cache/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
72
+ datasets/packaged_modules/cache/cache.py,sha256=sjQDBHJUeLU1U9PUK179BHfn8dHNA2RoudCWeIAv8p8,8196
73
+ datasets/packaged_modules/csv/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
74
+ datasets/packaged_modules/csv/csv.py,sha256=4LShCsr9o4YY0C-n4V37L01u2_2qithYrswSp1WMsRU,8568
75
+ datasets/packaged_modules/folder_based_builder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
76
+ datasets/packaged_modules/folder_based_builder/folder_based_builder.py,sha256=ryTimqZbh9_reC9sjR0Hl6Ww6AbJy9RrX1ijB-qPnGU,21509
77
+ datasets/packaged_modules/generator/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
78
+ datasets/packaged_modules/generator/generator.py,sha256=Oke-26QOyDRkGfmIARqSXDqOJW0sIDjboYCwWSHsbdQ,1002
79
+ datasets/packaged_modules/hdf5/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
80
+ datasets/packaged_modules/hdf5/hdf5.py,sha256=QqzKr53qsD9XwqCGjrpWcRBnnwmpmtI-efaRV0DPSMY,12761
81
+ datasets/packaged_modules/imagefolder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
82
+ datasets/packaged_modules/imagefolder/imagefolder.py,sha256=UpMVe8TUyayzHsVSfKN5wiXcc94QdamMvxauI4oFdw4,1956
83
+ datasets/packaged_modules/json/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
84
+ datasets/packaged_modules/json/json.py,sha256=xGcVS6AlQTTsjqyVC5bF_yKj012eqNnzudOYTXx2Ixw,9466
85
+ datasets/packaged_modules/niftifolder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
86
+ datasets/packaged_modules/niftifolder/niftifolder.py,sha256=b57h90uCGJ4GWelGbAvSonSi8TnEM2t-tdCEFIa_c6k,586
87
+ datasets/packaged_modules/pandas/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
88
+ datasets/packaged_modules/pandas/pandas.py,sha256=eR0B5iGOHZ1owzezYmlvx5U_rWblmlpCt_PdC5Ax59E,2547
89
+ datasets/packaged_modules/parquet/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
90
+ datasets/packaged_modules/parquet/parquet.py,sha256=gjLURvIpfIEuWUB-9z0Ago9yPdGRyJSOTD9W25X6OqA,9015
91
+ datasets/packaged_modules/pdffolder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
92
+ datasets/packaged_modules/pdffolder/pdffolder.py,sha256=bPYBh9-XOr2C-gg_Fl8h-UKhsVQ7VXjBL2FfW8abiGU,565
93
+ datasets/packaged_modules/spark/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
94
+ datasets/packaged_modules/spark/spark.py,sha256=UKu4mRB3k0EFb-Ij83eXpzr7VjCYn_TohQconF8Npag,14689
95
+ datasets/packaged_modules/sql/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
96
+ datasets/packaged_modules/sql/sql.py,sha256=0WWm-Xfputk2_QRCVrbKDbZAqZNHxOGdUwfX__4F5E0,4495
97
+ datasets/packaged_modules/text/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
98
+ datasets/packaged_modules/text/text.py,sha256=VOJVHkmy4Vm53nspW7QboCkPxd1S0M0uEzun5v8rzUE,5516
99
+ datasets/packaged_modules/videofolder/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
100
+ datasets/packaged_modules/videofolder/videofolder.py,sha256=HLTMldDZ3WfK8OAbI2wssBuNCP6ucRBpNLpCoJVDL10,807
101
+ datasets/packaged_modules/webdataset/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
102
+ datasets/packaged_modules/webdataset/_tenbin.py,sha256=oovYsgR2R3eXSn1xSCLG3oTly1szKDP4UOiRp4ORdIk,8533
103
+ datasets/packaged_modules/webdataset/webdataset.py,sha256=CNoJjGkQvC457cg6N-eeao0myKMiRzi-x087XZzoeBE,10464
104
+ datasets/packaged_modules/xml/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
105
+ datasets/packaged_modules/xml/xml.py,sha256=av0HcLQnKl5d1yM0jfBqVhw9EbzqmO_RsHDfa5pkvx4,2822
106
+ datasets/parallel/__init__.py,sha256=wiRFK4x67ez2vvmjwM2Sb9R1yFdf38laSarU9y0Bido,76
107
+ datasets/parallel/parallel.py,sha256=E-oOQ6zwKrkLFPwZ-3EOcr_aANJDhE-d6QTq7Mp7WvA,4738
108
+ datasets/search.py,sha256=PxveDc1hy2Yux7Gtvat1_Qe_0D6aIKQBWI5KbV0gS6I,35592
109
+ datasets/splits.py,sha256=zZO9vPnbfzfxXQG8LSAQkajXV7TGB2kEwOWrQxPFQbI,23430
110
+ datasets/streaming.py,sha256=7MIamZ2NmReUmJ_2pxgdSopIf7Oh5nFFAbb7WHLuW7E,5772
111
+ datasets/table.py,sha256=8EpNo6Q6HMp2kkKITHUobwaZIJTfXZ4o4e47fzHXij0,93972
112
+ datasets/utils/__init__.py,sha256=PuZtB9YTbRyvdwubnsx-JGdHuMA7p0I0Rmh0E_uxYF0,999
113
+ datasets/utils/_dataset_viewer.py,sha256=SrE1N18S5yCoCx0rAhwaHNDVS9uhxjspA84iNT4TFRw,4397
114
+ datasets/utils/_dill.py,sha256=fPBTvK8yif0Yoxdp-a6vssExfLm7-3usD-xA3ai-N_g,17550
115
+ datasets/utils/_filelock.py,sha256=iXW3bxsIr5JWNemhKtF_-q_0ysajkUTItzMm8LY9LBY,2355
116
+ datasets/utils/deprecation_utils.py,sha256=hTHwlzRs92NfNVudH71LMpW70sjbsP5amebrIgi3A-U,3452
117
+ datasets/utils/doc_utils.py,sha256=HoSm0TFaQaCYGfDgNhpBJ4Xc2WQZuOD6dTxLd9D87fs,407
118
+ datasets/utils/experimental.py,sha256=JgOjaEY3RWZ--3u0-ry82gLCDUpudfBfl-hWZ46SyS4,1097
119
+ datasets/utils/extract.py,sha256=kKMAujtg5FOK91MBXyWl6FFHZStEPn8WkOE7Jmo2Iq4,13021
120
+ datasets/utils/file_utils.py,sha256=DcNkvq-UtK6UvxAOeRDFA3XI-isx6KIKaUMxI7QDKNA,55060
121
+ datasets/utils/filelock.py,sha256=H6C5dQGFCzVKyeDRRY8fZ4YGTEvvNd-MTjpL_sWYb5k,352
122
+ datasets/utils/hub.py,sha256=V2JGolL5VjFT0YiEhI0sxJED_9tGdvma7lH22d64S9I,130
123
+ datasets/utils/info_utils.py,sha256=gAzubjnQbE0YTzB3hf3Cipmx5wCBtOje3fPwjYdzVBE,4330
124
+ datasets/utils/logging.py,sha256=c2g1gl3IV4C7A2-ky0yfrHZuYi4P41HRTdS3XCmdMew,5381
125
+ datasets/utils/metadata.py,sha256=Hrmn8xUoEzwpJKG3Y6tfJt5t7nW1OCxNjfLTlEaxsrI,9367
126
+ datasets/utils/patching.py,sha256=iTeb7XG4faLJKNylq55EcZyCndUXU_XBDvOOkuDz_sc,4955
127
+ datasets/utils/py_utils.py,sha256=v-nq7bKydxCDPiDiaRq1ssEF3pkRTpQn4NV4BxmO-2s,23375
128
+ datasets/utils/resources/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
129
+ datasets/utils/resources/creators.json,sha256=XtIpMZefgBOdTevRrQTkFiufbgCbp_iyxseyphYQkn0,257
130
+ datasets/utils/resources/languages.json,sha256=Z0rQNPsfje8zMi8KdvvwxF4APwwqcskJFUvhNiLAgPM,199138
131
+ datasets/utils/resources/multilingualities.json,sha256=02Uc8RtRzfl13l98Y_alZm5HuMYwPzL78B0S5a1X-8c,205
132
+ datasets/utils/resources/readme_structure.yaml,sha256=hNf9msoBZw5jfakQrDb0Af8T325TXdcaHsAO2MUcZvY,3877
133
+ datasets/utils/resources/size_categories.json,sha256=_5nAP7z8R6t7_GfER81QudFO6Y1tqYu4AWrr4Aot8S8,171
134
+ datasets/utils/sharding.py,sha256=VBQ4bRJQijMNDQTgFb1_ddlQ28wAcA0aQp4e-1jFIAk,4215
135
+ datasets/utils/stratify.py,sha256=-MVaLmijYhGyKDpnZS9A8SiHekaIyVm84HVyIIQOmfg,4085
136
+ datasets/utils/tf_utils.py,sha256=T3OysLGbkO7y-J-o9OVGyn9l-l-A3ruj-24JM_UULm8,24448
137
+ datasets/utils/tqdm.py,sha256=44F0g2fBpJwShh1l88PP7Z8kBihFWA_Yee4sjiQSxes,4303
138
+ datasets/utils/track.py,sha256=M81CGLn3MyJzHm98CQkbF3_1DG7evQsw-V52_Bp2paI,1838
139
+ datasets/utils/typing.py,sha256=G11ytWmwjqVia2IdziRDIWvQ4mLJee-sKzgJfHqU16E,205
140
+ datasets/utils/version.py,sha256=Z82cHpjTbQVJyWgnwSU8DsW2G0y-sSbSoOVeQrAds9k,3281
lib/python3.13/site-packages/datasets-4.4.1.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/datasets-4.4.1.dist-info/WHEEL ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: setuptools (75.8.0)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
lib/python3.13/site-packages/datasets-4.4.1.dist-info/entry_points.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ [console_scripts]
2
+ datasets-cli = datasets.commands.datasets_cli:main
lib/python3.13/site-packages/datasets-4.4.1.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ datasets
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/LICENSE ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) 2017 scoder
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/METADATA ADDED
@@ -0,0 +1,226 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: fastrlock
3
+ Version: 0.8.3
4
+ Summary: Fast, re-entrant optimistic lock implemented in Cython
5
+ Home-page: https://github.com/scoder/fastrlock
6
+ Author: Stefan Behnel
7
+ Author-email: stefan_ml@behnel.de
8
+ License: MIT style
9
+ Classifier: Development Status :: 5 - Production/Stable
10
+ Classifier: Intended Audience :: Developers
11
+ Classifier: Intended Audience :: Information Technology
12
+ Classifier: License :: OSI Approved :: MIT License
13
+ Classifier: Programming Language :: Cython
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Operating System :: OS Independent
16
+ Classifier: Topic :: Software Development
17
+ License-File: LICENSE
18
+
19
+ FastRLock
20
+ ---------
21
+
22
+ This is a C-level implementation of a fast, re-entrant, optimistic lock for CPython.
23
+ It is a drop-in replacement for
24
+ `threading.RLock <https://docs.python.org/3/library/threading.html#threading.RLock>`_.
25
+ FastRLock is implemented in `Cython <https://cython.org>`_ and also provides a C-API
26
+ for direct use from Cython code via ``from fastrlock cimport rlock`` or
27
+ ``from cython.cimports.fastrlock import rlock``.
28
+
29
+ Under normal conditions, it is about 10x faster than ``threading.RLock`` in Python 2.7
30
+ because it avoids all locking unless two or more threads try to acquire it at the
31
+ same time. Under congestion, it is still about 10% faster than RLock due to being
32
+ implemented in Cython.
33
+
34
+ This is mostly equivalent to the revised RLock implementation in Python 3.2,
35
+ but still faster due to being implemented in Cython. However, in Python 3.4 and
36
+ later, the ``threading.RLock`` implementation in the stdlib tends to be as fast
37
+ or even faster than the lock provided by this package, when called through the
38
+ Python API. ``FastRLock`` is still faster also on these systems when called
39
+ through its Cython API from other Cython modules.
40
+
41
+ It was initially published as a code recipe here:
42
+ https://code.activestate.com/recipes/577336-fast-re-entrant-optimistic-lock-implemented-in-cyt/
43
+
44
+ FastRLock has been used and tested in `Lupa <https://github.com/scoder/lupa>`_ for several years.
45
+
46
+
47
+ How does it work?
48
+ -----------------
49
+
50
+ The FastRLock implementation optimises for the non-congested case. It works by
51
+ exploiting the availability of the GIL. Since it knows that it holds the GIL when
52
+ the acquire()/release() methods are called, it can safely check the lock for being
53
+ held by other threads and just count any re-entries as long as it is always the
54
+ same thread that acquires it. This is a lot faster than actually acquiring the
55
+ underlying lock.
56
+
57
+ When a second thread wants to acquire the lock as well, it first checks the lock
58
+ count and finds out that the lock is already owned. If the underlying lock is also
59
+ held by another thread already, it then just frees the GIL and asks for acquiring
60
+ the lock, just like RLock does. If the underlying lock is not held, however, it
61
+ acquires it immediately and basically hands over the ownership by telling the
62
+ current owner to free it when it's done. Then, it falls back to the normal
63
+ non-owner behaviour that asks for the lock and will eventually acquire it when it
64
+ gets released. This makes sure that the real lock is only acquired when at least
65
+ two threads want it.
66
+
67
+ All of these operations are basically atomic because any thread that modifies the
68
+ lock state always holds the GIL. Note that the implementation must not call any
69
+ Python code while handling the lock, as calling into Python may lead to a context
70
+ switch which hands over the GIL to another thread and thus breaks atomicity.
71
+ Therefore, the code misuses Cython's 'nogil' annotation to make sure that no Python
72
+ code slips in accidentally.
73
+
74
+
75
+ How fast is it?
76
+ ---------------
77
+
78
+ Here are some timings for the following scenarios:
79
+
80
+ 1) five acquire-release cycles ('lock_unlock')
81
+ 2) five acquire calls followed by five release calls (nested locking, 'reentrant_lock_unlock')
82
+ 3) a mixed and partly nested sequence of acquire and release calls ('mixed_lock_unlock')
83
+ 4) five acquire-release cycles that do not block ('lock_unlock_nonblocking')
84
+
85
+ All four are benchmarked for the single threaded case and the multi threaded case
86
+ with 10 threads. I also tested it with 20 threads only to see that it then takes
87
+ about twice the time for both versions. Note also that the congested case is
88
+ substantially slower for both locks and the benchmark includes the thread
89
+ creation time, so I only looped 1000x here to get useful
90
+ timings instead of 100000x for the single threaded case.
91
+
92
+ The results here are mixed. Depending on the optimisation of the CPython
93
+ installation, it can be faster, about the same speed, or somewhat slower.
94
+ In any case, the direct Cython interface is always faster than going through
95
+ the Python API, because it avoids the Python call overhead and executes
96
+ a C call instead.
97
+
98
+ ::
99
+
100
+ Testing RLock (3.10.1)
101
+
102
+ sequential (x100000):
103
+ lock_unlock : 138.36 msec
104
+ reentrant_lock_unlock : 95.35 msec
105
+ mixed_lock_unlock : 102.05 msec
106
+ lock_unlock_nonblocking : 131.44 msec
107
+ context_manager : 616.83 msec
108
+
109
+ threaded 10T (x1000):
110
+ lock_unlock : 1386.60 msec
111
+ reentrant_lock_unlock : 1207.75 msec
112
+ mixed_lock_unlock : 1319.62 msec
113
+ lock_unlock_nonblocking : 1325.07 msec
114
+ context_manager : 1357.93 msec
115
+
116
+ Testing FastRLock (0.8.1)
117
+
118
+ sequential (x100000):
119
+ lock_unlock : 77.47 msec
120
+ reentrant_lock_unlock : 64.14 msec
121
+ mixed_lock_unlock : 73.51 msec
122
+ lock_unlock_nonblocking : 70.31 msec
123
+ context_manager : 393.34 msec
124
+
125
+ threaded 10T (x1000):
126
+ lock_unlock : 1214.13 msec
127
+ reentrant_lock_unlock : 1171.75 msec
128
+ mixed_lock_unlock : 1184.33 msec
129
+ lock_unlock_nonblocking : 1207.42 msec
130
+ context_manager : 1232.20 msec
131
+
132
+ Testing Cython interface of FastRLock (0.8.1)
133
+
134
+ sequential (x100000):
135
+ lock_unlock : 18.70 msec
136
+ reentrant_lock_unlock : 15.88 msec
137
+ mixed_lock_unlock : 14.96 msec
138
+ lock_unlock_nonblocking : 13.47 msec
139
+
140
+ threaded 10T (x1000):
141
+ lock_unlock : 1236.21 msec
142
+ reentrant_lock_unlock : 1245.77 msec
143
+ mixed_lock_unlock : 1194.25 msec
144
+ lock_unlock_nonblocking : 1206.96 msec
145
+
146
+
147
+ ===================
148
+ fastrlock changelog
149
+ ===================
150
+
151
+ 0.8.3 (2024-12-17)
152
+ ==================
153
+
154
+ * Rebuilt with Cython 3.0.11 to add Python 3.13 support.
155
+
156
+
157
+ 0.8.2 (2023-08-27)
158
+ ==================
159
+
160
+ * Rebuilt with Cython 3.0.2 to add Python 3.12 support.
161
+
162
+
163
+ 0.8.1 (2022-11-02)
164
+ ==================
165
+
166
+ * Rebuilt with Cython 3.0.0a11 to add Python 3.11 support.
167
+
168
+
169
+ 0.8 (2021-10-22)
170
+ ================
171
+
172
+ * Rebuilt with Cython 3.0.0a9 to improve the performance in recent
173
+ Python 3.x versions.
174
+
175
+
176
+ 0.7 (2021-10-21)
177
+ ================
178
+
179
+ * Adapted for unsigned thread IDs, as used by Py3.7+.
180
+ (original patch by Guilherme Dantas)
181
+
182
+ * Build with Cython 0.29.24 to support Py3.10 and later.
183
+
184
+
185
+ 0.6 (2021-03-21)
186
+ ================
187
+
188
+ * Rebuild with Cython 0.29.22 to support Py3.9 and later.
189
+
190
+
191
+ 0.5 (2020-06-05)
192
+ ================
193
+
194
+ * Rebuild with Cython 0.29.20 to support Py3.8 and later.
195
+
196
+
197
+ 0.4 (2018-08-24)
198
+ ================
199
+
200
+ * Rebuild with Cython 0.28.5.
201
+
202
+ * Linux wheels are faster through profile guided optimisation.
203
+
204
+ * Add missing file to sdist.
205
+ (patch by Mark Harfouche, Github issue #5)
206
+
207
+
208
+ 0.3 (2017-08-10)
209
+ ================
210
+
211
+ * improve cimport support of C-API
212
+ (patch by Naotoshi Seo, Github issue #3)
213
+
214
+ * provide ``fastrlock.__version__``
215
+
216
+
217
+ 0.2 (2017-08-09)
218
+ ================
219
+
220
+ * add missing readme file to sdist
221
+
222
+
223
+ 0.1 (2017-06-04)
224
+ ================
225
+
226
+ * initial release
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/RECORD ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ fastrlock-0.8.3.dist-info/INSTALLER,sha256=5hhM4Q4mYTT9z6QB6PGpUAW81PGNFrYrdXMj4oM_6ak,2
2
+ fastrlock-0.8.3.dist-info/LICENSE,sha256=edWWCQqdGaUaEXXL0SQGCy8j1Pa-vqeYIkHSMRdRljA,1063
3
+ fastrlock-0.8.3.dist-info/METADATA,sha256=CSkdXG1Tg_Nn1ar1AXfaqMPqOzGI3Er9xl1ed3brFQo,7664
4
+ fastrlock-0.8.3.dist-info/RECORD,,
5
+ fastrlock-0.8.3.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
+ fastrlock-0.8.3.dist-info/WHEEL,sha256=DhVnjUrgFTXtsoVOoKWdzDrhNCSUWwR0S5N1r3Zukh4,186
7
+ fastrlock-0.8.3.dist-info/top_level.txt,sha256=QMLNNCjoisR1NTxtzPxl2Zyih9n6sFxd8VCUQzIJHOA,10
8
+ fastrlock/__init__.pxd,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
9
+ fastrlock/__init__.py,sha256=lYDBBV0R1dtMBmWKorNXKhEma8Fo0OswJJW6zCSGmtU,169
10
+ fastrlock/_lock.pxi,sha256=tPIg2qyMZbCZDEXQsp_tb_Em2J0podo3iU3-XEBdnTQ,2608
11
+ fastrlock/rlock.cpython-313-x86_64-linux-gnu.so,sha256=gff9iX1lYMZf6lyvs5WwBdTKD3JFTHcFxdYaxQOqmf4,118912
12
+ fastrlock/rlock.pxd,sha256=slrtTC9yStpzsL9FUgoyU69D_YsJAe036GEfH6Z9a0c,313
13
+ fastrlock/rlock.pyx,sha256=YZfaVup-Tkqb42IcNlunf4Vtt2vXVQfZPG4l9BmQlAY,3599
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/REQUESTED ADDED
File without changes
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/WHEEL ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: setuptools (75.6.0)
3
+ Root-Is-Purelib: false
4
+ Tag: cp313-cp313-manylinux_2_5_x86_64
5
+ Tag: cp313-cp313-manylinux1_x86_64
6
+ Tag: cp313-cp313-manylinux_2_28_x86_64
7
+
lib/python3.13/site-packages/fastrlock-0.8.3.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ fastrlock
lib/python3.13/site-packages/flashinfer_python-0.4.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ uv
lib/python3.13/site-packages/flashinfer_python-0.4.1.dist-info/METADATA ADDED
@@ -0,0 +1,243 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.4
2
+ Name: flashinfer-python
3
+ Version: 0.4.1
4
+ Summary: FlashInfer: Kernel Library for LLM Serving
5
+ Author: FlashInfer team
6
+ License-Expression: Apache-2.0
7
+ Project-URL: Homepage, https://github.com/flashinfer-ai/flashinfer
8
+ Requires-Python: <4.0,>=3.9
9
+ Description-Content-Type: text/markdown
10
+ License-File: LICENSE
11
+ License-File: licenses/LICENSE.cutlass.txt
12
+ License-File: licenses/LICENSE.flashattention3.txt
13
+ License-File: licenses/LICENSE.fmt.txt
14
+ License-File: licenses/LICENSE.spdlog.txt
15
+ Requires-Dist: apache-tvm-ffi==0.1.0b15
16
+ Requires-Dist: click
17
+ Requires-Dist: einops
18
+ Requires-Dist: ninja
19
+ Requires-Dist: numpy
20
+ Requires-Dist: nvidia-cudnn-frontend>=1.13.0
21
+ Requires-Dist: nvidia-cutlass-dsl>=4.2.1
22
+ Requires-Dist: nvidia-ml-py
23
+ Requires-Dist: packaging>=24.2
24
+ Requires-Dist: requests
25
+ Requires-Dist: tabulate
26
+ Requires-Dist: torch
27
+ Requires-Dist: tqdm
28
+ Dynamic: license-file
29
+
30
+ <p align="center">
31
+ <picture>
32
+ <source media="(prefers-color-scheme: dark)" srcset="https://github.com/flashinfer-ai/web-data/blob/main/logo/FlashInfer-black-background.png?raw=true">
33
+ <img alt="FlashInfer" src="https://github.com/flashinfer-ai/web-data/blob/main/logo/FlashInfer-white-background.png?raw=true" width=55%>
34
+ </picture>
35
+ </p>
36
+ <h1 align="center">
37
+ Kernel Library for LLM Serving
38
+ </h1>
39
+
40
+ <p align="center">
41
+ | <a href="https://flashinfer.ai"><b>Blog</b></a> | <a href="https://docs.flashinfer.ai"><b>Documentation</b></a> | <a href="https://join.slack.com/t/flashinfer/shared_invite/zt-379wct3hc-D5jR~1ZKQcU00WHsXhgvtA"><b>Slack</b></a> | <a href="https://github.com/orgs/flashinfer-ai/discussions"><b>Discussion Forum</b></a> |
42
+ </p>
43
+
44
+ [![Build Status](https://ci.tlcpack.ai/job/flashinfer-ci/job/main/badge/icon)](https://ci.tlcpack.ai/job/flashinfer-ci/job/main/)
45
+ [![Documentation](https://github.com/flashinfer-ai/flashinfer/actions/workflows/build-doc.yml/badge.svg)](https://github.com/flashinfer-ai/flashinfer/actions/workflows/build-doc.yml)
46
+
47
+
48
+ FlashInfer is a library and kernel generator for Large Language Models that provides high-performance implementation of LLM GPU kernels such as FlashAttention, SparseAttention, PageAttention, Sampling, and more. FlashInfer focuses on LLM serving and inference, and delivers state-of-the-art performance across diverse scenarios.
49
+
50
+ Check our [v0.2 release blog](https://flashinfer.ai/2024/12/16/flashinfer-v02-release.html) for new features!
51
+
52
+ The core features of FlashInfer include:
53
+ 1. **Efficient Sparse/Dense Attention Kernels**: Efficient single/batch attention for sparse(paged)/dense KV-storage on CUDA Cores and Tensor Cores (both FA2 & FA3) templates. The vector-sparse attention can achieve 90% of the bandwidth of dense kernels with same problem size.
54
+ 2. **Load-Balanced Scheduling**: FlashInfer decouples `plan`/`run` stage of attention computation where we schedule the computation of variable-length inputs in `plan` stage to alleviate load-imbalance issue.
55
+ 3. **Memory Efficiency**: FlashInfer offers [Cascade Attention](https://docs.flashinfer.ai/api/cascade.html#flashinfer.cascade.MultiLevelCascadeAttentionWrapper) for hierarchical KV-Cache, and implements Head-Query fusion for accelerating Grouped-Query Attention, and efficient kernels for low-precision attention and fused-RoPE attention for compressed KV-Cache.
56
+ 4. **Customizable Attention**: Bring your own attention variants through JIT-compilation.
57
+ 5. **CUDAGraph and torch.compile Compatibility**: FlashInfer kernels can be captured by CUDAGraphs and torch.compile for low-latency inference.
58
+ 6. **Efficient LLM-specific Operators**: High-Performance [fused kernel for Top-P, Top-K/Min-P sampling](https://docs.flashinfer.ai/api/sampling.html) without the need to sorting.
59
+
60
+ FlashInfer supports PyTorch, TVM and C++ (header-only) APIs, and can be easily integrated into existing projects.
61
+
62
+ ## News
63
+ - [Mar 10, 2025] [Blog Post](https://flashinfer.ai/2025/03/10/sampling.html) Sorting-Free GPU Kernels for LLM Sampling, which explains the design of sampling kernels in FlashInfer.
64
+ - [Mar 1, 2025] Checkout flashinfer's [intra-kernel profiler](https://github.com/flashinfer-ai/flashinfer/tree/main/profiler) for visualizing the timeline of each threadblock in GPU kernels.
65
+ - [Dec 16, 2024] [Blog Post](https://flashinfer.ai/2024/12/16/flashinfer-v02-release.html) FlashInfer 0.2 - Efficient and Customizable Kernels for LLM Inference Serving
66
+ - [Sept 2024] We've launched a [Slack](https://join.slack.com/t/flashinfer/shared_invite/zt-2r93kj2aq-wZnC2n_Z2~mf73N5qnVGGA) workspace for Flashinfer users and developers. Join us for timely support, discussions, updates and knowledge sharing!
67
+ - [Jan 31, 2024] [Blog Post](https://flashinfer.ai/2024/01/08/cascade-inference.html) Cascade Inference: Memory-Efficient Shared Prefix Batch Decoding
68
+ - [Jan 31, 2024] [Blog Post](https://flashinfer.ai/2024/01/03/introduce-flashinfer.html) Accelerating Self-Attentions for LLM Serving with FlashInfer
69
+
70
+ ## Getting Started
71
+
72
+ Using our PyTorch API is the easiest way to get started:
73
+
74
+ ### Install from PyPI
75
+
76
+ FlashInfer is available as a Python package for Linux. Install the core package with:
77
+
78
+ ```bash
79
+ pip install flashinfer-python
80
+ ```
81
+
82
+ **Package Options:**
83
+ - **flashinfer-python**: Core package that compiles/downloads kernels on first use
84
+ - **flashinfer-cubin**: Pre-compiled kernel binaries for all supported GPU architectures
85
+ - **flashinfer-jit-cache**: Pre-built kernel cache for specific CUDA versions
86
+
87
+ **For faster initialization and offline usage**, install the optional packages to have most kernels pre-compiled:
88
+ ```bash
89
+ pip install flashinfer-python flashinfer-cubin
90
+ # JIT cache package (replace cu129 with your CUDA version: cu128, cu129, or cu130)
91
+ pip install flashinfer-jit-cache --index-url https://flashinfer.ai/whl/cu129
92
+ ```
93
+
94
+ This eliminates compilation and downloading overhead at runtime.
95
+
96
+ ### Install from Source
97
+
98
+ Build the core package from source:
99
+
100
+ ```bash
101
+ git clone https://github.com/flashinfer-ai/flashinfer.git --recursive
102
+ cd flashinfer
103
+ python -m pip install -v .
104
+ ```
105
+
106
+ **For development**, install in editable mode:
107
+ ```bash
108
+ python -m pip install --no-build-isolation -e . -v
109
+ ```
110
+
111
+ **Build optional packages:**
112
+
113
+ `flashinfer-cubin`:
114
+ ```bash
115
+ cd flashinfer-cubin
116
+ python -m build --no-isolation --wheel
117
+ python -m pip install dist/*.whl
118
+ ```
119
+
120
+ `flashinfer-jit-cache` (customize `FLASHINFER_CUDA_ARCH_LIST` for your target GPUs):
121
+ ```bash
122
+ export FLASHINFER_CUDA_ARCH_LIST="7.5 8.0 8.9 10.0a 10.3a 12.0a"
123
+ cd flashinfer-jit-cache
124
+ python -m build --no-isolation --wheel
125
+ python -m pip install dist/*.whl
126
+ ```
127
+
128
+ For more details, see the [Install from Source documentation](https://docs.flashinfer.ai/installation.html#install-from-source).
129
+
130
+ ### Install Nightly Build
131
+
132
+ Nightly builds are available for testing the latest features:
133
+
134
+ ```bash
135
+ # Core and cubin packages
136
+ pip install -U --pre flashinfer-python --index-url https://flashinfer.ai/whl/nightly/ --no-deps # Install the nightly package from custom index, without installing dependencies
137
+ pip install flashinfer-python # Install flashinfer-python's dependencies from PyPI
138
+ pip install -U --pre flashinfer-cubin --index-url https://flashinfer.ai/whl/nightly/
139
+ # JIT cache package (replace cu129 with your CUDA version: cu128, cu129, or cu130)
140
+ pip install -U --pre flashinfer-jit-cache --index-url https://flashinfer.ai/whl/nightly/cu129
141
+ ```
142
+
143
+ ### Verify Installation
144
+
145
+ After installation, verify that FlashInfer is correctly installed and configured:
146
+
147
+ ```bash
148
+ flashinfer show-config
149
+ ```
150
+
151
+ This command displays:
152
+ - FlashInfer version and installed packages (flashinfer-python, flashinfer-cubin, flashinfer-jit-cache)
153
+ - PyTorch and CUDA version information
154
+ - Environment variables and artifact paths
155
+ - Downloaded cubin status and module compilation status
156
+
157
+ ### Trying it out
158
+
159
+ Below is a minimal example of using FlashInfer's single-request decode/append/prefill attention kernels:
160
+
161
+ ```python
162
+ import torch
163
+ import flashinfer
164
+
165
+ kv_len = 2048
166
+ num_kv_heads = 32
167
+ head_dim = 128
168
+
169
+ k = torch.randn(kv_len, num_kv_heads, head_dim).half().to(0)
170
+ v = torch.randn(kv_len, num_kv_heads, head_dim).half().to(0)
171
+
172
+ # decode attention
173
+
174
+ num_qo_heads = 32
175
+ q = torch.randn(num_qo_heads, head_dim).half().to(0)
176
+
177
+ o = flashinfer.single_decode_with_kv_cache(q, k, v) # decode attention without RoPE on-the-fly
178
+ o_rope_on_the_fly = flashinfer.single_decode_with_kv_cache(q, k, v, pos_encoding_mode="ROPE_LLAMA") # decode with LLaMA style RoPE on-the-fly
179
+
180
+ # append attention
181
+ append_qo_len = 128
182
+ q = torch.randn(append_qo_len, num_qo_heads, head_dim).half().to(0) # append attention, the last 128 tokens in the KV-Cache are the new tokens
183
+ o = flashinfer.single_prefill_with_kv_cache(q, k, v, causal=True) # append attention without RoPE on-the-fly, apply causal mask
184
+ o_rope_on_the_fly = flashinfer.single_prefill_with_kv_cache(q, k, v, causal=True, pos_encoding_mode="ROPE_LLAMA") # append attention with LLaMA style RoPE on-the-fly, apply causal mask
185
+
186
+ # prefill attention
187
+ qo_len = 2048
188
+ q = torch.randn(qo_len, num_qo_heads, head_dim).half().to(0) # prefill attention
189
+ o = flashinfer.single_prefill_with_kv_cache(q, k, v, causal=False) # prefill attention without RoPE on-the-fly, do not apply causal mask
190
+ ```
191
+
192
+ Check out [documentation](https://docs.flashinfer.ai/) for usage of batch decode/append/prefill kernels and shared-prefix cascading kernels.
193
+
194
+ ## Custom Attention Variants
195
+
196
+ Starting from FlashInfer v0.2, users can customize their own attention variants with additional parameters. For more details, refer to our [JIT examples](https://github.com/flashinfer-ai/flashinfer/blob/main/tests/utils/test_jit_example.py).
197
+
198
+ ## GPU Support
199
+
200
+ FlashInfer currently provides support for NVIDIA SM architectures 75 and higher and beta support for 103, 110, 120, and 121.
201
+
202
+ ## Adoption
203
+
204
+ We are thrilled to share that FlashInfer is being adopted by many cutting-edge projects, including but not limited to:
205
+ - [MLC-LLM](https://github.com/mlc-ai/mlc-llm)
206
+ - [Punica](https://github.com/punica-ai/punica)
207
+ - [SGLang](https://github.com/sgl-project/sglang)
208
+ - [ScaleLLM](https://github.com/vectorch-ai/ScaleLLM)
209
+ - [vLLM](https://github.com/vllm-project/vllm)
210
+ - [TGI](https://github.com/huggingface/text-generation-inference)
211
+ - [lorax](https://github.com/predibase/lorax)
212
+ - [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM)
213
+ - [LightLLM](https://github.com/ModelTC/lightllm)
214
+
215
+ ## Acknowledgement
216
+
217
+ FlashInfer is inspired by [FlashAttention 1&2](https://github.com/dao-AILab/flash-attention/), [vLLM](https://github.com/vllm-project/vllm), [stream-K](https://arxiv.org/abs/2301.03598), [cutlass](https://github.com/nvidia/cutlass) and [AITemplate](https://github.com/facebookincubator/AITemplate) projects.
218
+
219
+ ## Citation
220
+
221
+ If you find FlashInfer helpful in your project or research, please consider citing our [paper](https://arxiv.org/abs/2501.01005):
222
+
223
+ ```bibtex
224
+ @article{ye2025flashinfer,
225
+ title = {FlashInfer: Efficient and Customizable Attention Engine for LLM Inference Serving},
226
+ author = {
227
+ Ye, Zihao and
228
+ Chen, Lequn and
229
+ Lai, Ruihang and
230
+ Lin, Wuwei and
231
+ Zhang, Yineng and
232
+ Wang, Stephanie and
233
+ Chen, Tianqi and
234
+ Kasikci, Baris and
235
+ Grover, Vinod and
236
+ Krishnamurthy, Arvind and
237
+ Ceze, Luis
238
+ },
239
+ journal = {arXiv preprint arXiv:2501.01005},
240
+ year = {2025},
241
+ url = {https://arxiv.org/abs/2501.01005}
242
+ }
243
+ ```