+
+> Markdown parser done right.
+
+- Follows the __[CommonMark spec](http://spec.commonmark.org/)__ for baseline parsing
+- Configurable syntax: you can add new rules and even replace existing ones.
+- Pluggable: Adds syntax extensions to extend the parser (see the [plugin list][md-plugins]).
+- High speed (see our [benchmarking tests][md-performance])
+- Easy to configure for [security][md-security]
+- Member of [Google's Assured Open Source Software](https://cloud.google.com/assured-open-source-software/docs/supported-packages)
+
+This is a Python port of [markdown-it], and some of its associated plugins.
+For more details see: .
+
+For details on [markdown-it] itself, see:
+
+- The __[Live demo](https://markdown-it.github.io)__
+- [The markdown-it README][markdown-it-readme]
+
+**See also:** [markdown-it-pyrs](https://github.com/chrisjsewell/markdown-it-pyrs) for an experimental Rust binding,
+for even more speed!
+
+## Installation
+
+### PIP
+
+```bash
+pip install markdown-it-py[plugins]
+```
+
+or with extras
+
+```bash
+pip install markdown-it-py[linkify,plugins]
+```
+
+### Conda
+
+```bash
+conda install -c conda-forge markdown-it-py
+```
+
+or with extras
+
+```bash
+conda install -c conda-forge markdown-it-py linkify-it-py mdit-py-plugins
+```
+
+## Usage
+
+### Python API Usage
+
+Render markdown to HTML with markdown-it-py and a custom configuration
+with and without plugins and features:
+
+```python
+from markdown_it import MarkdownIt
+from mdit_py_plugins.front_matter import front_matter_plugin
+from mdit_py_plugins.footnote import footnote_plugin
+
+md = (
+ MarkdownIt('commonmark', {'breaks':True,'html':True})
+ .use(front_matter_plugin)
+ .use(footnote_plugin)
+ .enable('table')
+)
+text = ("""
+---
+a: 1
+---
+
+a | b
+- | -
+1 | 2
+
+A footnote [^1]
+
+[^1]: some details
+""")
+tokens = md.parse(text)
+html_text = md.render(text)
+
+## To export the html to a file, uncomment the lines below:
+# from pathlib import Path
+# Path("output.html").write_text(html_text)
+```
+
+### Command-line Usage
+
+Render markdown to HTML with markdown-it-py from the
+command-line:
+
+```console
+usage: markdown-it [-h] [-v] [filenames [filenames ...]]
+
+Parse one or more markdown files, convert each to HTML, and print to stdout
+
+positional arguments:
+ filenames specify an optional list of files to convert
+
+optional arguments:
+ -h, --help show this help message and exit
+ -v, --version show program's version number and exit
+
+Interactive:
+
+ $ markdown-it
+ markdown-it-py [version 0.0.0] (interactive)
+ Type Ctrl-D to complete input, or Ctrl-C to exit.
+ >>> # Example
+ ... > markdown *input*
+ ...
+
Example
+
+
markdown input
+
+
+Batch:
+
+ $ markdown-it README.md README.footer.md > index.html
+
+```
+
+## References / Thanks
+
+Big thanks to the authors of [markdown-it]:
+
+- Alex Kocharin [github/rlidwka](https://github.com/rlidwka)
+- Vitaly Puzrin [github/puzrin](https://github.com/puzrin)
+
+Also [John MacFarlane](https://github.com/jgm) for his work on the CommonMark spec and reference implementations.
+
+[github-ci]: https://github.com/executablebooks/markdown-it-py/actions/workflows/tests.yml/badge.svg?branch=master
+[github-link]: https://github.com/executablebooks/markdown-it-py
+[pypi-badge]: https://img.shields.io/pypi/v/markdown-it-py.svg
+[pypi-link]: https://pypi.org/project/markdown-it-py
+[conda-badge]: https://anaconda.org/conda-forge/markdown-it-py/badges/version.svg
+[conda-link]: https://anaconda.org/conda-forge/markdown-it-py
+[codecov-badge]: https://codecov.io/gh/executablebooks/markdown-it-py/branch/master/graph/badge.svg
+[codecov-link]: https://codecov.io/gh/executablebooks/markdown-it-py
+[install-badge]: https://img.shields.io/pypi/dw/markdown-it-py?label=pypi%20installs
+[install-link]: https://pypistats.org/packages/markdown-it-py
+
+[CommonMark spec]: http://spec.commonmark.org/
+[markdown-it]: https://github.com/markdown-it/markdown-it
+[markdown-it-readme]: https://github.com/markdown-it/markdown-it/blob/master/README.md
+[md-security]: https://markdown-it-py.readthedocs.io/en/latest/security.html
+[md-performance]: https://markdown-it-py.readthedocs.io/en/latest/performance.html
+[md-plugins]: https://markdown-it-py.readthedocs.io/en/latest/plugins.html
+
diff --git a/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/RECORD b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/RECORD
new file mode 100644
index 0000000000000000000000000000000000000000..c5546ad93f519f654ff2bc6a4534026faef15079
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/RECORD
@@ -0,0 +1,142 @@
+../../../bin/markdown-it,sha256=OJvNJGZt5O0PgS-ONRKjcT8iGz3Wz_yU7W2FjDS_AZQ,385
+markdown_it/__init__.py,sha256=R7fMvDxageYJ4Q6doBcimogy1ctcV1eBuCFu5Pr8bbA,114
+markdown_it/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/__pycache__/_compat.cpython-310.pyc,,
+markdown_it/__pycache__/_punycode.cpython-310.pyc,,
+markdown_it/__pycache__/main.cpython-310.pyc,,
+markdown_it/__pycache__/parser_block.cpython-310.pyc,,
+markdown_it/__pycache__/parser_core.cpython-310.pyc,,
+markdown_it/__pycache__/parser_inline.cpython-310.pyc,,
+markdown_it/__pycache__/renderer.cpython-310.pyc,,
+markdown_it/__pycache__/ruler.cpython-310.pyc,,
+markdown_it/__pycache__/token.cpython-310.pyc,,
+markdown_it/__pycache__/tree.cpython-310.pyc,,
+markdown_it/__pycache__/utils.cpython-310.pyc,,
+markdown_it/_compat.py,sha256=U4S_2y3zgLZVfMenHRaJFBW8yqh2mUBuI291LGQVOJ8,35
+markdown_it/_punycode.py,sha256=JvSOZJ4VKr58z7unFGM0KhfTxqHMk2w8gglxae2QszM,2373
+markdown_it/cli/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+markdown_it/cli/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/cli/__pycache__/parse.cpython-310.pyc,,
+markdown_it/cli/parse.py,sha256=Un3N7fyGHhZAQouGVnRx-WZcpKwEK2OF08rzVAEBie8,2881
+markdown_it/common/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+markdown_it/common/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/common/__pycache__/entities.cpython-310.pyc,,
+markdown_it/common/__pycache__/html_blocks.cpython-310.pyc,,
+markdown_it/common/__pycache__/html_re.cpython-310.pyc,,
+markdown_it/common/__pycache__/normalize_url.cpython-310.pyc,,
+markdown_it/common/__pycache__/utils.cpython-310.pyc,,
+markdown_it/common/entities.py,sha256=EYRCmUL7ZU1FRGLSXQlPx356lY8EUBdFyx96eSGc6d0,157
+markdown_it/common/html_blocks.py,sha256=QXbUDMoN9lXLgYFk2DBYllnLiFukL6dHn2X98Y6Wews,986
+markdown_it/common/html_re.py,sha256=FggAEv9IL8gHQqsGTkHcf333rTojwG0DQJMH9oVu0fU,926
+markdown_it/common/normalize_url.py,sha256=avOXnLd9xw5jU1q5PLftjAM9pvGx8l9QDEkmZSyrMgg,2568
+markdown_it/common/utils.py,sha256=pMgvMOE3ZW-BdJ7HfuzlXNKyD1Ivk7jHErc2J_B8J5M,8734
+markdown_it/helpers/__init__.py,sha256=YH2z7dS0WUc_9l51MWPvrLtFoBPh4JLGw58OuhGRCK0,253
+markdown_it/helpers/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/helpers/__pycache__/parse_link_destination.cpython-310.pyc,,
+markdown_it/helpers/__pycache__/parse_link_label.cpython-310.pyc,,
+markdown_it/helpers/__pycache__/parse_link_title.cpython-310.pyc,,
+markdown_it/helpers/parse_link_destination.py,sha256=u-xxWVP3g1s7C1bQuQItiYyDrYoYHJzXaZXPgr-o6mY,1906
+markdown_it/helpers/parse_link_label.py,sha256=PIHG6ZMm3BUw0a2m17lCGqNrl3vaz911tuoGviWD3I4,1037
+markdown_it/helpers/parse_link_title.py,sha256=jkLoYQMKNeX9bvWQHkaSroiEo27HylkEUNmj8xBRlp4,2273
+markdown_it/main.py,sha256=vzuT23LJyKrPKNyHKKAbOHkNWpwIldOGUM-IGsv2DHM,12732
+markdown_it/parser_block.py,sha256=-MyugXB63Te71s4NcSQZiK5bE6BHkdFyZv_bviuatdI,3939
+markdown_it/parser_core.py,sha256=SRmJjqe8dC6GWzEARpWba59cBmxjCr3Gsg8h29O8sQk,1016
+markdown_it/parser_inline.py,sha256=y0jCig8CJxQO7hBz0ZY3sGvPlAKTohOwIgaqnlSaS5A,5024
+markdown_it/port.yaml,sha256=jt_rdwOnfocOV5nc35revTybAAQMIp_-1fla_527sVE,2447
+markdown_it/presets/__init__.py,sha256=22vFtwJEY7iqFRtgVZ-pJthcetfpr1Oig8XOF9x1328,970
+markdown_it/presets/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/presets/__pycache__/commonmark.cpython-310.pyc,,
+markdown_it/presets/__pycache__/default.cpython-310.pyc,,
+markdown_it/presets/__pycache__/zero.cpython-310.pyc,,
+markdown_it/presets/commonmark.py,sha256=ygfb0R7WQ_ZoyQP3df-B0EnYMqNXCVOSw9SAdMjsGow,2869
+markdown_it/presets/default.py,sha256=FfKVUI0HH3M-_qy6RwotLStdC4PAaAxE7Dq0_KQtRtc,1811
+markdown_it/presets/zero.py,sha256=okXWTBEI-2nmwx5XKeCjxInRf65oC11gahtRl-QNtHM,2113
+markdown_it/py.typed,sha256=8PjyZ1aVoQpRVvt71muvuq5qE-jTFZkK-GLHkhdebmc,26
+markdown_it/renderer.py,sha256=Lzr0glqd5oxFL10DOfjjW8kg4Gp41idQ4viEQaE47oA,9947
+markdown_it/ruler.py,sha256=eMAtWGRAfSM33aiJed0k5923BEkuMVsMq1ct8vU-ql4,9142
+markdown_it/rules_block/__init__.py,sha256=SQpg0ocmsHeILPAWRHhzgLgJMKIcNkQyELH13o_6Ktc,553
+markdown_it/rules_block/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/blockquote.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/code.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/fence.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/heading.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/hr.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/html_block.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/lheading.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/list.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/paragraph.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/reference.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/state_block.cpython-310.pyc,,
+markdown_it/rules_block/__pycache__/table.cpython-310.pyc,,
+markdown_it/rules_block/blockquote.py,sha256=7uymS36dcrned3DsIaRcqcbFU1NlymhvsZpEXTD3_n8,8887
+markdown_it/rules_block/code.py,sha256=iTAxv0U1-MDhz88M1m1pi2vzOhEMSEROsXMo2Qq--kU,860
+markdown_it/rules_block/fence.py,sha256=BJgU-PqZ4vAlCqGcrc8UtdLpJJyMeRWN-G-Op-zxrMc,2537
+markdown_it/rules_block/heading.py,sha256=4Lh15rwoVsQjE1hVhpbhidQ0k9xKHihgjAeYSbwgO5k,1745
+markdown_it/rules_block/hr.py,sha256=QCoY5kImaQRvF7PyP8OoWft6A8JVH1v6MN-0HR9Ikpg,1227
+markdown_it/rules_block/html_block.py,sha256=wA8pb34LtZr1BkIATgGKQBIGX5jQNOkwZl9UGEqvb5M,2721
+markdown_it/rules_block/lheading.py,sha256=fWoEuUo7S2svr5UMKmyQMkh0hheYAHg2gMM266Mogs4,2625
+markdown_it/rules_block/list.py,sha256=gIodkAJFyOIyKCZCj5lAlL7jIj5kAzrDb-K-2MFNplY,9668
+markdown_it/rules_block/paragraph.py,sha256=9pmCwA7eMu4LBdV4fWKzC4EdwaOoaGw2kfeYSQiLye8,1819
+markdown_it/rules_block/reference.py,sha256=ue1qZbUaUP0GIvwTjh6nD1UtCij8uwsIMuYW1xBkckc,6983
+markdown_it/rules_block/state_block.py,sha256=HowsQyy5hGUibH4HRZWKfLIlXeDUnuWL7kpF0-rSwoM,8422
+markdown_it/rules_block/table.py,sha256=8nMd9ONGOffER7BXmc9kbbhxkLjtpX79dVLR0iatGnM,7682
+markdown_it/rules_core/__init__.py,sha256=QFGBe9TUjnRQJDU7xY4SQYpxyTHNwg8beTSwXpNGRjE,394
+markdown_it/rules_core/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/block.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/inline.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/linkify.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/normalize.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/replacements.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/smartquotes.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/state_core.cpython-310.pyc,,
+markdown_it/rules_core/__pycache__/text_join.cpython-310.pyc,,
+markdown_it/rules_core/block.py,sha256=0_JY1CUy-H2OooFtIEZAACtuoGUMohgxo4Z6A_UinSg,372
+markdown_it/rules_core/inline.py,sha256=9oWmeBhJHE7x47oJcN9yp6UsAZtrEY_A-VmfoMvKld4,325
+markdown_it/rules_core/linkify.py,sha256=mjQqpk_lHLh2Nxw4UFaLxa47Fgi-OHnmDamlgXnhmv0,5141
+markdown_it/rules_core/normalize.py,sha256=AJm4femtFJ_QBnM0dzh0UNqTTJk9K6KMtwRPaioZFqM,403
+markdown_it/rules_core/replacements.py,sha256=CH75mie-tdzdLKQtMBuCTcXAl1ijegdZGfbV_Vk7st0,3471
+markdown_it/rules_core/smartquotes.py,sha256=izK9fSyuTzA-zAUGkRkz9KwwCQWo40iRqcCKqOhFbEE,7443
+markdown_it/rules_core/state_core.py,sha256=HqWZCUr5fW7xG6jeQZDdO0hE9hxxyl3_-bawgOy57HY,570
+markdown_it/rules_core/text_join.py,sha256=rLXxNuLh_es5RvH31GsXi7en8bMNO9UJ5nbJMDBPltY,1173
+markdown_it/rules_inline/__init__.py,sha256=qqHZk6-YE8Rc12q6PxvVKBaxv2wmZeeo45H1XMR_Vxs,696
+markdown_it/rules_inline/__pycache__/__init__.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/autolink.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/backticks.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/balance_pairs.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/emphasis.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/entity.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/escape.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/fragments_join.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/html_inline.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/image.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/link.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/linkify.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/newline.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/state_inline.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/strikethrough.cpython-310.pyc,,
+markdown_it/rules_inline/__pycache__/text.cpython-310.pyc,,
+markdown_it/rules_inline/autolink.py,sha256=pPoqJY8i99VtFn7KgUzMackMeq1hytzioVvWs-VQPRo,2065
+markdown_it/rules_inline/backticks.py,sha256=J7bezjjNxiXlKqvHc0fJkHZwH7-2nBsXVjcKydk8E4M,2037
+markdown_it/rules_inline/balance_pairs.py,sha256=5zgBiGidqdiWmt7Io_cuZOYh5EFEfXrYRce8RXg5m7o,4852
+markdown_it/rules_inline/emphasis.py,sha256=7aDLZx0Jlekuvbu3uEUTDhJp00Z0Pj6g4C3-VLhI8Co,3123
+markdown_it/rules_inline/entity.py,sha256=CE8AIGMi5isEa24RNseo0wRmTTaj5YLbgTFdDmBesAU,1651
+markdown_it/rules_inline/escape.py,sha256=KGulwrP5FnqZM7GXY8lf7pyVv0YkR59taZDeHb5cmKg,1659
+markdown_it/rules_inline/fragments_join.py,sha256=_3JbwWYJz74gRHeZk6T8edVJT2IVSsi7FfmJJlieQlA,1493
+markdown_it/rules_inline/html_inline.py,sha256=SBg6HR0HRqCdrkkec0dfOYuQdAqyfeLRFLeQggtgjvg,1130
+markdown_it/rules_inline/image.py,sha256=Wbsg7jgnOtKXIwXGNJOlG7ORThkMkBVolxItC0ph6C0,4141
+markdown_it/rules_inline/link.py,sha256=2oD-fAdB0xyxDRtZLTjzLeWbzJ1k9bbPVQmohb58RuI,4258
+markdown_it/rules_inline/linkify.py,sha256=ifH6sb5wE8PGMWEw9Sr4x0DhMVfNOEBCfFSwKll2O-s,1706
+markdown_it/rules_inline/newline.py,sha256=329r0V3aDjzNtJcvzA3lsFYjzgBrShLAV5uf9hwQL_M,1297
+markdown_it/rules_inline/state_inline.py,sha256=d-menFzbz5FDy1JNgGBF-BASasnVI-9RuOxWz9PnKn4,5003
+markdown_it/rules_inline/strikethrough.py,sha256=pwcPlyhkh5pqFVxRCSrdW5dNCIOtU4eDit7TVDTPIVA,3214
+markdown_it/rules_inline/text.py,sha256=FQqaQRUqbnMLO9ZSWPWQUMEKH6JqWSSSmlZ5Ii9P48o,1119
+markdown_it/token.py,sha256=cWrt9kodfPdizHq_tYrzyIZNtJYNMN1813DPNlunwTg,6381
+markdown_it/tree.py,sha256=56Cdbwu2Aiks7kNYqO_fQZWpPb_n48CUllzjQQfgu1Y,11111
+markdown_it/utils.py,sha256=lVLeX7Af3GaNFfxmMgUbsn5p7cXbwhLq7RSf56UWuRE,5687
+markdown_it_py-4.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+markdown_it_py-4.0.0.dist-info/METADATA,sha256=6fyqHi2vP5bYQKCfuqo5T-qt83o22Ip7a2tnJIfGW_s,7288
+markdown_it_py-4.0.0.dist-info/RECORD,,
+markdown_it_py-4.0.0.dist-info/WHEEL,sha256=G2gURzTEtmeR8nrdXUJfNiB3VYVxigPQ-bEQujpNiNs,82
+markdown_it_py-4.0.0.dist-info/entry_points.txt,sha256=T81l7fHQ3pllpQ4wUtQK6a8g_p6wxQbnjKVHCk2WMG4,58
+markdown_it_py-4.0.0.dist-info/licenses/LICENSE,sha256=SiJg1uLND1oVGh6G2_59PtVSseK-q_mUHBulxJy85IQ,1078
+markdown_it_py-4.0.0.dist-info/licenses/LICENSE.markdown-it,sha256=eSxIxahJoV_fnjfovPnm0d0TsytGxkKnSKCkapkZ1HM,1073
diff --git a/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/WHEEL b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/WHEEL
new file mode 100644
index 0000000000000000000000000000000000000000..d8b9936dad9ab2513fa6979f411560d3b6b57e37
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/WHEEL
@@ -0,0 +1,4 @@
+Wheel-Version: 1.0
+Generator: flit 3.12.0
+Root-Is-Purelib: true
+Tag: py3-none-any
diff --git a/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/entry_points.txt b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/entry_points.txt
new file mode 100644
index 0000000000000000000000000000000000000000..7d829cd792a5d754844f433c6a8dd499564fdcbf
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/entry_points.txt
@@ -0,0 +1,3 @@
+[console_scripts]
+markdown-it=markdown_it.cli.parse:main
+
diff --git a/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE
new file mode 100644
index 0000000000000000000000000000000000000000..582ddf59e08277fe6e78cee924d2c84805fe36fe
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE
@@ -0,0 +1,21 @@
+MIT License
+
+Copyright (c) 2020 ExecutableBookProject
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE.markdown-it b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE.markdown-it
new file mode 100644
index 0000000000000000000000000000000000000000..7ffa058cb78f8fb9beb974d9fd429004d2d2e585
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/markdown_it_py-4.0.0.dist-info/licenses/LICENSE.markdown-it
@@ -0,0 +1,22 @@
+Copyright (c) 2014 Vitaly Puzrin, Alex Kocharin.
+
+Permission is hereby granted, free of charge, to any person
+obtaining a copy of this software and associated documentation
+files (the "Software"), to deal in the Software without
+restriction, including without limitation the rights to use,
+copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the
+Software is furnished to do so, subject to the following
+conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
+OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
+HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
+WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+OTHER DEALINGS IN THE SOFTWARE.
diff --git a/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/INSTALLER b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/INSTALLER
new file mode 100644
index 0000000000000000000000000000000000000000..a1b589e38a32041e49332e5e81c2d363dc418d68
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/LICENSE b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/LICENSE
new file mode 100644
index 0000000000000000000000000000000000000000..2a920c59d8abdd485a774087915986448495fd7c
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/LICENSE
@@ -0,0 +1,46 @@
+Copyright (c) 2015 Vitaly Puzrin, Alex Kocharin.
+Copyright (c) 2021 Taneli Hukkinen
+
+Permission is hereby granted, free of charge, to any person
+obtaining a copy of this software and associated documentation
+files (the "Software"), to deal in the Software without
+restriction, including without limitation the rights to use,
+copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the
+Software is furnished to do so, subject to the following
+conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
+OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
+HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
+WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+OTHER DEALINGS IN THE SOFTWARE.
+
+--------------------------------------------------------------------------------
+
+.parse() is based on Joyent's node.js `url` code:
+
+Copyright Joyent, Inc. and other Node contributors. All rights reserved.
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to
+deal in the Software without restriction, including without limitation the
+rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
+sell copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
+IN THE SOFTWARE.
diff --git a/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/METADATA b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/METADATA
new file mode 100644
index 0000000000000000000000000000000000000000..b4670e86b6dc207c944c55c5d3b84911fb41157a
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/METADATA
@@ -0,0 +1,32 @@
+Metadata-Version: 2.1
+Name: mdurl
+Version: 0.1.2
+Summary: Markdown URL utilities
+Keywords: markdown,commonmark
+Author-email: Taneli Hukkinen
+Requires-Python: >=3.7
+Description-Content-Type: text/markdown
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Operating System :: MacOS
+Classifier: Operating System :: Microsoft :: Windows
+Classifier: Operating System :: POSIX :: Linux
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Typing :: Typed
+Project-URL: Homepage, https://github.com/executablebooks/mdurl
+
+# mdurl
+
+[](https://github.com/executablebooks/mdurl/actions?query=workflow%3ATests+branch%3Amaster+event%3Apush)
+[](https://codecov.io/gh/executablebooks/mdurl)
+[](https://pypi.org/project/mdurl)
+
+This is a Python port of the JavaScript [mdurl](https://www.npmjs.com/package/mdurl) package.
+See the [upstream README.md file](https://github.com/markdown-it/mdurl/blob/master/README.md) for API documentation.
+
diff --git a/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/RECORD b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/RECORD
new file mode 100644
index 0000000000000000000000000000000000000000..594bce3c8502cf7a0f3d2cf18c394d389a2b3e82
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/RECORD
@@ -0,0 +1,18 @@
+mdurl-0.1.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+mdurl-0.1.2.dist-info/LICENSE,sha256=fGBd9uKGZ6lgMRjpgnT2SknOPu0NJvzM6VNKNF4O-VU,2338
+mdurl-0.1.2.dist-info/METADATA,sha256=tTsp1I9Jk2cFP9o8gefOJ9JVg4Drv4PmYCOwLrfd0l0,1638
+mdurl-0.1.2.dist-info/RECORD,,
+mdurl-0.1.2.dist-info/WHEEL,sha256=4TfKIB_xu-04bc2iKz6_zFt-gEFEEDU_31HGhqzOCE8,81
+mdurl/__init__.py,sha256=1vpE89NyXniIRZNC_4f6BPm3Ub4bPntjfyyhLRR7opU,547
+mdurl/__pycache__/__init__.cpython-310.pyc,,
+mdurl/__pycache__/_decode.cpython-310.pyc,,
+mdurl/__pycache__/_encode.cpython-310.pyc,,
+mdurl/__pycache__/_format.cpython-310.pyc,,
+mdurl/__pycache__/_parse.cpython-310.pyc,,
+mdurl/__pycache__/_url.cpython-310.pyc,,
+mdurl/_decode.py,sha256=3Q_gDQqU__TvDbu7x-b9LjbVl4QWy5g_qFwljcuvN_Y,3004
+mdurl/_encode.py,sha256=goJLUFt1h4rVZNqqm9t15Nw2W-bFXYQEy3aR01ImWvs,2602
+mdurl/_format.py,sha256=xZct0mdePXA0H3kAqxjGtlB5O86G35DAYMGkA44CmB4,626
+mdurl/_parse.py,sha256=ezZSkM2_4NQ2Zx047sEdcJG7NYQRFHiZK7Y8INHFzwY,11374
+mdurl/_url.py,sha256=5kQnRQN2A_G4svLnRzZcG0bfoD9AbBrYDXousDHZ3z0,284
+mdurl/py.typed,sha256=8PjyZ1aVoQpRVvt71muvuq5qE-jTFZkK-GLHkhdebmc,26
diff --git a/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/WHEEL b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/WHEEL
new file mode 100644
index 0000000000000000000000000000000000000000..668ba4d0151c5c76ed6e758061daa8c1b0bf5d21
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl-0.1.2.dist-info/WHEEL
@@ -0,0 +1,4 @@
+Wheel-Version: 1.0
+Generator: flit 3.7.1
+Root-Is-Purelib: true
+Tag: py3-none-any
diff --git a/venv/lib/python3.10/site-packages/mdurl/__init__.py b/venv/lib/python3.10/site-packages/mdurl/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..cdbb640e004cef0e950a656a53d92d89d82c7472
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/__init__.py
@@ -0,0 +1,18 @@
+__all__ = (
+ "decode",
+ "DECODE_DEFAULT_CHARS",
+ "DECODE_COMPONENT_CHARS",
+ "encode",
+ "ENCODE_DEFAULT_CHARS",
+ "ENCODE_COMPONENT_CHARS",
+ "format",
+ "parse",
+ "URL",
+)
+__version__ = "0.1.2" # DO NOT EDIT THIS LINE MANUALLY. LET bump2version UTILITY DO IT
+
+from mdurl._decode import DECODE_COMPONENT_CHARS, DECODE_DEFAULT_CHARS, decode
+from mdurl._encode import ENCODE_COMPONENT_CHARS, ENCODE_DEFAULT_CHARS, encode
+from mdurl._format import format
+from mdurl._parse import url_parse as parse
+from mdurl._url import URL
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/__init__.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..744f8483255ad77a093afdeeaddd136eff5e3298
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/__init__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/_decode.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_decode.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..f8f4c1febcb91304435ae475a73baa9f3591c6db
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_decode.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/_encode.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_encode.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..5b7c1e5920f06fc0f7cab3f60e4a0ebecd1ae315
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_encode.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/_format.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_format.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..91d90eec363261fc98c419622e9f87b531d4d5c8
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_format.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/_parse.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_parse.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..dc48b30bc4213eb3dff6b3d37b187d13a0a9d2b4
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_parse.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/__pycache__/_url.cpython-310.pyc b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_url.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..7b812e3f13dec8f9e19295fe0eb5ae3fae278c4f
Binary files /dev/null and b/venv/lib/python3.10/site-packages/mdurl/__pycache__/_url.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/mdurl/_decode.py b/venv/lib/python3.10/site-packages/mdurl/_decode.py
new file mode 100644
index 0000000000000000000000000000000000000000..9b50a2dde976a6d43491ec6f20d12e60f6f6597f
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/_decode.py
@@ -0,0 +1,104 @@
+from __future__ import annotations
+
+from collections.abc import Sequence
+import functools
+import re
+
+DECODE_DEFAULT_CHARS = ";/?:@&=+$,#"
+DECODE_COMPONENT_CHARS = ""
+
+decode_cache: dict[str, list[str]] = {}
+
+
+def get_decode_cache(exclude: str) -> Sequence[str]:
+ if exclude in decode_cache:
+ return decode_cache[exclude]
+
+ cache: list[str] = []
+ decode_cache[exclude] = cache
+
+ for i in range(128):
+ ch = chr(i)
+ cache.append(ch)
+
+ for i in range(len(exclude)):
+ ch_code = ord(exclude[i])
+ cache[ch_code] = "%" + ("0" + hex(ch_code)[2:].upper())[-2:]
+
+ return cache
+
+
+# Decode percent-encoded string.
+#
+def decode(string: str, exclude: str = DECODE_DEFAULT_CHARS) -> str:
+ cache = get_decode_cache(exclude)
+ repl_func = functools.partial(repl_func_with_cache, cache=cache)
+ return re.sub(r"(%[a-f0-9]{2})+", repl_func, string, flags=re.IGNORECASE)
+
+
+def repl_func_with_cache(match: re.Match, cache: Sequence[str]) -> str:
+ seq = match.group()
+ result = ""
+
+ i = 0
+ l = len(seq) # noqa: E741
+ while i < l:
+ b1 = int(seq[i + 1 : i + 3], 16)
+
+ if b1 < 0x80:
+ result += cache[b1]
+ i += 3 # emulate JS for loop statement3
+ continue
+
+ if (b1 & 0xE0) == 0xC0 and (i + 3 < l):
+ # 110xxxxx 10xxxxxx
+ b2 = int(seq[i + 4 : i + 6], 16)
+
+ if (b2 & 0xC0) == 0x80:
+ all_bytes = bytes((b1, b2))
+ try:
+ result += all_bytes.decode()
+ except UnicodeDecodeError:
+ result += "\ufffd" * 2
+
+ i += 3
+ i += 3 # emulate JS for loop statement3
+ continue
+
+ if (b1 & 0xF0) == 0xE0 and (i + 6 < l):
+ # 1110xxxx 10xxxxxx 10xxxxxx
+ b2 = int(seq[i + 4 : i + 6], 16)
+ b3 = int(seq[i + 7 : i + 9], 16)
+
+ if (b2 & 0xC0) == 0x80 and (b3 & 0xC0) == 0x80:
+ all_bytes = bytes((b1, b2, b3))
+ try:
+ result += all_bytes.decode()
+ except UnicodeDecodeError:
+ result += "\ufffd" * 3
+
+ i += 6
+ i += 3 # emulate JS for loop statement3
+ continue
+
+ if (b1 & 0xF8) == 0xF0 and (i + 9 < l):
+ # 111110xx 10xxxxxx 10xxxxxx 10xxxxxx
+ b2 = int(seq[i + 4 : i + 6], 16)
+ b3 = int(seq[i + 7 : i + 9], 16)
+ b4 = int(seq[i + 10 : i + 12], 16)
+
+ if (b2 & 0xC0) == 0x80 and (b3 & 0xC0) == 0x80 and (b4 & 0xC0) == 0x80:
+ all_bytes = bytes((b1, b2, b3, b4))
+ try:
+ result += all_bytes.decode()
+ except UnicodeDecodeError:
+ result += "\ufffd" * 4
+
+ i += 9
+ i += 3 # emulate JS for loop statement3
+ continue
+
+ result += "\ufffd"
+ i += 3 # emulate JS for loop statement3
+
+ return result
diff --git a/venv/lib/python3.10/site-packages/mdurl/_encode.py b/venv/lib/python3.10/site-packages/mdurl/_encode.py
new file mode 100644
index 0000000000000000000000000000000000000000..bc2e5b917afe9e9ecaa6f11af7a9ac82704d3914
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/_encode.py
@@ -0,0 +1,85 @@
+from __future__ import annotations
+
+from collections.abc import Sequence
+from string import ascii_letters, digits, hexdigits
+from urllib.parse import quote as encode_uri_component
+
+ASCII_LETTERS_AND_DIGITS = ascii_letters + digits
+
+ENCODE_DEFAULT_CHARS = ";/?:@&=+$,-_.!~*'()#"
+ENCODE_COMPONENT_CHARS = "-_.!~*'()"
+
+encode_cache: dict[str, list[str]] = {}
+
+
+# Create a lookup array where anything but characters in `chars` string
+# and alphanumeric chars is percent-encoded.
+def get_encode_cache(exclude: str) -> Sequence[str]:
+ if exclude in encode_cache:
+ return encode_cache[exclude]
+
+ cache: list[str] = []
+ encode_cache[exclude] = cache
+
+ for i in range(128):
+ ch = chr(i)
+
+ if ch in ASCII_LETTERS_AND_DIGITS:
+ # always allow unencoded alphanumeric characters
+ cache.append(ch)
+ else:
+ cache.append("%" + ("0" + hex(i)[2:].upper())[-2:])
+
+ for i in range(len(exclude)):
+ cache[ord(exclude[i])] = exclude[i]
+
+ return cache
+
+
+# Encode unsafe characters with percent-encoding, skipping already
+# encoded sequences.
+#
+# - string - string to encode
+# - exclude - list of characters to ignore (in addition to a-zA-Z0-9)
+# - keepEscaped - don't encode '%' in a correct escape sequence (default: true)
+def encode(
+ string: str, exclude: str = ENCODE_DEFAULT_CHARS, *, keep_escaped: bool = True
+) -> str:
+ result = ""
+
+ cache = get_encode_cache(exclude)
+
+ l = len(string) # noqa: E741
+ i = 0
+ while i < l:
+ code = ord(string[i])
+
+ # %
+ if keep_escaped and code == 0x25 and i + 2 < l:
+ if all(c in hexdigits for c in string[i + 1 : i + 3]):
+ result += string[i : i + 3]
+ i += 2
+ i += 1 # JS for loop statement3
+ continue
+
+ if code < 128:
+ result += cache[code]
+ i += 1 # JS for loop statement3
+ continue
+
+ if code >= 0xD800 and code <= 0xDFFF:
+ if code >= 0xD800 and code <= 0xDBFF and i + 1 < l:
+ next_code = ord(string[i + 1])
+ if next_code >= 0xDC00 and next_code <= 0xDFFF:
+ result += encode_uri_component(string[i] + string[i + 1])
+ i += 1
+ i += 1 # JS for loop statement3
+ continue
+ result += "%EF%BF%BD"
+ i += 1 # JS for loop statement3
+ continue
+
+ result += encode_uri_component(string[i])
+ i += 1 # JS for loop statement3
+
+ return result
diff --git a/venv/lib/python3.10/site-packages/mdurl/_format.py b/venv/lib/python3.10/site-packages/mdurl/_format.py
new file mode 100644
index 0000000000000000000000000000000000000000..12524ca626065183ec9974f3d7d08dadd4a7d3e8
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/_format.py
@@ -0,0 +1,27 @@
+from __future__ import annotations
+
+from typing import TYPE_CHECKING
+
+if TYPE_CHECKING:
+ from mdurl._url import URL
+
+
+def format(url: URL) -> str: # noqa: A001
+ result = ""
+
+ result += url.protocol or ""
+ result += "//" if url.slashes else ""
+ result += url.auth + "@" if url.auth else ""
+
+ if url.hostname and ":" in url.hostname:
+ # ipv6 address
+ result += "[" + url.hostname + "]"
+ else:
+ result += url.hostname or ""
+
+ result += ":" + url.port if url.port else ""
+ result += url.pathname or ""
+ result += url.search or ""
+ result += url.hash or ""
+
+ return result
diff --git a/venv/lib/python3.10/site-packages/mdurl/_parse.py b/venv/lib/python3.10/site-packages/mdurl/_parse.py
new file mode 100644
index 0000000000000000000000000000000000000000..ffeeac768dca3bff60c55c9b1f0bc0fbb4cec7b1
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/_parse.py
@@ -0,0 +1,304 @@
+# Copyright Joyent, Inc. and other Node contributors.
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to permit
+# persons to whom the Software is furnished to do so, subject to the
+# following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+# USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+
+# Changes from joyent/node:
+#
+# 1. No leading slash in paths,
+# e.g. in `url.parse('http://foo?bar')` pathname is ``, not `/`
+#
+# 2. Backslashes are not replaced with slashes,
+# so `http:\\example.org\` is treated like a relative path
+#
+# 3. Trailing colon is treated like a part of the path,
+# i.e. in `http://example.org:foo` pathname is `:foo`
+#
+# 4. Nothing is URL-encoded in the resulting object,
+# (in joyent/node some chars in auth and paths are encoded)
+#
+# 5. `url.parse()` does not have `parseQueryString` argument
+#
+# 6. Removed extraneous result properties: `host`, `path`, `query`, etc.,
+# which can be constructed using other parts of the url.
+
+from __future__ import annotations
+
+from collections import defaultdict
+import re
+
+from mdurl._url import URL
+
+# Reference: RFC 3986, RFC 1808, RFC 2396
+
+# define these here so at least they only have to be
+# compiled once on the first module load.
+PROTOCOL_PATTERN = re.compile(r"^([a-z0-9.+-]+:)", flags=re.IGNORECASE)
+PORT_PATTERN = re.compile(r":[0-9]*$")
+
+# Special case for a simple path URL
+SIMPLE_PATH_PATTERN = re.compile(r"^(//?(?!/)[^?\s]*)(\?[^\s]*)?$")
+
+# RFC 2396: characters reserved for delimiting URLs.
+# We actually just auto-escape these.
+DELIMS = ("<", ">", '"', "`", " ", "\r", "\n", "\t")
+
+# RFC 2396: characters not allowed for various reasons.
+UNWISE = ("{", "}", "|", "\\", "^", "`") + DELIMS
+
+# Allowed by RFCs, but cause of XSS attacks. Always escape these.
+AUTO_ESCAPE = ("'",) + UNWISE
+# Characters that are never ever allowed in a hostname.
+# Note that any invalid chars are also handled, but these
+# are the ones that are *expected* to be seen, so we fast-path
+# them.
+NON_HOST_CHARS = ("%", "/", "?", ";", "#") + AUTO_ESCAPE
+HOST_ENDING_CHARS = ("/", "?", "#")
+HOSTNAME_MAX_LEN = 255
+HOSTNAME_PART_PATTERN = re.compile(r"^[+a-z0-9A-Z_-]{0,63}$")
+HOSTNAME_PART_START = re.compile(r"^([+a-z0-9A-Z_-]{0,63})(.*)$")
+# protocols that can allow "unsafe" and "unwise" chars.
+
+# protocols that never have a hostname.
+HOSTLESS_PROTOCOL = defaultdict(
+ bool,
+ {
+ "javascript": True,
+ "javascript:": True,
+ },
+)
+# protocols that always contain a // bit.
+SLASHED_PROTOCOL = defaultdict(
+ bool,
+ {
+ "http": True,
+ "https": True,
+ "ftp": True,
+ "gopher": True,
+ "file": True,
+ "http:": True,
+ "https:": True,
+ "ftp:": True,
+ "gopher:": True,
+ "file:": True,
+ },
+)
+
+
+class MutableURL:
+ def __init__(self) -> None:
+ self.protocol: str | None = None
+ self.slashes: bool = False
+ self.auth: str | None = None
+ self.port: str | None = None
+ self.hostname: str | None = None
+ self.hash: str | None = None
+ self.search: str | None = None
+ self.pathname: str | None = None
+
+ def parse(self, url: str, slashes_denote_host: bool) -> "MutableURL":
+ lower_proto = ""
+ slashes = False
+ rest = url
+
+ # trim before proceeding.
+ # This is to support parse stuff like " http://foo.com \n"
+ rest = rest.strip()
+
+ if not slashes_denote_host and len(url.split("#")) == 1:
+ # Try fast path regexp
+ simple_path = SIMPLE_PATH_PATTERN.match(rest)
+ if simple_path:
+ self.pathname = simple_path.group(1)
+ if simple_path.group(2):
+ self.search = simple_path.group(2)
+ return self
+
+ proto = ""
+ proto_match = PROTOCOL_PATTERN.match(rest)
+ if proto_match:
+ proto = proto_match.group()
+ lower_proto = proto.lower()
+ self.protocol = proto
+ rest = rest[len(proto) :]
+
+ # figure out if it's got a host
+ # user@server is *always* interpreted as a hostname, and url
+ # resolution will treat //foo/bar as host=foo,path=bar because that's
+ # how the browser resolves relative URLs.
+ if slashes_denote_host or proto or re.search(r"^//[^@/]+@[^@/]+", rest):
+ slashes = rest.startswith("//")
+ if slashes and not (proto and HOSTLESS_PROTOCOL[proto]):
+ rest = rest[2:]
+ self.slashes = True
+
+ if not HOSTLESS_PROTOCOL[proto] and (
+ slashes or (proto and not SLASHED_PROTOCOL[proto])
+ ):
+
+ # there's a hostname.
+ # the first instance of /, ?, ;, or # ends the host.
+ #
+ # If there is an @ in the hostname, then non-host chars *are* allowed
+ # to the left of the last @ sign, unless some host-ending character
+ # comes *before* the @-sign.
+ # URLs are obnoxious.
+ #
+ # ex:
+ # http://a@b@c/ => user:a@b host:c
+ # http://a@b?@c => user:a host:c path:/?@c
+
+ # v0.12 TODO(isaacs): This is not quite how Chrome does things.
+ # Review our test case against browsers more comprehensively.
+
+ # find the first instance of any hostEndingChars
+ host_end = -1
+ for i in range(len(HOST_ENDING_CHARS)):
+ hec = rest.find(HOST_ENDING_CHARS[i])
+ if hec != -1 and (host_end == -1 or hec < host_end):
+ host_end = hec
+
+ # at this point, either we have an explicit point where the
+ # auth portion cannot go past, or the last @ char is the decider.
+ if host_end == -1:
+ # atSign can be anywhere.
+ at_sign = rest.rfind("@")
+ else:
+ # atSign must be in auth portion.
+ # http://a@b/c@d => host:b auth:a path:/c@d
+ at_sign = rest.rfind("@", 0, host_end + 1)
+
+ # Now we have a portion which is definitely the auth.
+ # Pull that off.
+ if at_sign != -1:
+ auth = rest[:at_sign]
+ rest = rest[at_sign + 1 :]
+ self.auth = auth
+
+ # the host is the remaining to the left of the first non-host char
+ host_end = -1
+ for i in range(len(NON_HOST_CHARS)):
+ hec = rest.find(NON_HOST_CHARS[i])
+ if hec != -1 and (host_end == -1 or hec < host_end):
+ host_end = hec
+ # if we still have not hit it, then the entire thing is a host.
+ if host_end == -1:
+ host_end = len(rest)
+
+ if host_end > 0 and rest[host_end - 1] == ":":
+ host_end -= 1
+ host = rest[:host_end]
+ rest = rest[host_end:]
+
+ # pull out port.
+ self.parse_host(host)
+
+ # we've indicated that there is a hostname,
+ # so even if it's empty, it has to be present.
+ self.hostname = self.hostname or ""
+
+ # if hostname begins with [ and ends with ]
+ # assume that it's an IPv6 address.
+ ipv6_hostname = self.hostname.startswith("[") and self.hostname.endswith(
+ "]"
+ )
+
+ # validate a little.
+ if not ipv6_hostname:
+ hostparts = self.hostname.split(".")
+ l = len(hostparts) # noqa: E741
+ i = 0
+ while i < l:
+ part = hostparts[i]
+ if not part:
+ i += 1 # emulate statement3 in JS for loop
+ continue
+ if not HOSTNAME_PART_PATTERN.search(part):
+ newpart = ""
+ k = len(part)
+ j = 0
+ while j < k:
+ if ord(part[j]) > 127:
+ # we replace non-ASCII char with a temporary placeholder
+ # we need this to make sure size of hostname is not
+ # broken by replacing non-ASCII by nothing
+ newpart += "x"
+ else:
+ newpart += part[j]
+ j += 1 # emulate statement3 in JS for loop
+
+ # we test again with ASCII char only
+ if not HOSTNAME_PART_PATTERN.search(newpart):
+ valid_parts = hostparts[:i]
+ not_host = hostparts[i + 1 :]
+ bit = HOSTNAME_PART_START.search(part)
+ if bit:
+ valid_parts.append(bit.group(1))
+ not_host.insert(0, bit.group(2))
+ if not_host:
+ rest = ".".join(not_host) + rest
+ self.hostname = ".".join(valid_parts)
+ break
+ i += 1 # emulate statement3 in JS for loop
+
+ if len(self.hostname) > HOSTNAME_MAX_LEN:
+ self.hostname = ""
+
+ # strip [ and ] from the hostname
+ # the host field still retains them, though
+ if ipv6_hostname:
+ self.hostname = self.hostname[1:-1]
+
+ # chop off from the tail first.
+ hash = rest.find("#") # noqa: A001
+ if hash != -1:
+ # got a fragment string.
+ self.hash = rest[hash:]
+ rest = rest[:hash]
+ qm = rest.find("?")
+ if qm != -1:
+ self.search = rest[qm:]
+ rest = rest[:qm]
+ if rest:
+ self.pathname = rest
+ if SLASHED_PROTOCOL[lower_proto] and self.hostname and not self.pathname:
+ self.pathname = ""
+
+ return self
+
+ def parse_host(self, host: str) -> None:
+ port_match = PORT_PATTERN.search(host)
+ if port_match:
+ port = port_match.group()
+ if port != ":":
+ self.port = port[1:]
+ host = host[: -len(port)]
+ if host:
+ self.hostname = host
+
+
+def url_parse(url: URL | str, *, slashes_denote_host: bool = False) -> URL:
+ if isinstance(url, URL):
+ return url
+ u = MutableURL()
+ u.parse(url, slashes_denote_host)
+ return URL(
+ u.protocol, u.slashes, u.auth, u.port, u.hostname, u.hash, u.search, u.pathname
+ )
diff --git a/venv/lib/python3.10/site-packages/mdurl/_url.py b/venv/lib/python3.10/site-packages/mdurl/_url.py
new file mode 100644
index 0000000000000000000000000000000000000000..f866e7a179c8854e37c9bba6294f48681e5d99d7
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/_url.py
@@ -0,0 +1,14 @@
+from __future__ import annotations
+
+from typing import NamedTuple
+
+
+class URL(NamedTuple):
+ protocol: str | None
+ slashes: bool
+ auth: str | None
+ port: str | None
+ hostname: str | None
+ hash: str | None # noqa: A003
+ search: str | None
+ pathname: str | None
diff --git a/venv/lib/python3.10/site-packages/mdurl/py.typed b/venv/lib/python3.10/site-packages/mdurl/py.typed
new file mode 100644
index 0000000000000000000000000000000000000000..7632ecf77545c5e5501cb3fc5719df0761104ca2
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/mdurl/py.typed
@@ -0,0 +1 @@
+# Marker file for PEP 561
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/INSTALLER b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/INSTALLER
new file mode 100644
index 0000000000000000000000000000000000000000..a1b589e38a32041e49332e5e81c2d363dc418d68
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/METADATA b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/METADATA
new file mode 100644
index 0000000000000000000000000000000000000000..3200e601f970271fdde3fcc74f9af4423655a79d
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/METADATA
@@ -0,0 +1,107 @@
+Metadata-Version: 2.4
+Name: packaging
+Version: 26.0
+Summary: Core utilities for Python packages
+Author-email: Donald Stufft
+Requires-Python: >=3.8
+Description-Content-Type: text/x-rst
+License-Expression: Apache-2.0 OR BSD-2-Clause
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3.12
+Classifier: Programming Language :: Python :: 3.13
+Classifier: Programming Language :: Python :: 3.14
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Typing :: Typed
+License-File: LICENSE
+License-File: LICENSE.APACHE
+License-File: LICENSE.BSD
+Project-URL: Documentation, https://packaging.pypa.io/
+Project-URL: Source, https://github.com/pypa/packaging
+
+packaging
+=========
+
+.. start-intro
+
+Reusable core utilities for various Python Packaging
+`interoperability specifications `_.
+
+This library provides utilities that implement the interoperability
+specifications which have clearly one correct behaviour (eg: :pep:`440`)
+or benefit greatly from having a single shared implementation (eg: :pep:`425`).
+
+.. end-intro
+
+The ``packaging`` project includes the following: version handling, specifiers,
+markers, requirements, tags, metadata, lockfiles, utilities.
+
+Documentation
+-------------
+
+The `documentation`_ provides information and the API for the following:
+
+- Version Handling
+- Specifiers
+- Markers
+- Requirements
+- Tags
+- Metadata
+- Lockfiles
+- Utilities
+
+Installation
+------------
+
+Use ``pip`` to install these utilities::
+
+ pip install packaging
+
+The ``packaging`` library uses calendar-based versioning (``YY.N``).
+
+Discussion
+----------
+
+If you run into bugs, you can file them in our `issue tracker`_.
+
+You can also join ``#pypa`` on Freenode to ask questions or get involved.
+
+
+.. _`documentation`: https://packaging.pypa.io/
+.. _`issue tracker`: https://github.com/pypa/packaging/issues
+
+
+Code of Conduct
+---------------
+
+Everyone interacting in the packaging project's codebases, issue trackers, chat
+rooms, and mailing lists is expected to follow the `PSF Code of Conduct`_.
+
+.. _PSF Code of Conduct: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
+
+Contributing
+------------
+
+The ``CONTRIBUTING.rst`` file outlines how to contribute to this project as
+well as how to report a potential security issue. The documentation for this
+project also covers information about `project development`_ and `security`_.
+
+.. _`project development`: https://packaging.pypa.io/en/latest/development/
+.. _`security`: https://packaging.pypa.io/en/latest/security/
+
+Project History
+---------------
+
+Please review the ``CHANGELOG.rst`` file or the `Changelog documentation`_ for
+recent changes and project history.
+
+.. _`Changelog documentation`: https://packaging.pypa.io/en/latest/changelog/
+
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/RECORD b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/RECORD
new file mode 100644
index 0000000000000000000000000000000000000000..48cf6a5c232aa03a1fb903e44cee6e2c26e7800c
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/RECORD
@@ -0,0 +1,42 @@
+packaging-26.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+packaging-26.0.dist-info/METADATA,sha256=M2K7fWom2iliuo2qpHhc0LrKwhq6kIoRlcyPWVgKJlo,3309
+packaging-26.0.dist-info/RECORD,,
+packaging-26.0.dist-info/WHEEL,sha256=G2gURzTEtmeR8nrdXUJfNiB3VYVxigPQ-bEQujpNiNs,82
+packaging-26.0.dist-info/licenses/LICENSE,sha256=ytHvW9NA1z4HS6YU0m996spceUDD2MNIUuZcSQlobEg,197
+packaging-26.0.dist-info/licenses/LICENSE.APACHE,sha256=DVQuDIgE45qn836wDaWnYhSdxoLXgpRRKH4RuTjpRZQ,10174
+packaging-26.0.dist-info/licenses/LICENSE.BSD,sha256=tw5-m3QvHMb5SLNMFqo5_-zpQZY2S8iP8NIYDwAo-sU,1344
+packaging/__init__.py,sha256=y4lVbpeBzCGk-IPDw5BGBZ_b0P3ukEEJZAbGYc6Ey8c,494
+packaging/__pycache__/__init__.cpython-310.pyc,,
+packaging/__pycache__/_elffile.cpython-310.pyc,,
+packaging/__pycache__/_manylinux.cpython-310.pyc,,
+packaging/__pycache__/_musllinux.cpython-310.pyc,,
+packaging/__pycache__/_parser.cpython-310.pyc,,
+packaging/__pycache__/_structures.cpython-310.pyc,,
+packaging/__pycache__/_tokenizer.cpython-310.pyc,,
+packaging/__pycache__/markers.cpython-310.pyc,,
+packaging/__pycache__/metadata.cpython-310.pyc,,
+packaging/__pycache__/pylock.cpython-310.pyc,,
+packaging/__pycache__/requirements.cpython-310.pyc,,
+packaging/__pycache__/specifiers.cpython-310.pyc,,
+packaging/__pycache__/tags.cpython-310.pyc,,
+packaging/__pycache__/utils.cpython-310.pyc,,
+packaging/__pycache__/version.cpython-310.pyc,,
+packaging/_elffile.py,sha256=-sKkptYqzYw2-x3QByJa5mB4rfPWu1pxkZHRx1WAFCY,3211
+packaging/_manylinux.py,sha256=Hf6nB0cOrayEs96-p3oIXAgGnFquv20DO5l-o2_Xnv0,9559
+packaging/_musllinux.py,sha256=Z6swjH3MA7XS3qXnmMN7QPhqP3fnoYI0eQ18e9-HgAE,2707
+packaging/_parser.py,sha256=U_DajsEx2VoC_F46fSVV3hDKNCWoQYkPkasO3dld0ig,10518
+packaging/_structures.py,sha256=Hn49Ta8zV9Wo8GiCL8Nl2ARZY983Un3pruZGVNldPwE,1514
+packaging/_tokenizer.py,sha256=M8EwNIdXeL9NMFuFrQtiOKwjka_xFx8KjRQnfE8O_z8,5421
+packaging/licenses/__init__.py,sha256=TwXLHZCXwSgdFwRLPxW602T6mSieunSFHM6fp8pgW78,5819
+packaging/licenses/__pycache__/__init__.cpython-310.pyc,,
+packaging/licenses/__pycache__/_spdx.cpython-310.pyc,,
+packaging/licenses/_spdx.py,sha256=WW7DXiyg68up_YND_wpRYlr1SHhiV4FfJLQffghhMxQ,51122
+packaging/markers.py,sha256=ZX-cLvW1S3cZcEc0fHI4z7zSx5U2T19yMpDP_mE-CYw,12771
+packaging/metadata.py,sha256=CWVZpN_HfoYMSSDuCP7igOvGgqA9AOmpW8f3qTisfnc,39360
+packaging/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+packaging/pylock.py,sha256=-R1uNfJ4PaLto7Mg62YsGOHgvskuiIEqPwxOywl42Jk,22537
+packaging/requirements.py,sha256=PMCAWD8aNMnVD-6uZMedhBuAVX2573eZ4yPBLXmz04I,2870
+packaging/specifiers.py,sha256=EPNPimY_zFivthv1vdjZYz5IqkKGsnKR2yKh-EVyvZw,40797
+packaging/tags.py,sha256=cXLV1pJD3UtJlDg7Wz3zrfdQhRZqr8jumSAKKAAd2xE,22856
+packaging/utils.py,sha256=N4c6oZzFJy6klTZ3AnkNz7sSkJesuFWPp68LA3B5dAo,5040
+packaging/version.py,sha256=7XWlL2IDYLwDYC0ht6cFEhapLwLWbmyo4rb7sEFj0x8,23272
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/WHEEL b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/WHEEL
new file mode 100644
index 0000000000000000000000000000000000000000..d8b9936dad9ab2513fa6979f411560d3b6b57e37
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/WHEEL
@@ -0,0 +1,4 @@
+Wheel-Version: 1.0
+Generator: flit 3.12.0
+Root-Is-Purelib: true
+Tag: py3-none-any
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE
new file mode 100644
index 0000000000000000000000000000000000000000..6f62d44e4ef733c0e713afcd2371fed7f2b3de67
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE
@@ -0,0 +1,3 @@
+This software is made available under the terms of *either* of the licenses
+found in LICENSE.APACHE or LICENSE.BSD. Contributions to this software is made
+under the terms of *both* these licenses.
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.APACHE b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.APACHE
new file mode 100644
index 0000000000000000000000000000000000000000..f433b1a53f5b830a205fd2df78e2b34974656c7b
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.APACHE
@@ -0,0 +1,177 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
diff --git a/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.BSD b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.BSD
new file mode 100644
index 0000000000000000000000000000000000000000..42ce7b75c92fb01a3f6ed17eea363f756b7da582
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging-26.0.dist-info/licenses/LICENSE.BSD
@@ -0,0 +1,23 @@
+Copyright (c) Donald Stufft and individual contributors.
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are met:
+
+ 1. Redistributions of source code must retain the above copyright notice,
+ this list of conditions and the following disclaimer.
+
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the distribution.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
+ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
+WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
+FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/venv/lib/python3.10/site-packages/packaging/__init__.py b/venv/lib/python3.10/site-packages/packaging/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..21695a74b5107c96ba4bb2cbca6b7f259dacd330
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/__init__.py
@@ -0,0 +1,15 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+
+__title__ = "packaging"
+__summary__ = "Core utilities for Python packages"
+__uri__ = "https://github.com/pypa/packaging"
+
+__version__ = "26.0"
+
+__author__ = "Donald Stufft and individual contributors"
+__email__ = "donald@stufft.io"
+
+__license__ = "BSD-2-Clause or Apache-2.0"
+__copyright__ = f"2014 {__author__}"
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/__init__.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..60d7f426e5147e6b69261a7ea3bf905c34d94c96
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/__init__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_elffile.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_elffile.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..dcc5c285d3c3abe2232d81ac54bd098e89e6adb1
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_elffile.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_manylinux.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_manylinux.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..2052d7b4ea21fc32a4f6780df8514b9d6989c3f8
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_manylinux.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_musllinux.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_musllinux.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..6f4d3c6c6e79e99347921f65a9deaa893c16bb9d
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_musllinux.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_parser.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_parser.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..0ec516109d572136003c9aa6d33ef66cd4e3c73f
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_parser.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_structures.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_structures.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..6af51d821b5cc4cfff4950ea512f1b4b2166117c
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_structures.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/_tokenizer.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/_tokenizer.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..27c5af436d9a3ad0c0becb59f4e18b447e68c2e7
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/_tokenizer.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/markers.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/markers.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..53e43a566b3cd7f502b235b586eef88a2617eab0
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/markers.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/metadata.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/metadata.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..19f23e04cbe86d2cafbc2f652ad977ae1bb11f7e
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/metadata.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/pylock.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/pylock.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..7acd740d7a9b0fc537d90f1f543c69b37a7d192e
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/pylock.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/requirements.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/requirements.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..9c02e2f67c81a3ab5e21c66488ecf1d11c521e0c
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/requirements.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/specifiers.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/specifiers.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..9c911c25bf70766d823991a09fe33fa1a0abf578
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/specifiers.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/tags.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/tags.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..6e1b0a3643ac605aea35fae5fa16b6ffe72b520b
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/tags.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/utils.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/utils.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..da6472f9223db1a319872b49204428363c12e441
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/utils.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/__pycache__/version.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/__pycache__/version.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..88194d3392c9117d81978e2528141c1c6615ebfc
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/__pycache__/version.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/_elffile.py b/venv/lib/python3.10/site-packages/packaging/_elffile.py
new file mode 100644
index 0000000000000000000000000000000000000000..497b0645217512ae2ba8ff61341fd2bbfa3648cd
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_elffile.py
@@ -0,0 +1,108 @@
+"""
+ELF file parser.
+
+This provides a class ``ELFFile`` that parses an ELF executable in a similar
+interface to ``ZipFile``. Only the read interface is implemented.
+
+ELF header: https://refspecs.linuxfoundation.org/elf/gabi4+/ch4.eheader.html
+"""
+
+from __future__ import annotations
+
+import enum
+import os
+import struct
+from typing import IO
+
+
+class ELFInvalid(ValueError):
+ pass
+
+
+class EIClass(enum.IntEnum):
+ C32 = 1
+ C64 = 2
+
+
+class EIData(enum.IntEnum):
+ Lsb = 1
+ Msb = 2
+
+
+class EMachine(enum.IntEnum):
+ I386 = 3
+ S390 = 22
+ Arm = 40
+ X8664 = 62
+ AArc64 = 183
+
+
+class ELFFile:
+ """
+ Representation of an ELF executable.
+ """
+
+ def __init__(self, f: IO[bytes]) -> None:
+ self._f = f
+
+ try:
+ ident = self._read("16B")
+ except struct.error as e:
+ raise ELFInvalid("unable to parse identification") from e
+ magic = bytes(ident[:4])
+ if magic != b"\x7fELF":
+ raise ELFInvalid(f"invalid magic: {magic!r}")
+
+ self.capacity = ident[4] # Format for program header (bitness).
+ self.encoding = ident[5] # Data structure encoding (endianness).
+
+ try:
+ # e_fmt: Format for program header.
+ # p_fmt: Format for section header.
+ # p_idx: Indexes to find p_type, p_offset, and p_filesz.
+ e_fmt, self._p_fmt, self._p_idx = {
+ (1, 1): ("HHIIIIIHHH", ">IIIIIIII", (0, 1, 4)), # 32-bit MSB.
+ (2, 1): ("HHIQQQIHHH", ">IIQQQQQQ", (0, 2, 5)), # 64-bit MSB.
+ }[(self.capacity, self.encoding)]
+ except KeyError as e:
+ raise ELFInvalid(
+ f"unrecognized capacity ({self.capacity}) or encoding ({self.encoding})"
+ ) from e
+
+ try:
+ (
+ _,
+ self.machine, # Architecture type.
+ _,
+ _,
+ self._e_phoff, # Offset of program header.
+ _,
+ self.flags, # Processor-specific flags.
+ _,
+ self._e_phentsize, # Size of section.
+ self._e_phnum, # Number of sections.
+ ) = self._read(e_fmt)
+ except struct.error as e:
+ raise ELFInvalid("unable to parse machine and section information") from e
+
+ def _read(self, fmt: str) -> tuple[int, ...]:
+ return struct.unpack(fmt, self._f.read(struct.calcsize(fmt)))
+
+ @property
+ def interpreter(self) -> str | None:
+ """
+ The path recorded in the ``PT_INTERP`` section header.
+ """
+ for index in range(self._e_phnum):
+ self._f.seek(self._e_phoff + self._e_phentsize * index)
+ try:
+ data = self._read(self._p_fmt)
+ except struct.error:
+ continue
+ if data[self._p_idx[0]] != 3: # Not PT_INTERP.
+ continue
+ self._f.seek(data[self._p_idx[1]])
+ return os.fsdecode(self._f.read(data[self._p_idx[2]])).strip("\0")
+ return None
diff --git a/venv/lib/python3.10/site-packages/packaging/_manylinux.py b/venv/lib/python3.10/site-packages/packaging/_manylinux.py
new file mode 100644
index 0000000000000000000000000000000000000000..0e79e8a882be74fe76c80ccf49a9cd68fb636fd4
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_manylinux.py
@@ -0,0 +1,262 @@
+from __future__ import annotations
+
+import collections
+import contextlib
+import functools
+import os
+import re
+import sys
+import warnings
+from typing import Generator, Iterator, NamedTuple, Sequence
+
+from ._elffile import EIClass, EIData, ELFFile, EMachine
+
+EF_ARM_ABIMASK = 0xFF000000
+EF_ARM_ABI_VER5 = 0x05000000
+EF_ARM_ABI_FLOAT_HARD = 0x00000400
+
+_ALLOWED_ARCHS = {
+ "x86_64",
+ "aarch64",
+ "ppc64",
+ "ppc64le",
+ "s390x",
+ "loongarch64",
+ "riscv64",
+}
+
+
+# `os.PathLike` not a generic type until Python 3.9, so sticking with `str`
+# as the type for `path` until then.
+@contextlib.contextmanager
+def _parse_elf(path: str) -> Generator[ELFFile | None, None, None]:
+ try:
+ with open(path, "rb") as f:
+ yield ELFFile(f)
+ except (OSError, TypeError, ValueError):
+ yield None
+
+
+def _is_linux_armhf(executable: str) -> bool:
+ # hard-float ABI can be detected from the ELF header of the running
+ # process
+ # https://static.docs.arm.com/ihi0044/g/aaelf32.pdf
+ with _parse_elf(executable) as f:
+ return (
+ f is not None
+ and f.capacity == EIClass.C32
+ and f.encoding == EIData.Lsb
+ and f.machine == EMachine.Arm
+ and f.flags & EF_ARM_ABIMASK == EF_ARM_ABI_VER5
+ and f.flags & EF_ARM_ABI_FLOAT_HARD == EF_ARM_ABI_FLOAT_HARD
+ )
+
+
+def _is_linux_i686(executable: str) -> bool:
+ with _parse_elf(executable) as f:
+ return (
+ f is not None
+ and f.capacity == EIClass.C32
+ and f.encoding == EIData.Lsb
+ and f.machine == EMachine.I386
+ )
+
+
+def _have_compatible_abi(executable: str, archs: Sequence[str]) -> bool:
+ if "armv7l" in archs:
+ return _is_linux_armhf(executable)
+ if "i686" in archs:
+ return _is_linux_i686(executable)
+ return any(arch in _ALLOWED_ARCHS for arch in archs)
+
+
+# If glibc ever changes its major version, we need to know what the last
+# minor version was, so we can build the complete list of all versions.
+# For now, guess what the highest minor version might be, assume it will
+# be 50 for testing. Once this actually happens, update the dictionary
+# with the actual value.
+_LAST_GLIBC_MINOR: dict[int, int] = collections.defaultdict(lambda: 50)
+
+
+class _GLibCVersion(NamedTuple):
+ major: int
+ minor: int
+
+
+def _glibc_version_string_confstr() -> str | None:
+ """
+ Primary implementation of glibc_version_string using os.confstr.
+ """
+ # os.confstr is quite a bit faster than ctypes.DLL. It's also less likely
+ # to be broken or missing. This strategy is used in the standard library
+ # platform module.
+ # https://github.com/python/cpython/blob/fcf1d003bf4f0100c/Lib/platform.py#L175-L183
+ try:
+ # Should be a string like "glibc 2.17".
+ version_string: str | None = os.confstr("CS_GNU_LIBC_VERSION")
+ assert version_string is not None
+ _, version = version_string.rsplit()
+ except (AssertionError, AttributeError, OSError, ValueError):
+ # os.confstr() or CS_GNU_LIBC_VERSION not available (or a bad value)...
+ return None
+ return version
+
+
+def _glibc_version_string_ctypes() -> str | None:
+ """
+ Fallback implementation of glibc_version_string using ctypes.
+ """
+ try:
+ import ctypes # noqa: PLC0415
+ except ImportError:
+ return None
+
+ # ctypes.CDLL(None) internally calls dlopen(NULL), and as the dlopen
+ # manpage says, "If filename is NULL, then the returned handle is for the
+ # main program". This way we can let the linker do the work to figure out
+ # which libc our process is actually using.
+ #
+ # We must also handle the special case where the executable is not a
+ # dynamically linked executable. This can occur when using musl libc,
+ # for example. In this situation, dlopen() will error, leading to an
+ # OSError. Interestingly, at least in the case of musl, there is no
+ # errno set on the OSError. The single string argument used to construct
+ # OSError comes from libc itself and is therefore not portable to
+ # hard code here. In any case, failure to call dlopen() means we
+ # can proceed, so we bail on our attempt.
+ try:
+ process_namespace = ctypes.CDLL(None)
+ except OSError:
+ return None
+
+ try:
+ gnu_get_libc_version = process_namespace.gnu_get_libc_version
+ except AttributeError:
+ # Symbol doesn't exist -> therefore, we are not linked to
+ # glibc.
+ return None
+
+ # Call gnu_get_libc_version, which returns a string like "2.5"
+ gnu_get_libc_version.restype = ctypes.c_char_p
+ version_str: str = gnu_get_libc_version()
+ # py2 / py3 compatibility:
+ if not isinstance(version_str, str):
+ version_str = version_str.decode("ascii")
+
+ return version_str
+
+
+def _glibc_version_string() -> str | None:
+ """Returns glibc version string, or None if not using glibc."""
+ return _glibc_version_string_confstr() or _glibc_version_string_ctypes()
+
+
+def _parse_glibc_version(version_str: str) -> _GLibCVersion:
+ """Parse glibc version.
+
+ We use a regexp instead of str.split because we want to discard any
+ random junk that might come after the minor version -- this might happen
+ in patched/forked versions of glibc (e.g. Linaro's version of glibc
+ uses version strings like "2.20-2014.11"). See gh-3588.
+ """
+ m = re.match(r"(?P[0-9]+)\.(?P[0-9]+)", version_str)
+ if not m:
+ warnings.warn(
+ f"Expected glibc version with 2 components major.minor, got: {version_str}",
+ RuntimeWarning,
+ stacklevel=2,
+ )
+ return _GLibCVersion(-1, -1)
+ return _GLibCVersion(int(m.group("major")), int(m.group("minor")))
+
+
+@functools.lru_cache
+def _get_glibc_version() -> _GLibCVersion:
+ version_str = _glibc_version_string()
+ if version_str is None:
+ return _GLibCVersion(-1, -1)
+ return _parse_glibc_version(version_str)
+
+
+# From PEP 513, PEP 600
+def _is_compatible(arch: str, version: _GLibCVersion) -> bool:
+ sys_glibc = _get_glibc_version()
+ if sys_glibc < version:
+ return False
+ # Check for presence of _manylinux module.
+ try:
+ import _manylinux # noqa: PLC0415
+ except ImportError:
+ return True
+ if hasattr(_manylinux, "manylinux_compatible"):
+ result = _manylinux.manylinux_compatible(version[0], version[1], arch)
+ if result is not None:
+ return bool(result)
+ return True
+ if version == _GLibCVersion(2, 5) and hasattr(_manylinux, "manylinux1_compatible"):
+ return bool(_manylinux.manylinux1_compatible)
+ if version == _GLibCVersion(2, 12) and hasattr(
+ _manylinux, "manylinux2010_compatible"
+ ):
+ return bool(_manylinux.manylinux2010_compatible)
+ if version == _GLibCVersion(2, 17) and hasattr(
+ _manylinux, "manylinux2014_compatible"
+ ):
+ return bool(_manylinux.manylinux2014_compatible)
+ return True
+
+
+_LEGACY_MANYLINUX_MAP: dict[_GLibCVersion, str] = {
+ # CentOS 7 w/ glibc 2.17 (PEP 599)
+ _GLibCVersion(2, 17): "manylinux2014",
+ # CentOS 6 w/ glibc 2.12 (PEP 571)
+ _GLibCVersion(2, 12): "manylinux2010",
+ # CentOS 5 w/ glibc 2.5 (PEP 513)
+ _GLibCVersion(2, 5): "manylinux1",
+}
+
+
+def platform_tags(archs: Sequence[str]) -> Iterator[str]:
+ """Generate manylinux tags compatible to the current platform.
+
+ :param archs: Sequence of compatible architectures.
+ The first one shall be the closest to the actual architecture and be the part of
+ platform tag after the ``linux_`` prefix, e.g. ``x86_64``.
+ The ``linux_`` prefix is assumed as a prerequisite for the current platform to
+ be manylinux-compatible.
+
+ :returns: An iterator of compatible manylinux tags.
+ """
+ if not _have_compatible_abi(sys.executable, archs):
+ return
+ # Oldest glibc to be supported regardless of architecture is (2, 17).
+ too_old_glibc2 = _GLibCVersion(2, 16)
+ if set(archs) & {"x86_64", "i686"}:
+ # On x86/i686 also oldest glibc to be supported is (2, 5).
+ too_old_glibc2 = _GLibCVersion(2, 4)
+ current_glibc = _GLibCVersion(*_get_glibc_version())
+ glibc_max_list = [current_glibc]
+ # We can assume compatibility across glibc major versions.
+ # https://sourceware.org/bugzilla/show_bug.cgi?id=24636
+ #
+ # Build a list of maximum glibc versions so that we can
+ # output the canonical list of all glibc from current_glibc
+ # down to too_old_glibc2, including all intermediary versions.
+ for glibc_major in range(current_glibc.major - 1, 1, -1):
+ glibc_minor = _LAST_GLIBC_MINOR[glibc_major]
+ glibc_max_list.append(_GLibCVersion(glibc_major, glibc_minor))
+ for arch in archs:
+ for glibc_max in glibc_max_list:
+ if glibc_max.major == too_old_glibc2.major:
+ min_minor = too_old_glibc2.minor
+ else:
+ # For other glibc major versions oldest supported is (x, 0).
+ min_minor = -1
+ for glibc_minor in range(glibc_max.minor, min_minor, -1):
+ glibc_version = _GLibCVersion(glibc_max.major, glibc_minor)
+ if _is_compatible(arch, glibc_version):
+ yield "manylinux_{}_{}_{}".format(*glibc_version, arch)
+
+ # Handle the legacy manylinux1, manylinux2010, manylinux2014 tags.
+ if legacy_tag := _LEGACY_MANYLINUX_MAP.get(glibc_version):
+ yield f"{legacy_tag}_{arch}"
diff --git a/venv/lib/python3.10/site-packages/packaging/_musllinux.py b/venv/lib/python3.10/site-packages/packaging/_musllinux.py
new file mode 100644
index 0000000000000000000000000000000000000000..4e8116a79ca80d60657542a23b4bbcbc3c518eaf
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_musllinux.py
@@ -0,0 +1,85 @@
+"""PEP 656 support.
+
+This module implements logic to detect if the currently running Python is
+linked against musl, and what musl version is used.
+"""
+
+from __future__ import annotations
+
+import functools
+import re
+import subprocess
+import sys
+from typing import Iterator, NamedTuple, Sequence
+
+from ._elffile import ELFFile
+
+
+class _MuslVersion(NamedTuple):
+ major: int
+ minor: int
+
+
+def _parse_musl_version(output: str) -> _MuslVersion | None:
+ lines = [n for n in (n.strip() for n in output.splitlines()) if n]
+ if len(lines) < 2 or lines[0][:4] != "musl":
+ return None
+ m = re.match(r"Version (\d+)\.(\d+)", lines[1])
+ if not m:
+ return None
+ return _MuslVersion(major=int(m.group(1)), minor=int(m.group(2)))
+
+
+@functools.lru_cache
+def _get_musl_version(executable: str) -> _MuslVersion | None:
+ """Detect currently-running musl runtime version.
+
+ This is done by checking the specified executable's dynamic linking
+ information, and invoking the loader to parse its output for a version
+ string. If the loader is musl, the output would be something like::
+
+ musl libc (x86_64)
+ Version 1.2.2
+ Dynamic Program Loader
+ """
+ try:
+ with open(executable, "rb") as f:
+ ld = ELFFile(f).interpreter
+ except (OSError, TypeError, ValueError):
+ return None
+ if ld is None or "musl" not in ld:
+ return None
+ proc = subprocess.run([ld], check=False, stderr=subprocess.PIPE, text=True)
+ return _parse_musl_version(proc.stderr)
+
+
+def platform_tags(archs: Sequence[str]) -> Iterator[str]:
+ """Generate musllinux tags compatible to the current platform.
+
+ :param archs: Sequence of compatible architectures.
+ The first one shall be the closest to the actual architecture and be the part of
+ platform tag after the ``linux_`` prefix, e.g. ``x86_64``.
+ The ``linux_`` prefix is assumed as a prerequisite for the current platform to
+ be musllinux-compatible.
+
+ :returns: An iterator of compatible musllinux tags.
+ """
+ sys_musl = _get_musl_version(sys.executable)
+ if sys_musl is None: # Python not dynamically linked against musl.
+ return
+ for arch in archs:
+ for minor in range(sys_musl.minor, -1, -1):
+ yield f"musllinux_{sys_musl.major}_{minor}_{arch}"
+
+
+if __name__ == "__main__": # pragma: no cover
+ import sysconfig
+
+ plat = sysconfig.get_platform()
+ assert plat.startswith("linux-"), "not linux"
+
+ print("plat:", plat)
+ print("musl:", _get_musl_version(sys.executable))
+ print("tags:", end=" ")
+ for t in platform_tags(re.sub(r"[.-]", "_", plat.split("-", 1)[-1])):
+ print(t, end="\n ")
diff --git a/venv/lib/python3.10/site-packages/packaging/_parser.py b/venv/lib/python3.10/site-packages/packaging/_parser.py
new file mode 100644
index 0000000000000000000000000000000000000000..f6c1f5cd226b926f96a3bb1e9fb0f18d1bd021c9
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_parser.py
@@ -0,0 +1,365 @@
+"""Handwritten parser of dependency specifiers.
+
+The docstring for each __parse_* function contains EBNF-inspired grammar representing
+the implementation.
+"""
+
+from __future__ import annotations
+
+import ast
+from typing import List, Literal, NamedTuple, Sequence, Tuple, Union
+
+from ._tokenizer import DEFAULT_RULES, Tokenizer
+
+
+class Node:
+ __slots__ = ("value",)
+
+ def __init__(self, value: str) -> None:
+ self.value = value
+
+ def __str__(self) -> str:
+ return self.value
+
+ def __repr__(self) -> str:
+ return f"<{self.__class__.__name__}({self.value!r})>"
+
+ def serialize(self) -> str:
+ raise NotImplementedError
+
+
+class Variable(Node):
+ __slots__ = ()
+
+ def serialize(self) -> str:
+ return str(self)
+
+
+class Value(Node):
+ __slots__ = ()
+
+ def serialize(self) -> str:
+ return f'"{self}"'
+
+
+class Op(Node):
+ __slots__ = ()
+
+ def serialize(self) -> str:
+ return str(self)
+
+
+MarkerLogical = Literal["and", "or"]
+MarkerVar = Union[Variable, Value]
+MarkerItem = Tuple[MarkerVar, Op, MarkerVar]
+MarkerAtom = Union[MarkerItem, Sequence["MarkerAtom"]]
+MarkerList = List[Union["MarkerList", MarkerAtom, MarkerLogical]]
+
+
+class ParsedRequirement(NamedTuple):
+ name: str
+ url: str
+ extras: list[str]
+ specifier: str
+ marker: MarkerList | None
+
+
+# --------------------------------------------------------------------------------------
+# Recursive descent parser for dependency specifier
+# --------------------------------------------------------------------------------------
+def parse_requirement(source: str) -> ParsedRequirement:
+ return _parse_requirement(Tokenizer(source, rules=DEFAULT_RULES))
+
+
+def _parse_requirement(tokenizer: Tokenizer) -> ParsedRequirement:
+ """
+ requirement = WS? IDENTIFIER WS? extras WS? requirement_details
+ """
+ tokenizer.consume("WS")
+
+ name_token = tokenizer.expect(
+ "IDENTIFIER", expected="package name at the start of dependency specifier"
+ )
+ name = name_token.text
+ tokenizer.consume("WS")
+
+ extras = _parse_extras(tokenizer)
+ tokenizer.consume("WS")
+
+ url, specifier, marker = _parse_requirement_details(tokenizer)
+ tokenizer.expect("END", expected="end of dependency specifier")
+
+ return ParsedRequirement(name, url, extras, specifier, marker)
+
+
+def _parse_requirement_details(
+ tokenizer: Tokenizer,
+) -> tuple[str, str, MarkerList | None]:
+ """
+ requirement_details = AT URL (WS requirement_marker?)?
+ | specifier WS? (requirement_marker)?
+ """
+
+ specifier = ""
+ url = ""
+ marker = None
+
+ if tokenizer.check("AT"):
+ tokenizer.read()
+ tokenizer.consume("WS")
+
+ url_start = tokenizer.position
+ url = tokenizer.expect("URL", expected="URL after @").text
+ if tokenizer.check("END", peek=True):
+ return (url, specifier, marker)
+
+ tokenizer.expect("WS", expected="whitespace after URL")
+
+ # The input might end after whitespace.
+ if tokenizer.check("END", peek=True):
+ return (url, specifier, marker)
+
+ marker = _parse_requirement_marker(
+ tokenizer,
+ span_start=url_start,
+ expected="semicolon (after URL and whitespace)",
+ )
+ else:
+ specifier_start = tokenizer.position
+ specifier = _parse_specifier(tokenizer)
+ tokenizer.consume("WS")
+
+ if tokenizer.check("END", peek=True):
+ return (url, specifier, marker)
+
+ marker = _parse_requirement_marker(
+ tokenizer,
+ span_start=specifier_start,
+ expected=(
+ "comma (within version specifier), semicolon (after version specifier)"
+ if specifier
+ else "semicolon (after name with no version specifier)"
+ ),
+ )
+
+ return (url, specifier, marker)
+
+
+def _parse_requirement_marker(
+ tokenizer: Tokenizer, *, span_start: int, expected: str
+) -> MarkerList:
+ """
+ requirement_marker = SEMICOLON marker WS?
+ """
+
+ if not tokenizer.check("SEMICOLON"):
+ tokenizer.raise_syntax_error(
+ f"Expected {expected} or end",
+ span_start=span_start,
+ span_end=None,
+ )
+ tokenizer.read()
+
+ marker = _parse_marker(tokenizer)
+ tokenizer.consume("WS")
+
+ return marker
+
+
+def _parse_extras(tokenizer: Tokenizer) -> list[str]:
+ """
+ extras = (LEFT_BRACKET wsp* extras_list? wsp* RIGHT_BRACKET)?
+ """
+ if not tokenizer.check("LEFT_BRACKET", peek=True):
+ return []
+
+ with tokenizer.enclosing_tokens(
+ "LEFT_BRACKET",
+ "RIGHT_BRACKET",
+ around="extras",
+ ):
+ tokenizer.consume("WS")
+ extras = _parse_extras_list(tokenizer)
+ tokenizer.consume("WS")
+
+ return extras
+
+
+def _parse_extras_list(tokenizer: Tokenizer) -> list[str]:
+ """
+ extras_list = identifier (wsp* ',' wsp* identifier)*
+ """
+ extras: list[str] = []
+
+ if not tokenizer.check("IDENTIFIER"):
+ return extras
+
+ extras.append(tokenizer.read().text)
+
+ while True:
+ tokenizer.consume("WS")
+ if tokenizer.check("IDENTIFIER", peek=True):
+ tokenizer.raise_syntax_error("Expected comma between extra names")
+ elif not tokenizer.check("COMMA"):
+ break
+
+ tokenizer.read()
+ tokenizer.consume("WS")
+
+ extra_token = tokenizer.expect("IDENTIFIER", expected="extra name after comma")
+ extras.append(extra_token.text)
+
+ return extras
+
+
+def _parse_specifier(tokenizer: Tokenizer) -> str:
+ """
+ specifier = LEFT_PARENTHESIS WS? version_many WS? RIGHT_PARENTHESIS
+ | WS? version_many WS?
+ """
+ with tokenizer.enclosing_tokens(
+ "LEFT_PARENTHESIS",
+ "RIGHT_PARENTHESIS",
+ around="version specifier",
+ ):
+ tokenizer.consume("WS")
+ parsed_specifiers = _parse_version_many(tokenizer)
+ tokenizer.consume("WS")
+
+ return parsed_specifiers
+
+
+def _parse_version_many(tokenizer: Tokenizer) -> str:
+ """
+ version_many = (SPECIFIER (WS? COMMA WS? SPECIFIER)*)?
+ """
+ parsed_specifiers = ""
+ while tokenizer.check("SPECIFIER"):
+ span_start = tokenizer.position
+ parsed_specifiers += tokenizer.read().text
+ if tokenizer.check("VERSION_PREFIX_TRAIL", peek=True):
+ tokenizer.raise_syntax_error(
+ ".* suffix can only be used with `==` or `!=` operators",
+ span_start=span_start,
+ span_end=tokenizer.position + 1,
+ )
+ if tokenizer.check("VERSION_LOCAL_LABEL_TRAIL", peek=True):
+ tokenizer.raise_syntax_error(
+ "Local version label can only be used with `==` or `!=` operators",
+ span_start=span_start,
+ span_end=tokenizer.position,
+ )
+ tokenizer.consume("WS")
+ if not tokenizer.check("COMMA"):
+ break
+ parsed_specifiers += tokenizer.read().text
+ tokenizer.consume("WS")
+
+ return parsed_specifiers
+
+
+# --------------------------------------------------------------------------------------
+# Recursive descent parser for marker expression
+# --------------------------------------------------------------------------------------
+def parse_marker(source: str) -> MarkerList:
+ return _parse_full_marker(Tokenizer(source, rules=DEFAULT_RULES))
+
+
+def _parse_full_marker(tokenizer: Tokenizer) -> MarkerList:
+ retval = _parse_marker(tokenizer)
+ tokenizer.expect("END", expected="end of marker expression")
+ return retval
+
+
+def _parse_marker(tokenizer: Tokenizer) -> MarkerList:
+ """
+ marker = marker_atom (BOOLOP marker_atom)+
+ """
+ expression = [_parse_marker_atom(tokenizer)]
+ while tokenizer.check("BOOLOP"):
+ token = tokenizer.read()
+ expr_right = _parse_marker_atom(tokenizer)
+ expression.extend((token.text, expr_right))
+ return expression
+
+
+def _parse_marker_atom(tokenizer: Tokenizer) -> MarkerAtom:
+ """
+ marker_atom = WS? LEFT_PARENTHESIS WS? marker WS? RIGHT_PARENTHESIS WS?
+ | WS? marker_item WS?
+ """
+
+ tokenizer.consume("WS")
+ if tokenizer.check("LEFT_PARENTHESIS", peek=True):
+ with tokenizer.enclosing_tokens(
+ "LEFT_PARENTHESIS",
+ "RIGHT_PARENTHESIS",
+ around="marker expression",
+ ):
+ tokenizer.consume("WS")
+ marker: MarkerAtom = _parse_marker(tokenizer)
+ tokenizer.consume("WS")
+ else:
+ marker = _parse_marker_item(tokenizer)
+ tokenizer.consume("WS")
+ return marker
+
+
+def _parse_marker_item(tokenizer: Tokenizer) -> MarkerItem:
+ """
+ marker_item = WS? marker_var WS? marker_op WS? marker_var WS?
+ """
+ tokenizer.consume("WS")
+ marker_var_left = _parse_marker_var(tokenizer)
+ tokenizer.consume("WS")
+ marker_op = _parse_marker_op(tokenizer)
+ tokenizer.consume("WS")
+ marker_var_right = _parse_marker_var(tokenizer)
+ tokenizer.consume("WS")
+ return (marker_var_left, marker_op, marker_var_right)
+
+
+def _parse_marker_var(tokenizer: Tokenizer) -> MarkerVar: # noqa: RET503
+ """
+ marker_var = VARIABLE | QUOTED_STRING
+ """
+ if tokenizer.check("VARIABLE"):
+ return process_env_var(tokenizer.read().text.replace(".", "_"))
+ elif tokenizer.check("QUOTED_STRING"):
+ return process_python_str(tokenizer.read().text)
+ else:
+ tokenizer.raise_syntax_error(
+ message="Expected a marker variable or quoted string"
+ )
+
+
+def process_env_var(env_var: str) -> Variable:
+ if env_var in ("platform_python_implementation", "python_implementation"):
+ return Variable("platform_python_implementation")
+ else:
+ return Variable(env_var)
+
+
+def process_python_str(python_str: str) -> Value:
+ value = ast.literal_eval(python_str)
+ return Value(str(value))
+
+
+def _parse_marker_op(tokenizer: Tokenizer) -> Op:
+ """
+ marker_op = IN | NOT IN | OP
+ """
+ if tokenizer.check("IN"):
+ tokenizer.read()
+ return Op("in")
+ elif tokenizer.check("NOT"):
+ tokenizer.read()
+ tokenizer.expect("WS", expected="whitespace after 'not'")
+ tokenizer.expect("IN", expected="'in' after 'not'")
+ return Op("not in")
+ elif tokenizer.check("OP"):
+ return Op(tokenizer.read().text)
+ else:
+ return tokenizer.raise_syntax_error(
+ "Expected marker operator, one of <=, <, !=, ==, >=, >, ~=, ===, in, not in"
+ )
diff --git a/venv/lib/python3.10/site-packages/packaging/_structures.py b/venv/lib/python3.10/site-packages/packaging/_structures.py
new file mode 100644
index 0000000000000000000000000000000000000000..225e2eee01238571c50595eb104e0b70d5f503c4
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_structures.py
@@ -0,0 +1,69 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+
+import typing
+
+
+@typing.final
+class InfinityType:
+ __slots__ = ()
+
+ def __repr__(self) -> str:
+ return "Infinity"
+
+ def __hash__(self) -> int:
+ return hash(repr(self))
+
+ def __lt__(self, other: object) -> bool:
+ return False
+
+ def __le__(self, other: object) -> bool:
+ return False
+
+ def __eq__(self, other: object) -> bool:
+ return isinstance(other, self.__class__)
+
+ def __gt__(self, other: object) -> bool:
+ return True
+
+ def __ge__(self, other: object) -> bool:
+ return True
+
+ def __neg__(self: object) -> "NegativeInfinityType":
+ return NegativeInfinity
+
+
+Infinity = InfinityType()
+
+
+@typing.final
+class NegativeInfinityType:
+ __slots__ = ()
+
+ def __repr__(self) -> str:
+ return "-Infinity"
+
+ def __hash__(self) -> int:
+ return hash(repr(self))
+
+ def __lt__(self, other: object) -> bool:
+ return True
+
+ def __le__(self, other: object) -> bool:
+ return True
+
+ def __eq__(self, other: object) -> bool:
+ return isinstance(other, self.__class__)
+
+ def __gt__(self, other: object) -> bool:
+ return False
+
+ def __ge__(self, other: object) -> bool:
+ return False
+
+ def __neg__(self: object) -> InfinityType:
+ return Infinity
+
+
+NegativeInfinity = NegativeInfinityType()
diff --git a/venv/lib/python3.10/site-packages/packaging/_tokenizer.py b/venv/lib/python3.10/site-packages/packaging/_tokenizer.py
new file mode 100644
index 0000000000000000000000000000000000000000..e6d20dd3f56f880a92db7409a3e1335cb282a8f2
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/_tokenizer.py
@@ -0,0 +1,193 @@
+from __future__ import annotations
+
+import contextlib
+import re
+from dataclasses import dataclass
+from typing import Generator, Mapping, NoReturn
+
+from .specifiers import Specifier
+
+
+@dataclass
+class Token:
+ name: str
+ text: str
+ position: int
+
+
+class ParserSyntaxError(Exception):
+ """The provided source text could not be parsed correctly."""
+
+ def __init__(
+ self,
+ message: str,
+ *,
+ source: str,
+ span: tuple[int, int],
+ ) -> None:
+ self.span = span
+ self.message = message
+ self.source = source
+
+ super().__init__()
+
+ def __str__(self) -> str:
+ marker = " " * self.span[0] + "~" * (self.span[1] - self.span[0]) + "^"
+ return f"{self.message}\n {self.source}\n {marker}"
+
+
+DEFAULT_RULES: dict[str, re.Pattern[str]] = {
+ "LEFT_PARENTHESIS": re.compile(r"\("),
+ "RIGHT_PARENTHESIS": re.compile(r"\)"),
+ "LEFT_BRACKET": re.compile(r"\["),
+ "RIGHT_BRACKET": re.compile(r"\]"),
+ "SEMICOLON": re.compile(r";"),
+ "COMMA": re.compile(r","),
+ "QUOTED_STRING": re.compile(
+ r"""
+ (
+ ('[^']*')
+ |
+ ("[^"]*")
+ )
+ """,
+ re.VERBOSE,
+ ),
+ "OP": re.compile(r"(===|==|~=|!=|<=|>=|<|>)"),
+ "BOOLOP": re.compile(r"\b(or|and)\b"),
+ "IN": re.compile(r"\bin\b"),
+ "NOT": re.compile(r"\bnot\b"),
+ "VARIABLE": re.compile(
+ r"""
+ \b(
+ python_version
+ |python_full_version
+ |os[._]name
+ |sys[._]platform
+ |platform_(release|system)
+ |platform[._](version|machine|python_implementation)
+ |python_implementation
+ |implementation_(name|version)
+ |extras?
+ |dependency_groups
+ )\b
+ """,
+ re.VERBOSE,
+ ),
+ "SPECIFIER": re.compile(
+ Specifier._operator_regex_str + Specifier._version_regex_str,
+ re.VERBOSE | re.IGNORECASE,
+ ),
+ "AT": re.compile(r"\@"),
+ "URL": re.compile(r"[^ \t]+"),
+ "IDENTIFIER": re.compile(r"\b[a-zA-Z0-9][a-zA-Z0-9._-]*\b"),
+ "VERSION_PREFIX_TRAIL": re.compile(r"\.\*"),
+ "VERSION_LOCAL_LABEL_TRAIL": re.compile(r"\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*"),
+ "WS": re.compile(r"[ \t]+"),
+ "END": re.compile(r"$"),
+}
+
+
+class Tokenizer:
+ """Context-sensitive token parsing.
+
+ Provides methods to examine the input stream to check whether the next token
+ matches.
+ """
+
+ def __init__(
+ self,
+ source: str,
+ *,
+ rules: Mapping[str, re.Pattern[str]],
+ ) -> None:
+ self.source = source
+ self.rules = rules
+ self.next_token: Token | None = None
+ self.position = 0
+
+ def consume(self, name: str) -> None:
+ """Move beyond provided token name, if at current position."""
+ if self.check(name):
+ self.read()
+
+ def check(self, name: str, *, peek: bool = False) -> bool:
+ """Check whether the next token has the provided name.
+
+ By default, if the check succeeds, the token *must* be read before
+ another check. If `peek` is set to `True`, the token is not loaded and
+ would need to be checked again.
+ """
+ assert self.next_token is None, (
+ f"Cannot check for {name!r}, already have {self.next_token!r}"
+ )
+ assert name in self.rules, f"Unknown token name: {name!r}"
+
+ expression = self.rules[name]
+
+ match = expression.match(self.source, self.position)
+ if match is None:
+ return False
+ if not peek:
+ self.next_token = Token(name, match[0], self.position)
+ return True
+
+ def expect(self, name: str, *, expected: str) -> Token:
+ """Expect a certain token name next, failing with a syntax error otherwise.
+
+ The token is *not* read.
+ """
+ if not self.check(name):
+ raise self.raise_syntax_error(f"Expected {expected}")
+ return self.read()
+
+ def read(self) -> Token:
+ """Consume the next token and return it."""
+ token = self.next_token
+ assert token is not None
+
+ self.position += len(token.text)
+ self.next_token = None
+
+ return token
+
+ def raise_syntax_error(
+ self,
+ message: str,
+ *,
+ span_start: int | None = None,
+ span_end: int | None = None,
+ ) -> NoReturn:
+ """Raise ParserSyntaxError at the given position."""
+ span = (
+ self.position if span_start is None else span_start,
+ self.position if span_end is None else span_end,
+ )
+ raise ParserSyntaxError(
+ message,
+ source=self.source,
+ span=span,
+ )
+
+ @contextlib.contextmanager
+ def enclosing_tokens(
+ self, open_token: str, close_token: str, *, around: str
+ ) -> Generator[None, None, None]:
+ if self.check(open_token):
+ open_position = self.position
+ self.read()
+ else:
+ open_position = None
+
+ yield
+
+ if open_position is None:
+ return
+
+ if not self.check(close_token):
+ self.raise_syntax_error(
+ f"Expected matching {close_token} for {open_token}, after {around}",
+ span_start=open_position,
+ )
+
+ self.read()
diff --git a/venv/lib/python3.10/site-packages/packaging/licenses/__init__.py b/venv/lib/python3.10/site-packages/packaging/licenses/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..335b275fa7575b0a7c525a713fbe0252ad2d956f
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/licenses/__init__.py
@@ -0,0 +1,147 @@
+#######################################################################################
+#
+# Adapted from:
+# https://github.com/pypa/hatch/blob/5352e44/backend/src/hatchling/licenses/parse.py
+#
+# MIT License
+#
+# Copyright (c) 2017-present Ofek Lev
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy of this
+# software and associated documentation files (the "Software"), to deal in the Software
+# without restriction, including without limitation the rights to use, copy, modify,
+# merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to the following
+# conditions:
+#
+# The above copyright notice and this permission notice shall be included in all copies
+# or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
+# INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+# PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
+# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
+# CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
+# OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+#
+#
+# With additional allowance of arbitrary `LicenseRef-` identifiers, not just
+# `LicenseRef-Public-Domain` and `LicenseRef-Proprietary`.
+#
+#######################################################################################
+from __future__ import annotations
+
+import re
+from typing import NewType, cast
+
+from ._spdx import EXCEPTIONS, LICENSES
+
+__all__ = [
+ "InvalidLicenseExpression",
+ "NormalizedLicenseExpression",
+ "canonicalize_license_expression",
+]
+
+license_ref_allowed = re.compile("^[A-Za-z0-9.-]*$")
+
+NormalizedLicenseExpression = NewType("NormalizedLicenseExpression", str)
+
+
+class InvalidLicenseExpression(ValueError):
+ """Raised when a license-expression string is invalid
+
+ >>> canonicalize_license_expression("invalid")
+ Traceback (most recent call last):
+ ...
+ packaging.licenses.InvalidLicenseExpression: Invalid license expression: 'invalid'
+ """
+
+
+def canonicalize_license_expression(
+ raw_license_expression: str,
+) -> NormalizedLicenseExpression:
+ if not raw_license_expression:
+ message = f"Invalid license expression: {raw_license_expression!r}"
+ raise InvalidLicenseExpression(message)
+
+ # Pad any parentheses so tokenization can be achieved by merely splitting on
+ # whitespace.
+ license_expression = raw_license_expression.replace("(", " ( ").replace(")", " ) ")
+ licenseref_prefix = "LicenseRef-"
+ license_refs = {
+ ref.lower(): "LicenseRef-" + ref[len(licenseref_prefix) :]
+ for ref in license_expression.split()
+ if ref.lower().startswith(licenseref_prefix.lower())
+ }
+
+ # Normalize to lower case so we can look up licenses/exceptions
+ # and so boolean operators are Python-compatible.
+ license_expression = license_expression.lower()
+
+ tokens = license_expression.split()
+
+ # Rather than implementing a parenthesis/boolean logic parser, create an
+ # expression that Python can parse. Everything that is not involved with the
+ # grammar itself is replaced with the placeholder `False` and the resultant
+ # expression should become a valid Python expression.
+ python_tokens = []
+ for token in tokens:
+ if token not in {"or", "and", "with", "(", ")"}:
+ python_tokens.append("False")
+ elif token == "with":
+ python_tokens.append("or")
+ elif (
+ token == "("
+ and python_tokens
+ and python_tokens[-1] not in {"or", "and", "("}
+ ) or (token == ")" and python_tokens and python_tokens[-1] == "("):
+ message = f"Invalid license expression: {raw_license_expression!r}"
+ raise InvalidLicenseExpression(message)
+ else:
+ python_tokens.append(token)
+
+ python_expression = " ".join(python_tokens)
+ try:
+ compile(python_expression, "", "eval")
+ except SyntaxError:
+ message = f"Invalid license expression: {raw_license_expression!r}"
+ raise InvalidLicenseExpression(message) from None
+
+ # Take a final pass to check for unknown licenses/exceptions.
+ normalized_tokens = []
+ for token in tokens:
+ if token in {"or", "and", "with", "(", ")"}:
+ normalized_tokens.append(token.upper())
+ continue
+
+ if normalized_tokens and normalized_tokens[-1] == "WITH":
+ if token not in EXCEPTIONS:
+ message = f"Unknown license exception: {token!r}"
+ raise InvalidLicenseExpression(message)
+
+ normalized_tokens.append(EXCEPTIONS[token]["id"])
+ else:
+ if token.endswith("+"):
+ final_token = token[:-1]
+ suffix = "+"
+ else:
+ final_token = token
+ suffix = ""
+
+ if final_token.startswith("licenseref-"):
+ if not license_ref_allowed.match(final_token):
+ message = f"Invalid licenseref: {final_token!r}"
+ raise InvalidLicenseExpression(message)
+ normalized_tokens.append(license_refs[final_token] + suffix)
+ else:
+ if final_token not in LICENSES:
+ message = f"Unknown license: {final_token!r}"
+ raise InvalidLicenseExpression(message)
+ normalized_tokens.append(LICENSES[final_token]["id"] + suffix)
+
+ normalized_expression = " ".join(normalized_tokens)
+
+ return cast(
+ "NormalizedLicenseExpression",
+ normalized_expression.replace("( ", "(").replace(" )", ")"),
+ )
diff --git a/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/__init__.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..af4daed121253ea331f2dfa76b72e4518133a32c
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/__init__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/_spdx.cpython-310.pyc b/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/_spdx.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..b7b5815bf684c5032fb52a8343bb966f30af2021
Binary files /dev/null and b/venv/lib/python3.10/site-packages/packaging/licenses/__pycache__/_spdx.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/packaging/licenses/_spdx.py b/venv/lib/python3.10/site-packages/packaging/licenses/_spdx.py
new file mode 100644
index 0000000000000000000000000000000000000000..a277af28220b6dbe4599471104d1c7a2bd1e1288
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/licenses/_spdx.py
@@ -0,0 +1,799 @@
+
+from __future__ import annotations
+
+from typing import TypedDict
+
+class SPDXLicense(TypedDict):
+ id: str
+ deprecated: bool
+
+class SPDXException(TypedDict):
+ id: str
+ deprecated: bool
+
+
+VERSION = '3.27.0'
+
+LICENSES: dict[str, SPDXLicense] = {
+ '0bsd': {'id': '0BSD', 'deprecated': False},
+ '3d-slicer-1.0': {'id': '3D-Slicer-1.0', 'deprecated': False},
+ 'aal': {'id': 'AAL', 'deprecated': False},
+ 'abstyles': {'id': 'Abstyles', 'deprecated': False},
+ 'adacore-doc': {'id': 'AdaCore-doc', 'deprecated': False},
+ 'adobe-2006': {'id': 'Adobe-2006', 'deprecated': False},
+ 'adobe-display-postscript': {'id': 'Adobe-Display-PostScript', 'deprecated': False},
+ 'adobe-glyph': {'id': 'Adobe-Glyph', 'deprecated': False},
+ 'adobe-utopia': {'id': 'Adobe-Utopia', 'deprecated': False},
+ 'adsl': {'id': 'ADSL', 'deprecated': False},
+ 'afl-1.1': {'id': 'AFL-1.1', 'deprecated': False},
+ 'afl-1.2': {'id': 'AFL-1.2', 'deprecated': False},
+ 'afl-2.0': {'id': 'AFL-2.0', 'deprecated': False},
+ 'afl-2.1': {'id': 'AFL-2.1', 'deprecated': False},
+ 'afl-3.0': {'id': 'AFL-3.0', 'deprecated': False},
+ 'afmparse': {'id': 'Afmparse', 'deprecated': False},
+ 'agpl-1.0': {'id': 'AGPL-1.0', 'deprecated': True},
+ 'agpl-1.0-only': {'id': 'AGPL-1.0-only', 'deprecated': False},
+ 'agpl-1.0-or-later': {'id': 'AGPL-1.0-or-later', 'deprecated': False},
+ 'agpl-3.0': {'id': 'AGPL-3.0', 'deprecated': True},
+ 'agpl-3.0-only': {'id': 'AGPL-3.0-only', 'deprecated': False},
+ 'agpl-3.0-or-later': {'id': 'AGPL-3.0-or-later', 'deprecated': False},
+ 'aladdin': {'id': 'Aladdin', 'deprecated': False},
+ 'amd-newlib': {'id': 'AMD-newlib', 'deprecated': False},
+ 'amdplpa': {'id': 'AMDPLPA', 'deprecated': False},
+ 'aml': {'id': 'AML', 'deprecated': False},
+ 'aml-glslang': {'id': 'AML-glslang', 'deprecated': False},
+ 'ampas': {'id': 'AMPAS', 'deprecated': False},
+ 'antlr-pd': {'id': 'ANTLR-PD', 'deprecated': False},
+ 'antlr-pd-fallback': {'id': 'ANTLR-PD-fallback', 'deprecated': False},
+ 'any-osi': {'id': 'any-OSI', 'deprecated': False},
+ 'any-osi-perl-modules': {'id': 'any-OSI-perl-modules', 'deprecated': False},
+ 'apache-1.0': {'id': 'Apache-1.0', 'deprecated': False},
+ 'apache-1.1': {'id': 'Apache-1.1', 'deprecated': False},
+ 'apache-2.0': {'id': 'Apache-2.0', 'deprecated': False},
+ 'apafml': {'id': 'APAFML', 'deprecated': False},
+ 'apl-1.0': {'id': 'APL-1.0', 'deprecated': False},
+ 'app-s2p': {'id': 'App-s2p', 'deprecated': False},
+ 'apsl-1.0': {'id': 'APSL-1.0', 'deprecated': False},
+ 'apsl-1.1': {'id': 'APSL-1.1', 'deprecated': False},
+ 'apsl-1.2': {'id': 'APSL-1.2', 'deprecated': False},
+ 'apsl-2.0': {'id': 'APSL-2.0', 'deprecated': False},
+ 'arphic-1999': {'id': 'Arphic-1999', 'deprecated': False},
+ 'artistic-1.0': {'id': 'Artistic-1.0', 'deprecated': False},
+ 'artistic-1.0-cl8': {'id': 'Artistic-1.0-cl8', 'deprecated': False},
+ 'artistic-1.0-perl': {'id': 'Artistic-1.0-Perl', 'deprecated': False},
+ 'artistic-2.0': {'id': 'Artistic-2.0', 'deprecated': False},
+ 'artistic-dist': {'id': 'Artistic-dist', 'deprecated': False},
+ 'aspell-ru': {'id': 'Aspell-RU', 'deprecated': False},
+ 'aswf-digital-assets-1.0': {'id': 'ASWF-Digital-Assets-1.0', 'deprecated': False},
+ 'aswf-digital-assets-1.1': {'id': 'ASWF-Digital-Assets-1.1', 'deprecated': False},
+ 'baekmuk': {'id': 'Baekmuk', 'deprecated': False},
+ 'bahyph': {'id': 'Bahyph', 'deprecated': False},
+ 'barr': {'id': 'Barr', 'deprecated': False},
+ 'bcrypt-solar-designer': {'id': 'bcrypt-Solar-Designer', 'deprecated': False},
+ 'beerware': {'id': 'Beerware', 'deprecated': False},
+ 'bitstream-charter': {'id': 'Bitstream-Charter', 'deprecated': False},
+ 'bitstream-vera': {'id': 'Bitstream-Vera', 'deprecated': False},
+ 'bittorrent-1.0': {'id': 'BitTorrent-1.0', 'deprecated': False},
+ 'bittorrent-1.1': {'id': 'BitTorrent-1.1', 'deprecated': False},
+ 'blessing': {'id': 'blessing', 'deprecated': False},
+ 'blueoak-1.0.0': {'id': 'BlueOak-1.0.0', 'deprecated': False},
+ 'boehm-gc': {'id': 'Boehm-GC', 'deprecated': False},
+ 'boehm-gc-without-fee': {'id': 'Boehm-GC-without-fee', 'deprecated': False},
+ 'borceux': {'id': 'Borceux', 'deprecated': False},
+ 'brian-gladman-2-clause': {'id': 'Brian-Gladman-2-Clause', 'deprecated': False},
+ 'brian-gladman-3-clause': {'id': 'Brian-Gladman-3-Clause', 'deprecated': False},
+ 'bsd-1-clause': {'id': 'BSD-1-Clause', 'deprecated': False},
+ 'bsd-2-clause': {'id': 'BSD-2-Clause', 'deprecated': False},
+ 'bsd-2-clause-darwin': {'id': 'BSD-2-Clause-Darwin', 'deprecated': False},
+ 'bsd-2-clause-first-lines': {'id': 'BSD-2-Clause-first-lines', 'deprecated': False},
+ 'bsd-2-clause-freebsd': {'id': 'BSD-2-Clause-FreeBSD', 'deprecated': True},
+ 'bsd-2-clause-netbsd': {'id': 'BSD-2-Clause-NetBSD', 'deprecated': True},
+ 'bsd-2-clause-patent': {'id': 'BSD-2-Clause-Patent', 'deprecated': False},
+ 'bsd-2-clause-pkgconf-disclaimer': {'id': 'BSD-2-Clause-pkgconf-disclaimer', 'deprecated': False},
+ 'bsd-2-clause-views': {'id': 'BSD-2-Clause-Views', 'deprecated': False},
+ 'bsd-3-clause': {'id': 'BSD-3-Clause', 'deprecated': False},
+ 'bsd-3-clause-acpica': {'id': 'BSD-3-Clause-acpica', 'deprecated': False},
+ 'bsd-3-clause-attribution': {'id': 'BSD-3-Clause-Attribution', 'deprecated': False},
+ 'bsd-3-clause-clear': {'id': 'BSD-3-Clause-Clear', 'deprecated': False},
+ 'bsd-3-clause-flex': {'id': 'BSD-3-Clause-flex', 'deprecated': False},
+ 'bsd-3-clause-hp': {'id': 'BSD-3-Clause-HP', 'deprecated': False},
+ 'bsd-3-clause-lbnl': {'id': 'BSD-3-Clause-LBNL', 'deprecated': False},
+ 'bsd-3-clause-modification': {'id': 'BSD-3-Clause-Modification', 'deprecated': False},
+ 'bsd-3-clause-no-military-license': {'id': 'BSD-3-Clause-No-Military-License', 'deprecated': False},
+ 'bsd-3-clause-no-nuclear-license': {'id': 'BSD-3-Clause-No-Nuclear-License', 'deprecated': False},
+ 'bsd-3-clause-no-nuclear-license-2014': {'id': 'BSD-3-Clause-No-Nuclear-License-2014', 'deprecated': False},
+ 'bsd-3-clause-no-nuclear-warranty': {'id': 'BSD-3-Clause-No-Nuclear-Warranty', 'deprecated': False},
+ 'bsd-3-clause-open-mpi': {'id': 'BSD-3-Clause-Open-MPI', 'deprecated': False},
+ 'bsd-3-clause-sun': {'id': 'BSD-3-Clause-Sun', 'deprecated': False},
+ 'bsd-4-clause': {'id': 'BSD-4-Clause', 'deprecated': False},
+ 'bsd-4-clause-shortened': {'id': 'BSD-4-Clause-Shortened', 'deprecated': False},
+ 'bsd-4-clause-uc': {'id': 'BSD-4-Clause-UC', 'deprecated': False},
+ 'bsd-4.3reno': {'id': 'BSD-4.3RENO', 'deprecated': False},
+ 'bsd-4.3tahoe': {'id': 'BSD-4.3TAHOE', 'deprecated': False},
+ 'bsd-advertising-acknowledgement': {'id': 'BSD-Advertising-Acknowledgement', 'deprecated': False},
+ 'bsd-attribution-hpnd-disclaimer': {'id': 'BSD-Attribution-HPND-disclaimer', 'deprecated': False},
+ 'bsd-inferno-nettverk': {'id': 'BSD-Inferno-Nettverk', 'deprecated': False},
+ 'bsd-protection': {'id': 'BSD-Protection', 'deprecated': False},
+ 'bsd-source-beginning-file': {'id': 'BSD-Source-beginning-file', 'deprecated': False},
+ 'bsd-source-code': {'id': 'BSD-Source-Code', 'deprecated': False},
+ 'bsd-systemics': {'id': 'BSD-Systemics', 'deprecated': False},
+ 'bsd-systemics-w3works': {'id': 'BSD-Systemics-W3Works', 'deprecated': False},
+ 'bsl-1.0': {'id': 'BSL-1.0', 'deprecated': False},
+ 'busl-1.1': {'id': 'BUSL-1.1', 'deprecated': False},
+ 'bzip2-1.0.5': {'id': 'bzip2-1.0.5', 'deprecated': True},
+ 'bzip2-1.0.6': {'id': 'bzip2-1.0.6', 'deprecated': False},
+ 'c-uda-1.0': {'id': 'C-UDA-1.0', 'deprecated': False},
+ 'cal-1.0': {'id': 'CAL-1.0', 'deprecated': False},
+ 'cal-1.0-combined-work-exception': {'id': 'CAL-1.0-Combined-Work-Exception', 'deprecated': False},
+ 'caldera': {'id': 'Caldera', 'deprecated': False},
+ 'caldera-no-preamble': {'id': 'Caldera-no-preamble', 'deprecated': False},
+ 'catharon': {'id': 'Catharon', 'deprecated': False},
+ 'catosl-1.1': {'id': 'CATOSL-1.1', 'deprecated': False},
+ 'cc-by-1.0': {'id': 'CC-BY-1.0', 'deprecated': False},
+ 'cc-by-2.0': {'id': 'CC-BY-2.0', 'deprecated': False},
+ 'cc-by-2.5': {'id': 'CC-BY-2.5', 'deprecated': False},
+ 'cc-by-2.5-au': {'id': 'CC-BY-2.5-AU', 'deprecated': False},
+ 'cc-by-3.0': {'id': 'CC-BY-3.0', 'deprecated': False},
+ 'cc-by-3.0-at': {'id': 'CC-BY-3.0-AT', 'deprecated': False},
+ 'cc-by-3.0-au': {'id': 'CC-BY-3.0-AU', 'deprecated': False},
+ 'cc-by-3.0-de': {'id': 'CC-BY-3.0-DE', 'deprecated': False},
+ 'cc-by-3.0-igo': {'id': 'CC-BY-3.0-IGO', 'deprecated': False},
+ 'cc-by-3.0-nl': {'id': 'CC-BY-3.0-NL', 'deprecated': False},
+ 'cc-by-3.0-us': {'id': 'CC-BY-3.0-US', 'deprecated': False},
+ 'cc-by-4.0': {'id': 'CC-BY-4.0', 'deprecated': False},
+ 'cc-by-nc-1.0': {'id': 'CC-BY-NC-1.0', 'deprecated': False},
+ 'cc-by-nc-2.0': {'id': 'CC-BY-NC-2.0', 'deprecated': False},
+ 'cc-by-nc-2.5': {'id': 'CC-BY-NC-2.5', 'deprecated': False},
+ 'cc-by-nc-3.0': {'id': 'CC-BY-NC-3.0', 'deprecated': False},
+ 'cc-by-nc-3.0-de': {'id': 'CC-BY-NC-3.0-DE', 'deprecated': False},
+ 'cc-by-nc-4.0': {'id': 'CC-BY-NC-4.0', 'deprecated': False},
+ 'cc-by-nc-nd-1.0': {'id': 'CC-BY-NC-ND-1.0', 'deprecated': False},
+ 'cc-by-nc-nd-2.0': {'id': 'CC-BY-NC-ND-2.0', 'deprecated': False},
+ 'cc-by-nc-nd-2.5': {'id': 'CC-BY-NC-ND-2.5', 'deprecated': False},
+ 'cc-by-nc-nd-3.0': {'id': 'CC-BY-NC-ND-3.0', 'deprecated': False},
+ 'cc-by-nc-nd-3.0-de': {'id': 'CC-BY-NC-ND-3.0-DE', 'deprecated': False},
+ 'cc-by-nc-nd-3.0-igo': {'id': 'CC-BY-NC-ND-3.0-IGO', 'deprecated': False},
+ 'cc-by-nc-nd-4.0': {'id': 'CC-BY-NC-ND-4.0', 'deprecated': False},
+ 'cc-by-nc-sa-1.0': {'id': 'CC-BY-NC-SA-1.0', 'deprecated': False},
+ 'cc-by-nc-sa-2.0': {'id': 'CC-BY-NC-SA-2.0', 'deprecated': False},
+ 'cc-by-nc-sa-2.0-de': {'id': 'CC-BY-NC-SA-2.0-DE', 'deprecated': False},
+ 'cc-by-nc-sa-2.0-fr': {'id': 'CC-BY-NC-SA-2.0-FR', 'deprecated': False},
+ 'cc-by-nc-sa-2.0-uk': {'id': 'CC-BY-NC-SA-2.0-UK', 'deprecated': False},
+ 'cc-by-nc-sa-2.5': {'id': 'CC-BY-NC-SA-2.5', 'deprecated': False},
+ 'cc-by-nc-sa-3.0': {'id': 'CC-BY-NC-SA-3.0', 'deprecated': False},
+ 'cc-by-nc-sa-3.0-de': {'id': 'CC-BY-NC-SA-3.0-DE', 'deprecated': False},
+ 'cc-by-nc-sa-3.0-igo': {'id': 'CC-BY-NC-SA-3.0-IGO', 'deprecated': False},
+ 'cc-by-nc-sa-4.0': {'id': 'CC-BY-NC-SA-4.0', 'deprecated': False},
+ 'cc-by-nd-1.0': {'id': 'CC-BY-ND-1.0', 'deprecated': False},
+ 'cc-by-nd-2.0': {'id': 'CC-BY-ND-2.0', 'deprecated': False},
+ 'cc-by-nd-2.5': {'id': 'CC-BY-ND-2.5', 'deprecated': False},
+ 'cc-by-nd-3.0': {'id': 'CC-BY-ND-3.0', 'deprecated': False},
+ 'cc-by-nd-3.0-de': {'id': 'CC-BY-ND-3.0-DE', 'deprecated': False},
+ 'cc-by-nd-4.0': {'id': 'CC-BY-ND-4.0', 'deprecated': False},
+ 'cc-by-sa-1.0': {'id': 'CC-BY-SA-1.0', 'deprecated': False},
+ 'cc-by-sa-2.0': {'id': 'CC-BY-SA-2.0', 'deprecated': False},
+ 'cc-by-sa-2.0-uk': {'id': 'CC-BY-SA-2.0-UK', 'deprecated': False},
+ 'cc-by-sa-2.1-jp': {'id': 'CC-BY-SA-2.1-JP', 'deprecated': False},
+ 'cc-by-sa-2.5': {'id': 'CC-BY-SA-2.5', 'deprecated': False},
+ 'cc-by-sa-3.0': {'id': 'CC-BY-SA-3.0', 'deprecated': False},
+ 'cc-by-sa-3.0-at': {'id': 'CC-BY-SA-3.0-AT', 'deprecated': False},
+ 'cc-by-sa-3.0-de': {'id': 'CC-BY-SA-3.0-DE', 'deprecated': False},
+ 'cc-by-sa-3.0-igo': {'id': 'CC-BY-SA-3.0-IGO', 'deprecated': False},
+ 'cc-by-sa-4.0': {'id': 'CC-BY-SA-4.0', 'deprecated': False},
+ 'cc-pddc': {'id': 'CC-PDDC', 'deprecated': False},
+ 'cc-pdm-1.0': {'id': 'CC-PDM-1.0', 'deprecated': False},
+ 'cc-sa-1.0': {'id': 'CC-SA-1.0', 'deprecated': False},
+ 'cc0-1.0': {'id': 'CC0-1.0', 'deprecated': False},
+ 'cddl-1.0': {'id': 'CDDL-1.0', 'deprecated': False},
+ 'cddl-1.1': {'id': 'CDDL-1.1', 'deprecated': False},
+ 'cdl-1.0': {'id': 'CDL-1.0', 'deprecated': False},
+ 'cdla-permissive-1.0': {'id': 'CDLA-Permissive-1.0', 'deprecated': False},
+ 'cdla-permissive-2.0': {'id': 'CDLA-Permissive-2.0', 'deprecated': False},
+ 'cdla-sharing-1.0': {'id': 'CDLA-Sharing-1.0', 'deprecated': False},
+ 'cecill-1.0': {'id': 'CECILL-1.0', 'deprecated': False},
+ 'cecill-1.1': {'id': 'CECILL-1.1', 'deprecated': False},
+ 'cecill-2.0': {'id': 'CECILL-2.0', 'deprecated': False},
+ 'cecill-2.1': {'id': 'CECILL-2.1', 'deprecated': False},
+ 'cecill-b': {'id': 'CECILL-B', 'deprecated': False},
+ 'cecill-c': {'id': 'CECILL-C', 'deprecated': False},
+ 'cern-ohl-1.1': {'id': 'CERN-OHL-1.1', 'deprecated': False},
+ 'cern-ohl-1.2': {'id': 'CERN-OHL-1.2', 'deprecated': False},
+ 'cern-ohl-p-2.0': {'id': 'CERN-OHL-P-2.0', 'deprecated': False},
+ 'cern-ohl-s-2.0': {'id': 'CERN-OHL-S-2.0', 'deprecated': False},
+ 'cern-ohl-w-2.0': {'id': 'CERN-OHL-W-2.0', 'deprecated': False},
+ 'cfitsio': {'id': 'CFITSIO', 'deprecated': False},
+ 'check-cvs': {'id': 'check-cvs', 'deprecated': False},
+ 'checkmk': {'id': 'checkmk', 'deprecated': False},
+ 'clartistic': {'id': 'ClArtistic', 'deprecated': False},
+ 'clips': {'id': 'Clips', 'deprecated': False},
+ 'cmu-mach': {'id': 'CMU-Mach', 'deprecated': False},
+ 'cmu-mach-nodoc': {'id': 'CMU-Mach-nodoc', 'deprecated': False},
+ 'cnri-jython': {'id': 'CNRI-Jython', 'deprecated': False},
+ 'cnri-python': {'id': 'CNRI-Python', 'deprecated': False},
+ 'cnri-python-gpl-compatible': {'id': 'CNRI-Python-GPL-Compatible', 'deprecated': False},
+ 'coil-1.0': {'id': 'COIL-1.0', 'deprecated': False},
+ 'community-spec-1.0': {'id': 'Community-Spec-1.0', 'deprecated': False},
+ 'condor-1.1': {'id': 'Condor-1.1', 'deprecated': False},
+ 'copyleft-next-0.3.0': {'id': 'copyleft-next-0.3.0', 'deprecated': False},
+ 'copyleft-next-0.3.1': {'id': 'copyleft-next-0.3.1', 'deprecated': False},
+ 'cornell-lossless-jpeg': {'id': 'Cornell-Lossless-JPEG', 'deprecated': False},
+ 'cpal-1.0': {'id': 'CPAL-1.0', 'deprecated': False},
+ 'cpl-1.0': {'id': 'CPL-1.0', 'deprecated': False},
+ 'cpol-1.02': {'id': 'CPOL-1.02', 'deprecated': False},
+ 'cronyx': {'id': 'Cronyx', 'deprecated': False},
+ 'crossword': {'id': 'Crossword', 'deprecated': False},
+ 'cryptoswift': {'id': 'CryptoSwift', 'deprecated': False},
+ 'crystalstacker': {'id': 'CrystalStacker', 'deprecated': False},
+ 'cua-opl-1.0': {'id': 'CUA-OPL-1.0', 'deprecated': False},
+ 'cube': {'id': 'Cube', 'deprecated': False},
+ 'curl': {'id': 'curl', 'deprecated': False},
+ 'cve-tou': {'id': 'cve-tou', 'deprecated': False},
+ 'd-fsl-1.0': {'id': 'D-FSL-1.0', 'deprecated': False},
+ 'dec-3-clause': {'id': 'DEC-3-Clause', 'deprecated': False},
+ 'diffmark': {'id': 'diffmark', 'deprecated': False},
+ 'dl-de-by-2.0': {'id': 'DL-DE-BY-2.0', 'deprecated': False},
+ 'dl-de-zero-2.0': {'id': 'DL-DE-ZERO-2.0', 'deprecated': False},
+ 'doc': {'id': 'DOC', 'deprecated': False},
+ 'docbook-dtd': {'id': 'DocBook-DTD', 'deprecated': False},
+ 'docbook-schema': {'id': 'DocBook-Schema', 'deprecated': False},
+ 'docbook-stylesheet': {'id': 'DocBook-Stylesheet', 'deprecated': False},
+ 'docbook-xml': {'id': 'DocBook-XML', 'deprecated': False},
+ 'dotseqn': {'id': 'Dotseqn', 'deprecated': False},
+ 'drl-1.0': {'id': 'DRL-1.0', 'deprecated': False},
+ 'drl-1.1': {'id': 'DRL-1.1', 'deprecated': False},
+ 'dsdp': {'id': 'DSDP', 'deprecated': False},
+ 'dtoa': {'id': 'dtoa', 'deprecated': False},
+ 'dvipdfm': {'id': 'dvipdfm', 'deprecated': False},
+ 'ecl-1.0': {'id': 'ECL-1.0', 'deprecated': False},
+ 'ecl-2.0': {'id': 'ECL-2.0', 'deprecated': False},
+ 'ecos-2.0': {'id': 'eCos-2.0', 'deprecated': True},
+ 'efl-1.0': {'id': 'EFL-1.0', 'deprecated': False},
+ 'efl-2.0': {'id': 'EFL-2.0', 'deprecated': False},
+ 'egenix': {'id': 'eGenix', 'deprecated': False},
+ 'elastic-2.0': {'id': 'Elastic-2.0', 'deprecated': False},
+ 'entessa': {'id': 'Entessa', 'deprecated': False},
+ 'epics': {'id': 'EPICS', 'deprecated': False},
+ 'epl-1.0': {'id': 'EPL-1.0', 'deprecated': False},
+ 'epl-2.0': {'id': 'EPL-2.0', 'deprecated': False},
+ 'erlpl-1.1': {'id': 'ErlPL-1.1', 'deprecated': False},
+ 'etalab-2.0': {'id': 'etalab-2.0', 'deprecated': False},
+ 'eudatagrid': {'id': 'EUDatagrid', 'deprecated': False},
+ 'eupl-1.0': {'id': 'EUPL-1.0', 'deprecated': False},
+ 'eupl-1.1': {'id': 'EUPL-1.1', 'deprecated': False},
+ 'eupl-1.2': {'id': 'EUPL-1.2', 'deprecated': False},
+ 'eurosym': {'id': 'Eurosym', 'deprecated': False},
+ 'fair': {'id': 'Fair', 'deprecated': False},
+ 'fbm': {'id': 'FBM', 'deprecated': False},
+ 'fdk-aac': {'id': 'FDK-AAC', 'deprecated': False},
+ 'ferguson-twofish': {'id': 'Ferguson-Twofish', 'deprecated': False},
+ 'frameworx-1.0': {'id': 'Frameworx-1.0', 'deprecated': False},
+ 'freebsd-doc': {'id': 'FreeBSD-DOC', 'deprecated': False},
+ 'freeimage': {'id': 'FreeImage', 'deprecated': False},
+ 'fsfap': {'id': 'FSFAP', 'deprecated': False},
+ 'fsfap-no-warranty-disclaimer': {'id': 'FSFAP-no-warranty-disclaimer', 'deprecated': False},
+ 'fsful': {'id': 'FSFUL', 'deprecated': False},
+ 'fsfullr': {'id': 'FSFULLR', 'deprecated': False},
+ 'fsfullrsd': {'id': 'FSFULLRSD', 'deprecated': False},
+ 'fsfullrwd': {'id': 'FSFULLRWD', 'deprecated': False},
+ 'fsl-1.1-alv2': {'id': 'FSL-1.1-ALv2', 'deprecated': False},
+ 'fsl-1.1-mit': {'id': 'FSL-1.1-MIT', 'deprecated': False},
+ 'ftl': {'id': 'FTL', 'deprecated': False},
+ 'furuseth': {'id': 'Furuseth', 'deprecated': False},
+ 'fwlw': {'id': 'fwlw', 'deprecated': False},
+ 'game-programming-gems': {'id': 'Game-Programming-Gems', 'deprecated': False},
+ 'gcr-docs': {'id': 'GCR-docs', 'deprecated': False},
+ 'gd': {'id': 'GD', 'deprecated': False},
+ 'generic-xts': {'id': 'generic-xts', 'deprecated': False},
+ 'gfdl-1.1': {'id': 'GFDL-1.1', 'deprecated': True},
+ 'gfdl-1.1-invariants-only': {'id': 'GFDL-1.1-invariants-only', 'deprecated': False},
+ 'gfdl-1.1-invariants-or-later': {'id': 'GFDL-1.1-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.1-no-invariants-only': {'id': 'GFDL-1.1-no-invariants-only', 'deprecated': False},
+ 'gfdl-1.1-no-invariants-or-later': {'id': 'GFDL-1.1-no-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.1-only': {'id': 'GFDL-1.1-only', 'deprecated': False},
+ 'gfdl-1.1-or-later': {'id': 'GFDL-1.1-or-later', 'deprecated': False},
+ 'gfdl-1.2': {'id': 'GFDL-1.2', 'deprecated': True},
+ 'gfdl-1.2-invariants-only': {'id': 'GFDL-1.2-invariants-only', 'deprecated': False},
+ 'gfdl-1.2-invariants-or-later': {'id': 'GFDL-1.2-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.2-no-invariants-only': {'id': 'GFDL-1.2-no-invariants-only', 'deprecated': False},
+ 'gfdl-1.2-no-invariants-or-later': {'id': 'GFDL-1.2-no-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.2-only': {'id': 'GFDL-1.2-only', 'deprecated': False},
+ 'gfdl-1.2-or-later': {'id': 'GFDL-1.2-or-later', 'deprecated': False},
+ 'gfdl-1.3': {'id': 'GFDL-1.3', 'deprecated': True},
+ 'gfdl-1.3-invariants-only': {'id': 'GFDL-1.3-invariants-only', 'deprecated': False},
+ 'gfdl-1.3-invariants-or-later': {'id': 'GFDL-1.3-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.3-no-invariants-only': {'id': 'GFDL-1.3-no-invariants-only', 'deprecated': False},
+ 'gfdl-1.3-no-invariants-or-later': {'id': 'GFDL-1.3-no-invariants-or-later', 'deprecated': False},
+ 'gfdl-1.3-only': {'id': 'GFDL-1.3-only', 'deprecated': False},
+ 'gfdl-1.3-or-later': {'id': 'GFDL-1.3-or-later', 'deprecated': False},
+ 'giftware': {'id': 'Giftware', 'deprecated': False},
+ 'gl2ps': {'id': 'GL2PS', 'deprecated': False},
+ 'glide': {'id': 'Glide', 'deprecated': False},
+ 'glulxe': {'id': 'Glulxe', 'deprecated': False},
+ 'glwtpl': {'id': 'GLWTPL', 'deprecated': False},
+ 'gnuplot': {'id': 'gnuplot', 'deprecated': False},
+ 'gpl-1.0': {'id': 'GPL-1.0', 'deprecated': True},
+ 'gpl-1.0+': {'id': 'GPL-1.0+', 'deprecated': True},
+ 'gpl-1.0-only': {'id': 'GPL-1.0-only', 'deprecated': False},
+ 'gpl-1.0-or-later': {'id': 'GPL-1.0-or-later', 'deprecated': False},
+ 'gpl-2.0': {'id': 'GPL-2.0', 'deprecated': True},
+ 'gpl-2.0+': {'id': 'GPL-2.0+', 'deprecated': True},
+ 'gpl-2.0-only': {'id': 'GPL-2.0-only', 'deprecated': False},
+ 'gpl-2.0-or-later': {'id': 'GPL-2.0-or-later', 'deprecated': False},
+ 'gpl-2.0-with-autoconf-exception': {'id': 'GPL-2.0-with-autoconf-exception', 'deprecated': True},
+ 'gpl-2.0-with-bison-exception': {'id': 'GPL-2.0-with-bison-exception', 'deprecated': True},
+ 'gpl-2.0-with-classpath-exception': {'id': 'GPL-2.0-with-classpath-exception', 'deprecated': True},
+ 'gpl-2.0-with-font-exception': {'id': 'GPL-2.0-with-font-exception', 'deprecated': True},
+ 'gpl-2.0-with-gcc-exception': {'id': 'GPL-2.0-with-GCC-exception', 'deprecated': True},
+ 'gpl-3.0': {'id': 'GPL-3.0', 'deprecated': True},
+ 'gpl-3.0+': {'id': 'GPL-3.0+', 'deprecated': True},
+ 'gpl-3.0-only': {'id': 'GPL-3.0-only', 'deprecated': False},
+ 'gpl-3.0-or-later': {'id': 'GPL-3.0-or-later', 'deprecated': False},
+ 'gpl-3.0-with-autoconf-exception': {'id': 'GPL-3.0-with-autoconf-exception', 'deprecated': True},
+ 'gpl-3.0-with-gcc-exception': {'id': 'GPL-3.0-with-GCC-exception', 'deprecated': True},
+ 'graphics-gems': {'id': 'Graphics-Gems', 'deprecated': False},
+ 'gsoap-1.3b': {'id': 'gSOAP-1.3b', 'deprecated': False},
+ 'gtkbook': {'id': 'gtkbook', 'deprecated': False},
+ 'gutmann': {'id': 'Gutmann', 'deprecated': False},
+ 'haskellreport': {'id': 'HaskellReport', 'deprecated': False},
+ 'hdf5': {'id': 'HDF5', 'deprecated': False},
+ 'hdparm': {'id': 'hdparm', 'deprecated': False},
+ 'hidapi': {'id': 'HIDAPI', 'deprecated': False},
+ 'hippocratic-2.1': {'id': 'Hippocratic-2.1', 'deprecated': False},
+ 'hp-1986': {'id': 'HP-1986', 'deprecated': False},
+ 'hp-1989': {'id': 'HP-1989', 'deprecated': False},
+ 'hpnd': {'id': 'HPND', 'deprecated': False},
+ 'hpnd-dec': {'id': 'HPND-DEC', 'deprecated': False},
+ 'hpnd-doc': {'id': 'HPND-doc', 'deprecated': False},
+ 'hpnd-doc-sell': {'id': 'HPND-doc-sell', 'deprecated': False},
+ 'hpnd-export-us': {'id': 'HPND-export-US', 'deprecated': False},
+ 'hpnd-export-us-acknowledgement': {'id': 'HPND-export-US-acknowledgement', 'deprecated': False},
+ 'hpnd-export-us-modify': {'id': 'HPND-export-US-modify', 'deprecated': False},
+ 'hpnd-export2-us': {'id': 'HPND-export2-US', 'deprecated': False},
+ 'hpnd-fenneberg-livingston': {'id': 'HPND-Fenneberg-Livingston', 'deprecated': False},
+ 'hpnd-inria-imag': {'id': 'HPND-INRIA-IMAG', 'deprecated': False},
+ 'hpnd-intel': {'id': 'HPND-Intel', 'deprecated': False},
+ 'hpnd-kevlin-henney': {'id': 'HPND-Kevlin-Henney', 'deprecated': False},
+ 'hpnd-markus-kuhn': {'id': 'HPND-Markus-Kuhn', 'deprecated': False},
+ 'hpnd-merchantability-variant': {'id': 'HPND-merchantability-variant', 'deprecated': False},
+ 'hpnd-mit-disclaimer': {'id': 'HPND-MIT-disclaimer', 'deprecated': False},
+ 'hpnd-netrek': {'id': 'HPND-Netrek', 'deprecated': False},
+ 'hpnd-pbmplus': {'id': 'HPND-Pbmplus', 'deprecated': False},
+ 'hpnd-sell-mit-disclaimer-xserver': {'id': 'HPND-sell-MIT-disclaimer-xserver', 'deprecated': False},
+ 'hpnd-sell-regexpr': {'id': 'HPND-sell-regexpr', 'deprecated': False},
+ 'hpnd-sell-variant': {'id': 'HPND-sell-variant', 'deprecated': False},
+ 'hpnd-sell-variant-mit-disclaimer': {'id': 'HPND-sell-variant-MIT-disclaimer', 'deprecated': False},
+ 'hpnd-sell-variant-mit-disclaimer-rev': {'id': 'HPND-sell-variant-MIT-disclaimer-rev', 'deprecated': False},
+ 'hpnd-uc': {'id': 'HPND-UC', 'deprecated': False},
+ 'hpnd-uc-export-us': {'id': 'HPND-UC-export-US', 'deprecated': False},
+ 'htmltidy': {'id': 'HTMLTIDY', 'deprecated': False},
+ 'ibm-pibs': {'id': 'IBM-pibs', 'deprecated': False},
+ 'icu': {'id': 'ICU', 'deprecated': False},
+ 'iec-code-components-eula': {'id': 'IEC-Code-Components-EULA', 'deprecated': False},
+ 'ijg': {'id': 'IJG', 'deprecated': False},
+ 'ijg-short': {'id': 'IJG-short', 'deprecated': False},
+ 'imagemagick': {'id': 'ImageMagick', 'deprecated': False},
+ 'imatix': {'id': 'iMatix', 'deprecated': False},
+ 'imlib2': {'id': 'Imlib2', 'deprecated': False},
+ 'info-zip': {'id': 'Info-ZIP', 'deprecated': False},
+ 'inner-net-2.0': {'id': 'Inner-Net-2.0', 'deprecated': False},
+ 'innosetup': {'id': 'InnoSetup', 'deprecated': False},
+ 'intel': {'id': 'Intel', 'deprecated': False},
+ 'intel-acpi': {'id': 'Intel-ACPI', 'deprecated': False},
+ 'interbase-1.0': {'id': 'Interbase-1.0', 'deprecated': False},
+ 'ipa': {'id': 'IPA', 'deprecated': False},
+ 'ipl-1.0': {'id': 'IPL-1.0', 'deprecated': False},
+ 'isc': {'id': 'ISC', 'deprecated': False},
+ 'isc-veillard': {'id': 'ISC-Veillard', 'deprecated': False},
+ 'jam': {'id': 'Jam', 'deprecated': False},
+ 'jasper-2.0': {'id': 'JasPer-2.0', 'deprecated': False},
+ 'jove': {'id': 'jove', 'deprecated': False},
+ 'jpl-image': {'id': 'JPL-image', 'deprecated': False},
+ 'jpnic': {'id': 'JPNIC', 'deprecated': False},
+ 'json': {'id': 'JSON', 'deprecated': False},
+ 'kastrup': {'id': 'Kastrup', 'deprecated': False},
+ 'kazlib': {'id': 'Kazlib', 'deprecated': False},
+ 'knuth-ctan': {'id': 'Knuth-CTAN', 'deprecated': False},
+ 'lal-1.2': {'id': 'LAL-1.2', 'deprecated': False},
+ 'lal-1.3': {'id': 'LAL-1.3', 'deprecated': False},
+ 'latex2e': {'id': 'Latex2e', 'deprecated': False},
+ 'latex2e-translated-notice': {'id': 'Latex2e-translated-notice', 'deprecated': False},
+ 'leptonica': {'id': 'Leptonica', 'deprecated': False},
+ 'lgpl-2.0': {'id': 'LGPL-2.0', 'deprecated': True},
+ 'lgpl-2.0+': {'id': 'LGPL-2.0+', 'deprecated': True},
+ 'lgpl-2.0-only': {'id': 'LGPL-2.0-only', 'deprecated': False},
+ 'lgpl-2.0-or-later': {'id': 'LGPL-2.0-or-later', 'deprecated': False},
+ 'lgpl-2.1': {'id': 'LGPL-2.1', 'deprecated': True},
+ 'lgpl-2.1+': {'id': 'LGPL-2.1+', 'deprecated': True},
+ 'lgpl-2.1-only': {'id': 'LGPL-2.1-only', 'deprecated': False},
+ 'lgpl-2.1-or-later': {'id': 'LGPL-2.1-or-later', 'deprecated': False},
+ 'lgpl-3.0': {'id': 'LGPL-3.0', 'deprecated': True},
+ 'lgpl-3.0+': {'id': 'LGPL-3.0+', 'deprecated': True},
+ 'lgpl-3.0-only': {'id': 'LGPL-3.0-only', 'deprecated': False},
+ 'lgpl-3.0-or-later': {'id': 'LGPL-3.0-or-later', 'deprecated': False},
+ 'lgpllr': {'id': 'LGPLLR', 'deprecated': False},
+ 'libpng': {'id': 'Libpng', 'deprecated': False},
+ 'libpng-1.6.35': {'id': 'libpng-1.6.35', 'deprecated': False},
+ 'libpng-2.0': {'id': 'libpng-2.0', 'deprecated': False},
+ 'libselinux-1.0': {'id': 'libselinux-1.0', 'deprecated': False},
+ 'libtiff': {'id': 'libtiff', 'deprecated': False},
+ 'libutil-david-nugent': {'id': 'libutil-David-Nugent', 'deprecated': False},
+ 'liliq-p-1.1': {'id': 'LiLiQ-P-1.1', 'deprecated': False},
+ 'liliq-r-1.1': {'id': 'LiLiQ-R-1.1', 'deprecated': False},
+ 'liliq-rplus-1.1': {'id': 'LiLiQ-Rplus-1.1', 'deprecated': False},
+ 'linux-man-pages-1-para': {'id': 'Linux-man-pages-1-para', 'deprecated': False},
+ 'linux-man-pages-copyleft': {'id': 'Linux-man-pages-copyleft', 'deprecated': False},
+ 'linux-man-pages-copyleft-2-para': {'id': 'Linux-man-pages-copyleft-2-para', 'deprecated': False},
+ 'linux-man-pages-copyleft-var': {'id': 'Linux-man-pages-copyleft-var', 'deprecated': False},
+ 'linux-openib': {'id': 'Linux-OpenIB', 'deprecated': False},
+ 'loop': {'id': 'LOOP', 'deprecated': False},
+ 'lpd-document': {'id': 'LPD-document', 'deprecated': False},
+ 'lpl-1.0': {'id': 'LPL-1.0', 'deprecated': False},
+ 'lpl-1.02': {'id': 'LPL-1.02', 'deprecated': False},
+ 'lppl-1.0': {'id': 'LPPL-1.0', 'deprecated': False},
+ 'lppl-1.1': {'id': 'LPPL-1.1', 'deprecated': False},
+ 'lppl-1.2': {'id': 'LPPL-1.2', 'deprecated': False},
+ 'lppl-1.3a': {'id': 'LPPL-1.3a', 'deprecated': False},
+ 'lppl-1.3c': {'id': 'LPPL-1.3c', 'deprecated': False},
+ 'lsof': {'id': 'lsof', 'deprecated': False},
+ 'lucida-bitmap-fonts': {'id': 'Lucida-Bitmap-Fonts', 'deprecated': False},
+ 'lzma-sdk-9.11-to-9.20': {'id': 'LZMA-SDK-9.11-to-9.20', 'deprecated': False},
+ 'lzma-sdk-9.22': {'id': 'LZMA-SDK-9.22', 'deprecated': False},
+ 'mackerras-3-clause': {'id': 'Mackerras-3-Clause', 'deprecated': False},
+ 'mackerras-3-clause-acknowledgment': {'id': 'Mackerras-3-Clause-acknowledgment', 'deprecated': False},
+ 'magaz': {'id': 'magaz', 'deprecated': False},
+ 'mailprio': {'id': 'mailprio', 'deprecated': False},
+ 'makeindex': {'id': 'MakeIndex', 'deprecated': False},
+ 'man2html': {'id': 'man2html', 'deprecated': False},
+ 'martin-birgmeier': {'id': 'Martin-Birgmeier', 'deprecated': False},
+ 'mcphee-slideshow': {'id': 'McPhee-slideshow', 'deprecated': False},
+ 'metamail': {'id': 'metamail', 'deprecated': False},
+ 'minpack': {'id': 'Minpack', 'deprecated': False},
+ 'mips': {'id': 'MIPS', 'deprecated': False},
+ 'miros': {'id': 'MirOS', 'deprecated': False},
+ 'mit': {'id': 'MIT', 'deprecated': False},
+ 'mit-0': {'id': 'MIT-0', 'deprecated': False},
+ 'mit-advertising': {'id': 'MIT-advertising', 'deprecated': False},
+ 'mit-click': {'id': 'MIT-Click', 'deprecated': False},
+ 'mit-cmu': {'id': 'MIT-CMU', 'deprecated': False},
+ 'mit-enna': {'id': 'MIT-enna', 'deprecated': False},
+ 'mit-feh': {'id': 'MIT-feh', 'deprecated': False},
+ 'mit-festival': {'id': 'MIT-Festival', 'deprecated': False},
+ 'mit-khronos-old': {'id': 'MIT-Khronos-old', 'deprecated': False},
+ 'mit-modern-variant': {'id': 'MIT-Modern-Variant', 'deprecated': False},
+ 'mit-open-group': {'id': 'MIT-open-group', 'deprecated': False},
+ 'mit-testregex': {'id': 'MIT-testregex', 'deprecated': False},
+ 'mit-wu': {'id': 'MIT-Wu', 'deprecated': False},
+ 'mitnfa': {'id': 'MITNFA', 'deprecated': False},
+ 'mmixware': {'id': 'MMIXware', 'deprecated': False},
+ 'motosoto': {'id': 'Motosoto', 'deprecated': False},
+ 'mpeg-ssg': {'id': 'MPEG-SSG', 'deprecated': False},
+ 'mpi-permissive': {'id': 'mpi-permissive', 'deprecated': False},
+ 'mpich2': {'id': 'mpich2', 'deprecated': False},
+ 'mpl-1.0': {'id': 'MPL-1.0', 'deprecated': False},
+ 'mpl-1.1': {'id': 'MPL-1.1', 'deprecated': False},
+ 'mpl-2.0': {'id': 'MPL-2.0', 'deprecated': False},
+ 'mpl-2.0-no-copyleft-exception': {'id': 'MPL-2.0-no-copyleft-exception', 'deprecated': False},
+ 'mplus': {'id': 'mplus', 'deprecated': False},
+ 'ms-lpl': {'id': 'MS-LPL', 'deprecated': False},
+ 'ms-pl': {'id': 'MS-PL', 'deprecated': False},
+ 'ms-rl': {'id': 'MS-RL', 'deprecated': False},
+ 'mtll': {'id': 'MTLL', 'deprecated': False},
+ 'mulanpsl-1.0': {'id': 'MulanPSL-1.0', 'deprecated': False},
+ 'mulanpsl-2.0': {'id': 'MulanPSL-2.0', 'deprecated': False},
+ 'multics': {'id': 'Multics', 'deprecated': False},
+ 'mup': {'id': 'Mup', 'deprecated': False},
+ 'naist-2003': {'id': 'NAIST-2003', 'deprecated': False},
+ 'nasa-1.3': {'id': 'NASA-1.3', 'deprecated': False},
+ 'naumen': {'id': 'Naumen', 'deprecated': False},
+ 'nbpl-1.0': {'id': 'NBPL-1.0', 'deprecated': False},
+ 'ncbi-pd': {'id': 'NCBI-PD', 'deprecated': False},
+ 'ncgl-uk-2.0': {'id': 'NCGL-UK-2.0', 'deprecated': False},
+ 'ncl': {'id': 'NCL', 'deprecated': False},
+ 'ncsa': {'id': 'NCSA', 'deprecated': False},
+ 'net-snmp': {'id': 'Net-SNMP', 'deprecated': True},
+ 'netcdf': {'id': 'NetCDF', 'deprecated': False},
+ 'newsletr': {'id': 'Newsletr', 'deprecated': False},
+ 'ngpl': {'id': 'NGPL', 'deprecated': False},
+ 'ngrep': {'id': 'ngrep', 'deprecated': False},
+ 'nicta-1.0': {'id': 'NICTA-1.0', 'deprecated': False},
+ 'nist-pd': {'id': 'NIST-PD', 'deprecated': False},
+ 'nist-pd-fallback': {'id': 'NIST-PD-fallback', 'deprecated': False},
+ 'nist-software': {'id': 'NIST-Software', 'deprecated': False},
+ 'nlod-1.0': {'id': 'NLOD-1.0', 'deprecated': False},
+ 'nlod-2.0': {'id': 'NLOD-2.0', 'deprecated': False},
+ 'nlpl': {'id': 'NLPL', 'deprecated': False},
+ 'nokia': {'id': 'Nokia', 'deprecated': False},
+ 'nosl': {'id': 'NOSL', 'deprecated': False},
+ 'noweb': {'id': 'Noweb', 'deprecated': False},
+ 'npl-1.0': {'id': 'NPL-1.0', 'deprecated': False},
+ 'npl-1.1': {'id': 'NPL-1.1', 'deprecated': False},
+ 'nposl-3.0': {'id': 'NPOSL-3.0', 'deprecated': False},
+ 'nrl': {'id': 'NRL', 'deprecated': False},
+ 'ntia-pd': {'id': 'NTIA-PD', 'deprecated': False},
+ 'ntp': {'id': 'NTP', 'deprecated': False},
+ 'ntp-0': {'id': 'NTP-0', 'deprecated': False},
+ 'nunit': {'id': 'Nunit', 'deprecated': True},
+ 'o-uda-1.0': {'id': 'O-UDA-1.0', 'deprecated': False},
+ 'oar': {'id': 'OAR', 'deprecated': False},
+ 'occt-pl': {'id': 'OCCT-PL', 'deprecated': False},
+ 'oclc-2.0': {'id': 'OCLC-2.0', 'deprecated': False},
+ 'odbl-1.0': {'id': 'ODbL-1.0', 'deprecated': False},
+ 'odc-by-1.0': {'id': 'ODC-By-1.0', 'deprecated': False},
+ 'offis': {'id': 'OFFIS', 'deprecated': False},
+ 'ofl-1.0': {'id': 'OFL-1.0', 'deprecated': False},
+ 'ofl-1.0-no-rfn': {'id': 'OFL-1.0-no-RFN', 'deprecated': False},
+ 'ofl-1.0-rfn': {'id': 'OFL-1.0-RFN', 'deprecated': False},
+ 'ofl-1.1': {'id': 'OFL-1.1', 'deprecated': False},
+ 'ofl-1.1-no-rfn': {'id': 'OFL-1.1-no-RFN', 'deprecated': False},
+ 'ofl-1.1-rfn': {'id': 'OFL-1.1-RFN', 'deprecated': False},
+ 'ogc-1.0': {'id': 'OGC-1.0', 'deprecated': False},
+ 'ogdl-taiwan-1.0': {'id': 'OGDL-Taiwan-1.0', 'deprecated': False},
+ 'ogl-canada-2.0': {'id': 'OGL-Canada-2.0', 'deprecated': False},
+ 'ogl-uk-1.0': {'id': 'OGL-UK-1.0', 'deprecated': False},
+ 'ogl-uk-2.0': {'id': 'OGL-UK-2.0', 'deprecated': False},
+ 'ogl-uk-3.0': {'id': 'OGL-UK-3.0', 'deprecated': False},
+ 'ogtsl': {'id': 'OGTSL', 'deprecated': False},
+ 'oldap-1.1': {'id': 'OLDAP-1.1', 'deprecated': False},
+ 'oldap-1.2': {'id': 'OLDAP-1.2', 'deprecated': False},
+ 'oldap-1.3': {'id': 'OLDAP-1.3', 'deprecated': False},
+ 'oldap-1.4': {'id': 'OLDAP-1.4', 'deprecated': False},
+ 'oldap-2.0': {'id': 'OLDAP-2.0', 'deprecated': False},
+ 'oldap-2.0.1': {'id': 'OLDAP-2.0.1', 'deprecated': False},
+ 'oldap-2.1': {'id': 'OLDAP-2.1', 'deprecated': False},
+ 'oldap-2.2': {'id': 'OLDAP-2.2', 'deprecated': False},
+ 'oldap-2.2.1': {'id': 'OLDAP-2.2.1', 'deprecated': False},
+ 'oldap-2.2.2': {'id': 'OLDAP-2.2.2', 'deprecated': False},
+ 'oldap-2.3': {'id': 'OLDAP-2.3', 'deprecated': False},
+ 'oldap-2.4': {'id': 'OLDAP-2.4', 'deprecated': False},
+ 'oldap-2.5': {'id': 'OLDAP-2.5', 'deprecated': False},
+ 'oldap-2.6': {'id': 'OLDAP-2.6', 'deprecated': False},
+ 'oldap-2.7': {'id': 'OLDAP-2.7', 'deprecated': False},
+ 'oldap-2.8': {'id': 'OLDAP-2.8', 'deprecated': False},
+ 'olfl-1.3': {'id': 'OLFL-1.3', 'deprecated': False},
+ 'oml': {'id': 'OML', 'deprecated': False},
+ 'openpbs-2.3': {'id': 'OpenPBS-2.3', 'deprecated': False},
+ 'openssl': {'id': 'OpenSSL', 'deprecated': False},
+ 'openssl-standalone': {'id': 'OpenSSL-standalone', 'deprecated': False},
+ 'openvision': {'id': 'OpenVision', 'deprecated': False},
+ 'opl-1.0': {'id': 'OPL-1.0', 'deprecated': False},
+ 'opl-uk-3.0': {'id': 'OPL-UK-3.0', 'deprecated': False},
+ 'opubl-1.0': {'id': 'OPUBL-1.0', 'deprecated': False},
+ 'oset-pl-2.1': {'id': 'OSET-PL-2.1', 'deprecated': False},
+ 'osl-1.0': {'id': 'OSL-1.0', 'deprecated': False},
+ 'osl-1.1': {'id': 'OSL-1.1', 'deprecated': False},
+ 'osl-2.0': {'id': 'OSL-2.0', 'deprecated': False},
+ 'osl-2.1': {'id': 'OSL-2.1', 'deprecated': False},
+ 'osl-3.0': {'id': 'OSL-3.0', 'deprecated': False},
+ 'padl': {'id': 'PADL', 'deprecated': False},
+ 'parity-6.0.0': {'id': 'Parity-6.0.0', 'deprecated': False},
+ 'parity-7.0.0': {'id': 'Parity-7.0.0', 'deprecated': False},
+ 'pddl-1.0': {'id': 'PDDL-1.0', 'deprecated': False},
+ 'php-3.0': {'id': 'PHP-3.0', 'deprecated': False},
+ 'php-3.01': {'id': 'PHP-3.01', 'deprecated': False},
+ 'pixar': {'id': 'Pixar', 'deprecated': False},
+ 'pkgconf': {'id': 'pkgconf', 'deprecated': False},
+ 'plexus': {'id': 'Plexus', 'deprecated': False},
+ 'pnmstitch': {'id': 'pnmstitch', 'deprecated': False},
+ 'polyform-noncommercial-1.0.0': {'id': 'PolyForm-Noncommercial-1.0.0', 'deprecated': False},
+ 'polyform-small-business-1.0.0': {'id': 'PolyForm-Small-Business-1.0.0', 'deprecated': False},
+ 'postgresql': {'id': 'PostgreSQL', 'deprecated': False},
+ 'ppl': {'id': 'PPL', 'deprecated': False},
+ 'psf-2.0': {'id': 'PSF-2.0', 'deprecated': False},
+ 'psfrag': {'id': 'psfrag', 'deprecated': False},
+ 'psutils': {'id': 'psutils', 'deprecated': False},
+ 'python-2.0': {'id': 'Python-2.0', 'deprecated': False},
+ 'python-2.0.1': {'id': 'Python-2.0.1', 'deprecated': False},
+ 'python-ldap': {'id': 'python-ldap', 'deprecated': False},
+ 'qhull': {'id': 'Qhull', 'deprecated': False},
+ 'qpl-1.0': {'id': 'QPL-1.0', 'deprecated': False},
+ 'qpl-1.0-inria-2004': {'id': 'QPL-1.0-INRIA-2004', 'deprecated': False},
+ 'radvd': {'id': 'radvd', 'deprecated': False},
+ 'rdisc': {'id': 'Rdisc', 'deprecated': False},
+ 'rhecos-1.1': {'id': 'RHeCos-1.1', 'deprecated': False},
+ 'rpl-1.1': {'id': 'RPL-1.1', 'deprecated': False},
+ 'rpl-1.5': {'id': 'RPL-1.5', 'deprecated': False},
+ 'rpsl-1.0': {'id': 'RPSL-1.0', 'deprecated': False},
+ 'rsa-md': {'id': 'RSA-MD', 'deprecated': False},
+ 'rscpl': {'id': 'RSCPL', 'deprecated': False},
+ 'ruby': {'id': 'Ruby', 'deprecated': False},
+ 'ruby-pty': {'id': 'Ruby-pty', 'deprecated': False},
+ 'sax-pd': {'id': 'SAX-PD', 'deprecated': False},
+ 'sax-pd-2.0': {'id': 'SAX-PD-2.0', 'deprecated': False},
+ 'saxpath': {'id': 'Saxpath', 'deprecated': False},
+ 'scea': {'id': 'SCEA', 'deprecated': False},
+ 'schemereport': {'id': 'SchemeReport', 'deprecated': False},
+ 'sendmail': {'id': 'Sendmail', 'deprecated': False},
+ 'sendmail-8.23': {'id': 'Sendmail-8.23', 'deprecated': False},
+ 'sendmail-open-source-1.1': {'id': 'Sendmail-Open-Source-1.1', 'deprecated': False},
+ 'sgi-b-1.0': {'id': 'SGI-B-1.0', 'deprecated': False},
+ 'sgi-b-1.1': {'id': 'SGI-B-1.1', 'deprecated': False},
+ 'sgi-b-2.0': {'id': 'SGI-B-2.0', 'deprecated': False},
+ 'sgi-opengl': {'id': 'SGI-OpenGL', 'deprecated': False},
+ 'sgp4': {'id': 'SGP4', 'deprecated': False},
+ 'shl-0.5': {'id': 'SHL-0.5', 'deprecated': False},
+ 'shl-0.51': {'id': 'SHL-0.51', 'deprecated': False},
+ 'simpl-2.0': {'id': 'SimPL-2.0', 'deprecated': False},
+ 'sissl': {'id': 'SISSL', 'deprecated': False},
+ 'sissl-1.2': {'id': 'SISSL-1.2', 'deprecated': False},
+ 'sl': {'id': 'SL', 'deprecated': False},
+ 'sleepycat': {'id': 'Sleepycat', 'deprecated': False},
+ 'smail-gpl': {'id': 'SMAIL-GPL', 'deprecated': False},
+ 'smlnj': {'id': 'SMLNJ', 'deprecated': False},
+ 'smppl': {'id': 'SMPPL', 'deprecated': False},
+ 'snia': {'id': 'SNIA', 'deprecated': False},
+ 'snprintf': {'id': 'snprintf', 'deprecated': False},
+ 'sofa': {'id': 'SOFA', 'deprecated': False},
+ 'softsurfer': {'id': 'softSurfer', 'deprecated': False},
+ 'soundex': {'id': 'Soundex', 'deprecated': False},
+ 'spencer-86': {'id': 'Spencer-86', 'deprecated': False},
+ 'spencer-94': {'id': 'Spencer-94', 'deprecated': False},
+ 'spencer-99': {'id': 'Spencer-99', 'deprecated': False},
+ 'spl-1.0': {'id': 'SPL-1.0', 'deprecated': False},
+ 'ssh-keyscan': {'id': 'ssh-keyscan', 'deprecated': False},
+ 'ssh-openssh': {'id': 'SSH-OpenSSH', 'deprecated': False},
+ 'ssh-short': {'id': 'SSH-short', 'deprecated': False},
+ 'ssleay-standalone': {'id': 'SSLeay-standalone', 'deprecated': False},
+ 'sspl-1.0': {'id': 'SSPL-1.0', 'deprecated': False},
+ 'standardml-nj': {'id': 'StandardML-NJ', 'deprecated': True},
+ 'sugarcrm-1.1.3': {'id': 'SugarCRM-1.1.3', 'deprecated': False},
+ 'sul-1.0': {'id': 'SUL-1.0', 'deprecated': False},
+ 'sun-ppp': {'id': 'Sun-PPP', 'deprecated': False},
+ 'sun-ppp-2000': {'id': 'Sun-PPP-2000', 'deprecated': False},
+ 'sunpro': {'id': 'SunPro', 'deprecated': False},
+ 'swl': {'id': 'SWL', 'deprecated': False},
+ 'swrule': {'id': 'swrule', 'deprecated': False},
+ 'symlinks': {'id': 'Symlinks', 'deprecated': False},
+ 'tapr-ohl-1.0': {'id': 'TAPR-OHL-1.0', 'deprecated': False},
+ 'tcl': {'id': 'TCL', 'deprecated': False},
+ 'tcp-wrappers': {'id': 'TCP-wrappers', 'deprecated': False},
+ 'termreadkey': {'id': 'TermReadKey', 'deprecated': False},
+ 'tgppl-1.0': {'id': 'TGPPL-1.0', 'deprecated': False},
+ 'thirdeye': {'id': 'ThirdEye', 'deprecated': False},
+ 'threeparttable': {'id': 'threeparttable', 'deprecated': False},
+ 'tmate': {'id': 'TMate', 'deprecated': False},
+ 'torque-1.1': {'id': 'TORQUE-1.1', 'deprecated': False},
+ 'tosl': {'id': 'TOSL', 'deprecated': False},
+ 'tpdl': {'id': 'TPDL', 'deprecated': False},
+ 'tpl-1.0': {'id': 'TPL-1.0', 'deprecated': False},
+ 'trustedqsl': {'id': 'TrustedQSL', 'deprecated': False},
+ 'ttwl': {'id': 'TTWL', 'deprecated': False},
+ 'ttyp0': {'id': 'TTYP0', 'deprecated': False},
+ 'tu-berlin-1.0': {'id': 'TU-Berlin-1.0', 'deprecated': False},
+ 'tu-berlin-2.0': {'id': 'TU-Berlin-2.0', 'deprecated': False},
+ 'ubuntu-font-1.0': {'id': 'Ubuntu-font-1.0', 'deprecated': False},
+ 'ucar': {'id': 'UCAR', 'deprecated': False},
+ 'ucl-1.0': {'id': 'UCL-1.0', 'deprecated': False},
+ 'ulem': {'id': 'ulem', 'deprecated': False},
+ 'umich-merit': {'id': 'UMich-Merit', 'deprecated': False},
+ 'unicode-3.0': {'id': 'Unicode-3.0', 'deprecated': False},
+ 'unicode-dfs-2015': {'id': 'Unicode-DFS-2015', 'deprecated': False},
+ 'unicode-dfs-2016': {'id': 'Unicode-DFS-2016', 'deprecated': False},
+ 'unicode-tou': {'id': 'Unicode-TOU', 'deprecated': False},
+ 'unixcrypt': {'id': 'UnixCrypt', 'deprecated': False},
+ 'unlicense': {'id': 'Unlicense', 'deprecated': False},
+ 'unlicense-libtelnet': {'id': 'Unlicense-libtelnet', 'deprecated': False},
+ 'unlicense-libwhirlpool': {'id': 'Unlicense-libwhirlpool', 'deprecated': False},
+ 'upl-1.0': {'id': 'UPL-1.0', 'deprecated': False},
+ 'urt-rle': {'id': 'URT-RLE', 'deprecated': False},
+ 'vim': {'id': 'Vim', 'deprecated': False},
+ 'vostrom': {'id': 'VOSTROM', 'deprecated': False},
+ 'vsl-1.0': {'id': 'VSL-1.0', 'deprecated': False},
+ 'w3c': {'id': 'W3C', 'deprecated': False},
+ 'w3c-19980720': {'id': 'W3C-19980720', 'deprecated': False},
+ 'w3c-20150513': {'id': 'W3C-20150513', 'deprecated': False},
+ 'w3m': {'id': 'w3m', 'deprecated': False},
+ 'watcom-1.0': {'id': 'Watcom-1.0', 'deprecated': False},
+ 'widget-workshop': {'id': 'Widget-Workshop', 'deprecated': False},
+ 'wsuipa': {'id': 'Wsuipa', 'deprecated': False},
+ 'wtfpl': {'id': 'WTFPL', 'deprecated': False},
+ 'wwl': {'id': 'wwl', 'deprecated': False},
+ 'wxwindows': {'id': 'wxWindows', 'deprecated': True},
+ 'x11': {'id': 'X11', 'deprecated': False},
+ 'x11-distribute-modifications-variant': {'id': 'X11-distribute-modifications-variant', 'deprecated': False},
+ 'x11-swapped': {'id': 'X11-swapped', 'deprecated': False},
+ 'xdebug-1.03': {'id': 'Xdebug-1.03', 'deprecated': False},
+ 'xerox': {'id': 'Xerox', 'deprecated': False},
+ 'xfig': {'id': 'Xfig', 'deprecated': False},
+ 'xfree86-1.1': {'id': 'XFree86-1.1', 'deprecated': False},
+ 'xinetd': {'id': 'xinetd', 'deprecated': False},
+ 'xkeyboard-config-zinoviev': {'id': 'xkeyboard-config-Zinoviev', 'deprecated': False},
+ 'xlock': {'id': 'xlock', 'deprecated': False},
+ 'xnet': {'id': 'Xnet', 'deprecated': False},
+ 'xpp': {'id': 'xpp', 'deprecated': False},
+ 'xskat': {'id': 'XSkat', 'deprecated': False},
+ 'xzoom': {'id': 'xzoom', 'deprecated': False},
+ 'ypl-1.0': {'id': 'YPL-1.0', 'deprecated': False},
+ 'ypl-1.1': {'id': 'YPL-1.1', 'deprecated': False},
+ 'zed': {'id': 'Zed', 'deprecated': False},
+ 'zeeff': {'id': 'Zeeff', 'deprecated': False},
+ 'zend-2.0': {'id': 'Zend-2.0', 'deprecated': False},
+ 'zimbra-1.3': {'id': 'Zimbra-1.3', 'deprecated': False},
+ 'zimbra-1.4': {'id': 'Zimbra-1.4', 'deprecated': False},
+ 'zlib': {'id': 'Zlib', 'deprecated': False},
+ 'zlib-acknowledgement': {'id': 'zlib-acknowledgement', 'deprecated': False},
+ 'zpl-1.1': {'id': 'ZPL-1.1', 'deprecated': False},
+ 'zpl-2.0': {'id': 'ZPL-2.0', 'deprecated': False},
+ 'zpl-2.1': {'id': 'ZPL-2.1', 'deprecated': False},
+}
+
+EXCEPTIONS: dict[str, SPDXException] = {
+ '389-exception': {'id': '389-exception', 'deprecated': False},
+ 'asterisk-exception': {'id': 'Asterisk-exception', 'deprecated': False},
+ 'asterisk-linking-protocols-exception': {'id': 'Asterisk-linking-protocols-exception', 'deprecated': False},
+ 'autoconf-exception-2.0': {'id': 'Autoconf-exception-2.0', 'deprecated': False},
+ 'autoconf-exception-3.0': {'id': 'Autoconf-exception-3.0', 'deprecated': False},
+ 'autoconf-exception-generic': {'id': 'Autoconf-exception-generic', 'deprecated': False},
+ 'autoconf-exception-generic-3.0': {'id': 'Autoconf-exception-generic-3.0', 'deprecated': False},
+ 'autoconf-exception-macro': {'id': 'Autoconf-exception-macro', 'deprecated': False},
+ 'bison-exception-1.24': {'id': 'Bison-exception-1.24', 'deprecated': False},
+ 'bison-exception-2.2': {'id': 'Bison-exception-2.2', 'deprecated': False},
+ 'bootloader-exception': {'id': 'Bootloader-exception', 'deprecated': False},
+ 'cgal-linking-exception': {'id': 'CGAL-linking-exception', 'deprecated': False},
+ 'classpath-exception-2.0': {'id': 'Classpath-exception-2.0', 'deprecated': False},
+ 'clisp-exception-2.0': {'id': 'CLISP-exception-2.0', 'deprecated': False},
+ 'cryptsetup-openssl-exception': {'id': 'cryptsetup-OpenSSL-exception', 'deprecated': False},
+ 'digia-qt-lgpl-exception-1.1': {'id': 'Digia-Qt-LGPL-exception-1.1', 'deprecated': False},
+ 'digirule-foss-exception': {'id': 'DigiRule-FOSS-exception', 'deprecated': False},
+ 'ecos-exception-2.0': {'id': 'eCos-exception-2.0', 'deprecated': False},
+ 'erlang-otp-linking-exception': {'id': 'erlang-otp-linking-exception', 'deprecated': False},
+ 'fawkes-runtime-exception': {'id': 'Fawkes-Runtime-exception', 'deprecated': False},
+ 'fltk-exception': {'id': 'FLTK-exception', 'deprecated': False},
+ 'fmt-exception': {'id': 'fmt-exception', 'deprecated': False},
+ 'font-exception-2.0': {'id': 'Font-exception-2.0', 'deprecated': False},
+ 'freertos-exception-2.0': {'id': 'freertos-exception-2.0', 'deprecated': False},
+ 'gcc-exception-2.0': {'id': 'GCC-exception-2.0', 'deprecated': False},
+ 'gcc-exception-2.0-note': {'id': 'GCC-exception-2.0-note', 'deprecated': False},
+ 'gcc-exception-3.1': {'id': 'GCC-exception-3.1', 'deprecated': False},
+ 'gmsh-exception': {'id': 'Gmsh-exception', 'deprecated': False},
+ 'gnat-exception': {'id': 'GNAT-exception', 'deprecated': False},
+ 'gnome-examples-exception': {'id': 'GNOME-examples-exception', 'deprecated': False},
+ 'gnu-compiler-exception': {'id': 'GNU-compiler-exception', 'deprecated': False},
+ 'gnu-javamail-exception': {'id': 'gnu-javamail-exception', 'deprecated': False},
+ 'gpl-3.0-389-ds-base-exception': {'id': 'GPL-3.0-389-ds-base-exception', 'deprecated': False},
+ 'gpl-3.0-interface-exception': {'id': 'GPL-3.0-interface-exception', 'deprecated': False},
+ 'gpl-3.0-linking-exception': {'id': 'GPL-3.0-linking-exception', 'deprecated': False},
+ 'gpl-3.0-linking-source-exception': {'id': 'GPL-3.0-linking-source-exception', 'deprecated': False},
+ 'gpl-cc-1.0': {'id': 'GPL-CC-1.0', 'deprecated': False},
+ 'gstreamer-exception-2005': {'id': 'GStreamer-exception-2005', 'deprecated': False},
+ 'gstreamer-exception-2008': {'id': 'GStreamer-exception-2008', 'deprecated': False},
+ 'harbour-exception': {'id': 'harbour-exception', 'deprecated': False},
+ 'i2p-gpl-java-exception': {'id': 'i2p-gpl-java-exception', 'deprecated': False},
+ 'independent-modules-exception': {'id': 'Independent-modules-exception', 'deprecated': False},
+ 'kicad-libraries-exception': {'id': 'KiCad-libraries-exception', 'deprecated': False},
+ 'lgpl-3.0-linking-exception': {'id': 'LGPL-3.0-linking-exception', 'deprecated': False},
+ 'libpri-openh323-exception': {'id': 'libpri-OpenH323-exception', 'deprecated': False},
+ 'libtool-exception': {'id': 'Libtool-exception', 'deprecated': False},
+ 'linux-syscall-note': {'id': 'Linux-syscall-note', 'deprecated': False},
+ 'llgpl': {'id': 'LLGPL', 'deprecated': False},
+ 'llvm-exception': {'id': 'LLVM-exception', 'deprecated': False},
+ 'lzma-exception': {'id': 'LZMA-exception', 'deprecated': False},
+ 'mif-exception': {'id': 'mif-exception', 'deprecated': False},
+ 'mxml-exception': {'id': 'mxml-exception', 'deprecated': False},
+ 'nokia-qt-exception-1.1': {'id': 'Nokia-Qt-exception-1.1', 'deprecated': True},
+ 'ocaml-lgpl-linking-exception': {'id': 'OCaml-LGPL-linking-exception', 'deprecated': False},
+ 'occt-exception-1.0': {'id': 'OCCT-exception-1.0', 'deprecated': False},
+ 'openjdk-assembly-exception-1.0': {'id': 'OpenJDK-assembly-exception-1.0', 'deprecated': False},
+ 'openvpn-openssl-exception': {'id': 'openvpn-openssl-exception', 'deprecated': False},
+ 'pcre2-exception': {'id': 'PCRE2-exception', 'deprecated': False},
+ 'polyparse-exception': {'id': 'polyparse-exception', 'deprecated': False},
+ 'ps-or-pdf-font-exception-20170817': {'id': 'PS-or-PDF-font-exception-20170817', 'deprecated': False},
+ 'qpl-1.0-inria-2004-exception': {'id': 'QPL-1.0-INRIA-2004-exception', 'deprecated': False},
+ 'qt-gpl-exception-1.0': {'id': 'Qt-GPL-exception-1.0', 'deprecated': False},
+ 'qt-lgpl-exception-1.1': {'id': 'Qt-LGPL-exception-1.1', 'deprecated': False},
+ 'qwt-exception-1.0': {'id': 'Qwt-exception-1.0', 'deprecated': False},
+ 'romic-exception': {'id': 'romic-exception', 'deprecated': False},
+ 'rrdtool-floss-exception-2.0': {'id': 'RRDtool-FLOSS-exception-2.0', 'deprecated': False},
+ 'sane-exception': {'id': 'SANE-exception', 'deprecated': False},
+ 'shl-2.0': {'id': 'SHL-2.0', 'deprecated': False},
+ 'shl-2.1': {'id': 'SHL-2.1', 'deprecated': False},
+ 'stunnel-exception': {'id': 'stunnel-exception', 'deprecated': False},
+ 'swi-exception': {'id': 'SWI-exception', 'deprecated': False},
+ 'swift-exception': {'id': 'Swift-exception', 'deprecated': False},
+ 'texinfo-exception': {'id': 'Texinfo-exception', 'deprecated': False},
+ 'u-boot-exception-2.0': {'id': 'u-boot-exception-2.0', 'deprecated': False},
+ 'ubdl-exception': {'id': 'UBDL-exception', 'deprecated': False},
+ 'universal-foss-exception-1.0': {'id': 'Universal-FOSS-exception-1.0', 'deprecated': False},
+ 'vsftpd-openssl-exception': {'id': 'vsftpd-openssl-exception', 'deprecated': False},
+ 'wxwindows-exception-3.1': {'id': 'WxWindows-exception-3.1', 'deprecated': False},
+ 'x11vnc-openssl-exception': {'id': 'x11vnc-openssl-exception', 'deprecated': False},
+}
diff --git a/venv/lib/python3.10/site-packages/packaging/markers.py b/venv/lib/python3.10/site-packages/packaging/markers.py
new file mode 100644
index 0000000000000000000000000000000000000000..ca3706fe492f4cf0762f7734d84c2d269f88bbc5
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/markers.py
@@ -0,0 +1,388 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+
+from __future__ import annotations
+
+import operator
+import os
+import platform
+import sys
+from typing import AbstractSet, Callable, Literal, Mapping, TypedDict, Union, cast
+
+from ._parser import MarkerAtom, MarkerList, Op, Value, Variable
+from ._parser import parse_marker as _parse_marker
+from ._tokenizer import ParserSyntaxError
+from .specifiers import InvalidSpecifier, Specifier
+from .utils import canonicalize_name
+
+__all__ = [
+ "Environment",
+ "EvaluateContext",
+ "InvalidMarker",
+ "Marker",
+ "UndefinedComparison",
+ "UndefinedEnvironmentName",
+ "default_environment",
+]
+
+Operator = Callable[[str, Union[str, AbstractSet[str]]], bool]
+EvaluateContext = Literal["metadata", "lock_file", "requirement"]
+MARKERS_ALLOWING_SET = {"extras", "dependency_groups"}
+MARKERS_REQUIRING_VERSION = {
+ "implementation_version",
+ "platform_release",
+ "python_full_version",
+ "python_version",
+}
+
+
+class InvalidMarker(ValueError):
+ """
+ An invalid marker was found, users should refer to PEP 508.
+ """
+
+
+class UndefinedComparison(ValueError):
+ """
+ An invalid operation was attempted on a value that doesn't support it.
+ """
+
+
+class UndefinedEnvironmentName(ValueError):
+ """
+ A name was attempted to be used that does not exist inside of the
+ environment.
+ """
+
+
+class Environment(TypedDict):
+ implementation_name: str
+ """The implementation's identifier, e.g. ``'cpython'``."""
+
+ implementation_version: str
+ """
+ The implementation's version, e.g. ``'3.13.0a2'`` for CPython 3.13.0a2, or
+ ``'7.3.13'`` for PyPy3.10 v7.3.13.
+ """
+
+ os_name: str
+ """
+ The value of :py:data:`os.name`. The name of the operating system dependent module
+ imported, e.g. ``'posix'``.
+ """
+
+ platform_machine: str
+ """
+ Returns the machine type, e.g. ``'i386'``.
+
+ An empty string if the value cannot be determined.
+ """
+
+ platform_release: str
+ """
+ The system's release, e.g. ``'2.2.0'`` or ``'NT'``.
+
+ An empty string if the value cannot be determined.
+ """
+
+ platform_system: str
+ """
+ The system/OS name, e.g. ``'Linux'``, ``'Windows'`` or ``'Java'``.
+
+ An empty string if the value cannot be determined.
+ """
+
+ platform_version: str
+ """
+ The system's release version, e.g. ``'#3 on degas'``.
+
+ An empty string if the value cannot be determined.
+ """
+
+ python_full_version: str
+ """
+ The Python version as string ``'major.minor.patchlevel'``.
+
+ Note that unlike the Python :py:data:`sys.version`, this value will always include
+ the patchlevel (it defaults to 0).
+ """
+
+ platform_python_implementation: str
+ """
+ A string identifying the Python implementation, e.g. ``'CPython'``.
+ """
+
+ python_version: str
+ """The Python version as string ``'major.minor'``."""
+
+ sys_platform: str
+ """
+ This string contains a platform identifier that can be used to append
+ platform-specific components to :py:data:`sys.path`, for instance.
+
+ For Unix systems, except on Linux and AIX, this is the lowercased OS name as
+ returned by ``uname -s`` with the first part of the version as returned by
+ ``uname -r`` appended, e.g. ``'sunos5'`` or ``'freebsd8'``, at the time when Python
+ was built.
+ """
+
+
+def _normalize_extras(
+ result: MarkerList | MarkerAtom | str,
+) -> MarkerList | MarkerAtom | str:
+ if not isinstance(result, tuple):
+ return result
+
+ lhs, op, rhs = result
+ if isinstance(lhs, Variable) and lhs.value == "extra":
+ normalized_extra = canonicalize_name(rhs.value)
+ rhs = Value(normalized_extra)
+ elif isinstance(rhs, Variable) and rhs.value == "extra":
+ normalized_extra = canonicalize_name(lhs.value)
+ lhs = Value(normalized_extra)
+ return lhs, op, rhs
+
+
+def _normalize_extra_values(results: MarkerList) -> MarkerList:
+ """
+ Normalize extra values.
+ """
+
+ return [_normalize_extras(r) for r in results]
+
+
+def _format_marker(
+ marker: list[str] | MarkerAtom | str, first: bool | None = True
+) -> str:
+ assert isinstance(marker, (list, tuple, str))
+
+ # Sometimes we have a structure like [[...]] which is a single item list
+ # where the single item is itself it's own list. In that case we want skip
+ # the rest of this function so that we don't get extraneous () on the
+ # outside.
+ if (
+ isinstance(marker, list)
+ and len(marker) == 1
+ and isinstance(marker[0], (list, tuple))
+ ):
+ return _format_marker(marker[0])
+
+ if isinstance(marker, list):
+ inner = (_format_marker(m, first=False) for m in marker)
+ if first:
+ return " ".join(inner)
+ else:
+ return "(" + " ".join(inner) + ")"
+ elif isinstance(marker, tuple):
+ return " ".join([m.serialize() for m in marker])
+ else:
+ return marker
+
+
+_operators: dict[str, Operator] = {
+ "in": lambda lhs, rhs: lhs in rhs,
+ "not in": lambda lhs, rhs: lhs not in rhs,
+ "<": lambda _lhs, _rhs: False,
+ "<=": operator.eq,
+ "==": operator.eq,
+ "!=": operator.ne,
+ ">=": operator.eq,
+ ">": lambda _lhs, _rhs: False,
+}
+
+
+def _eval_op(lhs: str, op: Op, rhs: str | AbstractSet[str], *, key: str) -> bool:
+ op_str = op.serialize()
+ if key in MARKERS_REQUIRING_VERSION:
+ try:
+ spec = Specifier(f"{op_str}{rhs}")
+ except InvalidSpecifier:
+ pass
+ else:
+ return spec.contains(lhs, prereleases=True)
+
+ oper: Operator | None = _operators.get(op_str)
+ if oper is None:
+ raise UndefinedComparison(f"Undefined {op!r} on {lhs!r} and {rhs!r}.")
+
+ return oper(lhs, rhs)
+
+
+def _normalize(
+ lhs: str, rhs: str | AbstractSet[str], key: str
+) -> tuple[str, str | AbstractSet[str]]:
+ # PEP 685 - Comparison of extra names for optional distribution dependencies
+ # https://peps.python.org/pep-0685/
+ # > When comparing extra names, tools MUST normalize the names being
+ # > compared using the semantics outlined in PEP 503 for names
+ if key == "extra":
+ assert isinstance(rhs, str), "extra value must be a string"
+ # Both sides are normalized at this point already
+ return (lhs, rhs)
+ if key in MARKERS_ALLOWING_SET:
+ if isinstance(rhs, str): # pragma: no cover
+ return (canonicalize_name(lhs), canonicalize_name(rhs))
+ else:
+ return (canonicalize_name(lhs), {canonicalize_name(v) for v in rhs})
+
+ # other environment markers don't have such standards
+ return lhs, rhs
+
+
+def _evaluate_markers(
+ markers: MarkerList, environment: dict[str, str | AbstractSet[str]]
+) -> bool:
+ groups: list[list[bool]] = [[]]
+
+ for marker in markers:
+ if isinstance(marker, list):
+ groups[-1].append(_evaluate_markers(marker, environment))
+ elif isinstance(marker, tuple):
+ lhs, op, rhs = marker
+
+ if isinstance(lhs, Variable):
+ environment_key = lhs.value
+ lhs_value = environment[environment_key]
+ rhs_value = rhs.value
+ else:
+ lhs_value = lhs.value
+ environment_key = rhs.value
+ rhs_value = environment[environment_key]
+
+ assert isinstance(lhs_value, str), "lhs must be a string"
+ lhs_value, rhs_value = _normalize(lhs_value, rhs_value, key=environment_key)
+ groups[-1].append(_eval_op(lhs_value, op, rhs_value, key=environment_key))
+ elif marker == "or":
+ groups.append([])
+ elif marker == "and":
+ pass
+ else: # pragma: nocover
+ raise TypeError(f"Unexpected marker {marker!r}")
+
+ return any(all(item) for item in groups)
+
+
+def format_full_version(info: sys._version_info) -> str:
+ version = f"{info.major}.{info.minor}.{info.micro}"
+ kind = info.releaselevel
+ if kind != "final":
+ version += kind[0] + str(info.serial)
+ return version
+
+
+def default_environment() -> Environment:
+ iver = format_full_version(sys.implementation.version)
+ implementation_name = sys.implementation.name
+ return {
+ "implementation_name": implementation_name,
+ "implementation_version": iver,
+ "os_name": os.name,
+ "platform_machine": platform.machine(),
+ "platform_release": platform.release(),
+ "platform_system": platform.system(),
+ "platform_version": platform.version(),
+ "python_full_version": platform.python_version(),
+ "platform_python_implementation": platform.python_implementation(),
+ "python_version": ".".join(platform.python_version_tuple()[:2]),
+ "sys_platform": sys.platform,
+ }
+
+
+class Marker:
+ def __init__(self, marker: str) -> None:
+ # Note: We create a Marker object without calling this constructor in
+ # packaging.requirements.Requirement. If any additional logic is
+ # added here, make sure to mirror/adapt Requirement.
+
+ # If this fails and throws an error, the repr still expects _markers to
+ # be defined.
+ self._markers: MarkerList = []
+
+ try:
+ self._markers = _normalize_extra_values(_parse_marker(marker))
+ # The attribute `_markers` can be described in terms of a recursive type:
+ # MarkerList = List[Union[Tuple[Node, ...], str, MarkerList]]
+ #
+ # For example, the following expression:
+ # python_version > "3.6" or (python_version == "3.6" and os_name == "unix")
+ #
+ # is parsed into:
+ # [
+ # (, ')>, ),
+ # 'and',
+ # [
+ # (, , ),
+ # 'or',
+ # (, , )
+ # ]
+ # ]
+ except ParserSyntaxError as e:
+ raise InvalidMarker(str(e)) from e
+
+ def __str__(self) -> str:
+ return _format_marker(self._markers)
+
+ def __repr__(self) -> str:
+ return f"<{self.__class__.__name__}('{self}')>"
+
+ def __hash__(self) -> int:
+ return hash(str(self))
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, Marker):
+ return NotImplemented
+
+ return str(self) == str(other)
+
+ def evaluate(
+ self,
+ environment: Mapping[str, str | AbstractSet[str]] | None = None,
+ context: EvaluateContext = "metadata",
+ ) -> bool:
+ """Evaluate a marker.
+
+ Return the boolean from evaluating the given marker against the
+ environment. environment is an optional argument to override all or
+ part of the determined environment. The *context* parameter specifies what
+ context the markers are being evaluated for, which influences what markers
+ are considered valid. Acceptable values are "metadata" (for core metadata;
+ default), "lock_file", and "requirement" (i.e. all other situations).
+
+ The environment is determined from the current Python process.
+ """
+ current_environment = cast(
+ "dict[str, str | AbstractSet[str]]", default_environment()
+ )
+ if context == "lock_file":
+ current_environment.update(
+ extras=frozenset(), dependency_groups=frozenset()
+ )
+ elif context == "metadata":
+ current_environment["extra"] = ""
+
+ if environment is not None:
+ current_environment.update(environment)
+ if "extra" in current_environment:
+ # The API used to allow setting extra to None. We need to handle
+ # this case for backwards compatibility. Also skip running
+ # normalize name if extra is empty.
+ extra = cast("str | None", current_environment["extra"])
+ current_environment["extra"] = canonicalize_name(extra) if extra else ""
+
+ return _evaluate_markers(
+ self._markers, _repair_python_full_version(current_environment)
+ )
+
+
+def _repair_python_full_version(
+ env: dict[str, str | AbstractSet[str]],
+) -> dict[str, str | AbstractSet[str]]:
+ """
+ Work around platform.python_version() returning something that is not PEP 440
+ compliant for non-tagged Python builds.
+ """
+ python_full_version = cast("str", env["python_full_version"])
+ if python_full_version.endswith("+"):
+ env["python_full_version"] = f"{python_full_version}local"
+ return env
diff --git a/venv/lib/python3.10/site-packages/packaging/metadata.py b/venv/lib/python3.10/site-packages/packaging/metadata.py
new file mode 100644
index 0000000000000000000000000000000000000000..253f6b1b7ebd711fdc6bbbab3b56897061bab515
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/metadata.py
@@ -0,0 +1,978 @@
+from __future__ import annotations
+
+import email.feedparser
+import email.header
+import email.message
+import email.parser
+import email.policy
+import keyword
+import pathlib
+import sys
+import typing
+from typing import (
+ Any,
+ Callable,
+ Generic,
+ Literal,
+ TypedDict,
+ cast,
+)
+
+from . import licenses, requirements, specifiers, utils
+from . import version as version_module
+
+if typing.TYPE_CHECKING:
+ from .licenses import NormalizedLicenseExpression
+
+T = typing.TypeVar("T")
+
+
+if sys.version_info >= (3, 11): # pragma: no cover
+ ExceptionGroup = ExceptionGroup # noqa: F821
+else: # pragma: no cover
+
+ class ExceptionGroup(Exception):
+ """A minimal implementation of :external:exc:`ExceptionGroup` from Python 3.11.
+
+ If :external:exc:`ExceptionGroup` is already defined by Python itself,
+ that version is used instead.
+ """
+
+ message: str
+ exceptions: list[Exception]
+
+ def __init__(self, message: str, exceptions: list[Exception]) -> None:
+ self.message = message
+ self.exceptions = exceptions
+
+ def __repr__(self) -> str:
+ return f"{self.__class__.__name__}({self.message!r}, {self.exceptions!r})"
+
+
+class InvalidMetadata(ValueError):
+ """A metadata field contains invalid data."""
+
+ field: str
+ """The name of the field that contains invalid data."""
+
+ def __init__(self, field: str, message: str) -> None:
+ self.field = field
+ super().__init__(message)
+
+
+# The RawMetadata class attempts to make as few assumptions about the underlying
+# serialization formats as possible. The idea is that as long as a serialization
+# formats offer some very basic primitives in *some* way then we can support
+# serializing to and from that format.
+class RawMetadata(TypedDict, total=False):
+ """A dictionary of raw core metadata.
+
+ Each field in core metadata maps to a key of this dictionary (when data is
+ provided). The key is lower-case and underscores are used instead of dashes
+ compared to the equivalent core metadata field. Any core metadata field that
+ can be specified multiple times or can hold multiple values in a single
+ field have a key with a plural name. See :class:`Metadata` whose attributes
+ match the keys of this dictionary.
+
+ Core metadata fields that can be specified multiple times are stored as a
+ list or dict depending on which is appropriate for the field. Any fields
+ which hold multiple values in a single field are stored as a list.
+
+ """
+
+ # Metadata 1.0 - PEP 241
+ metadata_version: str
+ name: str
+ version: str
+ platforms: list[str]
+ summary: str
+ description: str
+ keywords: list[str]
+ home_page: str
+ author: str
+ author_email: str
+ license: str
+
+ # Metadata 1.1 - PEP 314
+ supported_platforms: list[str]
+ download_url: str
+ classifiers: list[str]
+ requires: list[str]
+ provides: list[str]
+ obsoletes: list[str]
+
+ # Metadata 1.2 - PEP 345
+ maintainer: str
+ maintainer_email: str
+ requires_dist: list[str]
+ provides_dist: list[str]
+ obsoletes_dist: list[str]
+ requires_python: str
+ requires_external: list[str]
+ project_urls: dict[str, str]
+
+ # Metadata 2.0
+ # PEP 426 attempted to completely revamp the metadata format
+ # but got stuck without ever being able to build consensus on
+ # it and ultimately ended up withdrawn.
+ #
+ # However, a number of tools had started emitting METADATA with
+ # `2.0` Metadata-Version, so for historical reasons, this version
+ # was skipped.
+
+ # Metadata 2.1 - PEP 566
+ description_content_type: str
+ provides_extra: list[str]
+
+ # Metadata 2.2 - PEP 643
+ dynamic: list[str]
+
+ # Metadata 2.3 - PEP 685
+ # No new fields were added in PEP 685, just some edge case were
+ # tightened up to provide better interoperability.
+
+ # Metadata 2.4 - PEP 639
+ license_expression: str
+ license_files: list[str]
+
+ # Metadata 2.5 - PEP 794
+ import_names: list[str]
+ import_namespaces: list[str]
+
+
+# 'keywords' is special as it's a string in the core metadata spec, but we
+# represent it as a list.
+_STRING_FIELDS = {
+ "author",
+ "author_email",
+ "description",
+ "description_content_type",
+ "download_url",
+ "home_page",
+ "license",
+ "license_expression",
+ "maintainer",
+ "maintainer_email",
+ "metadata_version",
+ "name",
+ "requires_python",
+ "summary",
+ "version",
+}
+
+_LIST_FIELDS = {
+ "classifiers",
+ "dynamic",
+ "license_files",
+ "obsoletes",
+ "obsoletes_dist",
+ "platforms",
+ "provides",
+ "provides_dist",
+ "provides_extra",
+ "requires",
+ "requires_dist",
+ "requires_external",
+ "supported_platforms",
+ "import_names",
+ "import_namespaces",
+}
+
+_DICT_FIELDS = {
+ "project_urls",
+}
+
+
+def _parse_keywords(data: str) -> list[str]:
+ """Split a string of comma-separated keywords into a list of keywords."""
+ return [k.strip() for k in data.split(",")]
+
+
+def _parse_project_urls(data: list[str]) -> dict[str, str]:
+ """Parse a list of label/URL string pairings separated by a comma."""
+ urls = {}
+ for pair in data:
+ # Our logic is slightly tricky here as we want to try and do
+ # *something* reasonable with malformed data.
+ #
+ # The main thing that we have to worry about, is data that does
+ # not have a ',' at all to split the label from the Value. There
+ # isn't a singular right answer here, and we will fail validation
+ # later on (if the caller is validating) so it doesn't *really*
+ # matter, but since the missing value has to be an empty str
+ # and our return value is dict[str, str], if we let the key
+ # be the missing value, then they'd have multiple '' values that
+ # overwrite each other in a accumulating dict.
+ #
+ # The other potential issue is that it's possible to have the
+ # same label multiple times in the metadata, with no solid "right"
+ # answer with what to do in that case. As such, we'll do the only
+ # thing we can, which is treat the field as unparsable and add it
+ # to our list of unparsed fields.
+ #
+ # TODO: The spec doesn't say anything about if the keys should be
+ # considered case sensitive or not... logically they should
+ # be case-preserving and case-insensitive, but doing that
+ # would open up more cases where we might have duplicate
+ # entries.
+ label, _, url = (s.strip() for s in pair.partition(","))
+
+ if label in urls:
+ # The label already exists in our set of urls, so this field
+ # is unparsable, and we can just add the whole thing to our
+ # unparsable data and stop processing it.
+ raise KeyError("duplicate labels in project urls")
+ urls[label] = url
+
+ return urls
+
+
+def _get_payload(msg: email.message.Message, source: bytes | str) -> str:
+ """Get the body of the message."""
+ # If our source is a str, then our caller has managed encodings for us,
+ # and we don't need to deal with it.
+ if isinstance(source, str):
+ payload = msg.get_payload()
+ assert isinstance(payload, str)
+ return payload
+ # If our source is a bytes, then we're managing the encoding and we need
+ # to deal with it.
+ else:
+ bpayload = msg.get_payload(decode=True)
+ assert isinstance(bpayload, bytes)
+ try:
+ return bpayload.decode("utf8", "strict")
+ except UnicodeDecodeError as exc:
+ raise ValueError("payload in an invalid encoding") from exc
+
+
+# The various parse_FORMAT functions here are intended to be as lenient as
+# possible in their parsing, while still returning a correctly typed
+# RawMetadata.
+#
+# To aid in this, we also generally want to do as little touching of the
+# data as possible, except where there are possibly some historic holdovers
+# that make valid data awkward to work with.
+#
+# While this is a lower level, intermediate format than our ``Metadata``
+# class, some light touch ups can make a massive difference in usability.
+
+# Map METADATA fields to RawMetadata.
+_EMAIL_TO_RAW_MAPPING = {
+ "author": "author",
+ "author-email": "author_email",
+ "classifier": "classifiers",
+ "description": "description",
+ "description-content-type": "description_content_type",
+ "download-url": "download_url",
+ "dynamic": "dynamic",
+ "home-page": "home_page",
+ "import-name": "import_names",
+ "import-namespace": "import_namespaces",
+ "keywords": "keywords",
+ "license": "license",
+ "license-expression": "license_expression",
+ "license-file": "license_files",
+ "maintainer": "maintainer",
+ "maintainer-email": "maintainer_email",
+ "metadata-version": "metadata_version",
+ "name": "name",
+ "obsoletes": "obsoletes",
+ "obsoletes-dist": "obsoletes_dist",
+ "platform": "platforms",
+ "project-url": "project_urls",
+ "provides": "provides",
+ "provides-dist": "provides_dist",
+ "provides-extra": "provides_extra",
+ "requires": "requires",
+ "requires-dist": "requires_dist",
+ "requires-external": "requires_external",
+ "requires-python": "requires_python",
+ "summary": "summary",
+ "supported-platform": "supported_platforms",
+ "version": "version",
+}
+_RAW_TO_EMAIL_MAPPING = {raw: email for email, raw in _EMAIL_TO_RAW_MAPPING.items()}
+
+
+# This class is for writing RFC822 messages
+class RFC822Policy(email.policy.EmailPolicy):
+ """
+ This is :class:`email.policy.EmailPolicy`, but with a simple ``header_store_parse``
+ implementation that handles multi-line values, and some nice defaults.
+ """
+
+ utf8 = True
+ mangle_from_ = False
+ max_line_length = 0
+
+ def header_store_parse(self, name: str, value: str) -> tuple[str, str]:
+ size = len(name) + 2
+ value = value.replace("\n", "\n" + " " * size)
+ return (name, value)
+
+
+# This class is for writing RFC822 messages
+class RFC822Message(email.message.EmailMessage):
+ """
+ This is :class:`email.message.EmailMessage` with two small changes: it defaults to
+ our `RFC822Policy`, and it correctly writes unicode when being called
+ with `bytes()`.
+ """
+
+ def __init__(self) -> None:
+ super().__init__(policy=RFC822Policy())
+
+ def as_bytes(
+ self, unixfrom: bool = False, policy: email.policy.Policy | None = None
+ ) -> bytes:
+ """
+ Return the bytes representation of the message.
+
+ This handles unicode encoding.
+ """
+ return self.as_string(unixfrom, policy=policy).encode("utf-8")
+
+
+def parse_email(data: bytes | str) -> tuple[RawMetadata, dict[str, list[str]]]:
+ """Parse a distribution's metadata stored as email headers (e.g. from ``METADATA``).
+
+ This function returns a two-item tuple of dicts. The first dict is of
+ recognized fields from the core metadata specification. Fields that can be
+ parsed and translated into Python's built-in types are converted
+ appropriately. All other fields are left as-is. Fields that are allowed to
+ appear multiple times are stored as lists.
+
+ The second dict contains all other fields from the metadata. This includes
+ any unrecognized fields. It also includes any fields which are expected to
+ be parsed into a built-in type but were not formatted appropriately. Finally,
+ any fields that are expected to appear only once but are repeated are
+ included in this dict.
+
+ """
+ raw: dict[str, str | list[str] | dict[str, str]] = {}
+ unparsed: dict[str, list[str]] = {}
+
+ if isinstance(data, str):
+ parsed = email.parser.Parser(policy=email.policy.compat32).parsestr(data)
+ else:
+ parsed = email.parser.BytesParser(policy=email.policy.compat32).parsebytes(data)
+
+ # We have to wrap parsed.keys() in a set, because in the case of multiple
+ # values for a key (a list), the key will appear multiple times in the
+ # list of keys, but we're avoiding that by using get_all().
+ for name_with_case in frozenset(parsed.keys()):
+ # Header names in RFC are case insensitive, so we'll normalize to all
+ # lower case to make comparisons easier.
+ name = name_with_case.lower()
+
+ # We use get_all() here, even for fields that aren't multiple use,
+ # because otherwise someone could have e.g. two Name fields, and we
+ # would just silently ignore it rather than doing something about it.
+ headers = parsed.get_all(name) or []
+
+ # The way the email module works when parsing bytes is that it
+ # unconditionally decodes the bytes as ascii using the surrogateescape
+ # handler. When you pull that data back out (such as with get_all() ),
+ # it looks to see if the str has any surrogate escapes, and if it does
+ # it wraps it in a Header object instead of returning the string.
+ #
+ # As such, we'll look for those Header objects, and fix up the encoding.
+ value = []
+ # Flag if we have run into any issues processing the headers, thus
+ # signalling that the data belongs in 'unparsed'.
+ valid_encoding = True
+ for h in headers:
+ # It's unclear if this can return more types than just a Header or
+ # a str, so we'll just assert here to make sure.
+ assert isinstance(h, (email.header.Header, str))
+
+ # If it's a header object, we need to do our little dance to get
+ # the real data out of it. In cases where there is invalid data
+ # we're going to end up with mojibake, but there's no obvious, good
+ # way around that without reimplementing parts of the Header object
+ # ourselves.
+ #
+ # That should be fine since, if mojibacked happens, this key is
+ # going into the unparsed dict anyways.
+ if isinstance(h, email.header.Header):
+ # The Header object stores it's data as chunks, and each chunk
+ # can be independently encoded, so we'll need to check each
+ # of them.
+ chunks: list[tuple[bytes, str | None]] = []
+ for binary, _encoding in email.header.decode_header(h):
+ try:
+ binary.decode("utf8", "strict")
+ except UnicodeDecodeError:
+ # Enable mojibake.
+ encoding = "latin1"
+ valid_encoding = False
+ else:
+ encoding = "utf8"
+ chunks.append((binary, encoding))
+
+ # Turn our chunks back into a Header object, then let that
+ # Header object do the right thing to turn them into a
+ # string for us.
+ value.append(str(email.header.make_header(chunks)))
+ # This is already a string, so just add it.
+ else:
+ value.append(h)
+
+ # We've processed all of our values to get them into a list of str,
+ # but we may have mojibake data, in which case this is an unparsed
+ # field.
+ if not valid_encoding:
+ unparsed[name] = value
+ continue
+
+ raw_name = _EMAIL_TO_RAW_MAPPING.get(name)
+ if raw_name is None:
+ # This is a bit of a weird situation, we've encountered a key that
+ # we don't know what it means, so we don't know whether it's meant
+ # to be a list or not.
+ #
+ # Since we can't really tell one way or another, we'll just leave it
+ # as a list, even though it may be a single item list, because that's
+ # what makes the most sense for email headers.
+ unparsed[name] = value
+ continue
+
+ # If this is one of our string fields, then we'll check to see if our
+ # value is a list of a single item. If it is then we'll assume that
+ # it was emitted as a single string, and unwrap the str from inside
+ # the list.
+ #
+ # If it's any other kind of data, then we haven't the faintest clue
+ # what we should parse it as, and we have to just add it to our list
+ # of unparsed stuff.
+ if raw_name in _STRING_FIELDS and len(value) == 1:
+ raw[raw_name] = value[0]
+ # If this is import_names, we need to special case the empty field
+ # case, which converts to an empty list instead of None. We can't let
+ # the empty case slip through, as it will fail validation.
+ elif raw_name == "import_names" and value == [""]:
+ raw[raw_name] = []
+ # If this is one of our list of string fields, then we can just assign
+ # the value, since email *only* has strings, and our get_all() call
+ # above ensures that this is a list.
+ elif raw_name in _LIST_FIELDS:
+ raw[raw_name] = value
+ # Special Case: Keywords
+ # The keywords field is implemented in the metadata spec as a str,
+ # but it conceptually is a list of strings, and is serialized using
+ # ", ".join(keywords), so we'll do some light data massaging to turn
+ # this into what it logically is.
+ elif raw_name == "keywords" and len(value) == 1:
+ raw[raw_name] = _parse_keywords(value[0])
+ # Special Case: Project-URL
+ # The project urls is implemented in the metadata spec as a list of
+ # specially-formatted strings that represent a key and a value, which
+ # is fundamentally a mapping, however the email format doesn't support
+ # mappings in a sane way, so it was crammed into a list of strings
+ # instead.
+ #
+ # We will do a little light data massaging to turn this into a map as
+ # it logically should be.
+ elif raw_name == "project_urls":
+ try:
+ raw[raw_name] = _parse_project_urls(value)
+ except KeyError:
+ unparsed[name] = value
+ # Nothing that we've done has managed to parse this, so it'll just
+ # throw it in our unparsable data and move on.
+ else:
+ unparsed[name] = value
+
+ # We need to support getting the Description from the message payload in
+ # addition to getting it from the the headers. This does mean, though, there
+ # is the possibility of it being set both ways, in which case we put both
+ # in 'unparsed' since we don't know which is right.
+ try:
+ payload = _get_payload(parsed, data)
+ except ValueError:
+ unparsed.setdefault("description", []).append(
+ parsed.get_payload(decode=isinstance(data, bytes)) # type: ignore[call-overload]
+ )
+ else:
+ if payload:
+ # Check to see if we've already got a description, if so then both
+ # it, and this body move to unparsable.
+ if "description" in raw:
+ description_header = cast("str", raw.pop("description"))
+ unparsed.setdefault("description", []).extend(
+ [description_header, payload]
+ )
+ elif "description" in unparsed:
+ unparsed["description"].append(payload)
+ else:
+ raw["description"] = payload
+
+ # We need to cast our `raw` to a metadata, because a TypedDict only support
+ # literal key names, but we're computing our key names on purpose, but the
+ # way this function is implemented, our `TypedDict` can only have valid key
+ # names.
+ return cast("RawMetadata", raw), unparsed
+
+
+_NOT_FOUND = object()
+
+
+# Keep the two values in sync.
+_VALID_METADATA_VERSIONS = ["1.0", "1.1", "1.2", "2.1", "2.2", "2.3", "2.4", "2.5"]
+_MetadataVersion = Literal["1.0", "1.1", "1.2", "2.1", "2.2", "2.3", "2.4", "2.5"]
+
+_REQUIRED_ATTRS = frozenset(["metadata_version", "name", "version"])
+
+
+class _Validator(Generic[T]):
+ """Validate a metadata field.
+
+ All _process_*() methods correspond to a core metadata field. The method is
+ called with the field's raw value. If the raw value is valid it is returned
+ in its "enriched" form (e.g. ``version.Version`` for the ``Version`` field).
+ If the raw value is invalid, :exc:`InvalidMetadata` is raised (with a cause
+ as appropriate).
+ """
+
+ name: str
+ raw_name: str
+ added: _MetadataVersion
+
+ def __init__(
+ self,
+ *,
+ added: _MetadataVersion = "1.0",
+ ) -> None:
+ self.added = added
+
+ def __set_name__(self, _owner: Metadata, name: str) -> None:
+ self.name = name
+ self.raw_name = _RAW_TO_EMAIL_MAPPING[name]
+
+ def __get__(self, instance: Metadata, _owner: type[Metadata]) -> T:
+ # With Python 3.8, the caching can be replaced with functools.cached_property().
+ # No need to check the cache as attribute lookup will resolve into the
+ # instance's __dict__ before __get__ is called.
+ cache = instance.__dict__
+ value = instance._raw.get(self.name)
+
+ # To make the _process_* methods easier, we'll check if the value is None
+ # and if this field is NOT a required attribute, and if both of those
+ # things are true, we'll skip the the converter. This will mean that the
+ # converters never have to deal with the None union.
+ if self.name in _REQUIRED_ATTRS or value is not None:
+ try:
+ converter: Callable[[Any], T] = getattr(self, f"_process_{self.name}")
+ except AttributeError:
+ pass
+ else:
+ value = converter(value)
+
+ cache[self.name] = value
+ try:
+ del instance._raw[self.name] # type: ignore[misc]
+ except KeyError:
+ pass
+
+ return cast("T", value)
+
+ def _invalid_metadata(
+ self, msg: str, cause: Exception | None = None
+ ) -> InvalidMetadata:
+ exc = InvalidMetadata(
+ self.raw_name, msg.format_map({"field": repr(self.raw_name)})
+ )
+ exc.__cause__ = cause
+ return exc
+
+ def _process_metadata_version(self, value: str) -> _MetadataVersion:
+ # Implicitly makes Metadata-Version required.
+ if value not in _VALID_METADATA_VERSIONS:
+ raise self._invalid_metadata(f"{value!r} is not a valid metadata version")
+ return cast("_MetadataVersion", value)
+
+ def _process_name(self, value: str) -> str:
+ if not value:
+ raise self._invalid_metadata("{field} is a required field")
+ # Validate the name as a side-effect.
+ try:
+ utils.canonicalize_name(value, validate=True)
+ except utils.InvalidName as exc:
+ raise self._invalid_metadata(
+ f"{value!r} is invalid for {{field}}", cause=exc
+ ) from exc
+ else:
+ return value
+
+ def _process_version(self, value: str) -> version_module.Version:
+ if not value:
+ raise self._invalid_metadata("{field} is a required field")
+ try:
+ return version_module.parse(value)
+ except version_module.InvalidVersion as exc:
+ raise self._invalid_metadata(
+ f"{value!r} is invalid for {{field}}", cause=exc
+ ) from exc
+
+ def _process_summary(self, value: str) -> str:
+ """Check the field contains no newlines."""
+ if "\n" in value:
+ raise self._invalid_metadata("{field} must be a single line")
+ return value
+
+ def _process_description_content_type(self, value: str) -> str:
+ content_types = {"text/plain", "text/x-rst", "text/markdown"}
+ message = email.message.EmailMessage()
+ message["content-type"] = value
+
+ content_type, parameters = (
+ # Defaults to `text/plain` if parsing failed.
+ message.get_content_type().lower(),
+ message["content-type"].params,
+ )
+ # Check if content-type is valid or defaulted to `text/plain` and thus was
+ # not parseable.
+ if content_type not in content_types or content_type not in value.lower():
+ raise self._invalid_metadata(
+ f"{{field}} must be one of {list(content_types)}, not {value!r}"
+ )
+
+ charset = parameters.get("charset", "UTF-8")
+ if charset != "UTF-8":
+ raise self._invalid_metadata(
+ f"{{field}} can only specify the UTF-8 charset, not {list(charset)}"
+ )
+
+ markdown_variants = {"GFM", "CommonMark"}
+ variant = parameters.get("variant", "GFM") # Use an acceptable default.
+ if content_type == "text/markdown" and variant not in markdown_variants:
+ raise self._invalid_metadata(
+ f"valid Markdown variants for {{field}} are {list(markdown_variants)}, "
+ f"not {variant!r}",
+ )
+ return value
+
+ def _process_dynamic(self, value: list[str]) -> list[str]:
+ for dynamic_field in map(str.lower, value):
+ if dynamic_field in {"name", "version", "metadata-version"}:
+ raise self._invalid_metadata(
+ f"{dynamic_field!r} is not allowed as a dynamic field"
+ )
+ elif dynamic_field not in _EMAIL_TO_RAW_MAPPING:
+ raise self._invalid_metadata(
+ f"{dynamic_field!r} is not a valid dynamic field"
+ )
+ return list(map(str.lower, value))
+
+ def _process_provides_extra(
+ self,
+ value: list[str],
+ ) -> list[utils.NormalizedName]:
+ normalized_names = []
+ try:
+ for name in value:
+ normalized_names.append(utils.canonicalize_name(name, validate=True))
+ except utils.InvalidName as exc:
+ raise self._invalid_metadata(
+ f"{name!r} is invalid for {{field}}", cause=exc
+ ) from exc
+ else:
+ return normalized_names
+
+ def _process_requires_python(self, value: str) -> specifiers.SpecifierSet:
+ try:
+ return specifiers.SpecifierSet(value)
+ except specifiers.InvalidSpecifier as exc:
+ raise self._invalid_metadata(
+ f"{value!r} is invalid for {{field}}", cause=exc
+ ) from exc
+
+ def _process_requires_dist(
+ self,
+ value: list[str],
+ ) -> list[requirements.Requirement]:
+ reqs = []
+ try:
+ for req in value:
+ reqs.append(requirements.Requirement(req))
+ except requirements.InvalidRequirement as exc:
+ raise self._invalid_metadata(
+ f"{req!r} is invalid for {{field}}", cause=exc
+ ) from exc
+ else:
+ return reqs
+
+ def _process_license_expression(self, value: str) -> NormalizedLicenseExpression:
+ try:
+ return licenses.canonicalize_license_expression(value)
+ except ValueError as exc:
+ raise self._invalid_metadata(
+ f"{value!r} is invalid for {{field}}", cause=exc
+ ) from exc
+
+ def _process_license_files(self, value: list[str]) -> list[str]:
+ paths = []
+ for path in value:
+ if ".." in path:
+ raise self._invalid_metadata(
+ f"{path!r} is invalid for {{field}}, "
+ "parent directory indicators are not allowed"
+ )
+ if "*" in path:
+ raise self._invalid_metadata(
+ f"{path!r} is invalid for {{field}}, paths must be resolved"
+ )
+ if (
+ pathlib.PurePosixPath(path).is_absolute()
+ or pathlib.PureWindowsPath(path).is_absolute()
+ ):
+ raise self._invalid_metadata(
+ f"{path!r} is invalid for {{field}}, paths must be relative"
+ )
+ if pathlib.PureWindowsPath(path).as_posix() != path:
+ raise self._invalid_metadata(
+ f"{path!r} is invalid for {{field}}, paths must use '/' delimiter"
+ )
+ paths.append(path)
+ return paths
+
+ def _process_import_names(self, value: list[str]) -> list[str]:
+ for import_name in value:
+ name, semicolon, private = import_name.partition(";")
+ name = name.rstrip()
+ for identifier in name.split("."):
+ if not identifier.isidentifier():
+ raise self._invalid_metadata(
+ f"{name!r} is invalid for {{field}}; "
+ f"{identifier!r} is not a valid identifier"
+ )
+ elif keyword.iskeyword(identifier):
+ raise self._invalid_metadata(
+ f"{name!r} is invalid for {{field}}; "
+ f"{identifier!r} is a keyword"
+ )
+ if semicolon and private.lstrip() != "private":
+ raise self._invalid_metadata(
+ f"{import_name!r} is invalid for {{field}}; "
+ "the only valid option is 'private'"
+ )
+ return value
+
+ _process_import_namespaces = _process_import_names
+
+
+class Metadata:
+ """Representation of distribution metadata.
+
+ Compared to :class:`RawMetadata`, this class provides objects representing
+ metadata fields instead of only using built-in types. Any invalid metadata
+ will cause :exc:`InvalidMetadata` to be raised (with a
+ :py:attr:`~BaseException.__cause__` attribute as appropriate).
+ """
+
+ _raw: RawMetadata
+
+ @classmethod
+ def from_raw(cls, data: RawMetadata, *, validate: bool = True) -> Metadata:
+ """Create an instance from :class:`RawMetadata`.
+
+ If *validate* is true, all metadata will be validated. All exceptions
+ related to validation will be gathered and raised as an :class:`ExceptionGroup`.
+ """
+ ins = cls()
+ ins._raw = data.copy() # Mutations occur due to caching enriched values.
+
+ if validate:
+ exceptions: list[Exception] = []
+ try:
+ metadata_version = ins.metadata_version
+ metadata_age = _VALID_METADATA_VERSIONS.index(metadata_version)
+ except InvalidMetadata as metadata_version_exc:
+ exceptions.append(metadata_version_exc)
+ metadata_version = None
+
+ # Make sure to check for the fields that are present, the required
+ # fields (so their absence can be reported).
+ fields_to_check = frozenset(ins._raw) | _REQUIRED_ATTRS
+ # Remove fields that have already been checked.
+ fields_to_check -= {"metadata_version"}
+
+ for key in fields_to_check:
+ try:
+ if metadata_version:
+ # Can't use getattr() as that triggers descriptor protocol which
+ # will fail due to no value for the instance argument.
+ try:
+ field_metadata_version = cls.__dict__[key].added
+ except KeyError:
+ exc = InvalidMetadata(key, f"unrecognized field: {key!r}")
+ exceptions.append(exc)
+ continue
+ field_age = _VALID_METADATA_VERSIONS.index(
+ field_metadata_version
+ )
+ if field_age > metadata_age:
+ field = _RAW_TO_EMAIL_MAPPING[key]
+ exc = InvalidMetadata(
+ field,
+ f"{field} introduced in metadata version "
+ f"{field_metadata_version}, not {metadata_version}",
+ )
+ exceptions.append(exc)
+ continue
+ getattr(ins, key)
+ except InvalidMetadata as exc:
+ exceptions.append(exc)
+
+ if exceptions:
+ raise ExceptionGroup("invalid metadata", exceptions)
+
+ return ins
+
+ @classmethod
+ def from_email(cls, data: bytes | str, *, validate: bool = True) -> Metadata:
+ """Parse metadata from email headers.
+
+ If *validate* is true, the metadata will be validated. All exceptions
+ related to validation will be gathered and raised as an :class:`ExceptionGroup`.
+ """
+ raw, unparsed = parse_email(data)
+
+ if validate:
+ exceptions: list[Exception] = []
+ for unparsed_key in unparsed:
+ if unparsed_key in _EMAIL_TO_RAW_MAPPING:
+ message = f"{unparsed_key!r} has invalid data"
+ else:
+ message = f"unrecognized field: {unparsed_key!r}"
+ exceptions.append(InvalidMetadata(unparsed_key, message))
+
+ if exceptions:
+ raise ExceptionGroup("unparsed", exceptions)
+
+ try:
+ return cls.from_raw(raw, validate=validate)
+ except ExceptionGroup as exc_group:
+ raise ExceptionGroup(
+ "invalid or unparsed metadata", exc_group.exceptions
+ ) from None
+
+ metadata_version: _Validator[_MetadataVersion] = _Validator()
+ """:external:ref:`core-metadata-metadata-version`
+ (required; validated to be a valid metadata version)"""
+ # `name` is not normalized/typed to NormalizedName so as to provide access to
+ # the original/raw name.
+ name: _Validator[str] = _Validator()
+ """:external:ref:`core-metadata-name`
+ (required; validated using :func:`~packaging.utils.canonicalize_name` and its
+ *validate* parameter)"""
+ version: _Validator[version_module.Version] = _Validator()
+ """:external:ref:`core-metadata-version` (required)"""
+ dynamic: _Validator[list[str] | None] = _Validator(
+ added="2.2",
+ )
+ """:external:ref:`core-metadata-dynamic`
+ (validated against core metadata field names and lowercased)"""
+ platforms: _Validator[list[str] | None] = _Validator()
+ """:external:ref:`core-metadata-platform`"""
+ supported_platforms: _Validator[list[str] | None] = _Validator(added="1.1")
+ """:external:ref:`core-metadata-supported-platform`"""
+ summary: _Validator[str | None] = _Validator()
+ """:external:ref:`core-metadata-summary` (validated to contain no newlines)"""
+ description: _Validator[str | None] = _Validator() # TODO 2.1: can be in body
+ """:external:ref:`core-metadata-description`"""
+ description_content_type: _Validator[str | None] = _Validator(added="2.1")
+ """:external:ref:`core-metadata-description-content-type` (validated)"""
+ keywords: _Validator[list[str] | None] = _Validator()
+ """:external:ref:`core-metadata-keywords`"""
+ home_page: _Validator[str | None] = _Validator()
+ """:external:ref:`core-metadata-home-page`"""
+ download_url: _Validator[str | None] = _Validator(added="1.1")
+ """:external:ref:`core-metadata-download-url`"""
+ author: _Validator[str | None] = _Validator()
+ """:external:ref:`core-metadata-author`"""
+ author_email: _Validator[str | None] = _Validator()
+ """:external:ref:`core-metadata-author-email`"""
+ maintainer: _Validator[str | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-maintainer`"""
+ maintainer_email: _Validator[str | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-maintainer-email`"""
+ license: _Validator[str | None] = _Validator()
+ """:external:ref:`core-metadata-license`"""
+ license_expression: _Validator[NormalizedLicenseExpression | None] = _Validator(
+ added="2.4"
+ )
+ """:external:ref:`core-metadata-license-expression`"""
+ license_files: _Validator[list[str] | None] = _Validator(added="2.4")
+ """:external:ref:`core-metadata-license-file`"""
+ classifiers: _Validator[list[str] | None] = _Validator(added="1.1")
+ """:external:ref:`core-metadata-classifier`"""
+ requires_dist: _Validator[list[requirements.Requirement] | None] = _Validator(
+ added="1.2"
+ )
+ """:external:ref:`core-metadata-requires-dist`"""
+ requires_python: _Validator[specifiers.SpecifierSet | None] = _Validator(
+ added="1.2"
+ )
+ """:external:ref:`core-metadata-requires-python`"""
+ # Because `Requires-External` allows for non-PEP 440 version specifiers, we
+ # don't do any processing on the values.
+ requires_external: _Validator[list[str] | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-requires-external`"""
+ project_urls: _Validator[dict[str, str] | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-project-url`"""
+ # PEP 685 lets us raise an error if an extra doesn't pass `Name` validation
+ # regardless of metadata version.
+ provides_extra: _Validator[list[utils.NormalizedName] | None] = _Validator(
+ added="2.1",
+ )
+ """:external:ref:`core-metadata-provides-extra`"""
+ provides_dist: _Validator[list[str] | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-provides-dist`"""
+ obsoletes_dist: _Validator[list[str] | None] = _Validator(added="1.2")
+ """:external:ref:`core-metadata-obsoletes-dist`"""
+ import_names: _Validator[list[str] | None] = _Validator(added="2.5")
+ """:external:ref:`core-metadata-import-name`"""
+ import_namespaces: _Validator[list[str] | None] = _Validator(added="2.5")
+ """:external:ref:`core-metadata-import-namespace`"""
+ requires: _Validator[list[str] | None] = _Validator(added="1.1")
+ """``Requires`` (deprecated)"""
+ provides: _Validator[list[str] | None] = _Validator(added="1.1")
+ """``Provides`` (deprecated)"""
+ obsoletes: _Validator[list[str] | None] = _Validator(added="1.1")
+ """``Obsoletes`` (deprecated)"""
+
+ def as_rfc822(self) -> RFC822Message:
+ """
+ Return an RFC822 message with the metadata.
+ """
+ message = RFC822Message()
+ self._write_metadata(message)
+ return message
+
+ def _write_metadata(self, message: RFC822Message) -> None:
+ """
+ Return an RFC822 message with the metadata.
+ """
+ for name, validator in self.__class__.__dict__.items():
+ if isinstance(validator, _Validator) and name != "description":
+ value = getattr(self, name)
+ email_name = _RAW_TO_EMAIL_MAPPING[name]
+ if value is not None:
+ if email_name == "project-url":
+ for label, url in value.items():
+ message[email_name] = f"{label}, {url}"
+ elif email_name == "keywords":
+ message[email_name] = ",".join(value)
+ elif email_name == "import-name" and value == []:
+ message[email_name] = ""
+ elif isinstance(value, list):
+ for item in value:
+ message[email_name] = str(item)
+ else:
+ message[email_name] = str(value)
+
+ # The description is a special case because it is in the body of the message.
+ if self.description is not None:
+ message.set_payload(self.description)
diff --git a/venv/lib/python3.10/site-packages/packaging/py.typed b/venv/lib/python3.10/site-packages/packaging/py.typed
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/venv/lib/python3.10/site-packages/packaging/pylock.py b/venv/lib/python3.10/site-packages/packaging/pylock.py
new file mode 100644
index 0000000000000000000000000000000000000000..a564f15246ad65038029f8fefb48621fa64a3abd
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/pylock.py
@@ -0,0 +1,635 @@
+from __future__ import annotations
+
+import dataclasses
+import logging
+import re
+from collections.abc import Mapping, Sequence
+from dataclasses import dataclass
+from datetime import datetime
+from typing import (
+ TYPE_CHECKING,
+ Any,
+ Callable,
+ Protocol,
+ TypeVar,
+)
+
+from .markers import Marker
+from .specifiers import SpecifierSet
+from .utils import NormalizedName, is_normalized_name
+from .version import Version
+
+if TYPE_CHECKING: # pragma: no cover
+ from pathlib import Path
+
+ from typing_extensions import Self
+
+_logger = logging.getLogger(__name__)
+
+__all__ = [
+ "Package",
+ "PackageArchive",
+ "PackageDirectory",
+ "PackageSdist",
+ "PackageVcs",
+ "PackageWheel",
+ "Pylock",
+ "PylockUnsupportedVersionError",
+ "PylockValidationError",
+ "is_valid_pylock_path",
+]
+
+_T = TypeVar("_T")
+_T2 = TypeVar("_T2")
+
+
+class _FromMappingProtocol(Protocol): # pragma: no cover
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self: ...
+
+
+_FromMappingProtocolT = TypeVar("_FromMappingProtocolT", bound=_FromMappingProtocol)
+
+
+_PYLOCK_FILE_NAME_RE = re.compile(r"^pylock\.([^.]+)\.toml$")
+
+
+def is_valid_pylock_path(path: Path) -> bool:
+ """Check if the given path is a valid pylock file path."""
+ return path.name == "pylock.toml" or bool(_PYLOCK_FILE_NAME_RE.match(path.name))
+
+
+def _toml_key(key: str) -> str:
+ return key.replace("_", "-")
+
+
+def _toml_value(key: str, value: Any) -> Any: # noqa: ANN401
+ if isinstance(value, (Version, Marker, SpecifierSet)):
+ return str(value)
+ if isinstance(value, Sequence) and key == "environments":
+ return [str(v) for v in value]
+ return value
+
+
+def _toml_dict_factory(data: list[tuple[str, Any]]) -> dict[str, Any]:
+ return {
+ _toml_key(key): _toml_value(key, value)
+ for key, value in data
+ if value is not None
+ }
+
+
+def _get(d: Mapping[str, Any], expected_type: type[_T], key: str) -> _T | None:
+ """Get a value from the dictionary and verify it's the expected type."""
+ if (value := d.get(key)) is None:
+ return None
+ if not isinstance(value, expected_type):
+ raise PylockValidationError(
+ f"Unexpected type {type(value).__name__} "
+ f"(expected {expected_type.__name__})",
+ context=key,
+ )
+ return value
+
+
+def _get_required(d: Mapping[str, Any], expected_type: type[_T], key: str) -> _T:
+ """Get a required value from the dictionary and verify it's the expected type."""
+ if (value := _get(d, expected_type, key)) is None:
+ raise _PylockRequiredKeyError(key)
+ return value
+
+
+def _get_sequence(
+ d: Mapping[str, Any], expected_item_type: type[_T], key: str
+) -> Sequence[_T] | None:
+ """Get a list value from the dictionary and verify it's the expected items type."""
+ if (value := _get(d, Sequence, key)) is None: # type: ignore[type-abstract]
+ return None
+ if isinstance(value, (str, bytes)):
+ # special case: str and bytes are Sequences, but we want to reject it
+ raise PylockValidationError(
+ f"Unexpected type {type(value).__name__} (expected Sequence)",
+ context=key,
+ )
+ for i, item in enumerate(value):
+ if not isinstance(item, expected_item_type):
+ raise PylockValidationError(
+ f"Unexpected type {type(item).__name__} "
+ f"(expected {expected_item_type.__name__})",
+ context=f"{key}[{i}]",
+ )
+ return value
+
+
+def _get_as(
+ d: Mapping[str, Any],
+ expected_type: type[_T],
+ target_type: Callable[[_T], _T2],
+ key: str,
+) -> _T2 | None:
+ """Get a value from the dictionary, verify it's the expected type,
+ and convert to the target type.
+
+ This assumes the target_type constructor accepts the value.
+ """
+ if (value := _get(d, expected_type, key)) is None:
+ return None
+ try:
+ return target_type(value)
+ except Exception as e:
+ raise PylockValidationError(e, context=key) from e
+
+
+def _get_required_as(
+ d: Mapping[str, Any],
+ expected_type: type[_T],
+ target_type: Callable[[_T], _T2],
+ key: str,
+) -> _T2:
+ """Get a required value from the dict, verify it's the expected type,
+ and convert to the target type."""
+ if (value := _get_as(d, expected_type, target_type, key)) is None:
+ raise _PylockRequiredKeyError(key)
+ return value
+
+
+def _get_sequence_as(
+ d: Mapping[str, Any],
+ expected_item_type: type[_T],
+ target_item_type: Callable[[_T], _T2],
+ key: str,
+) -> list[_T2] | None:
+ """Get list value from dictionary and verify expected items type."""
+ if (value := _get_sequence(d, expected_item_type, key)) is None:
+ return None
+ result = []
+ try:
+ for item in value:
+ typed_item = target_item_type(item)
+ result.append(typed_item)
+ except Exception as e:
+ raise PylockValidationError(e, context=f"{key}[{len(result)}]") from e
+ return result
+
+
+def _get_object(
+ d: Mapping[str, Any], target_type: type[_FromMappingProtocolT], key: str
+) -> _FromMappingProtocolT | None:
+ """Get a dictionary value from the dictionary and convert it to a dataclass."""
+ if (value := _get(d, Mapping, key)) is None: # type: ignore[type-abstract]
+ return None
+ try:
+ return target_type._from_dict(value)
+ except Exception as e:
+ raise PylockValidationError(e, context=key) from e
+
+
+def _get_sequence_of_objects(
+ d: Mapping[str, Any], target_item_type: type[_FromMappingProtocolT], key: str
+) -> list[_FromMappingProtocolT] | None:
+ """Get a list value from the dictionary and convert its items to a dataclass."""
+ if (value := _get_sequence(d, Mapping, key)) is None: # type: ignore[type-abstract]
+ return None
+ result: list[_FromMappingProtocolT] = []
+ try:
+ for item in value:
+ typed_item = target_item_type._from_dict(item)
+ result.append(typed_item)
+ except Exception as e:
+ raise PylockValidationError(e, context=f"{key}[{len(result)}]") from e
+ return result
+
+
+def _get_required_sequence_of_objects(
+ d: Mapping[str, Any], target_item_type: type[_FromMappingProtocolT], key: str
+) -> Sequence[_FromMappingProtocolT]:
+ """Get a required list value from the dictionary and convert its items to a
+ dataclass."""
+ if (result := _get_sequence_of_objects(d, target_item_type, key)) is None:
+ raise _PylockRequiredKeyError(key)
+ return result
+
+
+def _validate_normalized_name(name: str) -> NormalizedName:
+ """Validate that a string is a NormalizedName."""
+ if not is_normalized_name(name):
+ raise PylockValidationError(f"Name {name!r} is not normalized")
+ return NormalizedName(name)
+
+
+def _validate_path_url(path: str | None, url: str | None) -> None:
+ if not path and not url:
+ raise PylockValidationError("path or url must be provided")
+
+
+def _validate_hashes(hashes: Mapping[str, Any]) -> Mapping[str, Any]:
+ if not hashes:
+ raise PylockValidationError("At least one hash must be provided")
+ if not all(isinstance(hash_val, str) for hash_val in hashes.values()):
+ raise PylockValidationError("Hash values must be strings")
+ return hashes
+
+
+class PylockValidationError(Exception):
+ """Raised when when input data is not spec-compliant."""
+
+ context: str | None = None
+ message: str
+
+ def __init__(
+ self,
+ cause: str | Exception,
+ *,
+ context: str | None = None,
+ ) -> None:
+ if isinstance(cause, PylockValidationError):
+ if cause.context:
+ self.context = (
+ f"{context}.{cause.context}" if context else cause.context
+ )
+ else:
+ self.context = context
+ self.message = cause.message
+ else:
+ self.context = context
+ self.message = str(cause)
+
+ def __str__(self) -> str:
+ if self.context:
+ return f"{self.message} in {self.context!r}"
+ return self.message
+
+
+class _PylockRequiredKeyError(PylockValidationError):
+ def __init__(self, key: str) -> None:
+ super().__init__("Missing required value", context=key)
+
+
+class PylockUnsupportedVersionError(PylockValidationError):
+ """Raised when encountering an unsupported `lock_version`."""
+
+
+@dataclass(frozen=True, init=False)
+class PackageVcs:
+ type: str
+ url: str | None = None
+ path: str | None = None
+ requested_revision: str | None = None
+ commit_id: str # type: ignore[misc]
+ subdirectory: str | None = None
+
+ def __init__(
+ self,
+ *,
+ type: str,
+ url: str | None = None,
+ path: str | None = None,
+ requested_revision: str | None = None,
+ commit_id: str,
+ subdirectory: str | None = None,
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "type", type)
+ object.__setattr__(self, "url", url)
+ object.__setattr__(self, "path", path)
+ object.__setattr__(self, "requested_revision", requested_revision)
+ object.__setattr__(self, "commit_id", commit_id)
+ object.__setattr__(self, "subdirectory", subdirectory)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ package_vcs = cls(
+ type=_get_required(d, str, "type"),
+ url=_get(d, str, "url"),
+ path=_get(d, str, "path"),
+ requested_revision=_get(d, str, "requested-revision"),
+ commit_id=_get_required(d, str, "commit-id"),
+ subdirectory=_get(d, str, "subdirectory"),
+ )
+ _validate_path_url(package_vcs.path, package_vcs.url)
+ return package_vcs
+
+
+@dataclass(frozen=True, init=False)
+class PackageDirectory:
+ path: str
+ editable: bool | None = None
+ subdirectory: str | None = None
+
+ def __init__(
+ self,
+ *,
+ path: str,
+ editable: bool | None = None,
+ subdirectory: str | None = None,
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "path", path)
+ object.__setattr__(self, "editable", editable)
+ object.__setattr__(self, "subdirectory", subdirectory)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ return cls(
+ path=_get_required(d, str, "path"),
+ editable=_get(d, bool, "editable"),
+ subdirectory=_get(d, str, "subdirectory"),
+ )
+
+
+@dataclass(frozen=True, init=False)
+class PackageArchive:
+ url: str | None = None
+ path: str | None = None
+ size: int | None = None
+ upload_time: datetime | None = None
+ hashes: Mapping[str, str] # type: ignore[misc]
+ subdirectory: str | None = None
+
+ def __init__(
+ self,
+ *,
+ url: str | None = None,
+ path: str | None = None,
+ size: int | None = None,
+ upload_time: datetime | None = None,
+ hashes: Mapping[str, str],
+ subdirectory: str | None = None,
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "url", url)
+ object.__setattr__(self, "path", path)
+ object.__setattr__(self, "size", size)
+ object.__setattr__(self, "upload_time", upload_time)
+ object.__setattr__(self, "hashes", hashes)
+ object.__setattr__(self, "subdirectory", subdirectory)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ package_archive = cls(
+ url=_get(d, str, "url"),
+ path=_get(d, str, "path"),
+ size=_get(d, int, "size"),
+ upload_time=_get(d, datetime, "upload-time"),
+ hashes=_get_required_as(d, Mapping, _validate_hashes, "hashes"), # type: ignore[type-abstract]
+ subdirectory=_get(d, str, "subdirectory"),
+ )
+ _validate_path_url(package_archive.path, package_archive.url)
+ return package_archive
+
+
+@dataclass(frozen=True, init=False)
+class PackageSdist:
+ name: str | None = None
+ upload_time: datetime | None = None
+ url: str | None = None
+ path: str | None = None
+ size: int | None = None
+ hashes: Mapping[str, str] # type: ignore[misc]
+
+ def __init__(
+ self,
+ *,
+ name: str | None = None,
+ upload_time: datetime | None = None,
+ url: str | None = None,
+ path: str | None = None,
+ size: int | None = None,
+ hashes: Mapping[str, str],
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "name", name)
+ object.__setattr__(self, "upload_time", upload_time)
+ object.__setattr__(self, "url", url)
+ object.__setattr__(self, "path", path)
+ object.__setattr__(self, "size", size)
+ object.__setattr__(self, "hashes", hashes)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ package_sdist = cls(
+ name=_get(d, str, "name"),
+ upload_time=_get(d, datetime, "upload-time"),
+ url=_get(d, str, "url"),
+ path=_get(d, str, "path"),
+ size=_get(d, int, "size"),
+ hashes=_get_required_as(d, Mapping, _validate_hashes, "hashes"), # type: ignore[type-abstract]
+ )
+ _validate_path_url(package_sdist.path, package_sdist.url)
+ return package_sdist
+
+
+@dataclass(frozen=True, init=False)
+class PackageWheel:
+ name: str | None = None
+ upload_time: datetime | None = None
+ url: str | None = None
+ path: str | None = None
+ size: int | None = None
+ hashes: Mapping[str, str] # type: ignore[misc]
+
+ def __init__(
+ self,
+ *,
+ name: str | None = None,
+ upload_time: datetime | None = None,
+ url: str | None = None,
+ path: str | None = None,
+ size: int | None = None,
+ hashes: Mapping[str, str],
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "name", name)
+ object.__setattr__(self, "upload_time", upload_time)
+ object.__setattr__(self, "url", url)
+ object.__setattr__(self, "path", path)
+ object.__setattr__(self, "size", size)
+ object.__setattr__(self, "hashes", hashes)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ package_wheel = cls(
+ name=_get(d, str, "name"),
+ upload_time=_get(d, datetime, "upload-time"),
+ url=_get(d, str, "url"),
+ path=_get(d, str, "path"),
+ size=_get(d, int, "size"),
+ hashes=_get_required_as(d, Mapping, _validate_hashes, "hashes"), # type: ignore[type-abstract]
+ )
+ _validate_path_url(package_wheel.path, package_wheel.url)
+ return package_wheel
+
+
+@dataclass(frozen=True, init=False)
+class Package:
+ name: NormalizedName
+ version: Version | None = None
+ marker: Marker | None = None
+ requires_python: SpecifierSet | None = None
+ dependencies: Sequence[Mapping[str, Any]] | None = None
+ vcs: PackageVcs | None = None
+ directory: PackageDirectory | None = None
+ archive: PackageArchive | None = None
+ index: str | None = None
+ sdist: PackageSdist | None = None
+ wheels: Sequence[PackageWheel] | None = None
+ attestation_identities: Sequence[Mapping[str, Any]] | None = None
+ tool: Mapping[str, Any] | None = None
+
+ def __init__(
+ self,
+ *,
+ name: NormalizedName,
+ version: Version | None = None,
+ marker: Marker | None = None,
+ requires_python: SpecifierSet | None = None,
+ dependencies: Sequence[Mapping[str, Any]] | None = None,
+ vcs: PackageVcs | None = None,
+ directory: PackageDirectory | None = None,
+ archive: PackageArchive | None = None,
+ index: str | None = None,
+ sdist: PackageSdist | None = None,
+ wheels: Sequence[PackageWheel] | None = None,
+ attestation_identities: Sequence[Mapping[str, Any]] | None = None,
+ tool: Mapping[str, Any] | None = None,
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "name", name)
+ object.__setattr__(self, "version", version)
+ object.__setattr__(self, "marker", marker)
+ object.__setattr__(self, "requires_python", requires_python)
+ object.__setattr__(self, "dependencies", dependencies)
+ object.__setattr__(self, "vcs", vcs)
+ object.__setattr__(self, "directory", directory)
+ object.__setattr__(self, "archive", archive)
+ object.__setattr__(self, "index", index)
+ object.__setattr__(self, "sdist", sdist)
+ object.__setattr__(self, "wheels", wheels)
+ object.__setattr__(self, "attestation_identities", attestation_identities)
+ object.__setattr__(self, "tool", tool)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ package = cls(
+ name=_get_required_as(d, str, _validate_normalized_name, "name"),
+ version=_get_as(d, str, Version, "version"),
+ requires_python=_get_as(d, str, SpecifierSet, "requires-python"),
+ dependencies=_get_sequence(d, Mapping, "dependencies"), # type: ignore[type-abstract]
+ marker=_get_as(d, str, Marker, "marker"),
+ vcs=_get_object(d, PackageVcs, "vcs"),
+ directory=_get_object(d, PackageDirectory, "directory"),
+ archive=_get_object(d, PackageArchive, "archive"),
+ index=_get(d, str, "index"),
+ sdist=_get_object(d, PackageSdist, "sdist"),
+ wheels=_get_sequence_of_objects(d, PackageWheel, "wheels"),
+ attestation_identities=_get_sequence(d, Mapping, "attestation-identities"), # type: ignore[type-abstract]
+ tool=_get(d, Mapping, "tool"), # type: ignore[type-abstract]
+ )
+ distributions = bool(package.sdist) + len(package.wheels or [])
+ direct_urls = (
+ bool(package.vcs) + bool(package.directory) + bool(package.archive)
+ )
+ if distributions > 0 and direct_urls > 0:
+ raise PylockValidationError(
+ "None of vcs, directory, archive must be set if sdist or wheels are set"
+ )
+ if distributions == 0 and direct_urls != 1:
+ raise PylockValidationError(
+ "Exactly one of vcs, directory, archive must be set "
+ "if sdist and wheels are not set"
+ )
+ try:
+ for i, attestation_identity in enumerate( # noqa: B007
+ package.attestation_identities or []
+ ):
+ _get_required(attestation_identity, str, "kind")
+ except Exception as e:
+ raise PylockValidationError(
+ e, context=f"attestation-identities[{i}]"
+ ) from e
+ return package
+
+ @property
+ def is_direct(self) -> bool:
+ return not (self.sdist or self.wheels)
+
+
+@dataclass(frozen=True, init=False)
+class Pylock:
+ """A class representing a pylock file."""
+
+ lock_version: Version
+ environments: Sequence[Marker] | None = None
+ requires_python: SpecifierSet | None = None
+ extras: Sequence[NormalizedName] | None = None
+ dependency_groups: Sequence[str] | None = None
+ default_groups: Sequence[str] | None = None
+ created_by: str # type: ignore[misc]
+ packages: Sequence[Package] # type: ignore[misc]
+ tool: Mapping[str, Any] | None = None
+
+ def __init__(
+ self,
+ *,
+ lock_version: Version,
+ environments: Sequence[Marker] | None = None,
+ requires_python: SpecifierSet | None = None,
+ extras: Sequence[NormalizedName] | None = None,
+ dependency_groups: Sequence[str] | None = None,
+ default_groups: Sequence[str] | None = None,
+ created_by: str,
+ packages: Sequence[Package],
+ tool: Mapping[str, Any] | None = None,
+ ) -> None:
+ # In Python 3.10+ make dataclass kw_only=True and remove __init__
+ object.__setattr__(self, "lock_version", lock_version)
+ object.__setattr__(self, "environments", environments)
+ object.__setattr__(self, "requires_python", requires_python)
+ object.__setattr__(self, "extras", extras)
+ object.__setattr__(self, "dependency_groups", dependency_groups)
+ object.__setattr__(self, "default_groups", default_groups)
+ object.__setattr__(self, "created_by", created_by)
+ object.__setattr__(self, "packages", packages)
+ object.__setattr__(self, "tool", tool)
+
+ @classmethod
+ def _from_dict(cls, d: Mapping[str, Any]) -> Self:
+ pylock = cls(
+ lock_version=_get_required_as(d, str, Version, "lock-version"),
+ environments=_get_sequence_as(d, str, Marker, "environments"),
+ extras=_get_sequence_as(d, str, _validate_normalized_name, "extras"),
+ dependency_groups=_get_sequence(d, str, "dependency-groups"),
+ default_groups=_get_sequence(d, str, "default-groups"),
+ created_by=_get_required(d, str, "created-by"),
+ requires_python=_get_as(d, str, SpecifierSet, "requires-python"),
+ packages=_get_required_sequence_of_objects(d, Package, "packages"),
+ tool=_get(d, Mapping, "tool"), # type: ignore[type-abstract]
+ )
+ if not Version("1") <= pylock.lock_version < Version("2"):
+ raise PylockUnsupportedVersionError(
+ f"pylock version {pylock.lock_version} is not supported"
+ )
+ if pylock.lock_version > Version("1.0"):
+ _logger.warning(
+ "pylock minor version %s is not supported", pylock.lock_version
+ )
+ return pylock
+
+ @classmethod
+ def from_dict(cls, d: Mapping[str, Any], /) -> Self:
+ """Create and validate a Pylock instance from a TOML dictionary.
+
+ Raises :class:`PylockValidationError` if the input data is not
+ spec-compliant.
+ """
+ return cls._from_dict(d)
+
+ def to_dict(self) -> Mapping[str, Any]:
+ """Convert the Pylock instance to a TOML dictionary."""
+ return dataclasses.asdict(self, dict_factory=_toml_dict_factory)
+
+ def validate(self) -> None:
+ """Validate the Pylock instance against the specification.
+
+ Raises :class:`PylockValidationError` otherwise."""
+ self.from_dict(self.to_dict())
diff --git a/venv/lib/python3.10/site-packages/packaging/requirements.py b/venv/lib/python3.10/site-packages/packaging/requirements.py
new file mode 100644
index 0000000000000000000000000000000000000000..3079be69bf880f47e64dbf62993f0e54754b7315
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/requirements.py
@@ -0,0 +1,86 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+from __future__ import annotations
+
+from typing import Iterator
+
+from ._parser import parse_requirement as _parse_requirement
+from ._tokenizer import ParserSyntaxError
+from .markers import Marker, _normalize_extra_values
+from .specifiers import SpecifierSet
+from .utils import canonicalize_name
+
+
+class InvalidRequirement(ValueError):
+ """
+ An invalid requirement was found, users should refer to PEP 508.
+ """
+
+
+class Requirement:
+ """Parse a requirement.
+
+ Parse a given requirement string into its parts, such as name, specifier,
+ URL, and extras. Raises InvalidRequirement on a badly-formed requirement
+ string.
+ """
+
+ # TODO: Can we test whether something is contained within a requirement?
+ # If so how do we do that? Do we need to test against the _name_ of
+ # the thing as well as the version? What about the markers?
+ # TODO: Can we normalize the name and extra name?
+
+ def __init__(self, requirement_string: str) -> None:
+ try:
+ parsed = _parse_requirement(requirement_string)
+ except ParserSyntaxError as e:
+ raise InvalidRequirement(str(e)) from e
+
+ self.name: str = parsed.name
+ self.url: str | None = parsed.url or None
+ self.extras: set[str] = set(parsed.extras or [])
+ self.specifier: SpecifierSet = SpecifierSet(parsed.specifier)
+ self.marker: Marker | None = None
+ if parsed.marker is not None:
+ self.marker = Marker.__new__(Marker)
+ self.marker._markers = _normalize_extra_values(parsed.marker)
+
+ def _iter_parts(self, name: str) -> Iterator[str]:
+ yield name
+
+ if self.extras:
+ formatted_extras = ",".join(sorted(self.extras))
+ yield f"[{formatted_extras}]"
+
+ if self.specifier:
+ yield str(self.specifier)
+
+ if self.url:
+ yield f" @ {self.url}"
+ if self.marker:
+ yield " "
+
+ if self.marker:
+ yield f"; {self.marker}"
+
+ def __str__(self) -> str:
+ return "".join(self._iter_parts(self.name))
+
+ def __repr__(self) -> str:
+ return f"<{self.__class__.__name__}('{self}')>"
+
+ def __hash__(self) -> int:
+ return hash(tuple(self._iter_parts(canonicalize_name(self.name))))
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, Requirement):
+ return NotImplemented
+
+ return (
+ canonicalize_name(self.name) == canonicalize_name(other.name)
+ and self.extras == other.extras
+ and self.specifier == other.specifier
+ and self.url == other.url
+ and self.marker == other.marker
+ )
diff --git a/venv/lib/python3.10/site-packages/packaging/specifiers.py b/venv/lib/python3.10/site-packages/packaging/specifiers.py
new file mode 100644
index 0000000000000000000000000000000000000000..5d26b0d1ae2d21b77e24b692d5a7e1fd01296edc
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/specifiers.py
@@ -0,0 +1,1068 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+"""
+.. testsetup::
+
+ from packaging.specifiers import Specifier, SpecifierSet, InvalidSpecifier
+ from packaging.version import Version
+"""
+
+from __future__ import annotations
+
+import abc
+import itertools
+import re
+from typing import Callable, Final, Iterable, Iterator, TypeVar, Union
+
+from .utils import canonicalize_version
+from .version import InvalidVersion, Version
+
+UnparsedVersion = Union[Version, str]
+UnparsedVersionVar = TypeVar("UnparsedVersionVar", bound=UnparsedVersion)
+CallableOperator = Callable[[Version, str], bool]
+
+
+def _coerce_version(version: UnparsedVersion) -> Version | None:
+ if not isinstance(version, Version):
+ try:
+ version = Version(version)
+ except InvalidVersion:
+ return None
+ return version
+
+
+def _public_version(version: Version) -> Version:
+ return version.__replace__(local=None)
+
+
+def _base_version(version: Version) -> Version:
+ return version.__replace__(pre=None, post=None, dev=None, local=None)
+
+
+class InvalidSpecifier(ValueError):
+ """
+ Raised when attempting to create a :class:`Specifier` with a specifier
+ string that is invalid.
+
+ >>> Specifier("lolwat")
+ Traceback (most recent call last):
+ ...
+ packaging.specifiers.InvalidSpecifier: Invalid specifier: 'lolwat'
+ """
+
+
+class BaseSpecifier(metaclass=abc.ABCMeta):
+ __slots__ = ()
+ __match_args__ = ("_str",)
+
+ @property
+ def _str(self) -> str:
+ """Internal property for match_args"""
+ return str(self)
+
+ @abc.abstractmethod
+ def __str__(self) -> str:
+ """
+ Returns the str representation of this Specifier-like object. This
+ should be representative of the Specifier itself.
+ """
+
+ @abc.abstractmethod
+ def __hash__(self) -> int:
+ """
+ Returns a hash value for this Specifier-like object.
+ """
+
+ @abc.abstractmethod
+ def __eq__(self, other: object) -> bool:
+ """
+ Returns a boolean representing whether or not the two Specifier-like
+ objects are equal.
+
+ :param other: The other object to check against.
+ """
+
+ @property
+ @abc.abstractmethod
+ def prereleases(self) -> bool | None:
+ """Whether or not pre-releases as a whole are allowed.
+
+ This can be set to either ``True`` or ``False`` to explicitly enable or disable
+ prereleases or it can be set to ``None`` (the default) to use default semantics.
+ """
+
+ @prereleases.setter # noqa: B027
+ def prereleases(self, value: bool) -> None:
+ """Setter for :attr:`prereleases`.
+
+ :param value: The value to set.
+ """
+
+ @abc.abstractmethod
+ def contains(self, item: str, prereleases: bool | None = None) -> bool:
+ """
+ Determines if the given item is contained within this specifier.
+ """
+
+ @abc.abstractmethod
+ def filter(
+ self, iterable: Iterable[UnparsedVersionVar], prereleases: bool | None = None
+ ) -> Iterator[UnparsedVersionVar]:
+ """
+ Takes an iterable of items and filters them so that only items which
+ are contained within this specifier are allowed in it.
+ """
+
+
+class Specifier(BaseSpecifier):
+ """This class abstracts handling of version specifiers.
+
+ .. tip::
+
+ It is generally not required to instantiate this manually. You should instead
+ prefer to work with :class:`SpecifierSet` instead, which can parse
+ comma-separated version specifiers (which is what package metadata contains).
+ """
+
+ __slots__ = ("_prereleases", "_spec", "_spec_version")
+
+ _operator_regex_str = r"""
+ (?P(~=|==|!=|<=|>=|<|>|===))
+ """
+ _version_regex_str = r"""
+ (?P
+ (?:
+ # The identity operators allow for an escape hatch that will
+ # do an exact string match of the version you wish to install.
+ # This will not be parsed by PEP 440 and we cannot determine
+ # any semantic meaning from it. This operator is discouraged
+ # but included entirely as an escape hatch.
+ (?<====) # Only match for the identity operator
+ \s*
+ [^\s;)]* # The arbitrary version can be just about anything,
+ # we match everything except for whitespace, a
+ # semi-colon for marker support, and a closing paren
+ # since versions can be enclosed in them.
+ )
+ |
+ (?:
+ # The (non)equality operators allow for wild card and local
+ # versions to be specified so we have to define these two
+ # operators separately to enable that.
+ (?<===|!=) # Only match for equals and not equals
+
+ \s*
+ v?
+ (?:[0-9]+!)? # epoch
+ [0-9]+(?:\.[0-9]+)* # release
+
+ # You cannot use a wild card and a pre-release, post-release, a dev or
+ # local version together so group them with a | and make them optional.
+ (?:
+ \.\* # Wild card syntax of .*
+ |
+ (?: # pre release
+ [-_\.]?
+ (alpha|beta|preview|pre|a|b|c|rc)
+ [-_\.]?
+ [0-9]*
+ )?
+ (?: # post release
+ (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
+ )?
+ (?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
+ (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
+ )?
+ )
+ |
+ (?:
+ # The compatible operator requires at least two digits in the
+ # release segment.
+ (?<=~=) # Only match for the compatible operator
+
+ \s*
+ v?
+ (?:[0-9]+!)? # epoch
+ [0-9]+(?:\.[0-9]+)+ # release (We have a + instead of a *)
+ (?: # pre release
+ [-_\.]?
+ (alpha|beta|preview|pre|a|b|c|rc)
+ [-_\.]?
+ [0-9]*
+ )?
+ (?: # post release
+ (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
+ )?
+ (?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
+ )
+ |
+ (?:
+ # All other operators only allow a sub set of what the
+ # (non)equality operators do. Specifically they do not allow
+ # local versions to be specified nor do they allow the prefix
+ # matching wild cards.
+ (?=": "greater_than_equal",
+ "<": "less_than",
+ ">": "greater_than",
+ "===": "arbitrary",
+ }
+
+ def __init__(self, spec: str = "", prereleases: bool | None = None) -> None:
+ """Initialize a Specifier instance.
+
+ :param spec:
+ The string representation of a specifier which will be parsed and
+ normalized before use.
+ :param prereleases:
+ This tells the specifier if it should accept prerelease versions if
+ applicable or not. The default of ``None`` will autodetect it from the
+ given specifiers.
+ :raises InvalidSpecifier:
+ If the given specifier is invalid (i.e. bad syntax).
+ """
+ match = self._regex.fullmatch(spec)
+ if not match:
+ raise InvalidSpecifier(f"Invalid specifier: {spec!r}")
+
+ self._spec: tuple[str, str] = (
+ match.group("operator").strip(),
+ match.group("version").strip(),
+ )
+
+ # Store whether or not this Specifier should accept prereleases
+ self._prereleases = prereleases
+
+ # Specifier version cache
+ self._spec_version: tuple[str, Version] | None = None
+
+ def _get_spec_version(self, version: str) -> Version | None:
+ """One element cache, as only one spec Version is needed per Specifier."""
+ if self._spec_version is not None and self._spec_version[0] == version:
+ return self._spec_version[1]
+
+ version_specifier = _coerce_version(version)
+ if version_specifier is None:
+ return None
+
+ self._spec_version = (version, version_specifier)
+ return version_specifier
+
+ def _require_spec_version(self, version: str) -> Version:
+ """Get spec version, asserting it's valid (not for === operator).
+
+ This method should only be called for operators where version
+ strings are guaranteed to be valid PEP 440 versions (not ===).
+ """
+ spec_version = self._get_spec_version(version)
+ assert spec_version is not None
+ return spec_version
+
+ @property
+ def prereleases(self) -> bool | None:
+ # If there is an explicit prereleases set for this, then we'll just
+ # blindly use that.
+ if self._prereleases is not None:
+ return self._prereleases
+
+ # Only the "!=" operator does not imply prereleases when
+ # the version in the specifier is a prerelease.
+ operator, version_str = self._spec
+ if operator != "!=":
+ # The == specifier with trailing .* cannot include prereleases
+ # e.g. "==1.0a1.*" is not valid.
+ if operator == "==" and version_str.endswith(".*"):
+ return False
+
+ # "===" can have arbitrary string versions, so we cannot parse
+ # those, we take prereleases as unknown (None) for those.
+ version = self._get_spec_version(version_str)
+ if version is None:
+ return None
+
+ # For all other operators, use the check if spec Version
+ # object implies pre-releases.
+ if version.is_prerelease:
+ return True
+
+ return False
+
+ @prereleases.setter
+ def prereleases(self, value: bool | None) -> None:
+ self._prereleases = value
+
+ @property
+ def operator(self) -> str:
+ """The operator of this specifier.
+
+ >>> Specifier("==1.2.3").operator
+ '=='
+ """
+ return self._spec[0]
+
+ @property
+ def version(self) -> str:
+ """The version of this specifier.
+
+ >>> Specifier("==1.2.3").version
+ '1.2.3'
+ """
+ return self._spec[1]
+
+ def __repr__(self) -> str:
+ """A representation of the Specifier that shows all internal state.
+
+ >>> Specifier('>=1.0.0')
+ =1.0.0')>
+ >>> Specifier('>=1.0.0', prereleases=False)
+ =1.0.0', prereleases=False)>
+ >>> Specifier('>=1.0.0', prereleases=True)
+ =1.0.0', prereleases=True)>
+ """
+ pre = (
+ f", prereleases={self.prereleases!r}"
+ if self._prereleases is not None
+ else ""
+ )
+
+ return f"<{self.__class__.__name__}({str(self)!r}{pre})>"
+
+ def __str__(self) -> str:
+ """A string representation of the Specifier that can be round-tripped.
+
+ >>> str(Specifier('>=1.0.0'))
+ '>=1.0.0'
+ >>> str(Specifier('>=1.0.0', prereleases=False))
+ '>=1.0.0'
+ """
+ return "{}{}".format(*self._spec)
+
+ @property
+ def _canonical_spec(self) -> tuple[str, str]:
+ operator, version = self._spec
+ if operator == "===" or version.endswith(".*"):
+ return operator, version
+
+ spec_version = self._require_spec_version(version)
+
+ canonical_version = canonicalize_version(
+ spec_version, strip_trailing_zero=(operator != "~=")
+ )
+
+ return operator, canonical_version
+
+ def __hash__(self) -> int:
+ return hash(self._canonical_spec)
+
+ def __eq__(self, other: object) -> bool:
+ """Whether or not the two Specifier-like objects are equal.
+
+ :param other: The other object to check against.
+
+ The value of :attr:`prereleases` is ignored.
+
+ >>> Specifier("==1.2.3") == Specifier("== 1.2.3.0")
+ True
+ >>> (Specifier("==1.2.3", prereleases=False) ==
+ ... Specifier("==1.2.3", prereleases=True))
+ True
+ >>> Specifier("==1.2.3") == "==1.2.3"
+ True
+ >>> Specifier("==1.2.3") == Specifier("==1.2.4")
+ False
+ >>> Specifier("==1.2.3") == Specifier("~=1.2.3")
+ False
+ """
+ if isinstance(other, str):
+ try:
+ other = self.__class__(str(other))
+ except InvalidSpecifier:
+ return NotImplemented
+ elif not isinstance(other, self.__class__):
+ return NotImplemented
+
+ return self._canonical_spec == other._canonical_spec
+
+ def _get_operator(self, op: str) -> CallableOperator:
+ operator_callable: CallableOperator = getattr(
+ self, f"_compare_{self._operators[op]}"
+ )
+ return operator_callable
+
+ def _compare_compatible(self, prospective: Version, spec: str) -> bool:
+ # Compatible releases have an equivalent combination of >= and ==. That
+ # is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
+ # implement this in terms of the other specifiers instead of
+ # implementing it ourselves. The only thing we need to do is construct
+ # the other specifiers.
+
+ # We want everything but the last item in the version, but we want to
+ # ignore suffix segments.
+ prefix = _version_join(
+ list(itertools.takewhile(_is_not_suffix, _version_split(spec)))[:-1]
+ )
+
+ # Add the prefix notation to the end of our string
+ prefix += ".*"
+
+ return self._get_operator(">=")(prospective, spec) and self._get_operator("==")(
+ prospective, prefix
+ )
+
+ def _compare_equal(self, prospective: Version, spec: str) -> bool:
+ # We need special logic to handle prefix matching
+ if spec.endswith(".*"):
+ # In the case of prefix matching we want to ignore local segment.
+ normalized_prospective = canonicalize_version(
+ _public_version(prospective), strip_trailing_zero=False
+ )
+ # Get the normalized version string ignoring the trailing .*
+ normalized_spec = canonicalize_version(spec[:-2], strip_trailing_zero=False)
+ # Split the spec out by bangs and dots, and pretend that there is
+ # an implicit dot in between a release segment and a pre-release segment.
+ split_spec = _version_split(normalized_spec)
+
+ # Split the prospective version out by bangs and dots, and pretend
+ # that there is an implicit dot in between a release segment and
+ # a pre-release segment.
+ split_prospective = _version_split(normalized_prospective)
+
+ # 0-pad the prospective version before shortening it to get the correct
+ # shortened version.
+ padded_prospective, _ = _pad_version(split_prospective, split_spec)
+
+ # Shorten the prospective version to be the same length as the spec
+ # so that we can determine if the specifier is a prefix of the
+ # prospective version or not.
+ shortened_prospective = padded_prospective[: len(split_spec)]
+
+ return shortened_prospective == split_spec
+ else:
+ # Convert our spec string into a Version
+ spec_version = self._require_spec_version(spec)
+
+ # If the specifier does not have a local segment, then we want to
+ # act as if the prospective version also does not have a local
+ # segment.
+ if not spec_version.local:
+ prospective = _public_version(prospective)
+
+ return prospective == spec_version
+
+ def _compare_not_equal(self, prospective: Version, spec: str) -> bool:
+ return not self._compare_equal(prospective, spec)
+
+ def _compare_less_than_equal(self, prospective: Version, spec: str) -> bool:
+ # NB: Local version identifiers are NOT permitted in the version
+ # specifier, so local version labels can be universally removed from
+ # the prospective version.
+ return _public_version(prospective) <= self._require_spec_version(spec)
+
+ def _compare_greater_than_equal(self, prospective: Version, spec: str) -> bool:
+ # NB: Local version identifiers are NOT permitted in the version
+ # specifier, so local version labels can be universally removed from
+ # the prospective version.
+ return _public_version(prospective) >= self._require_spec_version(spec)
+
+ def _compare_less_than(self, prospective: Version, spec_str: str) -> bool:
+ # Convert our spec to a Version instance, since we'll want to work with
+ # it as a version.
+ spec = self._require_spec_version(spec_str)
+
+ # Check to see if the prospective version is less than the spec
+ # version. If it's not we can short circuit and just return False now
+ # instead of doing extra unneeded work.
+ if not prospective < spec:
+ return False
+
+ # This special case is here so that, unless the specifier itself
+ # includes is a pre-release version, that we do not accept pre-release
+ # versions for the version mentioned in the specifier (e.g. <3.1 should
+ # not match 3.1.dev0, but should match 3.0.dev0).
+ if (
+ not spec.is_prerelease
+ and prospective.is_prerelease
+ and _base_version(prospective) == _base_version(spec)
+ ):
+ return False
+
+ # If we've gotten to here, it means that prospective version is both
+ # less than the spec version *and* it's not a pre-release of the same
+ # version in the spec.
+ return True
+
+ def _compare_greater_than(self, prospective: Version, spec_str: str) -> bool:
+ # Convert our spec to a Version instance, since we'll want to work with
+ # it as a version.
+ spec = self._require_spec_version(spec_str)
+
+ # Check to see if the prospective version is greater than the spec
+ # version. If it's not we can short circuit and just return False now
+ # instead of doing extra unneeded work.
+ if not prospective > spec:
+ return False
+
+ # This special case is here so that, unless the specifier itself
+ # includes is a post-release version, that we do not accept
+ # post-release versions for the version mentioned in the specifier
+ # (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
+ if (
+ not spec.is_postrelease
+ and prospective.is_postrelease
+ and _base_version(prospective) == _base_version(spec)
+ ):
+ return False
+
+ # Ensure that we do not allow a local version of the version mentioned
+ # in the specifier, which is technically greater than, to match.
+ if prospective.local is not None and _base_version(
+ prospective
+ ) == _base_version(spec):
+ return False
+
+ # If we've gotten to here, it means that prospective version is both
+ # greater than the spec version *and* it's not a pre-release of the
+ # same version in the spec.
+ return True
+
+ def _compare_arbitrary(self, prospective: Version | str, spec: str) -> bool:
+ return str(prospective).lower() == str(spec).lower()
+
+ def __contains__(self, item: str | Version) -> bool:
+ """Return whether or not the item is contained in this specifier.
+
+ :param item: The item to check for.
+
+ This is used for the ``in`` operator and behaves the same as
+ :meth:`contains` with no ``prereleases`` argument passed.
+
+ >>> "1.2.3" in Specifier(">=1.2.3")
+ True
+ >>> Version("1.2.3") in Specifier(">=1.2.3")
+ True
+ >>> "1.0.0" in Specifier(">=1.2.3")
+ False
+ >>> "1.3.0a1" in Specifier(">=1.2.3")
+ True
+ >>> "1.3.0a1" in Specifier(">=1.2.3", prereleases=True)
+ True
+ """
+ return self.contains(item)
+
+ def contains(self, item: UnparsedVersion, prereleases: bool | None = None) -> bool:
+ """Return whether or not the item is contained in this specifier.
+
+ :param item:
+ The item to check for, which can be a version string or a
+ :class:`Version` instance.
+ :param prereleases:
+ Whether or not to match prereleases with this Specifier. If set to
+ ``None`` (the default), it will follow the recommendation from
+ :pep:`440` and match prereleases, as there are no other versions.
+
+ >>> Specifier(">=1.2.3").contains("1.2.3")
+ True
+ >>> Specifier(">=1.2.3").contains(Version("1.2.3"))
+ True
+ >>> Specifier(">=1.2.3").contains("1.0.0")
+ False
+ >>> Specifier(">=1.2.3").contains("1.3.0a1")
+ True
+ >>> Specifier(">=1.2.3", prereleases=False).contains("1.3.0a1")
+ False
+ >>> Specifier(">=1.2.3").contains("1.3.0a1")
+ True
+ """
+
+ return bool(list(self.filter([item], prereleases=prereleases)))
+
+ def filter(
+ self, iterable: Iterable[UnparsedVersionVar], prereleases: bool | None = None
+ ) -> Iterator[UnparsedVersionVar]:
+ """Filter items in the given iterable, that match the specifier.
+
+ :param iterable:
+ An iterable that can contain version strings and :class:`Version` instances.
+ The items in the iterable will be filtered according to the specifier.
+ :param prereleases:
+ Whether or not to allow prereleases in the returned iterator. If set to
+ ``None`` (the default), it will follow the recommendation from :pep:`440`
+ and match prereleases if there are no other versions.
+
+ >>> list(Specifier(">=1.2.3").filter(["1.2", "1.3", "1.5a1"]))
+ ['1.3']
+ >>> list(Specifier(">=1.2.3").filter(["1.2", "1.2.3", "1.3", Version("1.4")]))
+ ['1.2.3', '1.3', ]
+ >>> list(Specifier(">=1.2.3").filter(["1.2", "1.5a1"]))
+ ['1.5a1']
+ >>> list(Specifier(">=1.2.3").filter(["1.3", "1.5a1"], prereleases=True))
+ ['1.3', '1.5a1']
+ >>> list(Specifier(">=1.2.3", prereleases=True).filter(["1.3", "1.5a1"]))
+ ['1.3', '1.5a1']
+ """
+ prereleases_versions = []
+ found_non_prereleases = False
+
+ # Determine if to include prereleases by default
+ include_prereleases = (
+ prereleases if prereleases is not None else self.prereleases
+ )
+
+ # Get the matching operator
+ operator_callable = self._get_operator(self.operator)
+
+ # Filter versions
+ for version in iterable:
+ parsed_version = _coerce_version(version)
+ if parsed_version is None:
+ # === operator can match arbitrary (non-version) strings
+ if self.operator == "===" and self._compare_arbitrary(
+ version, self.version
+ ):
+ yield version
+ elif operator_callable(parsed_version, self.version):
+ # If it's not a prerelease or prereleases are allowed, yield it directly
+ if not parsed_version.is_prerelease or include_prereleases:
+ found_non_prereleases = True
+ yield version
+ # Otherwise collect prereleases for potential later use
+ elif prereleases is None and self._prereleases is not False:
+ prereleases_versions.append(version)
+
+ # If no non-prereleases were found and prereleases weren't
+ # explicitly forbidden, yield the collected prereleases
+ if (
+ not found_non_prereleases
+ and prereleases is None
+ and self._prereleases is not False
+ ):
+ yield from prereleases_versions
+
+
+_prefix_regex = re.compile(r"([0-9]+)((?:a|b|c|rc)[0-9]+)")
+
+
+def _version_split(version: str) -> list[str]:
+ """Split version into components.
+
+ The split components are intended for version comparison. The logic does
+ not attempt to retain the original version string, so joining the
+ components back with :func:`_version_join` may not produce the original
+ version string.
+ """
+ result: list[str] = []
+
+ epoch, _, rest = version.rpartition("!")
+ result.append(epoch or "0")
+
+ for item in rest.split("."):
+ match = _prefix_regex.fullmatch(item)
+ if match:
+ result.extend(match.groups())
+ else:
+ result.append(item)
+ return result
+
+
+def _version_join(components: list[str]) -> str:
+ """Join split version components into a version string.
+
+ This function assumes the input came from :func:`_version_split`, where the
+ first component must be the epoch (either empty or numeric), and all other
+ components numeric.
+ """
+ epoch, *rest = components
+ return f"{epoch}!{'.'.join(rest)}"
+
+
+def _is_not_suffix(segment: str) -> bool:
+ return not any(
+ segment.startswith(prefix) for prefix in ("dev", "a", "b", "rc", "post")
+ )
+
+
+def _pad_version(left: list[str], right: list[str]) -> tuple[list[str], list[str]]:
+ left_split, right_split = [], []
+
+ # Get the release segment of our versions
+ left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
+ right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))
+
+ # Get the rest of our versions
+ left_split.append(left[len(left_split[0]) :])
+ right_split.append(right[len(right_split[0]) :])
+
+ # Insert our padding
+ left_split.insert(1, ["0"] * max(0, len(right_split[0]) - len(left_split[0])))
+ right_split.insert(1, ["0"] * max(0, len(left_split[0]) - len(right_split[0])))
+
+ return (
+ list(itertools.chain.from_iterable(left_split)),
+ list(itertools.chain.from_iterable(right_split)),
+ )
+
+
+class SpecifierSet(BaseSpecifier):
+ """This class abstracts handling of a set of version specifiers.
+
+ It can be passed a single specifier (``>=3.0``), a comma-separated list of
+ specifiers (``>=3.0,!=3.1``), or no specifier at all.
+ """
+
+ __slots__ = ("_prereleases", "_specs")
+
+ def __init__(
+ self,
+ specifiers: str | Iterable[Specifier] = "",
+ prereleases: bool | None = None,
+ ) -> None:
+ """Initialize a SpecifierSet instance.
+
+ :param specifiers:
+ The string representation of a specifier or a comma-separated list of
+ specifiers which will be parsed and normalized before use.
+ May also be an iterable of ``Specifier`` instances, which will be used
+ as is.
+ :param prereleases:
+ This tells the SpecifierSet if it should accept prerelease versions if
+ applicable or not. The default of ``None`` will autodetect it from the
+ given specifiers.
+
+ :raises InvalidSpecifier:
+ If the given ``specifiers`` are not parseable than this exception will be
+ raised.
+ """
+
+ if isinstance(specifiers, str):
+ # Split on `,` to break each individual specifier into its own item, and
+ # strip each item to remove leading/trailing whitespace.
+ split_specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]
+
+ # Make each individual specifier a Specifier and save in a frozen set
+ # for later.
+ self._specs = frozenset(map(Specifier, split_specifiers))
+ else:
+ # Save the supplied specifiers in a frozen set.
+ self._specs = frozenset(specifiers)
+
+ # Store our prereleases value so we can use it later to determine if
+ # we accept prereleases or not.
+ self._prereleases = prereleases
+
+ @property
+ def prereleases(self) -> bool | None:
+ # If we have been given an explicit prerelease modifier, then we'll
+ # pass that through here.
+ if self._prereleases is not None:
+ return self._prereleases
+
+ # If we don't have any specifiers, and we don't have a forced value,
+ # then we'll just return None since we don't know if this should have
+ # pre-releases or not.
+ if not self._specs:
+ return None
+
+ # Otherwise we'll see if any of the given specifiers accept
+ # prereleases, if any of them do we'll return True, otherwise False.
+ if any(s.prereleases for s in self._specs):
+ return True
+
+ return None
+
+ @prereleases.setter
+ def prereleases(self, value: bool | None) -> None:
+ self._prereleases = value
+
+ def __repr__(self) -> str:
+ """A representation of the specifier set that shows all internal state.
+
+ Note that the ordering of the individual specifiers within the set may not
+ match the input string.
+
+ >>> SpecifierSet('>=1.0.0,!=2.0.0')
+ =1.0.0')>
+ >>> SpecifierSet('>=1.0.0,!=2.0.0', prereleases=False)
+ =1.0.0', prereleases=False)>
+ >>> SpecifierSet('>=1.0.0,!=2.0.0', prereleases=True)
+ =1.0.0', prereleases=True)>
+ """
+ pre = (
+ f", prereleases={self.prereleases!r}"
+ if self._prereleases is not None
+ else ""
+ )
+
+ return f""
+
+ def __str__(self) -> str:
+ """A string representation of the specifier set that can be round-tripped.
+
+ Note that the ordering of the individual specifiers within the set may not
+ match the input string.
+
+ >>> str(SpecifierSet(">=1.0.0,!=1.0.1"))
+ '!=1.0.1,>=1.0.0'
+ >>> str(SpecifierSet(">=1.0.0,!=1.0.1", prereleases=False))
+ '!=1.0.1,>=1.0.0'
+ """
+ return ",".join(sorted(str(s) for s in self._specs))
+
+ def __hash__(self) -> int:
+ return hash(self._specs)
+
+ def __and__(self, other: SpecifierSet | str) -> SpecifierSet:
+ """Return a SpecifierSet which is a combination of the two sets.
+
+ :param other: The other object to combine with.
+
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") & '<=2.0.0,!=2.0.1'
+ =1.0.0')>
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") & SpecifierSet('<=2.0.0,!=2.0.1')
+ =1.0.0')>
+ """
+ if isinstance(other, str):
+ other = SpecifierSet(other)
+ elif not isinstance(other, SpecifierSet):
+ return NotImplemented
+
+ specifier = SpecifierSet()
+ specifier._specs = frozenset(self._specs | other._specs)
+
+ if self._prereleases is None and other._prereleases is not None:
+ specifier._prereleases = other._prereleases
+ elif (
+ self._prereleases is not None and other._prereleases is None
+ ) or self._prereleases == other._prereleases:
+ specifier._prereleases = self._prereleases
+ else:
+ raise ValueError(
+ "Cannot combine SpecifierSets with True and False prerelease overrides."
+ )
+
+ return specifier
+
+ def __eq__(self, other: object) -> bool:
+ """Whether or not the two SpecifierSet-like objects are equal.
+
+ :param other: The other object to check against.
+
+ The value of :attr:`prereleases` is ignored.
+
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") == SpecifierSet(">=1.0.0,!=1.0.1")
+ True
+ >>> (SpecifierSet(">=1.0.0,!=1.0.1", prereleases=False) ==
+ ... SpecifierSet(">=1.0.0,!=1.0.1", prereleases=True))
+ True
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") == ">=1.0.0,!=1.0.1"
+ True
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") == SpecifierSet(">=1.0.0")
+ False
+ >>> SpecifierSet(">=1.0.0,!=1.0.1") == SpecifierSet(">=1.0.0,!=1.0.2")
+ False
+ """
+ if isinstance(other, (str, Specifier)):
+ other = SpecifierSet(str(other))
+ elif not isinstance(other, SpecifierSet):
+ return NotImplemented
+
+ return self._specs == other._specs
+
+ def __len__(self) -> int:
+ """Returns the number of specifiers in this specifier set."""
+ return len(self._specs)
+
+ def __iter__(self) -> Iterator[Specifier]:
+ """
+ Returns an iterator over all the underlying :class:`Specifier` instances
+ in this specifier set.
+
+ >>> sorted(SpecifierSet(">=1.0.0,!=1.0.1"), key=str)
+ [, =1.0.0')>]
+ """
+ return iter(self._specs)
+
+ def __contains__(self, item: UnparsedVersion) -> bool:
+ """Return whether or not the item is contained in this specifier.
+
+ :param item: The item to check for.
+
+ This is used for the ``in`` operator and behaves the same as
+ :meth:`contains` with no ``prereleases`` argument passed.
+
+ >>> "1.2.3" in SpecifierSet(">=1.0.0,!=1.0.1")
+ True
+ >>> Version("1.2.3") in SpecifierSet(">=1.0.0,!=1.0.1")
+ True
+ >>> "1.0.1" in SpecifierSet(">=1.0.0,!=1.0.1")
+ False
+ >>> "1.3.0a1" in SpecifierSet(">=1.0.0,!=1.0.1")
+ True
+ >>> "1.3.0a1" in SpecifierSet(">=1.0.0,!=1.0.1", prereleases=True)
+ True
+ """
+ return self.contains(item)
+
+ def contains(
+ self,
+ item: UnparsedVersion,
+ prereleases: bool | None = None,
+ installed: bool | None = None,
+ ) -> bool:
+ """Return whether or not the item is contained in this SpecifierSet.
+
+ :param item:
+ The item to check for, which can be a version string or a
+ :class:`Version` instance.
+ :param prereleases:
+ Whether or not to match prereleases with this SpecifierSet. If set to
+ ``None`` (the default), it will follow the recommendation from :pep:`440`
+ and match prereleases, as there are no other versions.
+ :param installed:
+ Whether or not the item is installed. If set to ``True``, it will
+ accept prerelease versions even if the specifier does not allow them.
+
+ >>> SpecifierSet(">=1.0.0,!=1.0.1").contains("1.2.3")
+ True
+ >>> SpecifierSet(">=1.0.0,!=1.0.1").contains(Version("1.2.3"))
+ True
+ >>> SpecifierSet(">=1.0.0,!=1.0.1").contains("1.0.1")
+ False
+ >>> SpecifierSet(">=1.0.0,!=1.0.1").contains("1.3.0a1")
+ True
+ >>> SpecifierSet(">=1.0.0,!=1.0.1", prereleases=False).contains("1.3.0a1")
+ False
+ >>> SpecifierSet(">=1.0.0,!=1.0.1").contains("1.3.0a1", prereleases=True)
+ True
+ """
+ version = _coerce_version(item)
+
+ if version is not None and installed and version.is_prerelease:
+ prereleases = True
+
+ check_item = item if version is None else version
+ return bool(list(self.filter([check_item], prereleases=prereleases)))
+
+ def filter(
+ self, iterable: Iterable[UnparsedVersionVar], prereleases: bool | None = None
+ ) -> Iterator[UnparsedVersionVar]:
+ """Filter items in the given iterable, that match the specifiers in this set.
+
+ :param iterable:
+ An iterable that can contain version strings and :class:`Version` instances.
+ The items in the iterable will be filtered according to the specifier.
+ :param prereleases:
+ Whether or not to allow prereleases in the returned iterator. If set to
+ ``None`` (the default), it will follow the recommendation from :pep:`440`
+ and match prereleases if there are no other versions.
+
+ >>> list(SpecifierSet(">=1.2.3").filter(["1.2", "1.3", "1.5a1"]))
+ ['1.3']
+ >>> list(SpecifierSet(">=1.2.3").filter(["1.2", "1.3", Version("1.4")]))
+ ['1.3', ]
+ >>> list(SpecifierSet(">=1.2.3").filter(["1.2", "1.5a1"]))
+ ['1.5a1']
+ >>> list(SpecifierSet(">=1.2.3").filter(["1.3", "1.5a1"], prereleases=True))
+ ['1.3', '1.5a1']
+ >>> list(SpecifierSet(">=1.2.3", prereleases=True).filter(["1.3", "1.5a1"]))
+ ['1.3', '1.5a1']
+
+ An "empty" SpecifierSet will filter items based on the presence of prerelease
+ versions in the set.
+
+ >>> list(SpecifierSet("").filter(["1.3", "1.5a1"]))
+ ['1.3']
+ >>> list(SpecifierSet("").filter(["1.5a1"]))
+ ['1.5a1']
+ >>> list(SpecifierSet("", prereleases=True).filter(["1.3", "1.5a1"]))
+ ['1.3', '1.5a1']
+ >>> list(SpecifierSet("").filter(["1.3", "1.5a1"], prereleases=True))
+ ['1.3', '1.5a1']
+ """
+ # Determine if we're forcing a prerelease or not, if we're not forcing
+ # one for this particular filter call, then we'll use whatever the
+ # SpecifierSet thinks for whether or not we should support prereleases.
+ if prereleases is None and self.prereleases is not None:
+ prereleases = self.prereleases
+
+ # If we have any specifiers, then we want to wrap our iterable in the
+ # filter method for each one, this will act as a logical AND amongst
+ # each specifier.
+ if self._specs:
+ # When prereleases is None, we need to let all versions through
+ # the individual filters, then decide about prereleases at the end
+ # based on whether any non-prereleases matched ALL specs.
+ for spec in self._specs:
+ iterable = spec.filter(
+ iterable, prereleases=True if prereleases is None else prereleases
+ )
+
+ if prereleases is not None:
+ # If we have a forced prereleases value,
+ # we can immediately return the iterator.
+ return iter(iterable)
+ else:
+ # Handle empty SpecifierSet cases where prereleases is not None.
+ if prereleases is True:
+ return iter(iterable)
+
+ if prereleases is False:
+ return (
+ item
+ for item in iterable
+ if (version := _coerce_version(item)) is None
+ or not version.is_prerelease
+ )
+
+ # Finally if prereleases is None, apply PEP 440 logic:
+ # exclude prereleases unless there are no final releases that matched.
+ filtered_items: list[UnparsedVersionVar] = []
+ found_prereleases: list[UnparsedVersionVar] = []
+ found_final_release = False
+
+ for item in iterable:
+ parsed_version = _coerce_version(item)
+ # Arbitrary strings are always included as it is not
+ # possible to determine if they are prereleases,
+ # and they have already passed all specifiers.
+ if parsed_version is None:
+ filtered_items.append(item)
+ found_prereleases.append(item)
+ elif parsed_version.is_prerelease:
+ found_prereleases.append(item)
+ else:
+ filtered_items.append(item)
+ found_final_release = True
+
+ return iter(filtered_items if found_final_release else found_prereleases)
diff --git a/venv/lib/python3.10/site-packages/packaging/tags.py b/venv/lib/python3.10/site-packages/packaging/tags.py
new file mode 100644
index 0000000000000000000000000000000000000000..5ef27c897a4df35a2a6923b608a5e04a0a38b9ee
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/tags.py
@@ -0,0 +1,651 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+
+from __future__ import annotations
+
+import logging
+import platform
+import re
+import struct
+import subprocess
+import sys
+import sysconfig
+from importlib.machinery import EXTENSION_SUFFIXES
+from typing import (
+ Any,
+ Iterable,
+ Iterator,
+ Sequence,
+ Tuple,
+ cast,
+)
+
+from . import _manylinux, _musllinux
+
+logger = logging.getLogger(__name__)
+
+PythonVersion = Sequence[int]
+AppleVersion = Tuple[int, int]
+
+INTERPRETER_SHORT_NAMES: dict[str, str] = {
+ "python": "py", # Generic.
+ "cpython": "cp",
+ "pypy": "pp",
+ "ironpython": "ip",
+ "jython": "jy",
+}
+
+
+_32_BIT_INTERPRETER = struct.calcsize("P") == 4
+
+
+class Tag:
+ """
+ A representation of the tag triple for a wheel.
+
+ Instances are considered immutable and thus are hashable. Equality checking
+ is also supported.
+ """
+
+ __slots__ = ["_abi", "_hash", "_interpreter", "_platform"]
+
+ def __init__(self, interpreter: str, abi: str, platform: str) -> None:
+ self._interpreter = interpreter.lower()
+ self._abi = abi.lower()
+ self._platform = platform.lower()
+ # The __hash__ of every single element in a Set[Tag] will be evaluated each time
+ # that a set calls its `.disjoint()` method, which may be called hundreds of
+ # times when scanning a page of links for packages with tags matching that
+ # Set[Tag]. Pre-computing the value here produces significant speedups for
+ # downstream consumers.
+ self._hash = hash((self._interpreter, self._abi, self._platform))
+
+ @property
+ def interpreter(self) -> str:
+ return self._interpreter
+
+ @property
+ def abi(self) -> str:
+ return self._abi
+
+ @property
+ def platform(self) -> str:
+ return self._platform
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, Tag):
+ return NotImplemented
+
+ return (
+ (self._hash == other._hash) # Short-circuit ASAP for perf reasons.
+ and (self._platform == other._platform)
+ and (self._abi == other._abi)
+ and (self._interpreter == other._interpreter)
+ )
+
+ def __hash__(self) -> int:
+ return self._hash
+
+ def __str__(self) -> str:
+ return f"{self._interpreter}-{self._abi}-{self._platform}"
+
+ def __repr__(self) -> str:
+ return f"<{self} @ {id(self)}>"
+
+ def __setstate__(self, state: tuple[None, dict[str, Any]]) -> None:
+ # The cached _hash is wrong when unpickling.
+ _, slots = state
+ for k, v in slots.items():
+ setattr(self, k, v)
+ self._hash = hash((self._interpreter, self._abi, self._platform))
+
+
+def parse_tag(tag: str) -> frozenset[Tag]:
+ """
+ Parses the provided tag (e.g. `py3-none-any`) into a frozenset of Tag instances.
+
+ Returning a set is required due to the possibility that the tag is a
+ compressed tag set.
+ """
+ tags = set()
+ interpreters, abis, platforms = tag.split("-")
+ for interpreter in interpreters.split("."):
+ for abi in abis.split("."):
+ for platform_ in platforms.split("."):
+ tags.add(Tag(interpreter, abi, platform_))
+ return frozenset(tags)
+
+
+def _get_config_var(name: str, warn: bool = False) -> int | str | None:
+ value: int | str | None = sysconfig.get_config_var(name)
+ if value is None and warn:
+ logger.debug(
+ "Config variable '%s' is unset, Python ABI tag may be incorrect", name
+ )
+ return value
+
+
+def _normalize_string(string: str) -> str:
+ return string.replace(".", "_").replace("-", "_").replace(" ", "_")
+
+
+def _is_threaded_cpython(abis: list[str]) -> bool:
+ """
+ Determine if the ABI corresponds to a threaded (`--disable-gil`) build.
+
+ The threaded builds are indicated by a "t" in the abiflags.
+ """
+ if len(abis) == 0:
+ return False
+ # expect e.g., cp313
+ m = re.match(r"cp\d+(.*)", abis[0])
+ if not m:
+ return False
+ abiflags = m.group(1)
+ return "t" in abiflags
+
+
+def _abi3_applies(python_version: PythonVersion, threading: bool) -> bool:
+ """
+ Determine if the Python version supports abi3.
+
+ PEP 384 was first implemented in Python 3.2. The threaded (`--disable-gil`)
+ builds do not support abi3.
+ """
+ return len(python_version) > 1 and tuple(python_version) >= (3, 2) and not threading
+
+
+def _cpython_abis(py_version: PythonVersion, warn: bool = False) -> list[str]:
+ py_version = tuple(py_version) # To allow for version comparison.
+ abis = []
+ version = _version_nodot(py_version[:2])
+ threading = debug = pymalloc = ucs4 = ""
+ with_debug = _get_config_var("Py_DEBUG", warn)
+ has_refcount = hasattr(sys, "gettotalrefcount")
+ # Windows doesn't set Py_DEBUG, so checking for support of debug-compiled
+ # extension modules is the best option.
+ # https://github.com/pypa/pip/issues/3383#issuecomment-173267692
+ has_ext = "_d.pyd" in EXTENSION_SUFFIXES
+ if with_debug or (with_debug is None and (has_refcount or has_ext)):
+ debug = "d"
+ if py_version >= (3, 13) and _get_config_var("Py_GIL_DISABLED", warn):
+ threading = "t"
+ if py_version < (3, 8):
+ with_pymalloc = _get_config_var("WITH_PYMALLOC", warn)
+ if with_pymalloc or with_pymalloc is None:
+ pymalloc = "m"
+ if py_version < (3, 3):
+ unicode_size = _get_config_var("Py_UNICODE_SIZE", warn)
+ if unicode_size == 4 or (
+ unicode_size is None and sys.maxunicode == 0x10FFFF
+ ):
+ ucs4 = "u"
+ elif debug:
+ # Debug builds can also load "normal" extension modules.
+ # We can also assume no UCS-4 or pymalloc requirement.
+ abis.append(f"cp{version}{threading}")
+ abis.insert(0, f"cp{version}{threading}{debug}{pymalloc}{ucs4}")
+ return abis
+
+
+def cpython_tags(
+ python_version: PythonVersion | None = None,
+ abis: Iterable[str] | None = None,
+ platforms: Iterable[str] | None = None,
+ *,
+ warn: bool = False,
+) -> Iterator[Tag]:
+ """
+ Yields the tags for a CPython interpreter.
+
+ The tags consist of:
+ - cp--
+ - cp-abi3-
+ - cp-none-
+ - cp-abi3- # Older Python versions down to 3.2.
+
+ If python_version only specifies a major version then user-provided ABIs and
+ the 'none' ABItag will be used.
+
+ If 'abi3' or 'none' are specified in 'abis' then they will be yielded at
+ their normal position and not at the beginning.
+ """
+ if not python_version:
+ python_version = sys.version_info[:2]
+
+ interpreter = f"cp{_version_nodot(python_version[:2])}"
+
+ if abis is None:
+ abis = _cpython_abis(python_version, warn) if len(python_version) > 1 else []
+ abis = list(abis)
+ # 'abi3' and 'none' are explicitly handled later.
+ for explicit_abi in ("abi3", "none"):
+ try:
+ abis.remove(explicit_abi)
+ except ValueError: # noqa: PERF203
+ pass
+
+ platforms = list(platforms or platform_tags())
+ for abi in abis:
+ for platform_ in platforms:
+ yield Tag(interpreter, abi, platform_)
+
+ threading = _is_threaded_cpython(abis)
+ use_abi3 = _abi3_applies(python_version, threading)
+ if use_abi3:
+ yield from (Tag(interpreter, "abi3", platform_) for platform_ in platforms)
+ yield from (Tag(interpreter, "none", platform_) for platform_ in platforms)
+
+ if use_abi3:
+ for minor_version in range(python_version[1] - 1, 1, -1):
+ for platform_ in platforms:
+ version = _version_nodot((python_version[0], minor_version))
+ interpreter = f"cp{version}"
+ yield Tag(interpreter, "abi3", platform_)
+
+
+def _generic_abi() -> list[str]:
+ """
+ Return the ABI tag based on EXT_SUFFIX.
+ """
+ # The following are examples of `EXT_SUFFIX`.
+ # We want to keep the parts which are related to the ABI and remove the
+ # parts which are related to the platform:
+ # - linux: '.cpython-310-x86_64-linux-gnu.so' => cp310
+ # - mac: '.cpython-310-darwin.so' => cp310
+ # - win: '.cp310-win_amd64.pyd' => cp310
+ # - win: '.pyd' => cp37 (uses _cpython_abis())
+ # - pypy: '.pypy38-pp73-x86_64-linux-gnu.so' => pypy38_pp73
+ # - graalpy: '.graalpy-38-native-x86_64-darwin.dylib'
+ # => graalpy_38_native
+
+ ext_suffix = _get_config_var("EXT_SUFFIX", warn=True)
+ if not isinstance(ext_suffix, str) or ext_suffix[0] != ".":
+ raise SystemError("invalid sysconfig.get_config_var('EXT_SUFFIX')")
+ parts = ext_suffix.split(".")
+ if len(parts) < 3:
+ # CPython3.7 and earlier uses ".pyd" on Windows.
+ return _cpython_abis(sys.version_info[:2])
+ soabi = parts[1]
+ if soabi.startswith("cpython"):
+ # non-windows
+ abi = "cp" + soabi.split("-")[1]
+ elif soabi.startswith("cp"):
+ # windows
+ abi = soabi.split("-")[0]
+ elif soabi.startswith("pypy"):
+ abi = "-".join(soabi.split("-")[:2])
+ elif soabi.startswith("graalpy"):
+ abi = "-".join(soabi.split("-")[:3])
+ elif soabi:
+ # pyston, ironpython, others?
+ abi = soabi
+ else:
+ return []
+ return [_normalize_string(abi)]
+
+
+def generic_tags(
+ interpreter: str | None = None,
+ abis: Iterable[str] | None = None,
+ platforms: Iterable[str] | None = None,
+ *,
+ warn: bool = False,
+) -> Iterator[Tag]:
+ """
+ Yields the tags for a generic interpreter.
+
+ The tags consist of:
+ - --
+
+ The "none" ABI will be added if it was not explicitly provided.
+ """
+ if not interpreter:
+ interp_name = interpreter_name()
+ interp_version = interpreter_version(warn=warn)
+ interpreter = f"{interp_name}{interp_version}"
+ abis = _generic_abi() if abis is None else list(abis)
+ platforms = list(platforms or platform_tags())
+ if "none" not in abis:
+ abis.append("none")
+ for abi in abis:
+ for platform_ in platforms:
+ yield Tag(interpreter, abi, platform_)
+
+
+def _py_interpreter_range(py_version: PythonVersion) -> Iterator[str]:
+ """
+ Yields Python versions in descending order.
+
+ After the latest version, the major-only version will be yielded, and then
+ all previous versions of that major version.
+ """
+ if len(py_version) > 1:
+ yield f"py{_version_nodot(py_version[:2])}"
+ yield f"py{py_version[0]}"
+ if len(py_version) > 1:
+ for minor in range(py_version[1] - 1, -1, -1):
+ yield f"py{_version_nodot((py_version[0], minor))}"
+
+
+def compatible_tags(
+ python_version: PythonVersion | None = None,
+ interpreter: str | None = None,
+ platforms: Iterable[str] | None = None,
+) -> Iterator[Tag]:
+ """
+ Yields the sequence of tags that are compatible with a specific version of Python.
+
+ The tags consist of:
+ - py*-none-
+ - -none-any # ... if `interpreter` is provided.
+ - py*-none-any
+ """
+ if not python_version:
+ python_version = sys.version_info[:2]
+ platforms = list(platforms or platform_tags())
+ for version in _py_interpreter_range(python_version):
+ for platform_ in platforms:
+ yield Tag(version, "none", platform_)
+ if interpreter:
+ yield Tag(interpreter, "none", "any")
+ for version in _py_interpreter_range(python_version):
+ yield Tag(version, "none", "any")
+
+
+def _mac_arch(arch: str, is_32bit: bool = _32_BIT_INTERPRETER) -> str:
+ if not is_32bit:
+ return arch
+
+ if arch.startswith("ppc"):
+ return "ppc"
+
+ return "i386"
+
+
+def _mac_binary_formats(version: AppleVersion, cpu_arch: str) -> list[str]:
+ formats = [cpu_arch]
+ if cpu_arch == "x86_64":
+ if version < (10, 4):
+ return []
+ formats.extend(["intel", "fat64", "fat32"])
+
+ elif cpu_arch == "i386":
+ if version < (10, 4):
+ return []
+ formats.extend(["intel", "fat32", "fat"])
+
+ elif cpu_arch == "ppc64":
+ # TODO: Need to care about 32-bit PPC for ppc64 through 10.2?
+ if version > (10, 5) or version < (10, 4):
+ return []
+ formats.append("fat64")
+
+ elif cpu_arch == "ppc":
+ if version > (10, 6):
+ return []
+ formats.extend(["fat32", "fat"])
+
+ if cpu_arch in {"arm64", "x86_64"}:
+ formats.append("universal2")
+
+ if cpu_arch in {"x86_64", "i386", "ppc64", "ppc", "intel"}:
+ formats.append("universal")
+
+ return formats
+
+
+def mac_platforms(
+ version: AppleVersion | None = None, arch: str | None = None
+) -> Iterator[str]:
+ """
+ Yields the platform tags for a macOS system.
+
+ The `version` parameter is a two-item tuple specifying the macOS version to
+ generate platform tags for. The `arch` parameter is the CPU architecture to
+ generate platform tags for. Both parameters default to the appropriate value
+ for the current system.
+ """
+ version_str, _, cpu_arch = platform.mac_ver()
+ if version is None:
+ version = cast("AppleVersion", tuple(map(int, version_str.split(".")[:2])))
+ if version == (10, 16):
+ # When built against an older macOS SDK, Python will report macOS 10.16
+ # instead of the real version.
+ version_str = subprocess.run(
+ [
+ sys.executable,
+ "-sS",
+ "-c",
+ "import platform; print(platform.mac_ver()[0])",
+ ],
+ check=True,
+ env={"SYSTEM_VERSION_COMPAT": "0"},
+ stdout=subprocess.PIPE,
+ text=True,
+ ).stdout
+ version = cast("AppleVersion", tuple(map(int, version_str.split(".")[:2])))
+
+ if arch is None:
+ arch = _mac_arch(cpu_arch)
+
+ if (10, 0) <= version < (11, 0):
+ # Prior to Mac OS 11, each yearly release of Mac OS bumped the
+ # "minor" version number. The major version was always 10.
+ major_version = 10
+ for minor_version in range(version[1], -1, -1):
+ compat_version = major_version, minor_version
+ binary_formats = _mac_binary_formats(compat_version, arch)
+ for binary_format in binary_formats:
+ yield f"macosx_{major_version}_{minor_version}_{binary_format}"
+
+ if version >= (11, 0):
+ # Starting with Mac OS 11, each yearly release bumps the major version
+ # number. The minor versions are now the midyear updates.
+ minor_version = 0
+ for major_version in range(version[0], 10, -1):
+ compat_version = major_version, minor_version
+ binary_formats = _mac_binary_formats(compat_version, arch)
+ for binary_format in binary_formats:
+ yield f"macosx_{major_version}_{minor_version}_{binary_format}"
+
+ if version >= (11, 0):
+ # Mac OS 11 on x86_64 is compatible with binaries from previous releases.
+ # Arm64 support was introduced in 11.0, so no Arm binaries from previous
+ # releases exist.
+ #
+ # However, the "universal2" binary format can have a
+ # macOS version earlier than 11.0 when the x86_64 part of the binary supports
+ # that version of macOS.
+ major_version = 10
+ if arch == "x86_64":
+ for minor_version in range(16, 3, -1):
+ compat_version = major_version, minor_version
+ binary_formats = _mac_binary_formats(compat_version, arch)
+ for binary_format in binary_formats:
+ yield f"macosx_{major_version}_{minor_version}_{binary_format}"
+ else:
+ for minor_version in range(16, 3, -1):
+ compat_version = major_version, minor_version
+ binary_format = "universal2"
+ yield f"macosx_{major_version}_{minor_version}_{binary_format}"
+
+
+def ios_platforms(
+ version: AppleVersion | None = None, multiarch: str | None = None
+) -> Iterator[str]:
+ """
+ Yields the platform tags for an iOS system.
+
+ :param version: A two-item tuple specifying the iOS version to generate
+ platform tags for. Defaults to the current iOS version.
+ :param multiarch: The CPU architecture+ABI to generate platform tags for -
+ (the value used by `sys.implementation._multiarch` e.g.,
+ `arm64_iphoneos` or `x84_64_iphonesimulator`). Defaults to the current
+ multiarch value.
+ """
+ if version is None:
+ # if iOS is the current platform, ios_ver *must* be defined. However,
+ # it won't exist for CPython versions before 3.13, which causes a mypy
+ # error.
+ _, release, _, _ = platform.ios_ver() # type: ignore[attr-defined, unused-ignore]
+ version = cast("AppleVersion", tuple(map(int, release.split(".")[:2])))
+
+ if multiarch is None:
+ multiarch = sys.implementation._multiarch
+ multiarch = multiarch.replace("-", "_")
+
+ ios_platform_template = "ios_{major}_{minor}_{multiarch}"
+
+ # Consider any iOS major.minor version from the version requested, down to
+ # 12.0. 12.0 is the first iOS version that is known to have enough features
+ # to support CPython. Consider every possible minor release up to X.9. There
+ # highest the minor has ever gone is 8 (14.8 and 15.8) but having some extra
+ # candidates that won't ever match doesn't really hurt, and it saves us from
+ # having to keep an explicit list of known iOS versions in the code. Return
+ # the results descending order of version number.
+
+ # If the requested major version is less than 12, there won't be any matches.
+ if version[0] < 12:
+ return
+
+ # Consider the actual X.Y version that was requested.
+ yield ios_platform_template.format(
+ major=version[0], minor=version[1], multiarch=multiarch
+ )
+
+ # Consider every minor version from X.0 to the minor version prior to the
+ # version requested by the platform.
+ for minor in range(version[1] - 1, -1, -1):
+ yield ios_platform_template.format(
+ major=version[0], minor=minor, multiarch=multiarch
+ )
+
+ for major in range(version[0] - 1, 11, -1):
+ for minor in range(9, -1, -1):
+ yield ios_platform_template.format(
+ major=major, minor=minor, multiarch=multiarch
+ )
+
+
+def android_platforms(
+ api_level: int | None = None, abi: str | None = None
+) -> Iterator[str]:
+ """
+ Yields the :attr:`~Tag.platform` tags for Android. If this function is invoked on
+ non-Android platforms, the ``api_level`` and ``abi`` arguments are required.
+
+ :param int api_level: The maximum `API level
+ `__ to return. Defaults
+ to the current system's version, as returned by ``platform.android_ver``.
+ :param str abi: The `Android ABI `__,
+ e.g. ``arm64_v8a``. Defaults to the current system's ABI , as returned by
+ ``sysconfig.get_platform``. Hyphens and periods will be replaced with
+ underscores.
+ """
+ if platform.system() != "Android" and (api_level is None or abi is None):
+ raise TypeError(
+ "on non-Android platforms, the api_level and abi arguments are required"
+ )
+
+ if api_level is None:
+ # Python 3.13 was the first version to return platform.system() == "Android",
+ # and also the first version to define platform.android_ver().
+ api_level = platform.android_ver().api_level # type: ignore[attr-defined]
+
+ if abi is None:
+ abi = sysconfig.get_platform().split("-")[-1]
+ abi = _normalize_string(abi)
+
+ # 16 is the minimum API level known to have enough features to support CPython
+ # without major patching. Yield every API level from the maximum down to the
+ # minimum, inclusive.
+ min_api_level = 16
+ for ver in range(api_level, min_api_level - 1, -1):
+ yield f"android_{ver}_{abi}"
+
+
+def _linux_platforms(is_32bit: bool = _32_BIT_INTERPRETER) -> Iterator[str]:
+ linux = _normalize_string(sysconfig.get_platform())
+ if not linux.startswith("linux_"):
+ # we should never be here, just yield the sysconfig one and return
+ yield linux
+ return
+ if is_32bit:
+ if linux == "linux_x86_64":
+ linux = "linux_i686"
+ elif linux == "linux_aarch64":
+ linux = "linux_armv8l"
+ _, arch = linux.split("_", 1)
+ archs = {"armv8l": ["armv8l", "armv7l"]}.get(arch, [arch])
+ yield from _manylinux.platform_tags(archs)
+ yield from _musllinux.platform_tags(archs)
+ for arch in archs:
+ yield f"linux_{arch}"
+
+
+def _generic_platforms() -> Iterator[str]:
+ yield _normalize_string(sysconfig.get_platform())
+
+
+def platform_tags() -> Iterator[str]:
+ """
+ Provides the platform tags for this installation.
+ """
+ if platform.system() == "Darwin":
+ return mac_platforms()
+ elif platform.system() == "iOS":
+ return ios_platforms()
+ elif platform.system() == "Android":
+ return android_platforms()
+ elif platform.system() == "Linux":
+ return _linux_platforms()
+ else:
+ return _generic_platforms()
+
+
+def interpreter_name() -> str:
+ """
+ Returns the name of the running interpreter.
+
+ Some implementations have a reserved, two-letter abbreviation which will
+ be returned when appropriate.
+ """
+ name = sys.implementation.name
+ return INTERPRETER_SHORT_NAMES.get(name) or name
+
+
+def interpreter_version(*, warn: bool = False) -> str:
+ """
+ Returns the version of the running interpreter.
+ """
+ version = _get_config_var("py_version_nodot", warn=warn)
+ return str(version) if version else _version_nodot(sys.version_info[:2])
+
+
+def _version_nodot(version: PythonVersion) -> str:
+ return "".join(map(str, version))
+
+
+def sys_tags(*, warn: bool = False) -> Iterator[Tag]:
+ """
+ Returns the sequence of tag triples for the running interpreter.
+
+ The order of the sequence corresponds to priority order for the
+ interpreter, from most to least important.
+ """
+
+ interp_name = interpreter_name()
+ if interp_name == "cp":
+ yield from cpython_tags(warn=warn)
+ else:
+ yield from generic_tags()
+
+ if interp_name == "pp":
+ interp = "pp3"
+ elif interp_name == "cp":
+ interp = "cp" + interpreter_version(warn=warn)
+ else:
+ interp = None
+ yield from compatible_tags(interpreter=interp)
diff --git a/venv/lib/python3.10/site-packages/packaging/utils.py b/venv/lib/python3.10/site-packages/packaging/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..c41c8137f2679a0fac21bb845596e231ae88dbd8
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/utils.py
@@ -0,0 +1,158 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+
+from __future__ import annotations
+
+import re
+from typing import NewType, Tuple, Union, cast
+
+from .tags import Tag, parse_tag
+from .version import InvalidVersion, Version, _TrimmedRelease
+
+BuildTag = Union[Tuple[()], Tuple[int, str]]
+NormalizedName = NewType("NormalizedName", str)
+
+
+class InvalidName(ValueError):
+ """
+ An invalid distribution name; users should refer to the packaging user guide.
+ """
+
+
+class InvalidWheelFilename(ValueError):
+ """
+ An invalid wheel filename was found, users should refer to PEP 427.
+ """
+
+
+class InvalidSdistFilename(ValueError):
+ """
+ An invalid sdist filename was found, users should refer to the packaging user guide.
+ """
+
+
+# Core metadata spec for `Name`
+_validate_regex = re.compile(r"[A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9]", re.IGNORECASE)
+_normalized_regex = re.compile(r"[a-z0-9]|[a-z0-9]([a-z0-9-](?!--))*[a-z0-9]")
+# PEP 427: The build number must start with a digit.
+_build_tag_regex = re.compile(r"(\d+)(.*)")
+
+
+def canonicalize_name(name: str, *, validate: bool = False) -> NormalizedName:
+ if validate and not _validate_regex.fullmatch(name):
+ raise InvalidName(f"name is invalid: {name!r}")
+ # Ensure all ``.`` and ``_`` are ``-``
+ # Emulates ``re.sub(r"[-_.]+", "-", name).lower()`` from PEP 503
+ # Much faster than re, and even faster than str.translate
+ value = name.lower().replace("_", "-").replace(".", "-")
+ # Condense repeats (faster than regex)
+ while "--" in value:
+ value = value.replace("--", "-")
+ return cast("NormalizedName", value)
+
+
+def is_normalized_name(name: str) -> bool:
+ return _normalized_regex.fullmatch(name) is not None
+
+
+def canonicalize_version(
+ version: Version | str, *, strip_trailing_zero: bool = True
+) -> str:
+ """
+ Return a canonical form of a version as a string.
+
+ >>> canonicalize_version('1.0.1')
+ '1.0.1'
+
+ Per PEP 625, versions may have multiple canonical forms, differing
+ only by trailing zeros.
+
+ >>> canonicalize_version('1.0.0')
+ '1'
+ >>> canonicalize_version('1.0.0', strip_trailing_zero=False)
+ '1.0.0'
+
+ Invalid versions are returned unaltered.
+
+ >>> canonicalize_version('foo bar baz')
+ 'foo bar baz'
+ """
+ if isinstance(version, str):
+ try:
+ version = Version(version)
+ except InvalidVersion:
+ return str(version)
+ return str(_TrimmedRelease(version) if strip_trailing_zero else version)
+
+
+def parse_wheel_filename(
+ filename: str,
+) -> tuple[NormalizedName, Version, BuildTag, frozenset[Tag]]:
+ if not filename.endswith(".whl"):
+ raise InvalidWheelFilename(
+ f"Invalid wheel filename (extension must be '.whl'): {filename!r}"
+ )
+
+ filename = filename[:-4]
+ dashes = filename.count("-")
+ if dashes not in (4, 5):
+ raise InvalidWheelFilename(
+ f"Invalid wheel filename (wrong number of parts): {filename!r}"
+ )
+
+ parts = filename.split("-", dashes - 2)
+ name_part = parts[0]
+ # See PEP 427 for the rules on escaping the project name.
+ if "__" in name_part or re.match(r"^[\w\d._]*$", name_part, re.UNICODE) is None:
+ raise InvalidWheelFilename(f"Invalid project name: {filename!r}")
+ name = canonicalize_name(name_part)
+
+ try:
+ version = Version(parts[1])
+ except InvalidVersion as e:
+ raise InvalidWheelFilename(
+ f"Invalid wheel filename (invalid version): {filename!r}"
+ ) from e
+
+ if dashes == 5:
+ build_part = parts[2]
+ build_match = _build_tag_regex.match(build_part)
+ if build_match is None:
+ raise InvalidWheelFilename(
+ f"Invalid build number: {build_part} in {filename!r}"
+ )
+ build = cast("BuildTag", (int(build_match.group(1)), build_match.group(2)))
+ else:
+ build = ()
+ tags = parse_tag(parts[-1])
+ return (name, version, build, tags)
+
+
+def parse_sdist_filename(filename: str) -> tuple[NormalizedName, Version]:
+ if filename.endswith(".tar.gz"):
+ file_stem = filename[: -len(".tar.gz")]
+ elif filename.endswith(".zip"):
+ file_stem = filename[: -len(".zip")]
+ else:
+ raise InvalidSdistFilename(
+ f"Invalid sdist filename (extension must be '.tar.gz' or '.zip'):"
+ f" {filename!r}"
+ )
+
+ # We are requiring a PEP 440 version, which cannot contain dashes,
+ # so we split on the last dash.
+ name_part, sep, version_part = file_stem.rpartition("-")
+ if not sep:
+ raise InvalidSdistFilename(f"Invalid sdist filename: {filename!r}")
+
+ name = canonicalize_name(name_part)
+
+ try:
+ version = Version(version_part)
+ except InvalidVersion as e:
+ raise InvalidSdistFilename(
+ f"Invalid sdist filename (invalid version): {filename!r}"
+ ) from e
+
+ return (name, version)
diff --git a/venv/lib/python3.10/site-packages/packaging/version.py b/venv/lib/python3.10/site-packages/packaging/version.py
new file mode 100644
index 0000000000000000000000000000000000000000..1206c462d4fcaa670a816e201bb88b27dfc9cf88
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/packaging/version.py
@@ -0,0 +1,792 @@
+# This file is dual licensed under the terms of the Apache License, Version
+# 2.0, and the BSD License. See the LICENSE file in the root of this repository
+# for complete details.
+"""
+.. testsetup::
+
+ from packaging.version import parse, Version
+"""
+
+from __future__ import annotations
+
+import re
+import sys
+import typing
+from typing import (
+ Any,
+ Callable,
+ Literal,
+ NamedTuple,
+ SupportsInt,
+ Tuple,
+ TypedDict,
+ Union,
+)
+
+from ._structures import Infinity, InfinityType, NegativeInfinity, NegativeInfinityType
+
+if typing.TYPE_CHECKING:
+ from typing_extensions import Self, Unpack
+
+if sys.version_info >= (3, 13): # pragma: no cover
+ from warnings import deprecated as _deprecated
+elif typing.TYPE_CHECKING:
+ from typing_extensions import deprecated as _deprecated
+else: # pragma: no cover
+ import functools
+ import warnings
+
+ def _deprecated(message: str) -> object:
+ def decorator(func: object) -> object:
+ @functools.wraps(func)
+ def wrapper(*args: object, **kwargs: object) -> object:
+ warnings.warn(
+ message,
+ category=DeprecationWarning,
+ stacklevel=2,
+ )
+ return func(*args, **kwargs)
+
+ return wrapper
+
+ return decorator
+
+
+_LETTER_NORMALIZATION = {
+ "alpha": "a",
+ "beta": "b",
+ "c": "rc",
+ "pre": "rc",
+ "preview": "rc",
+ "rev": "post",
+ "r": "post",
+}
+
+__all__ = ["VERSION_PATTERN", "InvalidVersion", "Version", "parse"]
+
+LocalType = Tuple[Union[int, str], ...]
+
+CmpPrePostDevType = Union[InfinityType, NegativeInfinityType, Tuple[str, int]]
+CmpLocalType = Union[
+ NegativeInfinityType,
+ Tuple[Union[Tuple[int, str], Tuple[NegativeInfinityType, Union[int, str]]], ...],
+]
+CmpKey = Tuple[
+ int,
+ Tuple[int, ...],
+ CmpPrePostDevType,
+ CmpPrePostDevType,
+ CmpPrePostDevType,
+ CmpLocalType,
+]
+VersionComparisonMethod = Callable[[CmpKey, CmpKey], bool]
+
+
+class _VersionReplace(TypedDict, total=False):
+ epoch: int | None
+ release: tuple[int, ...] | None
+ pre: tuple[Literal["a", "b", "rc"], int] | None
+ post: int | None
+ dev: int | None
+ local: str | None
+
+
+def parse(version: str) -> Version:
+ """Parse the given version string.
+
+ >>> parse('1.0.dev1')
+
+
+ :param version: The version string to parse.
+ :raises InvalidVersion: When the version string is not a valid version.
+ """
+ return Version(version)
+
+
+class InvalidVersion(ValueError):
+ """Raised when a version string is not a valid version.
+
+ >>> Version("invalid")
+ Traceback (most recent call last):
+ ...
+ packaging.version.InvalidVersion: Invalid version: 'invalid'
+ """
+
+
+class _BaseVersion:
+ __slots__ = ()
+
+ # This can also be a normal member (see the packaging_legacy package);
+ # we are just requiring it to be readable. Actually defining a property
+ # has runtime effect on subclasses, so it's typing only.
+ if typing.TYPE_CHECKING:
+
+ @property
+ def _key(self) -> tuple[Any, ...]: ...
+
+ def __hash__(self) -> int:
+ return hash(self._key)
+
+ # Please keep the duplicated `isinstance` check
+ # in the six comparisons hereunder
+ # unless you find a way to avoid adding overhead function calls.
+ def __lt__(self, other: _BaseVersion) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key < other._key
+
+ def __le__(self, other: _BaseVersion) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key <= other._key
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key == other._key
+
+ def __ge__(self, other: _BaseVersion) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key >= other._key
+
+ def __gt__(self, other: _BaseVersion) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key > other._key
+
+ def __ne__(self, other: object) -> bool:
+ if not isinstance(other, _BaseVersion):
+ return NotImplemented
+
+ return self._key != other._key
+
+
+# Deliberately not anchored to the start and end of the string, to make it
+# easier for 3rd party code to reuse
+
+# Note that ++ doesn't behave identically on CPython and PyPy, so not using it here
+_VERSION_PATTERN = r"""
+ v?+ # optional leading v
+ (?:
+ (?:(?P[0-9]+)!)?+ # epoch
+ (?P[0-9]+(?:\.[0-9]+)*+) # release segment
+ (?P
# pre-release
+ [._-]?+
+ (?Palpha|a|beta|b|preview|pre|c|rc)
+ [._-]?+
+ (?P[0-9]+)?
+ )?+
+ (?P # post release
+ (?:-(?P[0-9]+))
+ |
+ (?:
+ [._-]?
+ (?Ppost|rev|r)
+ [._-]?
+ (?P[0-9]+)?
+ )
+ )?+
+ (?P # dev release
+ [._-]?+
+ (?Pdev)
+ [._-]?+
+ (?P[0-9]+)?
+ )?+
+ )
+ (?:\+
+ (?P # local version
+ [a-z0-9]+
+ (?:[._-][a-z0-9]+)*+
+ )
+ )?+
+"""
+
+_VERSION_PATTERN_OLD = _VERSION_PATTERN.replace("*+", "*").replace("?+", "?")
+
+# Possessive qualifiers were added in Python 3.11.
+# CPython 3.11.0-3.11.4 had a bug: https://github.com/python/cpython/pull/107795
+# Older PyPy also had a bug.
+VERSION_PATTERN = (
+ _VERSION_PATTERN_OLD
+ if (sys.implementation.name == "cpython" and sys.version_info < (3, 11, 5))
+ or (sys.implementation.name == "pypy" and sys.version_info < (3, 11, 13))
+ or sys.version_info < (3, 11)
+ else _VERSION_PATTERN
+)
+"""
+A string containing the regular expression used to match a valid version.
+
+The pattern is not anchored at either end, and is intended for embedding in larger
+expressions (for example, matching a version number as part of a file name). The
+regular expression should be compiled with the ``re.VERBOSE`` and ``re.IGNORECASE``
+flags set.
+
+:meta hide-value:
+"""
+
+
+# Validation pattern for local version in replace()
+_LOCAL_PATTERN = re.compile(r"[a-z0-9]+(?:[._-][a-z0-9]+)*", re.IGNORECASE)
+
+
+def _validate_epoch(value: object, /) -> int:
+ epoch = value or 0
+ if isinstance(epoch, int) and epoch >= 0:
+ return epoch
+ msg = f"epoch must be non-negative integer, got {epoch}"
+ raise InvalidVersion(msg)
+
+
+def _validate_release(value: object, /) -> tuple[int, ...]:
+ release = (0,) if value is None else value
+ if (
+ isinstance(release, tuple)
+ and len(release) > 0
+ and all(isinstance(i, int) and i >= 0 for i in release)
+ ):
+ return release
+ msg = f"release must be a non-empty tuple of non-negative integers, got {release}"
+ raise InvalidVersion(msg)
+
+
+def _validate_pre(value: object, /) -> tuple[Literal["a", "b", "rc"], int] | None:
+ if value is None:
+ return value
+ if (
+ isinstance(value, tuple)
+ and len(value) == 2
+ and value[0] in ("a", "b", "rc")
+ and isinstance(value[1], int)
+ and value[1] >= 0
+ ):
+ return value
+ msg = f"pre must be a tuple of ('a'|'b'|'rc', non-negative int), got {value}"
+ raise InvalidVersion(msg)
+
+
+def _validate_post(value: object, /) -> tuple[Literal["post"], int] | None:
+ if value is None:
+ return value
+ if isinstance(value, int) and value >= 0:
+ return ("post", value)
+ msg = f"post must be non-negative integer, got {value}"
+ raise InvalidVersion(msg)
+
+
+def _validate_dev(value: object, /) -> tuple[Literal["dev"], int] | None:
+ if value is None:
+ return value
+ if isinstance(value, int) and value >= 0:
+ return ("dev", value)
+ msg = f"dev must be non-negative integer, got {value}"
+ raise InvalidVersion(msg)
+
+
+def _validate_local(value: object, /) -> LocalType | None:
+ if value is None:
+ return value
+ if isinstance(value, str) and _LOCAL_PATTERN.fullmatch(value):
+ return _parse_local_version(value)
+ msg = f"local must be a valid version string, got {value!r}"
+ raise InvalidVersion(msg)
+
+
+# Backward compatibility for internals before 26.0. Do not use.
+class _Version(NamedTuple):
+ epoch: int
+ release: tuple[int, ...]
+ dev: tuple[str, int] | None
+ pre: tuple[str, int] | None
+ post: tuple[str, int] | None
+ local: LocalType | None
+
+
+class Version(_BaseVersion):
+ """This class abstracts handling of a project's versions.
+
+ A :class:`Version` instance is comparison aware and can be compared and
+ sorted using the standard Python interfaces.
+
+ >>> v1 = Version("1.0a5")
+ >>> v2 = Version("1.0")
+ >>> v1
+
+ >>> v2
+
+ >>> v1 < v2
+ True
+ >>> v1 == v2
+ False
+ >>> v1 > v2
+ False
+ >>> v1 >= v2
+ False
+ >>> v1 <= v2
+ True
+ """
+
+ __slots__ = ("_dev", "_epoch", "_key_cache", "_local", "_post", "_pre", "_release")
+ __match_args__ = ("_str",)
+
+ _regex = re.compile(r"\s*" + VERSION_PATTERN + r"\s*", re.VERBOSE | re.IGNORECASE)
+
+ _epoch: int
+ _release: tuple[int, ...]
+ _dev: tuple[str, int] | None
+ _pre: tuple[str, int] | None
+ _post: tuple[str, int] | None
+ _local: LocalType | None
+
+ _key_cache: CmpKey | None
+
+ def __init__(self, version: str) -> None:
+ """Initialize a Version object.
+
+ :param version:
+ The string representation of a version which will be parsed and normalized
+ before use.
+ :raises InvalidVersion:
+ If the ``version`` does not conform to PEP 440 in any way then this
+ exception will be raised.
+ """
+ # Validate the version and parse it into pieces
+ match = self._regex.fullmatch(version)
+ if not match:
+ raise InvalidVersion(f"Invalid version: {version!r}")
+ self._epoch = int(match.group("epoch")) if match.group("epoch") else 0
+ self._release = tuple(map(int, match.group("release").split(".")))
+ self._pre = _parse_letter_version(match.group("pre_l"), match.group("pre_n"))
+ self._post = _parse_letter_version(
+ match.group("post_l"), match.group("post_n1") or match.group("post_n2")
+ )
+ self._dev = _parse_letter_version(match.group("dev_l"), match.group("dev_n"))
+ self._local = _parse_local_version(match.group("local"))
+
+ # Key which will be used for sorting
+ self._key_cache = None
+
+ def __replace__(self, **kwargs: Unpack[_VersionReplace]) -> Self:
+ epoch = _validate_epoch(kwargs["epoch"]) if "epoch" in kwargs else self._epoch
+ release = (
+ _validate_release(kwargs["release"])
+ if "release" in kwargs
+ else self._release
+ )
+ pre = _validate_pre(kwargs["pre"]) if "pre" in kwargs else self._pre
+ post = _validate_post(kwargs["post"]) if "post" in kwargs else self._post
+ dev = _validate_dev(kwargs["dev"]) if "dev" in kwargs else self._dev
+ local = _validate_local(kwargs["local"]) if "local" in kwargs else self._local
+
+ if (
+ epoch == self._epoch
+ and release == self._release
+ and pre == self._pre
+ and post == self._post
+ and dev == self._dev
+ and local == self._local
+ ):
+ return self
+
+ new_version = self.__class__.__new__(self.__class__)
+ new_version._key_cache = None
+ new_version._epoch = epoch
+ new_version._release = release
+ new_version._pre = pre
+ new_version._post = post
+ new_version._dev = dev
+ new_version._local = local
+
+ return new_version
+
+ @property
+ def _key(self) -> CmpKey:
+ if self._key_cache is None:
+ self._key_cache = _cmpkey(
+ self._epoch,
+ self._release,
+ self._pre,
+ self._post,
+ self._dev,
+ self._local,
+ )
+ return self._key_cache
+
+ @property
+ @_deprecated("Version._version is private and will be removed soon")
+ def _version(self) -> _Version:
+ return _Version(
+ self._epoch, self._release, self._dev, self._pre, self._post, self._local
+ )
+
+ @_version.setter
+ @_deprecated("Version._version is private and will be removed soon")
+ def _version(self, value: _Version) -> None:
+ self._epoch = value.epoch
+ self._release = value.release
+ self._dev = value.dev
+ self._pre = value.pre
+ self._post = value.post
+ self._local = value.local
+ self._key_cache = None
+
+ def __repr__(self) -> str:
+ """A representation of the Version that shows all internal state.
+
+ >>> Version('1.0.0')
+
+ """
+ return f""
+
+ def __str__(self) -> str:
+ """A string representation of the version that can be round-tripped.
+
+ >>> str(Version("1.0a5"))
+ '1.0a5'
+ """
+ # This is a hot function, so not calling self.base_version
+ version = ".".join(map(str, self.release))
+
+ # Epoch
+ if self.epoch:
+ version = f"{self.epoch}!{version}"
+
+ # Pre-release
+ if self.pre is not None:
+ version += "".join(map(str, self.pre))
+
+ # Post-release
+ if self.post is not None:
+ version += f".post{self.post}"
+
+ # Development release
+ if self.dev is not None:
+ version += f".dev{self.dev}"
+
+ # Local version segment
+ if self.local is not None:
+ version += f"+{self.local}"
+
+ return version
+
+ @property
+ def _str(self) -> str:
+ """Internal property for match_args"""
+ return str(self)
+
+ @property
+ def epoch(self) -> int:
+ """The epoch of the version.
+
+ >>> Version("2.0.0").epoch
+ 0
+ >>> Version("1!2.0.0").epoch
+ 1
+ """
+ return self._epoch
+
+ @property
+ def release(self) -> tuple[int, ...]:
+ """The components of the "release" segment of the version.
+
+ >>> Version("1.2.3").release
+ (1, 2, 3)
+ >>> Version("2.0.0").release
+ (2, 0, 0)
+ >>> Version("1!2.0.0.post0").release
+ (2, 0, 0)
+
+ Includes trailing zeroes but not the epoch or any pre-release / development /
+ post-release suffixes.
+ """
+ return self._release
+
+ @property
+ def pre(self) -> tuple[str, int] | None:
+ """The pre-release segment of the version.
+
+ >>> print(Version("1.2.3").pre)
+ None
+ >>> Version("1.2.3a1").pre
+ ('a', 1)
+ >>> Version("1.2.3b1").pre
+ ('b', 1)
+ >>> Version("1.2.3rc1").pre
+ ('rc', 1)
+ """
+ return self._pre
+
+ @property
+ def post(self) -> int | None:
+ """The post-release number of the version.
+
+ >>> print(Version("1.2.3").post)
+ None
+ >>> Version("1.2.3.post1").post
+ 1
+ """
+ return self._post[1] if self._post else None
+
+ @property
+ def dev(self) -> int | None:
+ """The development number of the version.
+
+ >>> print(Version("1.2.3").dev)
+ None
+ >>> Version("1.2.3.dev1").dev
+ 1
+ """
+ return self._dev[1] if self._dev else None
+
+ @property
+ def local(self) -> str | None:
+ """The local version segment of the version.
+
+ >>> print(Version("1.2.3").local)
+ None
+ >>> Version("1.2.3+abc").local
+ 'abc'
+ """
+ if self._local:
+ return ".".join(str(x) for x in self._local)
+ else:
+ return None
+
+ @property
+ def public(self) -> str:
+ """The public portion of the version.
+
+ >>> Version("1.2.3").public
+ '1.2.3'
+ >>> Version("1.2.3+abc").public
+ '1.2.3'
+ >>> Version("1!1.2.3dev1+abc").public
+ '1!1.2.3.dev1'
+ """
+ return str(self).split("+", 1)[0]
+
+ @property
+ def base_version(self) -> str:
+ """The "base version" of the version.
+
+ >>> Version("1.2.3").base_version
+ '1.2.3'
+ >>> Version("1.2.3+abc").base_version
+ '1.2.3'
+ >>> Version("1!1.2.3dev1+abc").base_version
+ '1!1.2.3'
+
+ The "base version" is the public version of the project without any pre or post
+ release markers.
+ """
+ release_segment = ".".join(map(str, self.release))
+ return f"{self.epoch}!{release_segment}" if self.epoch else release_segment
+
+ @property
+ def is_prerelease(self) -> bool:
+ """Whether this version is a pre-release.
+
+ >>> Version("1.2.3").is_prerelease
+ False
+ >>> Version("1.2.3a1").is_prerelease
+ True
+ >>> Version("1.2.3b1").is_prerelease
+ True
+ >>> Version("1.2.3rc1").is_prerelease
+ True
+ >>> Version("1.2.3dev1").is_prerelease
+ True
+ """
+ return self.dev is not None or self.pre is not None
+
+ @property
+ def is_postrelease(self) -> bool:
+ """Whether this version is a post-release.
+
+ >>> Version("1.2.3").is_postrelease
+ False
+ >>> Version("1.2.3.post1").is_postrelease
+ True
+ """
+ return self.post is not None
+
+ @property
+ def is_devrelease(self) -> bool:
+ """Whether this version is a development release.
+
+ >>> Version("1.2.3").is_devrelease
+ False
+ >>> Version("1.2.3.dev1").is_devrelease
+ True
+ """
+ return self.dev is not None
+
+ @property
+ def major(self) -> int:
+ """The first item of :attr:`release` or ``0`` if unavailable.
+
+ >>> Version("1.2.3").major
+ 1
+ """
+ return self.release[0] if len(self.release) >= 1 else 0
+
+ @property
+ def minor(self) -> int:
+ """The second item of :attr:`release` or ``0`` if unavailable.
+
+ >>> Version("1.2.3").minor
+ 2
+ >>> Version("1").minor
+ 0
+ """
+ return self.release[1] if len(self.release) >= 2 else 0
+
+ @property
+ def micro(self) -> int:
+ """The third item of :attr:`release` or ``0`` if unavailable.
+
+ >>> Version("1.2.3").micro
+ 3
+ >>> Version("1").micro
+ 0
+ """
+ return self.release[2] if len(self.release) >= 3 else 0
+
+
+class _TrimmedRelease(Version):
+ __slots__ = ()
+
+ def __init__(self, version: str | Version) -> None:
+ if isinstance(version, Version):
+ self._epoch = version._epoch
+ self._release = version._release
+ self._dev = version._dev
+ self._pre = version._pre
+ self._post = version._post
+ self._local = version._local
+ self._key_cache = version._key_cache
+ return
+ super().__init__(version) # pragma: no cover
+
+ @property
+ def release(self) -> tuple[int, ...]:
+ """
+ Release segment without any trailing zeros.
+
+ >>> _TrimmedRelease('1.0.0').release
+ (1,)
+ >>> _TrimmedRelease('0.0').release
+ (0,)
+ """
+ # This leaves one 0.
+ rel = super().release
+ len_release = len(rel)
+ i = len_release
+ while i > 1 and rel[i - 1] == 0:
+ i -= 1
+ return rel if i == len_release else rel[:i]
+
+
+def _parse_letter_version(
+ letter: str | None, number: str | bytes | SupportsInt | None
+) -> tuple[str, int] | None:
+ if letter:
+ # We normalize any letters to their lower case form
+ letter = letter.lower()
+
+ # We consider some words to be alternate spellings of other words and
+ # in those cases we want to normalize the spellings to our preferred
+ # spelling.
+ letter = _LETTER_NORMALIZATION.get(letter, letter)
+
+ # We consider there to be an implicit 0 in a pre-release if there is
+ # not a numeral associated with it.
+ return letter, int(number or 0)
+
+ if number:
+ # We assume if we are given a number, but we are not given a letter
+ # then this is using the implicit post release syntax (e.g. 1.0-1)
+ return "post", int(number)
+
+ return None
+
+
+_local_version_separators = re.compile(r"[\._-]")
+
+
+def _parse_local_version(local: str | None) -> LocalType | None:
+ """
+ Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
+ """
+ if local is not None:
+ return tuple(
+ part.lower() if not part.isdigit() else int(part)
+ for part in _local_version_separators.split(local)
+ )
+ return None
+
+
+def _cmpkey(
+ epoch: int,
+ release: tuple[int, ...],
+ pre: tuple[str, int] | None,
+ post: tuple[str, int] | None,
+ dev: tuple[str, int] | None,
+ local: LocalType | None,
+) -> CmpKey:
+ # When we compare a release version, we want to compare it with all of the
+ # trailing zeros removed. We will use this for our sorting key.
+ len_release = len(release)
+ i = len_release
+ while i and release[i - 1] == 0:
+ i -= 1
+ _release = release if i == len_release else release[:i]
+
+ # We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
+ # We'll do this by abusing the pre segment, but we _only_ want to do this
+ # if there is not a pre or a post segment. If we have one of those then
+ # the normal sorting rules will handle this case correctly.
+ if pre is None and post is None and dev is not None:
+ _pre: CmpPrePostDevType = NegativeInfinity
+ # Versions without a pre-release (except as noted above) should sort after
+ # those with one.
+ elif pre is None:
+ _pre = Infinity
+ else:
+ _pre = pre
+
+ # Versions without a post segment should sort before those with one.
+ if post is None:
+ _post: CmpPrePostDevType = NegativeInfinity
+
+ else:
+ _post = post
+
+ # Versions without a development segment should sort after those with one.
+ if dev is None:
+ _dev: CmpPrePostDevType = Infinity
+
+ else:
+ _dev = dev
+
+ if local is None:
+ # Versions without a local segment should sort before those with one.
+ _local: CmpLocalType = NegativeInfinity
+ else:
+ # Versions with a local segment need that segment parsed to implement
+ # the sorting rules in PEP440.
+ # - Alpha numeric segments sort before numeric segments
+ # - Alpha numeric segments sort lexicographically
+ # - Numeric segments sort numerically
+ # - Shorter versions sort before longer versions when the prefixes
+ # match exactly
+ _local = tuple(
+ (i, "") if isinstance(i, int) else (NegativeInfinity, i) for i in local
+ )
+
+ return epoch, _release, _pre, _post, _dev, _local
diff --git a/venv/lib/python3.10/site-packages/pip/__init__.py b/venv/lib/python3.10/site-packages/pip/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..8a50472b239048703282f4e5d5659d89a54e83b2
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/__init__.py
@@ -0,0 +1,13 @@
+from typing import List, Optional
+
+__version__ = "22.0.2"
+
+
+def main(args: Optional[List[str]] = None) -> int:
+ """This is an internal API only meant for use by pip's own console scripts.
+
+ For additional details, see https://github.com/pypa/pip/issues/7498.
+ """
+ from pip._internal.utils.entrypoints import _wrapper
+
+ return _wrapper(args)
diff --git a/venv/lib/python3.10/site-packages/pip/__main__.py b/venv/lib/python3.10/site-packages/pip/__main__.py
new file mode 100644
index 0000000000000000000000000000000000000000..fe34a7b7772cef55f5b5cb3455a2850489620ca7
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/__main__.py
@@ -0,0 +1,31 @@
+import os
+import sys
+import warnings
+
+# Remove '' and current working directory from the first entry
+# of sys.path, if present to avoid using current directory
+# in pip commands check, freeze, install, list and show,
+# when invoked as python -m pip
+if sys.path[0] in ("", os.getcwd()):
+ sys.path.pop(0)
+
+# If we are running from a wheel, add the wheel to sys.path
+# This allows the usage python pip-*.whl/pip install pip-*.whl
+if __package__ == "":
+ # __file__ is pip-*.whl/pip/__main__.py
+ # first dirname call strips of '/__main__.py', second strips off '/pip'
+ # Resulting path is the name of the wheel itself
+ # Add that to sys.path so we can import pip
+ path = os.path.dirname(os.path.dirname(__file__))
+ sys.path.insert(0, path)
+
+if __name__ == "__main__":
+ # Work around the error reported in #9540, pending a proper fix.
+ # Note: It is essential the warning filter is set *before* importing
+ # pip, as the deprecation happens at import time, not runtime.
+ warnings.filterwarnings(
+ "ignore", category=DeprecationWarning, module=".*packaging\\.version"
+ )
+ from pip._internal.cli.main import main as _main
+
+ sys.exit(_main())
diff --git a/venv/lib/python3.10/site-packages/pip/__pycache__/__init__.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..c0f25e947f8baf685665a03a4c6fe25872a865e8
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/__pycache__/__init__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/__pycache__/__main__.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/__pycache__/__main__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..059bbbdb699cd9a857363666337924e7031143dc
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/__pycache__/__main__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__init__.py b/venv/lib/python3.10/site-packages/pip/_internal/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..6afb5c627ce3db6e61cbf46276f7ddd42552eb28
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/__init__.py
@@ -0,0 +1,19 @@
+from typing import List, Optional
+
+import pip._internal.utils.inject_securetransport # noqa
+from pip._internal.utils import _log
+
+# init_logging() must be called before any call to logging.getLogger()
+# which happens at import of most modules.
+_log.init_logging()
+
+
+def main(args: (Optional[List[str]]) = None) -> int:
+ """This is preserved for old console scripts that may still be referencing
+ it.
+
+ For additional details, see https://github.com/pypa/pip/issues/7498.
+ """
+ from pip._internal.utils.entrypoints import _wrapper
+
+ return _wrapper(args)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/__init__.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..86bbf73b74f28661161ff6d4b8ad7b11ae1bf805
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/__init__.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/build_env.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/build_env.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..35c8ba185c4ae06b45224c5b506bd630d2ab26ac
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/build_env.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/cache.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/cache.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..81315d70c86aa3662377134377478fc0ed3099f7
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/cache.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/configuration.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/configuration.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..86114ebd01020305098df7722841b7d3d7533062
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/configuration.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/exceptions.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/exceptions.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..77816e374a66d18dab052ecd6b84712643417dca
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/exceptions.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/main.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/main.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..1b7dfb7a6688ffcac8fc55ba7e8c7328c56ee708
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/main.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/pyproject.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/pyproject.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..fc7c278b825f93ec03bfee73524dbc86187fe243
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/pyproject.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..6552aef7e0718afc306f0635a0289289879966c7
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-310.pyc b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..45aea932f810d8ec9fd6f809c76b4f3dd3217f6b
Binary files /dev/null and b/venv/lib/python3.10/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-310.pyc differ
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/build_env.py b/venv/lib/python3.10/site-packages/pip/_internal/build_env.py
new file mode 100644
index 0000000000000000000000000000000000000000..daeb7fbc8d7c32b0d0e7c2798dc1388c4e97f74d
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/build_env.py
@@ -0,0 +1,296 @@
+"""Build Environment used for isolation during sdist building
+"""
+
+import contextlib
+import logging
+import os
+import pathlib
+import sys
+import textwrap
+import zipfile
+from collections import OrderedDict
+from sysconfig import get_paths
+from types import TracebackType
+from typing import TYPE_CHECKING, Iterable, Iterator, List, Optional, Set, Tuple, Type
+
+from pip._vendor.certifi import where
+from pip._vendor.packaging.requirements import Requirement
+from pip._vendor.packaging.version import Version
+
+from pip import __file__ as pip_location
+from pip._internal.cli.spinners import open_spinner
+from pip._internal.locations import get_platlib, get_prefixed_libs, get_purelib
+from pip._internal.metadata import get_environment
+from pip._internal.utils.subprocess import call_subprocess
+from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
+
+if TYPE_CHECKING:
+ from pip._internal.index.package_finder import PackageFinder
+
+logger = logging.getLogger(__name__)
+
+
+class _Prefix:
+ def __init__(self, path: str) -> None:
+ self.path = path
+ self.setup = False
+ self.bin_dir = get_paths(
+ "nt" if os.name == "nt" else "posix_prefix",
+ vars={"base": path, "platbase": path},
+ )["scripts"]
+ self.lib_dirs = get_prefixed_libs(path)
+
+
+@contextlib.contextmanager
+def _create_standalone_pip() -> Iterator[str]:
+ """Create a "standalone pip" zip file.
+
+ The zip file's content is identical to the currently-running pip.
+ It will be used to install requirements into the build environment.
+ """
+ source = pathlib.Path(pip_location).resolve().parent
+
+ # Return the current instance if `source` is not a directory. We can't build
+ # a zip from this, and it likely means the instance is already standalone.
+ if not source.is_dir():
+ yield str(source)
+ return
+
+ with TempDirectory(kind="standalone-pip") as tmp_dir:
+ pip_zip = os.path.join(tmp_dir.path, "__env_pip__.zip")
+ kwargs = {}
+ if sys.version_info >= (3, 8):
+ kwargs["strict_timestamps"] = False
+ with zipfile.ZipFile(pip_zip, "w", **kwargs) as zf:
+ for child in source.rglob("*"):
+ zf.write(child, child.relative_to(source.parent).as_posix())
+ yield os.path.join(pip_zip, "pip")
+
+
+class BuildEnvironment:
+ """Creates and manages an isolated environment to install build deps"""
+
+ def __init__(self) -> None:
+ temp_dir = TempDirectory(kind=tempdir_kinds.BUILD_ENV, globally_managed=True)
+
+ self._prefixes = OrderedDict(
+ (name, _Prefix(os.path.join(temp_dir.path, name)))
+ for name in ("normal", "overlay")
+ )
+
+ self._bin_dirs: List[str] = []
+ self._lib_dirs: List[str] = []
+ for prefix in reversed(list(self._prefixes.values())):
+ self._bin_dirs.append(prefix.bin_dir)
+ self._lib_dirs.extend(prefix.lib_dirs)
+
+ # Customize site to:
+ # - ensure .pth files are honored
+ # - prevent access to system site packages
+ system_sites = {
+ os.path.normcase(site) for site in (get_purelib(), get_platlib())
+ }
+ self._site_dir = os.path.join(temp_dir.path, "site")
+ if not os.path.exists(self._site_dir):
+ os.mkdir(self._site_dir)
+ with open(
+ os.path.join(self._site_dir, "sitecustomize.py"), "w", encoding="utf-8"
+ ) as fp:
+ fp.write(
+ textwrap.dedent(
+ """
+ import os, site, sys
+
+ # First, drop system-sites related paths.
+ original_sys_path = sys.path[:]
+ known_paths = set()
+ for path in {system_sites!r}:
+ site.addsitedir(path, known_paths=known_paths)
+ system_paths = set(
+ os.path.normcase(path)
+ for path in sys.path[len(original_sys_path):]
+ )
+ original_sys_path = [
+ path for path in original_sys_path
+ if os.path.normcase(path) not in system_paths
+ ]
+ sys.path = original_sys_path
+
+ # Second, add lib directories.
+ # ensuring .pth file are processed.
+ for path in {lib_dirs!r}:
+ assert not path in sys.path
+ site.addsitedir(path)
+ """
+ ).format(system_sites=system_sites, lib_dirs=self._lib_dirs)
+ )
+
+ def __enter__(self) -> None:
+ self._save_env = {
+ name: os.environ.get(name, None)
+ for name in ("PATH", "PYTHONNOUSERSITE", "PYTHONPATH")
+ }
+
+ path = self._bin_dirs[:]
+ old_path = self._save_env["PATH"]
+ if old_path:
+ path.extend(old_path.split(os.pathsep))
+
+ pythonpath = [self._site_dir]
+
+ os.environ.update(
+ {
+ "PATH": os.pathsep.join(path),
+ "PYTHONNOUSERSITE": "1",
+ "PYTHONPATH": os.pathsep.join(pythonpath),
+ }
+ )
+
+ def __exit__(
+ self,
+ exc_type: Optional[Type[BaseException]],
+ exc_val: Optional[BaseException],
+ exc_tb: Optional[TracebackType],
+ ) -> None:
+ for varname, old_value in self._save_env.items():
+ if old_value is None:
+ os.environ.pop(varname, None)
+ else:
+ os.environ[varname] = old_value
+
+ def check_requirements(
+ self, reqs: Iterable[str]
+ ) -> Tuple[Set[Tuple[str, str]], Set[str]]:
+ """Return 2 sets:
+ - conflicting requirements: set of (installed, wanted) reqs tuples
+ - missing requirements: set of reqs
+ """
+ missing = set()
+ conflicting = set()
+ if reqs:
+ env = get_environment(self._lib_dirs)
+ for req_str in reqs:
+ req = Requirement(req_str)
+ dist = env.get_distribution(req.name)
+ if not dist:
+ missing.add(req_str)
+ continue
+ if isinstance(dist.version, Version):
+ installed_req_str = f"{req.name}=={dist.version}"
+ else:
+ installed_req_str = f"{req.name}==={dist.version}"
+ if dist.version not in req.specifier:
+ conflicting.add((installed_req_str, req_str))
+ # FIXME: Consider direct URL?
+ return conflicting, missing
+
+ def install_requirements(
+ self,
+ finder: "PackageFinder",
+ requirements: Iterable[str],
+ prefix_as_string: str,
+ *,
+ kind: str,
+ ) -> None:
+ prefix = self._prefixes[prefix_as_string]
+ assert not prefix.setup
+ prefix.setup = True
+ if not requirements:
+ return
+ with contextlib.ExitStack() as ctx:
+ pip_runnable = ctx.enter_context(_create_standalone_pip())
+ self._install_requirements(
+ pip_runnable,
+ finder,
+ requirements,
+ prefix,
+ kind=kind,
+ )
+
+ @staticmethod
+ def _install_requirements(
+ pip_runnable: str,
+ finder: "PackageFinder",
+ requirements: Iterable[str],
+ prefix: _Prefix,
+ *,
+ kind: str,
+ ) -> None:
+ args: List[str] = [
+ sys.executable,
+ pip_runnable,
+ "install",
+ "--ignore-installed",
+ "--no-user",
+ "--prefix",
+ prefix.path,
+ "--no-warn-script-location",
+ ]
+ if logger.getEffectiveLevel() <= logging.DEBUG:
+ args.append("-v")
+ for format_control in ("no_binary", "only_binary"):
+ formats = getattr(finder.format_control, format_control)
+ args.extend(
+ (
+ "--" + format_control.replace("_", "-"),
+ ",".join(sorted(formats or {":none:"})),
+ )
+ )
+
+ index_urls = finder.index_urls
+ if index_urls:
+ args.extend(["-i", index_urls[0]])
+ for extra_index in index_urls[1:]:
+ args.extend(["--extra-index-url", extra_index])
+ else:
+ args.append("--no-index")
+ for link in finder.find_links:
+ args.extend(["--find-links", link])
+
+ for host in finder.trusted_hosts:
+ args.extend(["--trusted-host", host])
+ if finder.allow_all_prereleases:
+ args.append("--pre")
+ if finder.prefer_binary:
+ args.append("--prefer-binary")
+ args.append("--")
+ args.extend(requirements)
+ extra_environ = {"_PIP_STANDALONE_CERT": where()}
+ with open_spinner(f"Installing {kind}") as spinner:
+ call_subprocess(
+ args,
+ command_desc=f"pip subprocess to install {kind}",
+ spinner=spinner,
+ extra_environ=extra_environ,
+ )
+
+
+class NoOpBuildEnvironment(BuildEnvironment):
+ """A no-op drop-in replacement for BuildEnvironment"""
+
+ def __init__(self) -> None:
+ pass
+
+ def __enter__(self) -> None:
+ pass
+
+ def __exit__(
+ self,
+ exc_type: Optional[Type[BaseException]],
+ exc_val: Optional[BaseException],
+ exc_tb: Optional[TracebackType],
+ ) -> None:
+ pass
+
+ def cleanup(self) -> None:
+ pass
+
+ def install_requirements(
+ self,
+ finder: "PackageFinder",
+ requirements: Iterable[str],
+ prefix_as_string: str,
+ *,
+ kind: str,
+ ) -> None:
+ raise NotImplementedError()
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cache.py b/venv/lib/python3.10/site-packages/pip/_internal/cache.py
new file mode 100644
index 0000000000000000000000000000000000000000..1d6df2201183a5786afbbcc96486e565ef90e5e0
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cache.py
@@ -0,0 +1,264 @@
+"""Cache Management
+"""
+
+import hashlib
+import json
+import logging
+import os
+from typing import Any, Dict, List, Optional, Set
+
+from pip._vendor.packaging.tags import Tag, interpreter_name, interpreter_version
+from pip._vendor.packaging.utils import canonicalize_name
+
+from pip._internal.exceptions import InvalidWheelFilename
+from pip._internal.models.format_control import FormatControl
+from pip._internal.models.link import Link
+from pip._internal.models.wheel import Wheel
+from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
+from pip._internal.utils.urls import path_to_url
+
+logger = logging.getLogger(__name__)
+
+
+def _hash_dict(d: Dict[str, str]) -> str:
+ """Return a stable sha224 of a dictionary."""
+ s = json.dumps(d, sort_keys=True, separators=(",", ":"), ensure_ascii=True)
+ return hashlib.sha224(s.encode("ascii")).hexdigest()
+
+
+class Cache:
+ """An abstract class - provides cache directories for data from links
+
+
+ :param cache_dir: The root of the cache.
+ :param format_control: An object of FormatControl class to limit
+ binaries being read from the cache.
+ :param allowed_formats: which formats of files the cache should store.
+ ('binary' and 'source' are the only allowed values)
+ """
+
+ def __init__(
+ self, cache_dir: str, format_control: FormatControl, allowed_formats: Set[str]
+ ) -> None:
+ super().__init__()
+ assert not cache_dir or os.path.isabs(cache_dir)
+ self.cache_dir = cache_dir or None
+ self.format_control = format_control
+ self.allowed_formats = allowed_formats
+
+ _valid_formats = {"source", "binary"}
+ assert self.allowed_formats.union(_valid_formats) == _valid_formats
+
+ def _get_cache_path_parts(self, link: Link) -> List[str]:
+ """Get parts of part that must be os.path.joined with cache_dir"""
+
+ # We want to generate an url to use as our cache key, we don't want to
+ # just re-use the URL because it might have other items in the fragment
+ # and we don't care about those.
+ key_parts = {"url": link.url_without_fragment}
+ if link.hash_name is not None and link.hash is not None:
+ key_parts[link.hash_name] = link.hash
+ if link.subdirectory_fragment:
+ key_parts["subdirectory"] = link.subdirectory_fragment
+
+ # Include interpreter name, major and minor version in cache key
+ # to cope with ill-behaved sdists that build a different wheel
+ # depending on the python version their setup.py is being run on,
+ # and don't encode the difference in compatibility tags.
+ # https://github.com/pypa/pip/issues/7296
+ key_parts["interpreter_name"] = interpreter_name()
+ key_parts["interpreter_version"] = interpreter_version()
+
+ # Encode our key url with sha224, we'll use this because it has similar
+ # security properties to sha256, but with a shorter total output (and
+ # thus less secure). However the differences don't make a lot of
+ # difference for our use case here.
+ hashed = _hash_dict(key_parts)
+
+ # We want to nest the directories some to prevent having a ton of top
+ # level directories where we might run out of sub directories on some
+ # FS.
+ parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]]
+
+ return parts
+
+ def _get_candidates(self, link: Link, canonical_package_name: str) -> List[Any]:
+ can_not_cache = not self.cache_dir or not canonical_package_name or not link
+ if can_not_cache:
+ return []
+
+ formats = self.format_control.get_allowed_formats(canonical_package_name)
+ if not self.allowed_formats.intersection(formats):
+ return []
+
+ candidates = []
+ path = self.get_path_for_link(link)
+ if os.path.isdir(path):
+ for candidate in os.listdir(path):
+ candidates.append((candidate, path))
+ return candidates
+
+ def get_path_for_link(self, link: Link) -> str:
+ """Return a directory to store cached items in for link."""
+ raise NotImplementedError()
+
+ def get(
+ self,
+ link: Link,
+ package_name: Optional[str],
+ supported_tags: List[Tag],
+ ) -> Link:
+ """Returns a link to a cached item if it exists, otherwise returns the
+ passed link.
+ """
+ raise NotImplementedError()
+
+
+class SimpleWheelCache(Cache):
+ """A cache of wheels for future installs."""
+
+ def __init__(self, cache_dir: str, format_control: FormatControl) -> None:
+ super().__init__(cache_dir, format_control, {"binary"})
+
+ def get_path_for_link(self, link: Link) -> str:
+ """Return a directory to store cached wheels for link
+
+ Because there are M wheels for any one sdist, we provide a directory
+ to cache them in, and then consult that directory when looking up
+ cache hits.
+
+ We only insert things into the cache if they have plausible version
+ numbers, so that we don't contaminate the cache with things that were
+ not unique. E.g. ./package might have dozens of installs done for it
+ and build a version of 0.0...and if we built and cached a wheel, we'd
+ end up using the same wheel even if the source has been edited.
+
+ :param link: The link of the sdist for which this will cache wheels.
+ """
+ parts = self._get_cache_path_parts(link)
+ assert self.cache_dir
+ # Store wheels within the root cache_dir
+ return os.path.join(self.cache_dir, "wheels", *parts)
+
+ def get(
+ self,
+ link: Link,
+ package_name: Optional[str],
+ supported_tags: List[Tag],
+ ) -> Link:
+ candidates = []
+
+ if not package_name:
+ return link
+
+ canonical_package_name = canonicalize_name(package_name)
+ for wheel_name, wheel_dir in self._get_candidates(link, canonical_package_name):
+ try:
+ wheel = Wheel(wheel_name)
+ except InvalidWheelFilename:
+ continue
+ if canonicalize_name(wheel.name) != canonical_package_name:
+ logger.debug(
+ "Ignoring cached wheel %s for %s as it "
+ "does not match the expected distribution name %s.",
+ wheel_name,
+ link,
+ package_name,
+ )
+ continue
+ if not wheel.supported(supported_tags):
+ # Built for a different python/arch/etc
+ continue
+ candidates.append(
+ (
+ wheel.support_index_min(supported_tags),
+ wheel_name,
+ wheel_dir,
+ )
+ )
+
+ if not candidates:
+ return link
+
+ _, wheel_name, wheel_dir = min(candidates)
+ return Link(path_to_url(os.path.join(wheel_dir, wheel_name)))
+
+
+class EphemWheelCache(SimpleWheelCache):
+ """A SimpleWheelCache that creates it's own temporary cache directory"""
+
+ def __init__(self, format_control: FormatControl) -> None:
+ self._temp_dir = TempDirectory(
+ kind=tempdir_kinds.EPHEM_WHEEL_CACHE,
+ globally_managed=True,
+ )
+
+ super().__init__(self._temp_dir.path, format_control)
+
+
+class CacheEntry:
+ def __init__(
+ self,
+ link: Link,
+ persistent: bool,
+ ):
+ self.link = link
+ self.persistent = persistent
+
+
+class WheelCache(Cache):
+ """Wraps EphemWheelCache and SimpleWheelCache into a single Cache
+
+ This Cache allows for gracefully degradation, using the ephem wheel cache
+ when a certain link is not found in the simple wheel cache first.
+ """
+
+ def __init__(self, cache_dir: str, format_control: FormatControl) -> None:
+ super().__init__(cache_dir, format_control, {"binary"})
+ self._wheel_cache = SimpleWheelCache(cache_dir, format_control)
+ self._ephem_cache = EphemWheelCache(format_control)
+
+ def get_path_for_link(self, link: Link) -> str:
+ return self._wheel_cache.get_path_for_link(link)
+
+ def get_ephem_path_for_link(self, link: Link) -> str:
+ return self._ephem_cache.get_path_for_link(link)
+
+ def get(
+ self,
+ link: Link,
+ package_name: Optional[str],
+ supported_tags: List[Tag],
+ ) -> Link:
+ cache_entry = self.get_cache_entry(link, package_name, supported_tags)
+ if cache_entry is None:
+ return link
+ return cache_entry.link
+
+ def get_cache_entry(
+ self,
+ link: Link,
+ package_name: Optional[str],
+ supported_tags: List[Tag],
+ ) -> Optional[CacheEntry]:
+ """Returns a CacheEntry with a link to a cached item if it exists or
+ None. The cache entry indicates if the item was found in the persistent
+ or ephemeral cache.
+ """
+ retval = self._wheel_cache.get(
+ link=link,
+ package_name=package_name,
+ supported_tags=supported_tags,
+ )
+ if retval is not link:
+ return CacheEntry(retval, persistent=True)
+
+ retval = self._ephem_cache.get(
+ link=link,
+ package_name=package_name,
+ supported_tags=supported_tags,
+ )
+ if retval is not link:
+ return CacheEntry(retval, persistent=False)
+
+ return None
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/__init__.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e589bb917e23823e25f9fff7e0849c4d6d4a62bc
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/__init__.py
@@ -0,0 +1,4 @@
+"""Subpackage containing all of pip's command line interface related code
+"""
+
+# This file intentionally does not import submodules
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/autocompletion.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/autocompletion.py
new file mode 100644
index 0000000000000000000000000000000000000000..226fe84dc0d0c4eb78f9b3c603df20cef0fdfda4
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/autocompletion.py
@@ -0,0 +1,171 @@
+"""Logic that powers autocompletion installed by ``pip completion``.
+"""
+
+import optparse
+import os
+import sys
+from itertools import chain
+from typing import Any, Iterable, List, Optional
+
+from pip._internal.cli.main_parser import create_main_parser
+from pip._internal.commands import commands_dict, create_command
+from pip._internal.metadata import get_default_environment
+
+
+def autocomplete() -> None:
+ """Entry Point for completion of main and subcommand options."""
+ # Don't complete if user hasn't sourced bash_completion file.
+ if "PIP_AUTO_COMPLETE" not in os.environ:
+ return
+ cwords = os.environ["COMP_WORDS"].split()[1:]
+ cword = int(os.environ["COMP_CWORD"])
+ try:
+ current = cwords[cword - 1]
+ except IndexError:
+ current = ""
+
+ parser = create_main_parser()
+ subcommands = list(commands_dict)
+ options = []
+
+ # subcommand
+ subcommand_name: Optional[str] = None
+ for word in cwords:
+ if word in subcommands:
+ subcommand_name = word
+ break
+ # subcommand options
+ if subcommand_name is not None:
+ # special case: 'help' subcommand has no options
+ if subcommand_name == "help":
+ sys.exit(1)
+ # special case: list locally installed dists for show and uninstall
+ should_list_installed = not current.startswith("-") and subcommand_name in [
+ "show",
+ "uninstall",
+ ]
+ if should_list_installed:
+ env = get_default_environment()
+ lc = current.lower()
+ installed = [
+ dist.canonical_name
+ for dist in env.iter_installed_distributions(local_only=True)
+ if dist.canonical_name.startswith(lc)
+ and dist.canonical_name not in cwords[1:]
+ ]
+ # if there are no dists installed, fall back to option completion
+ if installed:
+ for dist in installed:
+ print(dist)
+ sys.exit(1)
+
+ should_list_installables = (
+ not current.startswith("-") and subcommand_name == "install"
+ )
+ if should_list_installables:
+ for path in auto_complete_paths(current, "path"):
+ print(path)
+ sys.exit(1)
+
+ subcommand = create_command(subcommand_name)
+
+ for opt in subcommand.parser.option_list_all:
+ if opt.help != optparse.SUPPRESS_HELP:
+ for opt_str in opt._long_opts + opt._short_opts:
+ options.append((opt_str, opt.nargs))
+
+ # filter out previously specified options from available options
+ prev_opts = [x.split("=")[0] for x in cwords[1 : cword - 1]]
+ options = [(x, v) for (x, v) in options if x not in prev_opts]
+ # filter options by current input
+ options = [(k, v) for k, v in options if k.startswith(current)]
+ # get completion type given cwords and available subcommand options
+ completion_type = get_path_completion_type(
+ cwords,
+ cword,
+ subcommand.parser.option_list_all,
+ )
+ # get completion files and directories if ``completion_type`` is
+ # ````, ```` or ````
+ if completion_type:
+ paths = auto_complete_paths(current, completion_type)
+ options = [(path, 0) for path in paths]
+ for option in options:
+ opt_label = option[0]
+ # append '=' to options which require args
+ if option[1] and option[0][:2] == "--":
+ opt_label += "="
+ print(opt_label)
+ else:
+ # show main parser options only when necessary
+
+ opts = [i.option_list for i in parser.option_groups]
+ opts.append(parser.option_list)
+ flattened_opts = chain.from_iterable(opts)
+ if current.startswith("-"):
+ for opt in flattened_opts:
+ if opt.help != optparse.SUPPRESS_HELP:
+ subcommands += opt._long_opts + opt._short_opts
+ else:
+ # get completion type given cwords and all available options
+ completion_type = get_path_completion_type(cwords, cword, flattened_opts)
+ if completion_type:
+ subcommands = list(auto_complete_paths(current, completion_type))
+
+ print(" ".join([x for x in subcommands if x.startswith(current)]))
+ sys.exit(1)
+
+
+def get_path_completion_type(
+ cwords: List[str], cword: int, opts: Iterable[Any]
+) -> Optional[str]:
+ """Get the type of path completion (``file``, ``dir``, ``path`` or None)
+
+ :param cwords: same as the environmental variable ``COMP_WORDS``
+ :param cword: same as the environmental variable ``COMP_CWORD``
+ :param opts: The available options to check
+ :return: path completion type (``file``, ``dir``, ``path`` or None)
+ """
+ if cword < 2 or not cwords[cword - 2].startswith("-"):
+ return None
+ for opt in opts:
+ if opt.help == optparse.SUPPRESS_HELP:
+ continue
+ for o in str(opt).split("/"):
+ if cwords[cword - 2].split("=")[0] == o:
+ if not opt.metavar or any(
+ x in ("path", "file", "dir") for x in opt.metavar.split("/")
+ ):
+ return opt.metavar
+ return None
+
+
+def auto_complete_paths(current: str, completion_type: str) -> Iterable[str]:
+ """If ``completion_type`` is ``file`` or ``path``, list all regular files
+ and directories starting with ``current``; otherwise only list directories
+ starting with ``current``.
+
+ :param current: The word to be completed
+ :param completion_type: path completion type(``file``, ``path`` or ``dir``)
+ :return: A generator of regular files and/or directories
+ """
+ directory, filename = os.path.split(current)
+ current_path = os.path.abspath(directory)
+ # Don't complete paths if they can't be accessed
+ if not os.access(current_path, os.R_OK):
+ return
+ filename = os.path.normcase(filename)
+ # list all files that start with ``filename``
+ file_list = (
+ x for x in os.listdir(current_path) if os.path.normcase(x).startswith(filename)
+ )
+ for f in file_list:
+ opt = os.path.join(current_path, f)
+ comp_file = os.path.normcase(os.path.join(directory, f))
+ # complete regular files when there is not ```` after option
+ # complete directories when there is ````, ```` or
+ # ````after option
+ if completion_type != "dir" and os.path.isfile(opt):
+ yield comp_file
+ elif os.path.isdir(opt):
+ yield os.path.join(comp_file, "")
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/base_command.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/base_command.py
new file mode 100644
index 0000000000000000000000000000000000000000..f5dc0fecf788a7f71a690ff2a7ed86f755f29113
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/base_command.py
@@ -0,0 +1,220 @@
+"""Base Command class, and related routines"""
+
+import functools
+import logging
+import logging.config
+import optparse
+import os
+import sys
+import traceback
+from optparse import Values
+from typing import Any, Callable, List, Optional, Tuple
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.command_context import CommandContextMixIn
+from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
+from pip._internal.cli.status_codes import (
+ ERROR,
+ PREVIOUS_BUILD_DIR_ERROR,
+ UNKNOWN_ERROR,
+ VIRTUALENV_NOT_FOUND,
+)
+from pip._internal.exceptions import (
+ BadCommand,
+ CommandError,
+ DiagnosticPipError,
+ InstallationError,
+ NetworkConnectionError,
+ PreviousBuildDirError,
+ UninstallationError,
+)
+from pip._internal.utils.filesystem import check_path_owner
+from pip._internal.utils.logging import BrokenStdoutLoggingError, setup_logging
+from pip._internal.utils.misc import get_prog, normalize_path
+from pip._internal.utils.temp_dir import TempDirectoryTypeRegistry as TempDirRegistry
+from pip._internal.utils.temp_dir import global_tempdir_manager, tempdir_registry
+from pip._internal.utils.virtualenv import running_under_virtualenv
+
+__all__ = ["Command"]
+
+logger = logging.getLogger(__name__)
+
+
+class Command(CommandContextMixIn):
+ usage: str = ""
+ ignore_require_venv: bool = False
+
+ def __init__(self, name: str, summary: str, isolated: bool = False) -> None:
+ super().__init__()
+
+ self.name = name
+ self.summary = summary
+ self.parser = ConfigOptionParser(
+ usage=self.usage,
+ prog=f"{get_prog()} {name}",
+ formatter=UpdatingDefaultsHelpFormatter(),
+ add_help_option=False,
+ name=name,
+ description=self.__doc__,
+ isolated=isolated,
+ )
+
+ self.tempdir_registry: Optional[TempDirRegistry] = None
+
+ # Commands should add options to this option group
+ optgroup_name = f"{self.name.capitalize()} Options"
+ self.cmd_opts = optparse.OptionGroup(self.parser, optgroup_name)
+
+ # Add the general options
+ gen_opts = cmdoptions.make_option_group(
+ cmdoptions.general_group,
+ self.parser,
+ )
+ self.parser.add_option_group(gen_opts)
+
+ self.add_options()
+
+ def add_options(self) -> None:
+ pass
+
+ def handle_pip_version_check(self, options: Values) -> None:
+ """
+ This is a no-op so that commands by default do not do the pip version
+ check.
+ """
+ # Make sure we do the pip version check if the index_group options
+ # are present.
+ assert not hasattr(options, "no_index")
+
+ def run(self, options: Values, args: List[str]) -> int:
+ raise NotImplementedError
+
+ def parse_args(self, args: List[str]) -> Tuple[Values, List[str]]:
+ # factored out for testability
+ return self.parser.parse_args(args)
+
+ def main(self, args: List[str]) -> int:
+ try:
+ with self.main_context():
+ return self._main(args)
+ finally:
+ logging.shutdown()
+
+ def _main(self, args: List[str]) -> int:
+ # We must initialize this before the tempdir manager, otherwise the
+ # configuration would not be accessible by the time we clean up the
+ # tempdir manager.
+ self.tempdir_registry = self.enter_context(tempdir_registry())
+ # Intentionally set as early as possible so globally-managed temporary
+ # directories are available to the rest of the code.
+ self.enter_context(global_tempdir_manager())
+
+ options, args = self.parse_args(args)
+
+ # Set verbosity so that it can be used elsewhere.
+ self.verbosity = options.verbose - options.quiet
+
+ level_number = setup_logging(
+ verbosity=self.verbosity,
+ no_color=options.no_color,
+ user_log_file=options.log,
+ )
+
+ # TODO: Try to get these passing down from the command?
+ # without resorting to os.environ to hold these.
+ # This also affects isolated builds and it should.
+
+ if options.no_input:
+ os.environ["PIP_NO_INPUT"] = "1"
+
+ if options.exists_action:
+ os.environ["PIP_EXISTS_ACTION"] = " ".join(options.exists_action)
+
+ if options.require_venv and not self.ignore_require_venv:
+ # If a venv is required check if it can really be found
+ if not running_under_virtualenv():
+ logger.critical("Could not find an activated virtualenv (required).")
+ sys.exit(VIRTUALENV_NOT_FOUND)
+
+ if options.cache_dir:
+ options.cache_dir = normalize_path(options.cache_dir)
+ if not check_path_owner(options.cache_dir):
+ logger.warning(
+ "The directory '%s' or its parent directory is not owned "
+ "or is not writable by the current user. The cache "
+ "has been disabled. Check the permissions and owner of "
+ "that directory. If executing pip with sudo, you should "
+ "use sudo's -H flag.",
+ options.cache_dir,
+ )
+ options.cache_dir = None
+
+ if "2020-resolver" in options.features_enabled:
+ logger.warning(
+ "--use-feature=2020-resolver no longer has any effect, "
+ "since it is now the default dependency resolver in pip. "
+ "This will become an error in pip 21.0."
+ )
+
+ def intercepts_unhandled_exc(
+ run_func: Callable[..., int]
+ ) -> Callable[..., int]:
+ @functools.wraps(run_func)
+ def exc_logging_wrapper(*args: Any) -> int:
+ try:
+ status = run_func(*args)
+ assert isinstance(status, int)
+ return status
+ except DiagnosticPipError as exc:
+ logger.error("[present-diagnostic] %s", exc)
+ logger.debug("Exception information:", exc_info=True)
+
+ return ERROR
+ except PreviousBuildDirError as exc:
+ logger.critical(str(exc))
+ logger.debug("Exception information:", exc_info=True)
+
+ return PREVIOUS_BUILD_DIR_ERROR
+ except (
+ InstallationError,
+ UninstallationError,
+ BadCommand,
+ NetworkConnectionError,
+ ) as exc:
+ logger.critical(str(exc))
+ logger.debug("Exception information:", exc_info=True)
+
+ return ERROR
+ except CommandError as exc:
+ logger.critical("%s", exc)
+ logger.debug("Exception information:", exc_info=True)
+
+ return ERROR
+ except BrokenStdoutLoggingError:
+ # Bypass our logger and write any remaining messages to
+ # stderr because stdout no longer works.
+ print("ERROR: Pipe to stdout was broken", file=sys.stderr)
+ if level_number <= logging.DEBUG:
+ traceback.print_exc(file=sys.stderr)
+
+ return ERROR
+ except KeyboardInterrupt:
+ logger.critical("Operation cancelled by user")
+ logger.debug("Exception information:", exc_info=True)
+
+ return ERROR
+ except BaseException:
+ logger.critical("Exception:", exc_info=True)
+
+ return UNKNOWN_ERROR
+
+ return exc_logging_wrapper
+
+ try:
+ if not options.debug_mode:
+ run = intercepts_unhandled_exc(self.run)
+ else:
+ run = self.run
+ return run(options, args)
+ finally:
+ self.handle_pip_version_check(options)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/cmdoptions.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/cmdoptions.py
new file mode 100644
index 0000000000000000000000000000000000000000..b7e54f7c63c3355b5e2da338034f249b2e1c9e38
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/cmdoptions.py
@@ -0,0 +1,1018 @@
+"""
+shared options and groups
+
+The principle here is to define options once, but *not* instantiate them
+globally. One reason being that options with action='append' can carry state
+between parses. pip parses general options twice internally, and shouldn't
+pass on state. To be consistent, all options will follow this design.
+"""
+
+# The following comment should be removed at some point in the future.
+# mypy: strict-optional=False
+
+import logging
+import os
+import textwrap
+from functools import partial
+from optparse import SUPPRESS_HELP, Option, OptionGroup, OptionParser, Values
+from textwrap import dedent
+from typing import Any, Callable, Dict, Optional, Tuple
+
+from pip._vendor.packaging.utils import canonicalize_name
+
+from pip._internal.cli.parser import ConfigOptionParser
+from pip._internal.cli.progress_bars import BAR_TYPES
+from pip._internal.exceptions import CommandError
+from pip._internal.locations import USER_CACHE_DIR, get_src_prefix
+from pip._internal.models.format_control import FormatControl
+from pip._internal.models.index import PyPI
+from pip._internal.models.target_python import TargetPython
+from pip._internal.utils.hashes import STRONG_HASHES
+from pip._internal.utils.misc import strtobool
+
+logger = logging.getLogger(__name__)
+
+
+def raise_option_error(parser: OptionParser, option: Option, msg: str) -> None:
+ """
+ Raise an option parsing error using parser.error().
+
+ Args:
+ parser: an OptionParser instance.
+ option: an Option instance.
+ msg: the error text.
+ """
+ msg = f"{option} error: {msg}"
+ msg = textwrap.fill(" ".join(msg.split()))
+ parser.error(msg)
+
+
+def make_option_group(group: Dict[str, Any], parser: ConfigOptionParser) -> OptionGroup:
+ """
+ Return an OptionGroup object
+ group -- assumed to be dict with 'name' and 'options' keys
+ parser -- an optparse Parser
+ """
+ option_group = OptionGroup(parser, group["name"])
+ for option in group["options"]:
+ option_group.add_option(option())
+ return option_group
+
+
+def check_install_build_global(
+ options: Values, check_options: Optional[Values] = None
+) -> None:
+ """Disable wheels if per-setup.py call options are set.
+
+ :param options: The OptionParser options to update.
+ :param check_options: The options to check, if not supplied defaults to
+ options.
+ """
+ if check_options is None:
+ check_options = options
+
+ def getname(n: str) -> Optional[Any]:
+ return getattr(check_options, n, None)
+
+ names = ["build_options", "global_options", "install_options"]
+ if any(map(getname, names)):
+ control = options.format_control
+ control.disallow_binaries()
+ logger.warning(
+ "Disabling all use of wheels due to the use of --build-option "
+ "/ --global-option / --install-option.",
+ )
+
+
+def check_dist_restriction(options: Values, check_target: bool = False) -> None:
+ """Function for determining if custom platform options are allowed.
+
+ :param options: The OptionParser options.
+ :param check_target: Whether or not to check if --target is being used.
+ """
+ dist_restriction_set = any(
+ [
+ options.python_version,
+ options.platforms,
+ options.abis,
+ options.implementation,
+ ]
+ )
+
+ binary_only = FormatControl(set(), {":all:"})
+ sdist_dependencies_allowed = (
+ options.format_control != binary_only and not options.ignore_dependencies
+ )
+
+ # Installations or downloads using dist restrictions must not combine
+ # source distributions and dist-specific wheels, as they are not
+ # guaranteed to be locally compatible.
+ if dist_restriction_set and sdist_dependencies_allowed:
+ raise CommandError(
+ "When restricting platform and interpreter constraints using "
+ "--python-version, --platform, --abi, or --implementation, "
+ "either --no-deps must be set, or --only-binary=:all: must be "
+ "set and --no-binary must not be set (or must be set to "
+ ":none:)."
+ )
+
+ if check_target:
+ if dist_restriction_set and not options.target_dir:
+ raise CommandError(
+ "Can not use any platform or abi specific options unless "
+ "installing via '--target'"
+ )
+
+
+def _path_option_check(option: Option, opt: str, value: str) -> str:
+ return os.path.expanduser(value)
+
+
+def _package_name_option_check(option: Option, opt: str, value: str) -> str:
+ return canonicalize_name(value)
+
+
+class PipOption(Option):
+ TYPES = Option.TYPES + ("path", "package_name")
+ TYPE_CHECKER = Option.TYPE_CHECKER.copy()
+ TYPE_CHECKER["package_name"] = _package_name_option_check
+ TYPE_CHECKER["path"] = _path_option_check
+
+
+###########
+# options #
+###########
+
+help_: Callable[..., Option] = partial(
+ Option,
+ "-h",
+ "--help",
+ dest="help",
+ action="help",
+ help="Show help.",
+)
+
+debug_mode: Callable[..., Option] = partial(
+ Option,
+ "--debug",
+ dest="debug_mode",
+ action="store_true",
+ default=False,
+ help=(
+ "Let unhandled exceptions propagate outside the main subroutine, "
+ "instead of logging them to stderr."
+ ),
+)
+
+isolated_mode: Callable[..., Option] = partial(
+ Option,
+ "--isolated",
+ dest="isolated_mode",
+ action="store_true",
+ default=False,
+ help=(
+ "Run pip in an isolated mode, ignoring environment variables and user "
+ "configuration."
+ ),
+)
+
+require_virtualenv: Callable[..., Option] = partial(
+ Option,
+ "--require-virtualenv",
+ "--require-venv",
+ dest="require_venv",
+ action="store_true",
+ default=False,
+ help=(
+ "Allow pip to only run in a virtual environment; "
+ "exit with an error otherwise."
+ ),
+)
+
+verbose: Callable[..., Option] = partial(
+ Option,
+ "-v",
+ "--verbose",
+ dest="verbose",
+ action="count",
+ default=0,
+ help="Give more output. Option is additive, and can be used up to 3 times.",
+)
+
+no_color: Callable[..., Option] = partial(
+ Option,
+ "--no-color",
+ dest="no_color",
+ action="store_true",
+ default=False,
+ help="Suppress colored output.",
+)
+
+version: Callable[..., Option] = partial(
+ Option,
+ "-V",
+ "--version",
+ dest="version",
+ action="store_true",
+ help="Show version and exit.",
+)
+
+quiet: Callable[..., Option] = partial(
+ Option,
+ "-q",
+ "--quiet",
+ dest="quiet",
+ action="count",
+ default=0,
+ help=(
+ "Give less output. Option is additive, and can be used up to 3"
+ " times (corresponding to WARNING, ERROR, and CRITICAL logging"
+ " levels)."
+ ),
+)
+
+progress_bar: Callable[..., Option] = partial(
+ Option,
+ "--progress-bar",
+ dest="progress_bar",
+ type="choice",
+ choices=list(BAR_TYPES.keys()),
+ default="on",
+ help=(
+ "Specify type of progress to be displayed ["
+ + "|".join(BAR_TYPES.keys())
+ + "] (default: %default)"
+ ),
+)
+
+log: Callable[..., Option] = partial(
+ PipOption,
+ "--log",
+ "--log-file",
+ "--local-log",
+ dest="log",
+ metavar="path",
+ type="path",
+ help="Path to a verbose appending log.",
+)
+
+no_input: Callable[..., Option] = partial(
+ Option,
+ # Don't ask for input
+ "--no-input",
+ dest="no_input",
+ action="store_true",
+ default=False,
+ help="Disable prompting for input.",
+)
+
+proxy: Callable[..., Option] = partial(
+ Option,
+ "--proxy",
+ dest="proxy",
+ type="str",
+ default="",
+ help="Specify a proxy in the form [user:passwd@]proxy.server:port.",
+)
+
+retries: Callable[..., Option] = partial(
+ Option,
+ "--retries",
+ dest="retries",
+ type="int",
+ default=5,
+ help="Maximum number of retries each connection should attempt "
+ "(default %default times).",
+)
+
+timeout: Callable[..., Option] = partial(
+ Option,
+ "--timeout",
+ "--default-timeout",
+ metavar="sec",
+ dest="timeout",
+ type="float",
+ default=15,
+ help="Set the socket timeout (default %default seconds).",
+)
+
+
+def exists_action() -> Option:
+ return Option(
+ # Option when path already exist
+ "--exists-action",
+ dest="exists_action",
+ type="choice",
+ choices=["s", "i", "w", "b", "a"],
+ default=[],
+ action="append",
+ metavar="action",
+ help="Default action when a path already exists: "
+ "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.",
+ )
+
+
+cert: Callable[..., Option] = partial(
+ PipOption,
+ "--cert",
+ dest="cert",
+ type="path",
+ metavar="path",
+ help=(
+ "Path to PEM-encoded CA certificate bundle. "
+ "If provided, overrides the default. "
+ "See 'SSL Certificate Verification' in pip documentation "
+ "for more information."
+ ),
+)
+
+client_cert: Callable[..., Option] = partial(
+ PipOption,
+ "--client-cert",
+ dest="client_cert",
+ type="path",
+ default=None,
+ metavar="path",
+ help="Path to SSL client certificate, a single file containing the "
+ "private key and the certificate in PEM format.",
+)
+
+index_url: Callable[..., Option] = partial(
+ Option,
+ "-i",
+ "--index-url",
+ "--pypi-url",
+ dest="index_url",
+ metavar="URL",
+ default=PyPI.simple_url,
+ help="Base URL of the Python Package Index (default %default). "
+ "This should point to a repository compliant with PEP 503 "
+ "(the simple repository API) or a local directory laid out "
+ "in the same format.",
+)
+
+
+def extra_index_url() -> Option:
+ return Option(
+ "--extra-index-url",
+ dest="extra_index_urls",
+ metavar="URL",
+ action="append",
+ default=[],
+ help="Extra URLs of package indexes to use in addition to "
+ "--index-url. Should follow the same rules as "
+ "--index-url.",
+ )
+
+
+no_index: Callable[..., Option] = partial(
+ Option,
+ "--no-index",
+ dest="no_index",
+ action="store_true",
+ default=False,
+ help="Ignore package index (only looking at --find-links URLs instead).",
+)
+
+
+def find_links() -> Option:
+ return Option(
+ "-f",
+ "--find-links",
+ dest="find_links",
+ action="append",
+ default=[],
+ metavar="url",
+ help="If a URL or path to an html file, then parse for links to "
+ "archives such as sdist (.tar.gz) or wheel (.whl) files. "
+ "If a local path or file:// URL that's a directory, "
+ "then look for archives in the directory listing. "
+ "Links to VCS project URLs are not supported.",
+ )
+
+
+def trusted_host() -> Option:
+ return Option(
+ "--trusted-host",
+ dest="trusted_hosts",
+ action="append",
+ metavar="HOSTNAME",
+ default=[],
+ help="Mark this host or host:port pair as trusted, even though it "
+ "does not have valid or any HTTPS.",
+ )
+
+
+def constraints() -> Option:
+ return Option(
+ "-c",
+ "--constraint",
+ dest="constraints",
+ action="append",
+ default=[],
+ metavar="file",
+ help="Constrain versions using the given constraints file. "
+ "This option can be used multiple times.",
+ )
+
+
+def requirements() -> Option:
+ return Option(
+ "-r",
+ "--requirement",
+ dest="requirements",
+ action="append",
+ default=[],
+ metavar="file",
+ help="Install from the given requirements file. "
+ "This option can be used multiple times.",
+ )
+
+
+def editable() -> Option:
+ return Option(
+ "-e",
+ "--editable",
+ dest="editables",
+ action="append",
+ default=[],
+ metavar="path/url",
+ help=(
+ "Install a project in editable mode (i.e. setuptools "
+ '"develop mode") from a local project path or a VCS url.'
+ ),
+ )
+
+
+def _handle_src(option: Option, opt_str: str, value: str, parser: OptionParser) -> None:
+ value = os.path.abspath(value)
+ setattr(parser.values, option.dest, value)
+
+
+src: Callable[..., Option] = partial(
+ PipOption,
+ "--src",
+ "--source",
+ "--source-dir",
+ "--source-directory",
+ dest="src_dir",
+ type="path",
+ metavar="dir",
+ default=get_src_prefix(),
+ action="callback",
+ callback=_handle_src,
+ help="Directory to check out editable projects into. "
+ 'The default in a virtualenv is "/src". '
+ 'The default for global installs is "/src".',
+)
+
+
+def _get_format_control(values: Values, option: Option) -> Any:
+ """Get a format_control object."""
+ return getattr(values, option.dest)
+
+
+def _handle_no_binary(
+ option: Option, opt_str: str, value: str, parser: OptionParser
+) -> None:
+ existing = _get_format_control(parser.values, option)
+ FormatControl.handle_mutual_excludes(
+ value,
+ existing.no_binary,
+ existing.only_binary,
+ )
+
+
+def _handle_only_binary(
+ option: Option, opt_str: str, value: str, parser: OptionParser
+) -> None:
+ existing = _get_format_control(parser.values, option)
+ FormatControl.handle_mutual_excludes(
+ value,
+ existing.only_binary,
+ existing.no_binary,
+ )
+
+
+def no_binary() -> Option:
+ format_control = FormatControl(set(), set())
+ return Option(
+ "--no-binary",
+ dest="format_control",
+ action="callback",
+ callback=_handle_no_binary,
+ type="str",
+ default=format_control,
+ help="Do not use binary packages. Can be supplied multiple times, and "
+ 'each time adds to the existing value. Accepts either ":all:" to '
+ 'disable all binary packages, ":none:" to empty the set (notice '
+ "the colons), or one or more package names with commas between "
+ "them (no colons). Note that some packages are tricky to compile "
+ "and may fail to install when this option is used on them.",
+ )
+
+
+def only_binary() -> Option:
+ format_control = FormatControl(set(), set())
+ return Option(
+ "--only-binary",
+ dest="format_control",
+ action="callback",
+ callback=_handle_only_binary,
+ type="str",
+ default=format_control,
+ help="Do not use source packages. Can be supplied multiple times, and "
+ 'each time adds to the existing value. Accepts either ":all:" to '
+ 'disable all source packages, ":none:" to empty the set, or one '
+ "or more package names with commas between them. Packages "
+ "without binary distributions will fail to install when this "
+ "option is used on them.",
+ )
+
+
+platforms: Callable[..., Option] = partial(
+ Option,
+ "--platform",
+ dest="platforms",
+ metavar="platform",
+ action="append",
+ default=None,
+ help=(
+ "Only use wheels compatible with . Defaults to the "
+ "platform of the running system. Use this option multiple times to "
+ "specify multiple platforms supported by the target interpreter."
+ ),
+)
+
+
+# This was made a separate function for unit-testing purposes.
+def _convert_python_version(value: str) -> Tuple[Tuple[int, ...], Optional[str]]:
+ """
+ Convert a version string like "3", "37", or "3.7.3" into a tuple of ints.
+
+ :return: A 2-tuple (version_info, error_msg), where `error_msg` is
+ non-None if and only if there was a parsing error.
+ """
+ if not value:
+ # The empty string is the same as not providing a value.
+ return (None, None)
+
+ parts = value.split(".")
+ if len(parts) > 3:
+ return ((), "at most three version parts are allowed")
+
+ if len(parts) == 1:
+ # Then we are in the case of "3" or "37".
+ value = parts[0]
+ if len(value) > 1:
+ parts = [value[0], value[1:]]
+
+ try:
+ version_info = tuple(int(part) for part in parts)
+ except ValueError:
+ return ((), "each version part must be an integer")
+
+ return (version_info, None)
+
+
+def _handle_python_version(
+ option: Option, opt_str: str, value: str, parser: OptionParser
+) -> None:
+ """
+ Handle a provided --python-version value.
+ """
+ version_info, error_msg = _convert_python_version(value)
+ if error_msg is not None:
+ msg = "invalid --python-version value: {!r}: {}".format(
+ value,
+ error_msg,
+ )
+ raise_option_error(parser, option=option, msg=msg)
+
+ parser.values.python_version = version_info
+
+
+python_version: Callable[..., Option] = partial(
+ Option,
+ "--python-version",
+ dest="python_version",
+ metavar="python_version",
+ action="callback",
+ callback=_handle_python_version,
+ type="str",
+ default=None,
+ help=dedent(
+ """\
+ The Python interpreter version to use for wheel and "Requires-Python"
+ compatibility checks. Defaults to a version derived from the running
+ interpreter. The version can be specified using up to three dot-separated
+ integers (e.g. "3" for 3.0.0, "3.7" for 3.7.0, or "3.7.3"). A major-minor
+ version can also be given as a string without dots (e.g. "37" for 3.7.0).
+ """
+ ),
+)
+
+
+implementation: Callable[..., Option] = partial(
+ Option,
+ "--implementation",
+ dest="implementation",
+ metavar="implementation",
+ default=None,
+ help=(
+ "Only use wheels compatible with Python "
+ "implementation , e.g. 'pp', 'jy', 'cp', "
+ " or 'ip'. If not specified, then the current "
+ "interpreter implementation is used. Use 'py' to force "
+ "implementation-agnostic wheels."
+ ),
+)
+
+
+abis: Callable[..., Option] = partial(
+ Option,
+ "--abi",
+ dest="abis",
+ metavar="abi",
+ action="append",
+ default=None,
+ help=(
+ "Only use wheels compatible with Python abi , e.g. 'pypy_41'. "
+ "If not specified, then the current interpreter abi tag is used. "
+ "Use this option multiple times to specify multiple abis supported "
+ "by the target interpreter. Generally you will need to specify "
+ "--implementation, --platform, and --python-version when using this "
+ "option."
+ ),
+)
+
+
+def add_target_python_options(cmd_opts: OptionGroup) -> None:
+ cmd_opts.add_option(platforms())
+ cmd_opts.add_option(python_version())
+ cmd_opts.add_option(implementation())
+ cmd_opts.add_option(abis())
+
+
+def make_target_python(options: Values) -> TargetPython:
+ target_python = TargetPython(
+ platforms=options.platforms,
+ py_version_info=options.python_version,
+ abis=options.abis,
+ implementation=options.implementation,
+ )
+
+ return target_python
+
+
+def prefer_binary() -> Option:
+ return Option(
+ "--prefer-binary",
+ dest="prefer_binary",
+ action="store_true",
+ default=False,
+ help="Prefer older binary packages over newer source packages.",
+ )
+
+
+cache_dir: Callable[..., Option] = partial(
+ PipOption,
+ "--cache-dir",
+ dest="cache_dir",
+ default=USER_CACHE_DIR,
+ metavar="dir",
+ type="path",
+ help="Store the cache data in .",
+)
+
+
+def _handle_no_cache_dir(
+ option: Option, opt: str, value: str, parser: OptionParser
+) -> None:
+ """
+ Process a value provided for the --no-cache-dir option.
+
+ This is an optparse.Option callback for the --no-cache-dir option.
+ """
+ # The value argument will be None if --no-cache-dir is passed via the
+ # command-line, since the option doesn't accept arguments. However,
+ # the value can be non-None if the option is triggered e.g. by an
+ # environment variable, like PIP_NO_CACHE_DIR=true.
+ if value is not None:
+ # Then parse the string value to get argument error-checking.
+ try:
+ strtobool(value)
+ except ValueError as exc:
+ raise_option_error(parser, option=option, msg=str(exc))
+
+ # Originally, setting PIP_NO_CACHE_DIR to a value that strtobool()
+ # converted to 0 (like "false" or "no") caused cache_dir to be disabled
+ # rather than enabled (logic would say the latter). Thus, we disable
+ # the cache directory not just on values that parse to True, but (for
+ # backwards compatibility reasons) also on values that parse to False.
+ # In other words, always set it to False if the option is provided in
+ # some (valid) form.
+ parser.values.cache_dir = False
+
+
+no_cache: Callable[..., Option] = partial(
+ Option,
+ "--no-cache-dir",
+ dest="cache_dir",
+ action="callback",
+ callback=_handle_no_cache_dir,
+ help="Disable the cache.",
+)
+
+no_deps: Callable[..., Option] = partial(
+ Option,
+ "--no-deps",
+ "--no-dependencies",
+ dest="ignore_dependencies",
+ action="store_true",
+ default=False,
+ help="Don't install package dependencies.",
+)
+
+ignore_requires_python: Callable[..., Option] = partial(
+ Option,
+ "--ignore-requires-python",
+ dest="ignore_requires_python",
+ action="store_true",
+ help="Ignore the Requires-Python information.",
+)
+
+no_build_isolation: Callable[..., Option] = partial(
+ Option,
+ "--no-build-isolation",
+ dest="build_isolation",
+ action="store_false",
+ default=True,
+ help="Disable isolation when building a modern source distribution. "
+ "Build dependencies specified by PEP 518 must be already installed "
+ "if this option is used.",
+)
+
+
+def _handle_no_use_pep517(
+ option: Option, opt: str, value: str, parser: OptionParser
+) -> None:
+ """
+ Process a value provided for the --no-use-pep517 option.
+
+ This is an optparse.Option callback for the no_use_pep517 option.
+ """
+ # Since --no-use-pep517 doesn't accept arguments, the value argument
+ # will be None if --no-use-pep517 is passed via the command-line.
+ # However, the value can be non-None if the option is triggered e.g.
+ # by an environment variable, for example "PIP_NO_USE_PEP517=true".
+ if value is not None:
+ msg = """A value was passed for --no-use-pep517,
+ probably using either the PIP_NO_USE_PEP517 environment variable
+ or the "no-use-pep517" config file option. Use an appropriate value
+ of the PIP_USE_PEP517 environment variable or the "use-pep517"
+ config file option instead.
+ """
+ raise_option_error(parser, option=option, msg=msg)
+
+ # Otherwise, --no-use-pep517 was passed via the command-line.
+ parser.values.use_pep517 = False
+
+
+use_pep517: Any = partial(
+ Option,
+ "--use-pep517",
+ dest="use_pep517",
+ action="store_true",
+ default=None,
+ help="Use PEP 517 for building source distributions "
+ "(use --no-use-pep517 to force legacy behaviour).",
+)
+
+no_use_pep517: Any = partial(
+ Option,
+ "--no-use-pep517",
+ dest="use_pep517",
+ action="callback",
+ callback=_handle_no_use_pep517,
+ default=None,
+ help=SUPPRESS_HELP,
+)
+
+install_options: Callable[..., Option] = partial(
+ Option,
+ "--install-option",
+ dest="install_options",
+ action="append",
+ metavar="options",
+ help="Extra arguments to be supplied to the setup.py install "
+ 'command (use like --install-option="--install-scripts=/usr/local/'
+ 'bin"). Use multiple --install-option options to pass multiple '
+ "options to setup.py install. If you are using an option with a "
+ "directory path, be sure to use absolute path.",
+)
+
+build_options: Callable[..., Option] = partial(
+ Option,
+ "--build-option",
+ dest="build_options",
+ metavar="options",
+ action="append",
+ help="Extra arguments to be supplied to 'setup.py bdist_wheel'.",
+)
+
+global_options: Callable[..., Option] = partial(
+ Option,
+ "--global-option",
+ dest="global_options",
+ action="append",
+ metavar="options",
+ help="Extra global options to be supplied to the setup.py "
+ "call before the install or bdist_wheel command.",
+)
+
+no_clean: Callable[..., Option] = partial(
+ Option,
+ "--no-clean",
+ action="store_true",
+ default=False,
+ help="Don't clean up build directories.",
+)
+
+pre: Callable[..., Option] = partial(
+ Option,
+ "--pre",
+ action="store_true",
+ default=False,
+ help="Include pre-release and development versions. By default, "
+ "pip only finds stable versions.",
+)
+
+disable_pip_version_check: Callable[..., Option] = partial(
+ Option,
+ "--disable-pip-version-check",
+ dest="disable_pip_version_check",
+ action="store_true",
+ default=True,
+ help="Don't periodically check PyPI to determine whether a new version "
+ "of pip is available for download. Implied with --no-index.",
+)
+
+
+def _handle_merge_hash(
+ option: Option, opt_str: str, value: str, parser: OptionParser
+) -> None:
+ """Given a value spelled "algo:digest", append the digest to a list
+ pointed to in a dict by the algo name."""
+ if not parser.values.hashes:
+ parser.values.hashes = {}
+ try:
+ algo, digest = value.split(":", 1)
+ except ValueError:
+ parser.error(
+ "Arguments to {} must be a hash name " # noqa
+ "followed by a value, like --hash=sha256:"
+ "abcde...".format(opt_str)
+ )
+ if algo not in STRONG_HASHES:
+ parser.error(
+ "Allowed hash algorithms for {} are {}.".format( # noqa
+ opt_str, ", ".join(STRONG_HASHES)
+ )
+ )
+ parser.values.hashes.setdefault(algo, []).append(digest)
+
+
+hash: Callable[..., Option] = partial(
+ Option,
+ "--hash",
+ # Hash values eventually end up in InstallRequirement.hashes due to
+ # __dict__ copying in process_line().
+ dest="hashes",
+ action="callback",
+ callback=_handle_merge_hash,
+ type="string",
+ help="Verify that the package's archive matches this "
+ "hash before installing. Example: --hash=sha256:abcdef...",
+)
+
+
+require_hashes: Callable[..., Option] = partial(
+ Option,
+ "--require-hashes",
+ dest="require_hashes",
+ action="store_true",
+ default=False,
+ help="Require a hash to check each requirement against, for "
+ "repeatable installs. This option is implied when any package in a "
+ "requirements file has a --hash option.",
+)
+
+
+list_path: Callable[..., Option] = partial(
+ PipOption,
+ "--path",
+ dest="path",
+ type="path",
+ action="append",
+ help="Restrict to the specified installation path for listing "
+ "packages (can be used multiple times).",
+)
+
+
+def check_list_path_option(options: Values) -> None:
+ if options.path and (options.user or options.local):
+ raise CommandError("Cannot combine '--path' with '--user' or '--local'")
+
+
+list_exclude: Callable[..., Option] = partial(
+ PipOption,
+ "--exclude",
+ dest="excludes",
+ action="append",
+ metavar="package",
+ type="package_name",
+ help="Exclude specified package from the output",
+)
+
+
+no_python_version_warning: Callable[..., Option] = partial(
+ Option,
+ "--no-python-version-warning",
+ dest="no_python_version_warning",
+ action="store_true",
+ default=False,
+ help="Silence deprecation warnings for upcoming unsupported Pythons.",
+)
+
+
+use_new_feature: Callable[..., Option] = partial(
+ Option,
+ "--use-feature",
+ dest="features_enabled",
+ metavar="feature",
+ action="append",
+ default=[],
+ choices=["2020-resolver", "fast-deps", "in-tree-build"],
+ help="Enable new functionality, that may be backward incompatible.",
+)
+
+use_deprecated_feature: Callable[..., Option] = partial(
+ Option,
+ "--use-deprecated",
+ dest="deprecated_features_enabled",
+ metavar="feature",
+ action="append",
+ default=[],
+ choices=[
+ "legacy-resolver",
+ "out-of-tree-build",
+ "backtrack-on-build-failures",
+ "html5lib",
+ ],
+ help=("Enable deprecated functionality, that will be removed in the future."),
+)
+
+
+##########
+# groups #
+##########
+
+general_group: Dict[str, Any] = {
+ "name": "General Options",
+ "options": [
+ help_,
+ debug_mode,
+ isolated_mode,
+ require_virtualenv,
+ verbose,
+ version,
+ quiet,
+ log,
+ no_input,
+ proxy,
+ retries,
+ timeout,
+ exists_action,
+ trusted_host,
+ cert,
+ client_cert,
+ cache_dir,
+ no_cache,
+ disable_pip_version_check,
+ no_color,
+ no_python_version_warning,
+ use_new_feature,
+ use_deprecated_feature,
+ ],
+}
+
+index_group: Dict[str, Any] = {
+ "name": "Package Index Options",
+ "options": [
+ index_url,
+ extra_index_url,
+ no_index,
+ find_links,
+ ],
+}
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/command_context.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/command_context.py
new file mode 100644
index 0000000000000000000000000000000000000000..ed68322376db4864d2fca2d3bca0b0a300658167
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/command_context.py
@@ -0,0 +1,27 @@
+from contextlib import ExitStack, contextmanager
+from typing import ContextManager, Iterator, TypeVar
+
+_T = TypeVar("_T", covariant=True)
+
+
+class CommandContextMixIn:
+ def __init__(self) -> None:
+ super().__init__()
+ self._in_main_context = False
+ self._main_context = ExitStack()
+
+ @contextmanager
+ def main_context(self) -> Iterator[None]:
+ assert not self._in_main_context
+
+ self._in_main_context = True
+ try:
+ with self._main_context:
+ yield
+ finally:
+ self._in_main_context = False
+
+ def enter_context(self, context_provider: ContextManager[_T]) -> _T:
+ assert self._in_main_context
+
+ return self._main_context.enter_context(context_provider)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/main.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/main.py
new file mode 100644
index 0000000000000000000000000000000000000000..0e31221543adcd5cbec489985bbf473dcf7503f6
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/main.py
@@ -0,0 +1,70 @@
+"""Primary application entrypoint.
+"""
+import locale
+import logging
+import os
+import sys
+from typing import List, Optional
+
+from pip._internal.cli.autocompletion import autocomplete
+from pip._internal.cli.main_parser import parse_command
+from pip._internal.commands import create_command
+from pip._internal.exceptions import PipError
+from pip._internal.utils import deprecation
+
+logger = logging.getLogger(__name__)
+
+
+# Do not import and use main() directly! Using it directly is actively
+# discouraged by pip's maintainers. The name, location and behavior of
+# this function is subject to change, so calling it directly is not
+# portable across different pip versions.
+
+# In addition, running pip in-process is unsupported and unsafe. This is
+# elaborated in detail at
+# https://pip.pypa.io/en/stable/user_guide/#using-pip-from-your-program.
+# That document also provides suggestions that should work for nearly
+# all users that are considering importing and using main() directly.
+
+# However, we know that certain users will still want to invoke pip
+# in-process. If you understand and accept the implications of using pip
+# in an unsupported manner, the best approach is to use runpy to avoid
+# depending on the exact location of this entry point.
+
+# The following example shows how to use runpy to invoke pip in that
+# case:
+#
+# sys.argv = ["pip", your, args, here]
+# runpy.run_module("pip", run_name="__main__")
+#
+# Note that this will exit the process after running, unlike a direct
+# call to main. As it is not safe to do any processing after calling
+# main, this should not be an issue in practice.
+
+
+def main(args: Optional[List[str]] = None) -> int:
+ if args is None:
+ args = sys.argv[1:]
+
+ # Configure our deprecation warnings to be sent through loggers
+ deprecation.install_warning_logger()
+
+ autocomplete()
+
+ try:
+ cmd_name, cmd_args = parse_command(args)
+ except PipError as exc:
+ sys.stderr.write(f"ERROR: {exc}")
+ sys.stderr.write(os.linesep)
+ sys.exit(1)
+
+ # Needed for locale.getpreferredencoding(False) to work
+ # in pip._internal.utils.encoding.auto_decode
+ try:
+ locale.setlocale(locale.LC_ALL, "")
+ except locale.Error as e:
+ # setlocale can apparently crash if locale are uninitialized
+ logger.debug("Ignoring error %s when setting locale", e)
+ command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
+
+ return command.main(cmd_args)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/main_parser.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/main_parser.py
new file mode 100644
index 0000000000000000000000000000000000000000..3666ab04ca6460be9bc6944c0f045be7ff44c365
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/main_parser.py
@@ -0,0 +1,87 @@
+"""A single place for constructing and exposing the main parser
+"""
+
+import os
+import sys
+from typing import List, Tuple
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
+from pip._internal.commands import commands_dict, get_similar_commands
+from pip._internal.exceptions import CommandError
+from pip._internal.utils.misc import get_pip_version, get_prog
+
+__all__ = ["create_main_parser", "parse_command"]
+
+
+def create_main_parser() -> ConfigOptionParser:
+ """Creates and returns the main parser for pip's CLI"""
+
+ parser = ConfigOptionParser(
+ usage="\n%prog [options]",
+ add_help_option=False,
+ formatter=UpdatingDefaultsHelpFormatter(),
+ name="global",
+ prog=get_prog(),
+ )
+ parser.disable_interspersed_args()
+
+ parser.version = get_pip_version()
+
+ # add the general options
+ gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
+ parser.add_option_group(gen_opts)
+
+ # so the help formatter knows
+ parser.main = True # type: ignore
+
+ # create command listing for description
+ description = [""] + [
+ f"{name:27} {command_info.summary}"
+ for name, command_info in commands_dict.items()
+ ]
+ parser.description = "\n".join(description)
+
+ return parser
+
+
+def parse_command(args: List[str]) -> Tuple[str, List[str]]:
+ parser = create_main_parser()
+
+ # Note: parser calls disable_interspersed_args(), so the result of this
+ # call is to split the initial args into the general options before the
+ # subcommand and everything else.
+ # For example:
+ # args: ['--timeout=5', 'install', '--user', 'INITools']
+ # general_options: ['--timeout==5']
+ # args_else: ['install', '--user', 'INITools']
+ general_options, args_else = parser.parse_args(args)
+
+ # --version
+ if general_options.version:
+ sys.stdout.write(parser.version)
+ sys.stdout.write(os.linesep)
+ sys.exit()
+
+ # pip || pip help -> print_help()
+ if not args_else or (args_else[0] == "help" and len(args_else) == 1):
+ parser.print_help()
+ sys.exit()
+
+ # the subcommand name
+ cmd_name = args_else[0]
+
+ if cmd_name not in commands_dict:
+ guess = get_similar_commands(cmd_name)
+
+ msg = [f'unknown command "{cmd_name}"']
+ if guess:
+ msg.append(f'maybe you meant "{guess}"')
+
+ raise CommandError(" - ".join(msg))
+
+ # all the args without the subcommand
+ cmd_args = args[:]
+ cmd_args.remove(cmd_name)
+
+ return cmd_name, cmd_args
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/parser.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/parser.py
new file mode 100644
index 0000000000000000000000000000000000000000..a1c99a8cb301f222feb1845be4e80d9b1f9d2622
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/parser.py
@@ -0,0 +1,292 @@
+"""Base option parser setup"""
+
+import logging
+import optparse
+import shutil
+import sys
+import textwrap
+from contextlib import suppress
+from typing import Any, Dict, Iterator, List, Tuple
+
+from pip._internal.cli.status_codes import UNKNOWN_ERROR
+from pip._internal.configuration import Configuration, ConfigurationError
+from pip._internal.utils.misc import redact_auth_from_url, strtobool
+
+logger = logging.getLogger(__name__)
+
+
+class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
+ """A prettier/less verbose help formatter for optparse."""
+
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ # help position must be aligned with __init__.parseopts.description
+ kwargs["max_help_position"] = 30
+ kwargs["indent_increment"] = 1
+ kwargs["width"] = shutil.get_terminal_size()[0] - 2
+ super().__init__(*args, **kwargs)
+
+ def format_option_strings(self, option: optparse.Option) -> str:
+ return self._format_option_strings(option)
+
+ def _format_option_strings(
+ self, option: optparse.Option, mvarfmt: str = " <{}>", optsep: str = ", "
+ ) -> str:
+ """
+ Return a comma-separated list of option strings and metavars.
+
+ :param option: tuple of (short opt, long opt), e.g: ('-f', '--format')
+ :param mvarfmt: metavar format string
+ :param optsep: separator
+ """
+ opts = []
+
+ if option._short_opts:
+ opts.append(option._short_opts[0])
+ if option._long_opts:
+ opts.append(option._long_opts[0])
+ if len(opts) > 1:
+ opts.insert(1, optsep)
+
+ if option.takes_value():
+ assert option.dest is not None
+ metavar = option.metavar or option.dest.lower()
+ opts.append(mvarfmt.format(metavar.lower()))
+
+ return "".join(opts)
+
+ def format_heading(self, heading: str) -> str:
+ if heading == "Options":
+ return ""
+ return heading + ":\n"
+
+ def format_usage(self, usage: str) -> str:
+ """
+ Ensure there is only one newline between usage and the first heading
+ if there is no description.
+ """
+ msg = "\nUsage: {}\n".format(self.indent_lines(textwrap.dedent(usage), " "))
+ return msg
+
+ def format_description(self, description: str) -> str:
+ # leave full control over description to us
+ if description:
+ if hasattr(self.parser, "main"):
+ label = "Commands"
+ else:
+ label = "Description"
+ # some doc strings have initial newlines, some don't
+ description = description.lstrip("\n")
+ # some doc strings have final newlines and spaces, some don't
+ description = description.rstrip()
+ # dedent, then reindent
+ description = self.indent_lines(textwrap.dedent(description), " ")
+ description = f"{label}:\n{description}\n"
+ return description
+ else:
+ return ""
+
+ def format_epilog(self, epilog: str) -> str:
+ # leave full control over epilog to us
+ if epilog:
+ return epilog
+ else:
+ return ""
+
+ def indent_lines(self, text: str, indent: str) -> str:
+ new_lines = [indent + line for line in text.split("\n")]
+ return "\n".join(new_lines)
+
+
+class UpdatingDefaultsHelpFormatter(PrettyHelpFormatter):
+ """Custom help formatter for use in ConfigOptionParser.
+
+ This is updates the defaults before expanding them, allowing
+ them to show up correctly in the help listing.
+
+ Also redact auth from url type options
+ """
+
+ def expand_default(self, option: optparse.Option) -> str:
+ default_values = None
+ if self.parser is not None:
+ assert isinstance(self.parser, ConfigOptionParser)
+ self.parser._update_defaults(self.parser.defaults)
+ assert option.dest is not None
+ default_values = self.parser.defaults.get(option.dest)
+ help_text = super().expand_default(option)
+
+ if default_values and option.metavar == "URL":
+ if isinstance(default_values, str):
+ default_values = [default_values]
+
+ # If its not a list, we should abort and just return the help text
+ if not isinstance(default_values, list):
+ default_values = []
+
+ for val in default_values:
+ help_text = help_text.replace(val, redact_auth_from_url(val))
+
+ return help_text
+
+
+class CustomOptionParser(optparse.OptionParser):
+ def insert_option_group(
+ self, idx: int, *args: Any, **kwargs: Any
+ ) -> optparse.OptionGroup:
+ """Insert an OptionGroup at a given position."""
+ group = self.add_option_group(*args, **kwargs)
+
+ self.option_groups.pop()
+ self.option_groups.insert(idx, group)
+
+ return group
+
+ @property
+ def option_list_all(self) -> List[optparse.Option]:
+ """Get a list of all options, including those in option groups."""
+ res = self.option_list[:]
+ for i in self.option_groups:
+ res.extend(i.option_list)
+
+ return res
+
+
+class ConfigOptionParser(CustomOptionParser):
+ """Custom option parser which updates its defaults by checking the
+ configuration files and environmental variables"""
+
+ def __init__(
+ self,
+ *args: Any,
+ name: str,
+ isolated: bool = False,
+ **kwargs: Any,
+ ) -> None:
+ self.name = name
+ self.config = Configuration(isolated)
+
+ assert self.name
+ super().__init__(*args, **kwargs)
+
+ def check_default(self, option: optparse.Option, key: str, val: Any) -> Any:
+ try:
+ return option.check_value(key, val)
+ except optparse.OptionValueError as exc:
+ print(f"An error occurred during configuration: {exc}")
+ sys.exit(3)
+
+ def _get_ordered_configuration_items(self) -> Iterator[Tuple[str, Any]]:
+ # Configuration gives keys in an unordered manner. Order them.
+ override_order = ["global", self.name, ":env:"]
+
+ # Pool the options into different groups
+ section_items: Dict[str, List[Tuple[str, Any]]] = {
+ name: [] for name in override_order
+ }
+ for section_key, val in self.config.items():
+ # ignore empty values
+ if not val:
+ logger.debug(
+ "Ignoring configuration key '%s' as it's value is empty.",
+ section_key,
+ )
+ continue
+
+ section, key = section_key.split(".", 1)
+ if section in override_order:
+ section_items[section].append((key, val))
+
+ # Yield each group in their override order
+ for section in override_order:
+ for key, val in section_items[section]:
+ yield key, val
+
+ def _update_defaults(self, defaults: Dict[str, Any]) -> Dict[str, Any]:
+ """Updates the given defaults with values from the config files and
+ the environ. Does a little special handling for certain types of
+ options (lists)."""
+
+ # Accumulate complex default state.
+ self.values = optparse.Values(self.defaults)
+ late_eval = set()
+ # Then set the options with those values
+ for key, val in self._get_ordered_configuration_items():
+ # '--' because configuration supports only long names
+ option = self.get_option("--" + key)
+
+ # Ignore options not present in this parser. E.g. non-globals put
+ # in [global] by users that want them to apply to all applicable
+ # commands.
+ if option is None:
+ continue
+
+ assert option.dest is not None
+
+ if option.action in ("store_true", "store_false"):
+ try:
+ val = strtobool(val)
+ except ValueError:
+ self.error(
+ "{} is not a valid value for {} option, " # noqa
+ "please specify a boolean value like yes/no, "
+ "true/false or 1/0 instead.".format(val, key)
+ )
+ elif option.action == "count":
+ with suppress(ValueError):
+ val = strtobool(val)
+ with suppress(ValueError):
+ val = int(val)
+ if not isinstance(val, int) or val < 0:
+ self.error(
+ "{} is not a valid value for {} option, " # noqa
+ "please instead specify either a non-negative integer "
+ "or a boolean value like yes/no or false/true "
+ "which is equivalent to 1/0.".format(val, key)
+ )
+ elif option.action == "append":
+ val = val.split()
+ val = [self.check_default(option, key, v) for v in val]
+ elif option.action == "callback":
+ assert option.callback is not None
+ late_eval.add(option.dest)
+ opt_str = option.get_opt_string()
+ val = option.convert_value(opt_str, val)
+ # From take_action
+ args = option.callback_args or ()
+ kwargs = option.callback_kwargs or {}
+ option.callback(option, opt_str, val, self, *args, **kwargs)
+ else:
+ val = self.check_default(option, key, val)
+
+ defaults[option.dest] = val
+
+ for key in late_eval:
+ defaults[key] = getattr(self.values, key)
+ self.values = None
+ return defaults
+
+ def get_default_values(self) -> optparse.Values:
+ """Overriding to make updating the defaults after instantiation of
+ the option parser possible, _update_defaults() does the dirty work."""
+ if not self.process_default_values:
+ # Old, pre-Optik 1.5 behaviour.
+ return optparse.Values(self.defaults)
+
+ # Load the configuration, or error out in case of an error
+ try:
+ self.config.load()
+ except ConfigurationError as err:
+ self.exit(UNKNOWN_ERROR, str(err))
+
+ defaults = self._update_defaults(self.defaults.copy()) # ours
+ for option in self._get_all_options():
+ assert option.dest is not None
+ default = defaults.get(option.dest)
+ if isinstance(default, str):
+ opt_str = option.get_opt_string()
+ defaults[option.dest] = option.check_value(opt_str, default)
+ return optparse.Values(defaults)
+
+ def error(self, msg: str) -> None:
+ self.print_usage(sys.stderr)
+ self.exit(UNKNOWN_ERROR, f"{msg}\n")
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/cli/progress_bars.py b/venv/lib/python3.10/site-packages/pip/_internal/cli/progress_bars.py
new file mode 100644
index 0000000000000000000000000000000000000000..ffa1964fc7b7774829c5314c38984b6a3a2a4051
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/cli/progress_bars.py
@@ -0,0 +1,321 @@
+import functools
+import itertools
+import sys
+from signal import SIGINT, default_int_handler, signal
+from typing import Any, Callable, Iterator, Optional, Tuple
+
+from pip._vendor.progress.bar import Bar, FillingCirclesBar, IncrementalBar
+from pip._vendor.progress.spinner import Spinner
+from pip._vendor.rich.progress import (
+ BarColumn,
+ DownloadColumn,
+ FileSizeColumn,
+ Progress,
+ ProgressColumn,
+ SpinnerColumn,
+ TextColumn,
+ TimeElapsedColumn,
+ TimeRemainingColumn,
+ TransferSpeedColumn,
+)
+
+from pip._internal.utils.compat import WINDOWS
+from pip._internal.utils.logging import get_indentation
+from pip._internal.utils.misc import format_size
+
+try:
+ from pip._vendor import colorama
+# Lots of different errors can come from this, including SystemError and
+# ImportError.
+except Exception:
+ colorama = None
+
+DownloadProgressRenderer = Callable[[Iterator[bytes]], Iterator[bytes]]
+
+
+def _select_progress_class(preferred: Bar, fallback: Bar) -> Bar:
+ encoding = getattr(preferred.file, "encoding", None)
+
+ # If we don't know what encoding this file is in, then we'll just assume
+ # that it doesn't support unicode and use the ASCII bar.
+ if not encoding:
+ return fallback
+
+ # Collect all of the possible characters we want to use with the preferred
+ # bar.
+ characters = [
+ getattr(preferred, "empty_fill", ""),
+ getattr(preferred, "fill", ""),
+ ]
+ characters += list(getattr(preferred, "phases", []))
+
+ # Try to decode the characters we're using for the bar using the encoding
+ # of the given file, if this works then we'll assume that we can use the
+ # fancier bar and if not we'll fall back to the plaintext bar.
+ try:
+ "".join(characters).encode(encoding)
+ except UnicodeEncodeError:
+ return fallback
+ else:
+ return preferred
+
+
+_BaseBar: Any = _select_progress_class(IncrementalBar, Bar)
+
+
+class InterruptibleMixin:
+ """
+ Helper to ensure that self.finish() gets called on keyboard interrupt.
+
+ This allows downloads to be interrupted without leaving temporary state
+ (like hidden cursors) behind.
+
+ This class is similar to the progress library's existing SigIntMixin
+ helper, but as of version 1.2, that helper has the following problems:
+
+ 1. It calls sys.exit().
+ 2. It discards the existing SIGINT handler completely.
+ 3. It leaves its own handler in place even after an uninterrupted finish,
+ which will have unexpected delayed effects if the user triggers an
+ unrelated keyboard interrupt some time after a progress-displaying
+ download has already completed, for example.
+ """
+
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ """
+ Save the original SIGINT handler for later.
+ """
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+
+ self.original_handler = signal(SIGINT, self.handle_sigint)
+
+ # If signal() returns None, the previous handler was not installed from
+ # Python, and we cannot restore it. This probably should not happen,
+ # but if it does, we must restore something sensible instead, at least.
+ # The least bad option should be Python's default SIGINT handler, which
+ # just raises KeyboardInterrupt.
+ if self.original_handler is None:
+ self.original_handler = default_int_handler
+
+ def finish(self) -> None:
+ """
+ Restore the original SIGINT handler after finishing.
+
+ This should happen regardless of whether the progress display finishes
+ normally, or gets interrupted.
+ """
+ super().finish() # type: ignore
+ signal(SIGINT, self.original_handler)
+
+ def handle_sigint(self, signum, frame): # type: ignore
+ """
+ Call self.finish() before delegating to the original SIGINT handler.
+
+ This handler should only be in place while the progress display is
+ active.
+ """
+ self.finish()
+ self.original_handler(signum, frame)
+
+
+class SilentBar(Bar):
+ def update(self) -> None:
+ pass
+
+
+class BlueEmojiBar(IncrementalBar):
+
+ suffix = "%(percent)d%%"
+ bar_prefix = " "
+ bar_suffix = " "
+ phases = ("\U0001F539", "\U0001F537", "\U0001F535")
+
+
+class DownloadProgressMixin:
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+ self.message: str = (" " * (get_indentation() + 2)) + self.message
+
+ @property
+ def downloaded(self) -> str:
+ return format_size(self.index) # type: ignore
+
+ @property
+ def download_speed(self) -> str:
+ # Avoid zero division errors...
+ if self.avg == 0.0: # type: ignore
+ return "..."
+ return format_size(1 / self.avg) + "/s" # type: ignore
+
+ @property
+ def pretty_eta(self) -> str:
+ if self.eta: # type: ignore
+ return f"eta {self.eta_td}" # type: ignore
+ return ""
+
+ def iter(self, it): # type: ignore
+ for x in it:
+ yield x
+ # B305 is incorrectly raised here
+ # https://github.com/PyCQA/flake8-bugbear/issues/59
+ self.next(len(x)) # noqa: B305
+ self.finish()
+
+
+class WindowsMixin:
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ # The Windows terminal does not support the hide/show cursor ANSI codes
+ # even with colorama. So we'll ensure that hide_cursor is False on
+ # Windows.
+ # This call needs to go before the super() call, so that hide_cursor
+ # is set in time. The base progress bar class writes the "hide cursor"
+ # code to the terminal in its init, so if we don't set this soon
+ # enough, we get a "hide" with no corresponding "show"...
+ if WINDOWS and self.hide_cursor: # type: ignore
+ self.hide_cursor = False
+
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+
+ # Check if we are running on Windows and we have the colorama module,
+ # if we do then wrap our file with it.
+ if WINDOWS and colorama:
+ self.file = colorama.AnsiToWin32(self.file) # type: ignore
+ # The progress code expects to be able to call self.file.isatty()
+ # but the colorama.AnsiToWin32() object doesn't have that, so we'll
+ # add it.
+ self.file.isatty = lambda: self.file.wrapped.isatty()
+ # The progress code expects to be able to call self.file.flush()
+ # but the colorama.AnsiToWin32() object doesn't have that, so we'll
+ # add it.
+ self.file.flush = lambda: self.file.wrapped.flush()
+
+
+class BaseDownloadProgressBar(WindowsMixin, InterruptibleMixin, DownloadProgressMixin):
+
+ file = sys.stdout
+ message = "%(percent)d%%"
+ suffix = "%(downloaded)s %(download_speed)s %(pretty_eta)s"
+
+
+class DefaultDownloadProgressBar(BaseDownloadProgressBar, _BaseBar):
+ pass
+
+
+class DownloadSilentBar(BaseDownloadProgressBar, SilentBar):
+ pass
+
+
+class DownloadBar(BaseDownloadProgressBar, Bar):
+ pass
+
+
+class DownloadFillingCirclesBar(BaseDownloadProgressBar, FillingCirclesBar):
+ pass
+
+
+class DownloadBlueEmojiProgressBar(BaseDownloadProgressBar, BlueEmojiBar):
+ pass
+
+
+class DownloadProgressSpinner(
+ WindowsMixin, InterruptibleMixin, DownloadProgressMixin, Spinner
+):
+
+ file = sys.stdout
+ suffix = "%(downloaded)s %(download_speed)s"
+
+ def next_phase(self) -> str:
+ if not hasattr(self, "_phaser"):
+ self._phaser = itertools.cycle(self.phases)
+ return next(self._phaser)
+
+ def update(self) -> None:
+ message = self.message % self
+ phase = self.next_phase()
+ suffix = self.suffix % self
+ line = "".join(
+ [
+ message,
+ " " if message else "",
+ phase,
+ " " if suffix else "",
+ suffix,
+ ]
+ )
+
+ self.writeln(line)
+
+
+BAR_TYPES = {
+ "off": (DownloadSilentBar, DownloadSilentBar),
+ "on": (DefaultDownloadProgressBar, DownloadProgressSpinner),
+ "ascii": (DownloadBar, DownloadProgressSpinner),
+ "pretty": (DownloadFillingCirclesBar, DownloadProgressSpinner),
+ "emoji": (DownloadBlueEmojiProgressBar, DownloadProgressSpinner),
+}
+
+
+def _legacy_progress_bar(
+ progress_bar: str, max: Optional[int]
+) -> DownloadProgressRenderer:
+ if max is None or max == 0:
+ return BAR_TYPES[progress_bar][1]().iter # type: ignore
+ else:
+ return BAR_TYPES[progress_bar][0](max=max).iter
+
+
+#
+# Modern replacement, for our legacy progress bars.
+#
+def _rich_progress_bar(
+ iterable: Iterator[bytes],
+ *,
+ bar_type: str,
+ size: int,
+) -> Iterator[bytes]:
+ assert bar_type == "on", "This should only be used in the default mode."
+
+ if not size:
+ total = float("inf")
+ columns: Tuple[ProgressColumn, ...] = (
+ TextColumn("[progress.description]{task.description}"),
+ SpinnerColumn("line", speed=1.5),
+ FileSizeColumn(),
+ TransferSpeedColumn(),
+ TimeElapsedColumn(),
+ )
+ else:
+ total = size
+ columns = (
+ TextColumn("[progress.description]{task.description}"),
+ BarColumn(),
+ DownloadColumn(),
+ TransferSpeedColumn(),
+ TextColumn("eta"),
+ TimeRemainingColumn(),
+ )
+
+ progress = Progress(*columns, refresh_per_second=30)
+ task_id = progress.add_task(" " * (get_indentation() + 2), total=total)
+ with progress:
+ for chunk in iterable:
+ yield chunk
+ progress.update(task_id, advance=len(chunk))
+
+
+def get_download_progress_renderer(
+ *, bar_type: str, size: Optional[int] = None
+) -> DownloadProgressRenderer:
+ """Get an object that can be used to render the download progress.
+
+ Returns a callable, that takes an iterable to "wrap".
+ """
+ if bar_type == "on":
+ return functools.partial(_rich_progress_bar, bar_type=bar_type, size=size)
+ elif bar_type == "off":
+ return iter # no-op, when passed an iterator
+ else:
+ return _legacy_progress_bar(bar_type, size)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/configuration.py b/venv/lib/python3.10/site-packages/pip/_internal/configuration.py
new file mode 100644
index 0000000000000000000000000000000000000000..a8092d1ae069c4095901e7f5cb8e6fa49ef63033
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/configuration.py
@@ -0,0 +1,366 @@
+"""Configuration management setup
+
+Some terminology:
+- name
+ As written in config files.
+- value
+ Value associated with a name
+- key
+ Name combined with it's section (section.name)
+- variant
+ A single word describing where the configuration key-value pair came from
+"""
+
+import configparser
+import locale
+import os
+import sys
+from typing import Any, Dict, Iterable, List, NewType, Optional, Tuple
+
+from pip._internal.exceptions import (
+ ConfigurationError,
+ ConfigurationFileCouldNotBeLoaded,
+)
+from pip._internal.utils import appdirs
+from pip._internal.utils.compat import WINDOWS
+from pip._internal.utils.logging import getLogger
+from pip._internal.utils.misc import ensure_dir, enum
+
+RawConfigParser = configparser.RawConfigParser # Shorthand
+Kind = NewType("Kind", str)
+
+CONFIG_BASENAME = "pip.ini" if WINDOWS else "pip.conf"
+ENV_NAMES_IGNORED = "version", "help"
+
+# The kinds of configurations there are.
+kinds = enum(
+ USER="user", # User Specific
+ GLOBAL="global", # System Wide
+ SITE="site", # [Virtual] Environment Specific
+ ENV="env", # from PIP_CONFIG_FILE
+ ENV_VAR="env-var", # from Environment Variables
+)
+OVERRIDE_ORDER = kinds.GLOBAL, kinds.USER, kinds.SITE, kinds.ENV, kinds.ENV_VAR
+VALID_LOAD_ONLY = kinds.USER, kinds.GLOBAL, kinds.SITE
+
+logger = getLogger(__name__)
+
+
+# NOTE: Maybe use the optionx attribute to normalize keynames.
+def _normalize_name(name: str) -> str:
+ """Make a name consistent regardless of source (environment or file)"""
+ name = name.lower().replace("_", "-")
+ if name.startswith("--"):
+ name = name[2:] # only prefer long opts
+ return name
+
+
+def _disassemble_key(name: str) -> List[str]:
+ if "." not in name:
+ error_message = (
+ "Key does not contain dot separated section and key. "
+ "Perhaps you wanted to use 'global.{}' instead?"
+ ).format(name)
+ raise ConfigurationError(error_message)
+ return name.split(".", 1)
+
+
+def get_configuration_files() -> Dict[Kind, List[str]]:
+ global_config_files = [
+ os.path.join(path, CONFIG_BASENAME) for path in appdirs.site_config_dirs("pip")
+ ]
+
+ site_config_file = os.path.join(sys.prefix, CONFIG_BASENAME)
+ legacy_config_file = os.path.join(
+ os.path.expanduser("~"),
+ "pip" if WINDOWS else ".pip",
+ CONFIG_BASENAME,
+ )
+ new_config_file = os.path.join(appdirs.user_config_dir("pip"), CONFIG_BASENAME)
+ return {
+ kinds.GLOBAL: global_config_files,
+ kinds.SITE: [site_config_file],
+ kinds.USER: [legacy_config_file, new_config_file],
+ }
+
+
+class Configuration:
+ """Handles management of configuration.
+
+ Provides an interface to accessing and managing configuration files.
+
+ This class converts provides an API that takes "section.key-name" style
+ keys and stores the value associated with it as "key-name" under the
+ section "section".
+
+ This allows for a clean interface wherein the both the section and the
+ key-name are preserved in an easy to manage form in the configuration files
+ and the data stored is also nice.
+ """
+
+ def __init__(self, isolated: bool, load_only: Optional[Kind] = None) -> None:
+ super().__init__()
+
+ if load_only is not None and load_only not in VALID_LOAD_ONLY:
+ raise ConfigurationError(
+ "Got invalid value for load_only - should be one of {}".format(
+ ", ".join(map(repr, VALID_LOAD_ONLY))
+ )
+ )
+ self.isolated = isolated
+ self.load_only = load_only
+
+ # Because we keep track of where we got the data from
+ self._parsers: Dict[Kind, List[Tuple[str, RawConfigParser]]] = {
+ variant: [] for variant in OVERRIDE_ORDER
+ }
+ self._config: Dict[Kind, Dict[str, Any]] = {
+ variant: {} for variant in OVERRIDE_ORDER
+ }
+ self._modified_parsers: List[Tuple[str, RawConfigParser]] = []
+
+ def load(self) -> None:
+ """Loads configuration from configuration files and environment"""
+ self._load_config_files()
+ if not self.isolated:
+ self._load_environment_vars()
+
+ def get_file_to_edit(self) -> Optional[str]:
+ """Returns the file with highest priority in configuration"""
+ assert self.load_only is not None, "Need to be specified a file to be editing"
+
+ try:
+ return self._get_parser_to_modify()[0]
+ except IndexError:
+ return None
+
+ def items(self) -> Iterable[Tuple[str, Any]]:
+ """Returns key-value pairs like dict.items() representing the loaded
+ configuration
+ """
+ return self._dictionary.items()
+
+ def get_value(self, key: str) -> Any:
+ """Get a value from the configuration."""
+ try:
+ return self._dictionary[key]
+ except KeyError:
+ raise ConfigurationError(f"No such key - {key}")
+
+ def set_value(self, key: str, value: Any) -> None:
+ """Modify a value in the configuration."""
+ self._ensure_have_load_only()
+
+ assert self.load_only
+ fname, parser = self._get_parser_to_modify()
+
+ if parser is not None:
+ section, name = _disassemble_key(key)
+
+ # Modify the parser and the configuration
+ if not parser.has_section(section):
+ parser.add_section(section)
+ parser.set(section, name, value)
+
+ self._config[self.load_only][key] = value
+ self._mark_as_modified(fname, parser)
+
+ def unset_value(self, key: str) -> None:
+ """Unset a value in the configuration."""
+ self._ensure_have_load_only()
+
+ assert self.load_only
+ if key not in self._config[self.load_only]:
+ raise ConfigurationError(f"No such key - {key}")
+
+ fname, parser = self._get_parser_to_modify()
+
+ if parser is not None:
+ section, name = _disassemble_key(key)
+ if not (
+ parser.has_section(section) and parser.remove_option(section, name)
+ ):
+ # The option was not removed.
+ raise ConfigurationError(
+ "Fatal Internal error [id=1]. Please report as a bug."
+ )
+
+ # The section may be empty after the option was removed.
+ if not parser.items(section):
+ parser.remove_section(section)
+ self._mark_as_modified(fname, parser)
+
+ del self._config[self.load_only][key]
+
+ def save(self) -> None:
+ """Save the current in-memory state."""
+ self._ensure_have_load_only()
+
+ for fname, parser in self._modified_parsers:
+ logger.info("Writing to %s", fname)
+
+ # Ensure directory exists.
+ ensure_dir(os.path.dirname(fname))
+
+ with open(fname, "w") as f:
+ parser.write(f)
+
+ #
+ # Private routines
+ #
+
+ def _ensure_have_load_only(self) -> None:
+ if self.load_only is None:
+ raise ConfigurationError("Needed a specific file to be modifying.")
+ logger.debug("Will be working with %s variant only", self.load_only)
+
+ @property
+ def _dictionary(self) -> Dict[str, Any]:
+ """A dictionary representing the loaded configuration."""
+ # NOTE: Dictionaries are not populated if not loaded. So, conditionals
+ # are not needed here.
+ retval = {}
+
+ for variant in OVERRIDE_ORDER:
+ retval.update(self._config[variant])
+
+ return retval
+
+ def _load_config_files(self) -> None:
+ """Loads configuration from configuration files"""
+ config_files = dict(self.iter_config_files())
+ if config_files[kinds.ENV][0:1] == [os.devnull]:
+ logger.debug(
+ "Skipping loading configuration files due to "
+ "environment's PIP_CONFIG_FILE being os.devnull"
+ )
+ return
+
+ for variant, files in config_files.items():
+ for fname in files:
+ # If there's specific variant set in `load_only`, load only
+ # that variant, not the others.
+ if self.load_only is not None and variant != self.load_only:
+ logger.debug("Skipping file '%s' (variant: %s)", fname, variant)
+ continue
+
+ parser = self._load_file(variant, fname)
+
+ # Keeping track of the parsers used
+ self._parsers[variant].append((fname, parser))
+
+ def _load_file(self, variant: Kind, fname: str) -> RawConfigParser:
+ logger.verbose("For variant '%s', will try loading '%s'", variant, fname)
+ parser = self._construct_parser(fname)
+
+ for section in parser.sections():
+ items = parser.items(section)
+ self._config[variant].update(self._normalized_keys(section, items))
+
+ return parser
+
+ def _construct_parser(self, fname: str) -> RawConfigParser:
+ parser = configparser.RawConfigParser()
+ # If there is no such file, don't bother reading it but create the
+ # parser anyway, to hold the data.
+ # Doing this is useful when modifying and saving files, where we don't
+ # need to construct a parser.
+ if os.path.exists(fname):
+ locale_encoding = locale.getpreferredencoding(False)
+ try:
+ parser.read(fname, encoding=locale_encoding)
+ except UnicodeDecodeError:
+ # See https://github.com/pypa/pip/issues/4963
+ raise ConfigurationFileCouldNotBeLoaded(
+ reason=f"contains invalid {locale_encoding} characters",
+ fname=fname,
+ )
+ except configparser.Error as error:
+ # See https://github.com/pypa/pip/issues/4893
+ raise ConfigurationFileCouldNotBeLoaded(error=error)
+ return parser
+
+ def _load_environment_vars(self) -> None:
+ """Loads configuration from environment variables"""
+ self._config[kinds.ENV_VAR].update(
+ self._normalized_keys(":env:", self.get_environ_vars())
+ )
+
+ def _normalized_keys(
+ self, section: str, items: Iterable[Tuple[str, Any]]
+ ) -> Dict[str, Any]:
+ """Normalizes items to construct a dictionary with normalized keys.
+
+ This routine is where the names become keys and are made the same
+ regardless of source - configuration files or environment.
+ """
+ normalized = {}
+ for name, val in items:
+ key = section + "." + _normalize_name(name)
+ normalized[key] = val
+ return normalized
+
+ def get_environ_vars(self) -> Iterable[Tuple[str, str]]:
+ """Returns a generator with all environmental vars with prefix PIP_"""
+ for key, val in os.environ.items():
+ if key.startswith("PIP_"):
+ name = key[4:].lower()
+ if name not in ENV_NAMES_IGNORED:
+ yield name, val
+
+ # XXX: This is patched in the tests.
+ def iter_config_files(self) -> Iterable[Tuple[Kind, List[str]]]:
+ """Yields variant and configuration files associated with it.
+
+ This should be treated like items of a dictionary.
+ """
+ # SMELL: Move the conditions out of this function
+
+ # environment variables have the lowest priority
+ config_file = os.environ.get("PIP_CONFIG_FILE", None)
+ if config_file is not None:
+ yield kinds.ENV, [config_file]
+ else:
+ yield kinds.ENV, []
+
+ config_files = get_configuration_files()
+
+ # at the base we have any global configuration
+ yield kinds.GLOBAL, config_files[kinds.GLOBAL]
+
+ # per-user configuration next
+ should_load_user_config = not self.isolated and not (
+ config_file and os.path.exists(config_file)
+ )
+ if should_load_user_config:
+ # The legacy config file is overridden by the new config file
+ yield kinds.USER, config_files[kinds.USER]
+
+ # finally virtualenv configuration first trumping others
+ yield kinds.SITE, config_files[kinds.SITE]
+
+ def get_values_in_config(self, variant: Kind) -> Dict[str, Any]:
+ """Get values present in a config file"""
+ return self._config[variant]
+
+ def _get_parser_to_modify(self) -> Tuple[str, RawConfigParser]:
+ # Determine which parser to modify
+ assert self.load_only
+ parsers = self._parsers[self.load_only]
+ if not parsers:
+ # This should not happen if everything works correctly.
+ raise ConfigurationError(
+ "Fatal Internal error [id=2]. Please report as a bug."
+ )
+
+ # Use the highest priority parser.
+ return parsers[-1]
+
+ # XXX: This is patched in the tests.
+ def _mark_as_modified(self, fname: str, parser: RawConfigParser) -> None:
+ file_parser_tuple = (fname, parser)
+ if file_parser_tuple not in self._modified_parsers:
+ self._modified_parsers.append(file_parser_tuple)
+
+ def __repr__(self) -> str:
+ return f"{self.__class__.__name__}({self._dictionary!r})"
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/exceptions.py b/venv/lib/python3.10/site-packages/pip/_internal/exceptions.py
new file mode 100644
index 0000000000000000000000000000000000000000..97b9612a187a5e97579551e82244bcc30eacb3bf
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/exceptions.py
@@ -0,0 +1,658 @@
+"""Exceptions used throughout package.
+
+This module MUST NOT try to import from anything within `pip._internal` to
+operate. This is expected to be importable from any/all files within the
+subpackage and, thus, should not depend on them.
+"""
+
+import configparser
+import re
+from itertools import chain, groupby, repeat
+from typing import TYPE_CHECKING, Dict, List, Optional, Union
+
+from pip._vendor.requests.models import Request, Response
+from pip._vendor.rich.console import Console, ConsoleOptions, RenderResult
+from pip._vendor.rich.markup import escape
+from pip._vendor.rich.text import Text
+
+if TYPE_CHECKING:
+ from hashlib import _Hash
+ from typing import Literal
+
+ from pip._internal.metadata import BaseDistribution
+ from pip._internal.req.req_install import InstallRequirement
+
+
+#
+# Scaffolding
+#
+def _is_kebab_case(s: str) -> bool:
+ return re.match(r"^[a-z]+(-[a-z]+)*$", s) is not None
+
+
+def _prefix_with_indent(
+ s: Union[Text, str],
+ console: Console,
+ *,
+ prefix: str,
+ indent: str,
+) -> Text:
+ if isinstance(s, Text):
+ text = s
+ else:
+ text = console.render_str(s)
+
+ return console.render_str(prefix, overflow="ignore") + console.render_str(
+ f"\n{indent}", overflow="ignore"
+ ).join(text.split(allow_blank=True))
+
+
+class PipError(Exception):
+ """The base pip error."""
+
+
+class DiagnosticPipError(PipError):
+ """An error, that presents diagnostic information to the user.
+
+ This contains a bunch of logic, to enable pretty presentation of our error
+ messages. Each error gets a unique reference. Each error can also include
+ additional context, a hint and/or a note -- which are presented with the
+ main error message in a consistent style.
+
+ This is adapted from the error output styling in `sphinx-theme-builder`.
+ """
+
+ reference: str
+
+ def __init__(
+ self,
+ *,
+ kind: 'Literal["error", "warning"]' = "error",
+ reference: Optional[str] = None,
+ message: Union[str, Text],
+ context: Optional[Union[str, Text]],
+ hint_stmt: Optional[Union[str, Text]],
+ note_stmt: Optional[Union[str, Text]] = None,
+ link: Optional[str] = None,
+ ) -> None:
+ # Ensure a proper reference is provided.
+ if reference is None:
+ assert hasattr(self, "reference"), "error reference not provided!"
+ reference = self.reference
+ assert _is_kebab_case(reference), "error reference must be kebab-case!"
+
+ self.kind = kind
+ self.reference = reference
+
+ self.message = message
+ self.context = context
+
+ self.note_stmt = note_stmt
+ self.hint_stmt = hint_stmt
+
+ self.link = link
+
+ super().__init__(f"<{self.__class__.__name__}: {self.reference}>")
+
+ def __repr__(self) -> str:
+ return (
+ f"<{self.__class__.__name__}("
+ f"reference={self.reference!r}, "
+ f"message={self.message!r}, "
+ f"context={self.context!r}, "
+ f"note_stmt={self.note_stmt!r}, "
+ f"hint_stmt={self.hint_stmt!r}"
+ ")>"
+ )
+
+ def __rich_console__(
+ self,
+ console: Console,
+ options: ConsoleOptions,
+ ) -> RenderResult:
+ colour = "red" if self.kind == "error" else "yellow"
+
+ yield f"[{colour} bold]{self.kind}[/]: [bold]{self.reference}[/]"
+ yield ""
+
+ if not options.ascii_only:
+ # Present the main message, with relevant context indented.
+ if self.context is not None:
+ yield _prefix_with_indent(
+ self.message,
+ console,
+ prefix=f"[{colour}]×[/] ",
+ indent=f"[{colour}]│[/] ",
+ )
+ yield _prefix_with_indent(
+ self.context,
+ console,
+ prefix=f"[{colour}]╰─>[/] ",
+ indent=f"[{colour}] [/] ",
+ )
+ else:
+ yield _prefix_with_indent(
+ self.message,
+ console,
+ prefix="[red]×[/] ",
+ indent=" ",
+ )
+ else:
+ yield self.message
+ if self.context is not None:
+ yield ""
+ yield self.context
+
+ if self.note_stmt is not None or self.hint_stmt is not None:
+ yield ""
+
+ if self.note_stmt is not None:
+ yield _prefix_with_indent(
+ self.note_stmt,
+ console,
+ prefix="[magenta bold]note[/]: ",
+ indent=" ",
+ )
+ if self.hint_stmt is not None:
+ yield _prefix_with_indent(
+ self.hint_stmt,
+ console,
+ prefix="[cyan bold]hint[/]: ",
+ indent=" ",
+ )
+
+ if self.link is not None:
+ yield ""
+ yield f"Link: {self.link}"
+
+
+#
+# Actual Errors
+#
+class ConfigurationError(PipError):
+ """General exception in configuration"""
+
+
+class InstallationError(PipError):
+ """General exception during installation"""
+
+
+class UninstallationError(PipError):
+ """General exception during uninstallation"""
+
+
+class MissingPyProjectBuildRequires(DiagnosticPipError):
+ """Raised when pyproject.toml has `build-system`, but no `build-system.requires`."""
+
+ reference = "missing-pyproject-build-system-requires"
+
+ def __init__(self, *, package: str) -> None:
+ super().__init__(
+ message=f"Can not process {escape(package)}",
+ context=Text(
+ "This package has an invalid pyproject.toml file.\n"
+ "The [build-system] table is missing the mandatory `requires` key."
+ ),
+ note_stmt="This is an issue with the package mentioned above, not pip.",
+ hint_stmt=Text("See PEP 518 for the detailed specification."),
+ )
+
+
+class InvalidPyProjectBuildRequires(DiagnosticPipError):
+ """Raised when pyproject.toml an invalid `build-system.requires`."""
+
+ reference = "invalid-pyproject-build-system-requires"
+
+ def __init__(self, *, package: str, reason: str) -> None:
+ super().__init__(
+ message=f"Can not process {escape(package)}",
+ context=Text(
+ "This package has an invalid `build-system.requires` key in "
+ f"pyproject.toml.\n{reason}"
+ ),
+ note_stmt="This is an issue with the package mentioned above, not pip.",
+ hint_stmt=Text("See PEP 518 for the detailed specification."),
+ )
+
+
+class NoneMetadataError(PipError):
+ """Raised when accessing a Distribution's "METADATA" or "PKG-INFO".
+
+ This signifies an inconsistency, when the Distribution claims to have
+ the metadata file (if not, raise ``FileNotFoundError`` instead), but is
+ not actually able to produce its content. This may be due to permission
+ errors.
+ """
+
+ def __init__(
+ self,
+ dist: "BaseDistribution",
+ metadata_name: str,
+ ) -> None:
+ """
+ :param dist: A Distribution object.
+ :param metadata_name: The name of the metadata being accessed
+ (can be "METADATA" or "PKG-INFO").
+ """
+ self.dist = dist
+ self.metadata_name = metadata_name
+
+ def __str__(self) -> str:
+ # Use `dist` in the error message because its stringification
+ # includes more information, like the version and location.
+ return "None {} metadata found for distribution: {}".format(
+ self.metadata_name,
+ self.dist,
+ )
+
+
+class UserInstallationInvalid(InstallationError):
+ """A --user install is requested on an environment without user site."""
+
+ def __str__(self) -> str:
+ return "User base directory is not specified"
+
+
+class InvalidSchemeCombination(InstallationError):
+ def __str__(self) -> str:
+ before = ", ".join(str(a) for a in self.args[:-1])
+ return f"Cannot set {before} and {self.args[-1]} together"
+
+
+class DistributionNotFound(InstallationError):
+ """Raised when a distribution cannot be found to satisfy a requirement"""
+
+
+class RequirementsFileParseError(InstallationError):
+ """Raised when a general error occurs parsing a requirements file line."""
+
+
+class BestVersionAlreadyInstalled(PipError):
+ """Raised when the most up-to-date version of a package is already
+ installed."""
+
+
+class BadCommand(PipError):
+ """Raised when virtualenv or a command is not found"""
+
+
+class CommandError(PipError):
+ """Raised when there is an error in command-line arguments"""
+
+
+class PreviousBuildDirError(PipError):
+ """Raised when there's a previous conflicting build directory"""
+
+
+class NetworkConnectionError(PipError):
+ """HTTP connection error"""
+
+ def __init__(
+ self, error_msg: str, response: Response = None, request: Request = None
+ ) -> None:
+ """
+ Initialize NetworkConnectionError with `request` and `response`
+ objects.
+ """
+ self.response = response
+ self.request = request
+ self.error_msg = error_msg
+ if (
+ self.response is not None
+ and not self.request
+ and hasattr(response, "request")
+ ):
+ self.request = self.response.request
+ super().__init__(error_msg, response, request)
+
+ def __str__(self) -> str:
+ return str(self.error_msg)
+
+
+class InvalidWheelFilename(InstallationError):
+ """Invalid wheel filename."""
+
+
+class UnsupportedWheel(InstallationError):
+ """Unsupported wheel."""
+
+
+class InvalidWheel(InstallationError):
+ """Invalid (e.g. corrupt) wheel."""
+
+ def __init__(self, location: str, name: str):
+ self.location = location
+ self.name = name
+
+ def __str__(self) -> str:
+ return f"Wheel '{self.name}' located at {self.location} is invalid."
+
+
+class MetadataInconsistent(InstallationError):
+ """Built metadata contains inconsistent information.
+
+ This is raised when the metadata contains values (e.g. name and version)
+ that do not match the information previously obtained from sdist filename
+ or user-supplied ``#egg=`` value.
+ """
+
+ def __init__(
+ self, ireq: "InstallRequirement", field: str, f_val: str, m_val: str
+ ) -> None:
+ self.ireq = ireq
+ self.field = field
+ self.f_val = f_val
+ self.m_val = m_val
+
+ def __str__(self) -> str:
+ template = (
+ "Requested {} has inconsistent {}: "
+ "filename has {!r}, but metadata has {!r}"
+ )
+ return template.format(self.ireq, self.field, self.f_val, self.m_val)
+
+
+class LegacyInstallFailure(DiagnosticPipError):
+ """Error occurred while executing `setup.py install`"""
+
+ reference = "legacy-install-failure"
+
+ def __init__(self, package_details: str) -> None:
+ super().__init__(
+ message="Encountered error while trying to install package.",
+ context=package_details,
+ hint_stmt="See above for output from the failure.",
+ note_stmt="This is an issue with the package mentioned above, not pip.",
+ )
+
+
+class InstallationSubprocessError(DiagnosticPipError, InstallationError):
+ """A subprocess call failed."""
+
+ reference = "subprocess-exited-with-error"
+
+ def __init__(
+ self,
+ *,
+ command_description: str,
+ exit_code: int,
+ output_lines: Optional[List[str]],
+ ) -> None:
+ if output_lines is None:
+ output_prompt = Text("See above for output.")
+ else:
+ output_prompt = (
+ Text.from_markup(f"[red][{len(output_lines)} lines of output][/]\n")
+ + Text("".join(output_lines))
+ + Text.from_markup(R"[red]\[end of output][/]")
+ )
+
+ super().__init__(
+ message=(
+ f"[green]{escape(command_description)}[/] did not run successfully.\n"
+ f"exit code: {exit_code}"
+ ),
+ context=output_prompt,
+ hint_stmt=None,
+ note_stmt=(
+ "This error originates from a subprocess, and is likely not a "
+ "problem with pip."
+ ),
+ )
+
+ self.command_description = command_description
+ self.exit_code = exit_code
+
+ def __str__(self) -> str:
+ return f"{self.command_description} exited with {self.exit_code}"
+
+
+class MetadataGenerationFailed(InstallationSubprocessError, InstallationError):
+ reference = "metadata-generation-failed"
+
+ def __init__(
+ self,
+ *,
+ package_details: str,
+ ) -> None:
+ super(InstallationSubprocessError, self).__init__(
+ message="Encountered error while generating package metadata.",
+ context=escape(package_details),
+ hint_stmt="See above for details.",
+ note_stmt="This is an issue with the package mentioned above, not pip.",
+ )
+
+ def __str__(self) -> str:
+ return "metadata generation failed"
+
+
+class HashErrors(InstallationError):
+ """Multiple HashError instances rolled into one for reporting"""
+
+ def __init__(self) -> None:
+ self.errors: List["HashError"] = []
+
+ def append(self, error: "HashError") -> None:
+ self.errors.append(error)
+
+ def __str__(self) -> str:
+ lines = []
+ self.errors.sort(key=lambda e: e.order)
+ for cls, errors_of_cls in groupby(self.errors, lambda e: e.__class__):
+ lines.append(cls.head)
+ lines.extend(e.body() for e in errors_of_cls)
+ if lines:
+ return "\n".join(lines)
+ return ""
+
+ def __bool__(self) -> bool:
+ return bool(self.errors)
+
+
+class HashError(InstallationError):
+ """
+ A failure to verify a package against known-good hashes
+
+ :cvar order: An int sorting hash exception classes by difficulty of
+ recovery (lower being harder), so the user doesn't bother fretting
+ about unpinned packages when he has deeper issues, like VCS
+ dependencies, to deal with. Also keeps error reports in a
+ deterministic order.
+ :cvar head: A section heading for display above potentially many
+ exceptions of this kind
+ :ivar req: The InstallRequirement that triggered this error. This is
+ pasted on after the exception is instantiated, because it's not
+ typically available earlier.
+
+ """
+
+ req: Optional["InstallRequirement"] = None
+ head = ""
+ order: int = -1
+
+ def body(self) -> str:
+ """Return a summary of me for display under the heading.
+
+ This default implementation simply prints a description of the
+ triggering requirement.
+
+ :param req: The InstallRequirement that provoked this error, with
+ its link already populated by the resolver's _populate_link().
+
+ """
+ return f" {self._requirement_name()}"
+
+ def __str__(self) -> str:
+ return f"{self.head}\n{self.body()}"
+
+ def _requirement_name(self) -> str:
+ """Return a description of the requirement that triggered me.
+
+ This default implementation returns long description of the req, with
+ line numbers
+
+ """
+ return str(self.req) if self.req else "unknown package"
+
+
+class VcsHashUnsupported(HashError):
+ """A hash was provided for a version-control-system-based requirement, but
+ we don't have a method for hashing those."""
+
+ order = 0
+ head = (
+ "Can't verify hashes for these requirements because we don't "
+ "have a way to hash version control repositories:"
+ )
+
+
+class DirectoryUrlHashUnsupported(HashError):
+ """A hash was provided for a version-control-system-based requirement, but
+ we don't have a method for hashing those."""
+
+ order = 1
+ head = (
+ "Can't verify hashes for these file:// requirements because they "
+ "point to directories:"
+ )
+
+
+class HashMissing(HashError):
+ """A hash was needed for a requirement but is absent."""
+
+ order = 2
+ head = (
+ "Hashes are required in --require-hashes mode, but they are "
+ "missing from some requirements. Here is a list of those "
+ "requirements along with the hashes their downloaded archives "
+ "actually had. Add lines like these to your requirements files to "
+ "prevent tampering. (If you did not enable --require-hashes "
+ "manually, note that it turns on automatically when any package "
+ "has a hash.)"
+ )
+
+ def __init__(self, gotten_hash: str) -> None:
+ """
+ :param gotten_hash: The hash of the (possibly malicious) archive we
+ just downloaded
+ """
+ self.gotten_hash = gotten_hash
+
+ def body(self) -> str:
+ # Dodge circular import.
+ from pip._internal.utils.hashes import FAVORITE_HASH
+
+ package = None
+ if self.req:
+ # In the case of URL-based requirements, display the original URL
+ # seen in the requirements file rather than the package name,
+ # so the output can be directly copied into the requirements file.
+ package = (
+ self.req.original_link
+ if self.req.original_link
+ # In case someone feeds something downright stupid
+ # to InstallRequirement's constructor.
+ else getattr(self.req, "req", None)
+ )
+ return " {} --hash={}:{}".format(
+ package or "unknown package", FAVORITE_HASH, self.gotten_hash
+ )
+
+
+class HashUnpinned(HashError):
+ """A requirement had a hash specified but was not pinned to a specific
+ version."""
+
+ order = 3
+ head = (
+ "In --require-hashes mode, all requirements must have their "
+ "versions pinned with ==. These do not:"
+ )
+
+
+class HashMismatch(HashError):
+ """
+ Distribution file hash values don't match.
+
+ :ivar package_name: The name of the package that triggered the hash
+ mismatch. Feel free to write to this after the exception is raise to
+ improve its error message.
+
+ """
+
+ order = 4
+ head = (
+ "THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS "
+ "FILE. If you have updated the package versions, please update "
+ "the hashes. Otherwise, examine the package contents carefully; "
+ "someone may have tampered with them."
+ )
+
+ def __init__(self, allowed: Dict[str, List[str]], gots: Dict[str, "_Hash"]) -> None:
+ """
+ :param allowed: A dict of algorithm names pointing to lists of allowed
+ hex digests
+ :param gots: A dict of algorithm names pointing to hashes we
+ actually got from the files under suspicion
+ """
+ self.allowed = allowed
+ self.gots = gots
+
+ def body(self) -> str:
+ return " {}:\n{}".format(self._requirement_name(), self._hash_comparison())
+
+ def _hash_comparison(self) -> str:
+ """
+ Return a comparison of actual and expected hash values.
+
+ Example::
+
+ Expected sha256 abcdeabcdeabcdeabcdeabcdeabcdeabcdeabcdeabcde
+ or 123451234512345123451234512345123451234512345
+ Got bcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdef
+
+ """
+
+ def hash_then_or(hash_name: str) -> "chain[str]":
+ # For now, all the decent hashes have 6-char names, so we can get
+ # away with hard-coding space literals.
+ return chain([hash_name], repeat(" or"))
+
+ lines: List[str] = []
+ for hash_name, expecteds in self.allowed.items():
+ prefix = hash_then_or(hash_name)
+ lines.extend(
+ (" Expected {} {}".format(next(prefix), e)) for e in expecteds
+ )
+ lines.append(
+ " Got {}\n".format(self.gots[hash_name].hexdigest())
+ )
+ return "\n".join(lines)
+
+
+class UnsupportedPythonVersion(InstallationError):
+ """Unsupported python version according to Requires-Python package
+ metadata."""
+
+
+class ConfigurationFileCouldNotBeLoaded(ConfigurationError):
+ """When there are errors while loading a configuration file"""
+
+ def __init__(
+ self,
+ reason: str = "could not be loaded",
+ fname: Optional[str] = None,
+ error: Optional[configparser.Error] = None,
+ ) -> None:
+ super().__init__(error)
+ self.reason = reason
+ self.fname = fname
+ self.error = error
+
+ def __str__(self) -> str:
+ if self.fname is not None:
+ message_part = f" in {self.fname}."
+ else:
+ assert self.error is not None
+ message_part = f".\n{self.error}\n"
+ return f"Configuration file {self.reason}{message_part}"
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/main.py b/venv/lib/python3.10/site-packages/pip/_internal/main.py
new file mode 100644
index 0000000000000000000000000000000000000000..33c6d24cd85b55a9fb1b1e6ab784f471e2b135f0
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/main.py
@@ -0,0 +1,12 @@
+from typing import List, Optional
+
+
+def main(args: Optional[List[str]] = None) -> int:
+ """This is preserved for old console scripts that may still be referencing
+ it.
+
+ For additional details, see https://github.com/pypa/pip/issues/7498.
+ """
+ from pip._internal.utils.entrypoints import _wrapper
+
+ return _wrapper(args)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/pyproject.py b/venv/lib/python3.10/site-packages/pip/_internal/pyproject.py
new file mode 100644
index 0000000000000000000000000000000000000000..e183eaf86588b35a32117b4fa889842d3ea32216
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/pyproject.py
@@ -0,0 +1,168 @@
+import os
+from collections import namedtuple
+from typing import Any, List, Optional
+
+from pip._vendor import tomli
+from pip._vendor.packaging.requirements import InvalidRequirement, Requirement
+
+from pip._internal.exceptions import (
+ InstallationError,
+ InvalidPyProjectBuildRequires,
+ MissingPyProjectBuildRequires,
+)
+
+
+def _is_list_of_str(obj: Any) -> bool:
+ return isinstance(obj, list) and all(isinstance(item, str) for item in obj)
+
+
+def make_pyproject_path(unpacked_source_directory: str) -> str:
+ return os.path.join(unpacked_source_directory, "pyproject.toml")
+
+
+BuildSystemDetails = namedtuple(
+ "BuildSystemDetails", ["requires", "backend", "check", "backend_path"]
+)
+
+
+def load_pyproject_toml(
+ use_pep517: Optional[bool], pyproject_toml: str, setup_py: str, req_name: str
+) -> Optional[BuildSystemDetails]:
+ """Load the pyproject.toml file.
+
+ Parameters:
+ use_pep517 - Has the user requested PEP 517 processing? None
+ means the user hasn't explicitly specified.
+ pyproject_toml - Location of the project's pyproject.toml file
+ setup_py - Location of the project's setup.py file
+ req_name - The name of the requirement we're processing (for
+ error reporting)
+
+ Returns:
+ None if we should use the legacy code path, otherwise a tuple
+ (
+ requirements from pyproject.toml,
+ name of PEP 517 backend,
+ requirements we should check are installed after setting
+ up the build environment
+ directory paths to import the backend from (backend-path),
+ relative to the project root.
+ )
+ """
+ has_pyproject = os.path.isfile(pyproject_toml)
+ has_setup = os.path.isfile(setup_py)
+
+ if not has_pyproject and not has_setup:
+ raise InstallationError(
+ f"{req_name} does not appear to be a Python project: "
+ f"neither 'setup.py' nor 'pyproject.toml' found."
+ )
+
+ if has_pyproject:
+ with open(pyproject_toml, encoding="utf-8") as f:
+ pp_toml = tomli.loads(f.read())
+ build_system = pp_toml.get("build-system")
+ else:
+ build_system = None
+
+ # The following cases must use PEP 517
+ # We check for use_pep517 being non-None and falsey because that means
+ # the user explicitly requested --no-use-pep517. The value 0 as
+ # opposed to False can occur when the value is provided via an
+ # environment variable or config file option (due to the quirk of
+ # strtobool() returning an integer in pip's configuration code).
+ if has_pyproject and not has_setup:
+ if use_pep517 is not None and not use_pep517:
+ raise InstallationError(
+ "Disabling PEP 517 processing is invalid: "
+ "project does not have a setup.py"
+ )
+ use_pep517 = True
+ elif build_system and "build-backend" in build_system:
+ if use_pep517 is not None and not use_pep517:
+ raise InstallationError(
+ "Disabling PEP 517 processing is invalid: "
+ "project specifies a build backend of {} "
+ "in pyproject.toml".format(build_system["build-backend"])
+ )
+ use_pep517 = True
+
+ # If we haven't worked out whether to use PEP 517 yet,
+ # and the user hasn't explicitly stated a preference,
+ # we do so if the project has a pyproject.toml file.
+ elif use_pep517 is None:
+ use_pep517 = has_pyproject
+
+ # At this point, we know whether we're going to use PEP 517.
+ assert use_pep517 is not None
+
+ # If we're using the legacy code path, there is nothing further
+ # for us to do here.
+ if not use_pep517:
+ return None
+
+ if build_system is None:
+ # Either the user has a pyproject.toml with no build-system
+ # section, or the user has no pyproject.toml, but has opted in
+ # explicitly via --use-pep517.
+ # In the absence of any explicit backend specification, we
+ # assume the setuptools backend that most closely emulates the
+ # traditional direct setup.py execution, and require wheel and
+ # a version of setuptools that supports that backend.
+
+ build_system = {
+ "requires": ["setuptools>=40.8.0", "wheel"],
+ "build-backend": "setuptools.build_meta:__legacy__",
+ }
+
+ # If we're using PEP 517, we have build system information (either
+ # from pyproject.toml, or defaulted by the code above).
+ # Note that at this point, we do not know if the user has actually
+ # specified a backend, though.
+ assert build_system is not None
+
+ # Ensure that the build-system section in pyproject.toml conforms
+ # to PEP 518.
+
+ # Specifying the build-system table but not the requires key is invalid
+ if "requires" not in build_system:
+ raise MissingPyProjectBuildRequires(package=req_name)
+
+ # Error out if requires is not a list of strings
+ requires = build_system["requires"]
+ if not _is_list_of_str(requires):
+ raise InvalidPyProjectBuildRequires(
+ package=req_name,
+ reason="It is not a list of strings.",
+ )
+
+ # Each requirement must be valid as per PEP 508
+ for requirement in requires:
+ try:
+ Requirement(requirement)
+ except InvalidRequirement as error:
+ raise InvalidPyProjectBuildRequires(
+ package=req_name,
+ reason=f"It contains an invalid requirement: {requirement!r}",
+ ) from error
+
+ backend = build_system.get("build-backend")
+ backend_path = build_system.get("backend-path", [])
+ check: List[str] = []
+ if backend is None:
+ # If the user didn't specify a backend, we assume they want to use
+ # the setuptools backend. But we can't be sure they have included
+ # a version of setuptools which supplies the backend, or wheel
+ # (which is needed by the backend) in their requirements. So we
+ # make a note to check that those requirements are present once
+ # we have set up the environment.
+ # This is quite a lot of work to check for a very specific case. But
+ # the problem is, that case is potentially quite common - projects that
+ # adopted PEP 518 early for the ability to specify requirements to
+ # execute setup.py, but never considered needing to mention the build
+ # tools themselves. The original PEP 518 code had a similar check (but
+ # implemented in a different way).
+ backend = "setuptools.build_meta:__legacy__"
+ check = ["setuptools>=40.8.0", "wheel"]
+
+ return BuildSystemDetails(requires, backend, check, backend_path)
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/self_outdated_check.py b/venv/lib/python3.10/site-packages/pip/_internal/self_outdated_check.py
new file mode 100644
index 0000000000000000000000000000000000000000..7300e0ea4c0d06ced25a6abdeab0769354167920
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/self_outdated_check.py
@@ -0,0 +1,189 @@
+import datetime
+import hashlib
+import json
+import logging
+import optparse
+import os.path
+import sys
+from typing import Any, Dict
+
+from pip._vendor.packaging.version import parse as parse_version
+
+from pip._internal.index.collector import LinkCollector
+from pip._internal.index.package_finder import PackageFinder
+from pip._internal.metadata import get_default_environment
+from pip._internal.models.selection_prefs import SelectionPreferences
+from pip._internal.network.session import PipSession
+from pip._internal.utils.filesystem import adjacent_tmp_file, check_path_owner, replace
+from pip._internal.utils.misc import ensure_dir
+
+SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ"
+
+
+logger = logging.getLogger(__name__)
+
+
+def _get_statefile_name(key: str) -> str:
+ key_bytes = key.encode()
+ name = hashlib.sha224(key_bytes).hexdigest()
+ return name
+
+
+class SelfCheckState:
+ def __init__(self, cache_dir: str) -> None:
+ self.state: Dict[str, Any] = {}
+ self.statefile_path = None
+
+ # Try to load the existing state
+ if cache_dir:
+ self.statefile_path = os.path.join(
+ cache_dir, "selfcheck", _get_statefile_name(self.key)
+ )
+ try:
+ with open(self.statefile_path, encoding="utf-8") as statefile:
+ self.state = json.load(statefile)
+ except (OSError, ValueError, KeyError):
+ # Explicitly suppressing exceptions, since we don't want to
+ # error out if the cache file is invalid.
+ pass
+
+ @property
+ def key(self) -> str:
+ return sys.prefix
+
+ def save(self, pypi_version: str, current_time: datetime.datetime) -> None:
+ # If we do not have a path to cache in, don't bother saving.
+ if not self.statefile_path:
+ return
+
+ # Check to make sure that we own the directory
+ if not check_path_owner(os.path.dirname(self.statefile_path)):
+ return
+
+ # Now that we've ensured the directory is owned by this user, we'll go
+ # ahead and make sure that all our directories are created.
+ ensure_dir(os.path.dirname(self.statefile_path))
+
+ state = {
+ # Include the key so it's easy to tell which pip wrote the
+ # file.
+ "key": self.key,
+ "last_check": current_time.strftime(SELFCHECK_DATE_FMT),
+ "pypi_version": pypi_version,
+ }
+
+ text = json.dumps(state, sort_keys=True, separators=(",", ":"))
+
+ with adjacent_tmp_file(self.statefile_path) as f:
+ f.write(text.encode())
+
+ try:
+ # Since we have a prefix-specific state file, we can just
+ # overwrite whatever is there, no need to check.
+ replace(f.name, self.statefile_path)
+ except OSError:
+ # Best effort.
+ pass
+
+
+def was_installed_by_pip(pkg: str) -> bool:
+ """Checks whether pkg was installed by pip
+
+ This is used not to display the upgrade message when pip is in fact
+ installed by system package manager, such as dnf on Fedora.
+ """
+ dist = get_default_environment().get_distribution(pkg)
+ return dist is not None and "pip" == dist.installer
+
+
+def pip_self_version_check(session: PipSession, options: optparse.Values) -> None:
+ """Check for an update for pip.
+
+ Limit the frequency of checks to once per week. State is stored either in
+ the active virtualenv or in the user's USER_CACHE_DIR keyed off the prefix
+ of the pip script path.
+ """
+ installed_dist = get_default_environment().get_distribution("pip")
+ if not installed_dist:
+ return
+
+ pip_version = installed_dist.version
+ pypi_version = None
+
+ try:
+ state = SelfCheckState(cache_dir=options.cache_dir)
+
+ current_time = datetime.datetime.utcnow()
+ # Determine if we need to refresh the state
+ if "last_check" in state.state and "pypi_version" in state.state:
+ last_check = datetime.datetime.strptime(
+ state.state["last_check"], SELFCHECK_DATE_FMT
+ )
+ if (current_time - last_check).total_seconds() < 7 * 24 * 60 * 60:
+ pypi_version = state.state["pypi_version"]
+
+ # Refresh the version if we need to or just see if we need to warn
+ if pypi_version is None:
+ # Lets use PackageFinder to see what the latest pip version is
+ link_collector = LinkCollector.create(
+ session,
+ options=options,
+ suppress_no_index=True,
+ )
+
+ # Pass allow_yanked=False so we don't suggest upgrading to a
+ # yanked version.
+ selection_prefs = SelectionPreferences(
+ allow_yanked=False,
+ allow_all_prereleases=False, # Explicitly set to False
+ )
+
+ finder = PackageFinder.create(
+ link_collector=link_collector,
+ selection_prefs=selection_prefs,
+ use_deprecated_html5lib=(
+ "html5lib" in options.deprecated_features_enabled
+ ),
+ )
+ best_candidate = finder.find_best_candidate("pip").best_candidate
+ if best_candidate is None:
+ return
+ pypi_version = str(best_candidate.version)
+
+ # save that we've performed a check
+ state.save(pypi_version, current_time)
+
+ remote_version = parse_version(pypi_version)
+
+ local_version_is_older = (
+ pip_version < remote_version
+ and pip_version.base_version != remote_version.base_version
+ and was_installed_by_pip("pip")
+ )
+
+ # Determine if our pypi_version is older
+ if not local_version_is_older:
+ return
+
+ # We cannot tell how the current pip is available in the current
+ # command context, so be pragmatic here and suggest the command
+ # that's always available. This does not accommodate spaces in
+ # `sys.executable` on purpose as it is not possible to do it
+ # correctly without knowing the user's shell. Thus,
+ # it won't be done until possible through the standard library.
+ # Do not be tempted to use the undocumented subprocess.list2cmdline.
+ # It is considered an internal implementation detail for a reason.
+ pip_cmd = f"{sys.executable} -m pip"
+ logger.warning(
+ "You are using pip version %s; however, version %s is "
+ "available.\nYou should consider upgrading via the "
+ "'%s install --upgrade pip' command.",
+ pip_version,
+ pypi_version,
+ pip_cmd,
+ )
+ except Exception:
+ logger.debug(
+ "There was an error checking the latest version of pip",
+ exc_info=True,
+ )
diff --git a/venv/lib/python3.10/site-packages/pip/_internal/wheel_builder.py b/venv/lib/python3.10/site-packages/pip/_internal/wheel_builder.py
new file mode 100644
index 0000000000000000000000000000000000000000..d0663443b2207ad8efc0fd56a27d085e821b2eb7
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/_internal/wheel_builder.py
@@ -0,0 +1,377 @@
+"""Orchestrator for building wheels from InstallRequirements.
+"""
+
+import logging
+import os.path
+import re
+import shutil
+from typing import Any, Callable, Iterable, List, Optional, Tuple
+
+from pip._vendor.packaging.utils import canonicalize_name, canonicalize_version
+from pip._vendor.packaging.version import InvalidVersion, Version
+
+from pip._internal.cache import WheelCache
+from pip._internal.exceptions import InvalidWheelFilename, UnsupportedWheel
+from pip._internal.metadata import FilesystemWheel, get_wheel_distribution
+from pip._internal.models.link import Link
+from pip._internal.models.wheel import Wheel
+from pip._internal.operations.build.wheel import build_wheel_pep517
+from pip._internal.operations.build.wheel_editable import build_wheel_editable
+from pip._internal.operations.build.wheel_legacy import build_wheel_legacy
+from pip._internal.req.req_install import InstallRequirement
+from pip._internal.utils.logging import indent_log
+from pip._internal.utils.misc import ensure_dir, hash_file, is_wheel_installed
+from pip._internal.utils.setuptools_build import make_setuptools_clean_args
+from pip._internal.utils.subprocess import call_subprocess
+from pip._internal.utils.temp_dir import TempDirectory
+from pip._internal.utils.urls import path_to_url
+from pip._internal.vcs import vcs
+
+logger = logging.getLogger(__name__)
+
+_egg_info_re = re.compile(r"([a-z0-9_.]+)-([a-z0-9_.!+-]+)", re.IGNORECASE)
+
+BinaryAllowedPredicate = Callable[[InstallRequirement], bool]
+BuildResult = Tuple[List[InstallRequirement], List[InstallRequirement]]
+
+
+def _contains_egg_info(s: str) -> bool:
+ """Determine whether the string looks like an egg_info.
+
+ :param s: The string to parse. E.g. foo-2.1
+ """
+ return bool(_egg_info_re.search(s))
+
+
+def _should_build(
+ req: InstallRequirement,
+ need_wheel: bool,
+ check_binary_allowed: BinaryAllowedPredicate,
+) -> bool:
+ """Return whether an InstallRequirement should be built into a wheel."""
+ if req.constraint:
+ # never build requirements that are merely constraints
+ return False
+ if req.is_wheel:
+ if need_wheel:
+ logger.info(
+ "Skipping %s, due to already being wheel.",
+ req.name,
+ )
+ return False
+
+ if need_wheel:
+ # i.e. pip wheel, not pip install
+ return True
+
+ # From this point, this concerns the pip install command only
+ # (need_wheel=False).
+
+ if not req.source_dir:
+ return False
+
+ if req.editable:
+ # we only build PEP 660 editable requirements
+ return req.supports_pyproject_editable()
+
+ if req.use_pep517:
+ return True
+
+ if not check_binary_allowed(req):
+ logger.info(
+ "Skipping wheel build for %s, due to binaries being disabled for it.",
+ req.name,
+ )
+ return False
+
+ if not is_wheel_installed():
+ # we don't build legacy requirements if wheel is not installed
+ logger.info(
+ "Using legacy 'setup.py install' for %s, "
+ "since package 'wheel' is not installed.",
+ req.name,
+ )
+ return False
+
+ return True
+
+
+def should_build_for_wheel_command(
+ req: InstallRequirement,
+) -> bool:
+ return _should_build(req, need_wheel=True, check_binary_allowed=_always_true)
+
+
+def should_build_for_install_command(
+ req: InstallRequirement,
+ check_binary_allowed: BinaryAllowedPredicate,
+) -> bool:
+ return _should_build(
+ req, need_wheel=False, check_binary_allowed=check_binary_allowed
+ )
+
+
+def _should_cache(
+ req: InstallRequirement,
+) -> Optional[bool]:
+ """
+ Return whether a built InstallRequirement can be stored in the persistent
+ wheel cache, assuming the wheel cache is available, and _should_build()
+ has determined a wheel needs to be built.
+ """
+ if req.editable or not req.source_dir:
+ # never cache editable requirements
+ return False
+
+ if req.link and req.link.is_vcs:
+ # VCS checkout. Do not cache
+ # unless it points to an immutable commit hash.
+ assert not req.editable
+ assert req.source_dir
+ vcs_backend = vcs.get_backend_for_scheme(req.link.scheme)
+ assert vcs_backend
+ if vcs_backend.is_immutable_rev_checkout(req.link.url, req.source_dir):
+ return True
+ return False
+
+ assert req.link
+ base, ext = req.link.splitext()
+ if _contains_egg_info(base):
+ return True
+
+ # Otherwise, do not cache.
+ return False
+
+
+def _get_cache_dir(
+ req: InstallRequirement,
+ wheel_cache: WheelCache,
+) -> str:
+ """Return the persistent or temporary cache directory where the built
+ wheel need to be stored.
+ """
+ cache_available = bool(wheel_cache.cache_dir)
+ assert req.link
+ if cache_available and _should_cache(req):
+ cache_dir = wheel_cache.get_path_for_link(req.link)
+ else:
+ cache_dir = wheel_cache.get_ephem_path_for_link(req.link)
+ return cache_dir
+
+
+def _always_true(_: Any) -> bool:
+ return True
+
+
+def _verify_one(req: InstallRequirement, wheel_path: str) -> None:
+ canonical_name = canonicalize_name(req.name or "")
+ w = Wheel(os.path.basename(wheel_path))
+ if canonicalize_name(w.name) != canonical_name:
+ raise InvalidWheelFilename(
+ "Wheel has unexpected file name: expected {!r}, "
+ "got {!r}".format(canonical_name, w.name),
+ )
+ dist = get_wheel_distribution(FilesystemWheel(wheel_path), canonical_name)
+ dist_verstr = str(dist.version)
+ if canonicalize_version(dist_verstr) != canonicalize_version(w.version):
+ raise InvalidWheelFilename(
+ "Wheel has unexpected file name: expected {!r}, "
+ "got {!r}".format(dist_verstr, w.version),
+ )
+ metadata_version_value = dist.metadata_version
+ if metadata_version_value is None:
+ raise UnsupportedWheel("Missing Metadata-Version")
+ try:
+ metadata_version = Version(metadata_version_value)
+ except InvalidVersion:
+ msg = f"Invalid Metadata-Version: {metadata_version_value}"
+ raise UnsupportedWheel(msg)
+ if metadata_version >= Version("1.2") and not isinstance(dist.version, Version):
+ raise UnsupportedWheel(
+ "Metadata 1.2 mandates PEP 440 version, "
+ "but {!r} is not".format(dist_verstr)
+ )
+
+
+def _build_one(
+ req: InstallRequirement,
+ output_dir: str,
+ verify: bool,
+ build_options: List[str],
+ global_options: List[str],
+ editable: bool,
+) -> Optional[str]:
+ """Build one wheel.
+
+ :return: The filename of the built wheel, or None if the build failed.
+ """
+ artifact = "editable" if editable else "wheel"
+ try:
+ ensure_dir(output_dir)
+ except OSError as e:
+ logger.warning(
+ "Building %s for %s failed: %s",
+ artifact,
+ req.name,
+ e,
+ )
+ return None
+
+ # Install build deps into temporary directory (PEP 518)
+ with req.build_env:
+ wheel_path = _build_one_inside_env(
+ req, output_dir, build_options, global_options, editable
+ )
+ if wheel_path and verify:
+ try:
+ _verify_one(req, wheel_path)
+ except (InvalidWheelFilename, UnsupportedWheel) as e:
+ logger.warning("Built %s for %s is invalid: %s", artifact, req.name, e)
+ return None
+ return wheel_path
+
+
+def _build_one_inside_env(
+ req: InstallRequirement,
+ output_dir: str,
+ build_options: List[str],
+ global_options: List[str],
+ editable: bool,
+) -> Optional[str]:
+ with TempDirectory(kind="wheel") as temp_dir:
+ assert req.name
+ if req.use_pep517:
+ assert req.metadata_directory
+ assert req.pep517_backend
+ if global_options:
+ logger.warning(
+ "Ignoring --global-option when building %s using PEP 517", req.name
+ )
+ if build_options:
+ logger.warning(
+ "Ignoring --build-option when building %s using PEP 517", req.name
+ )
+ if editable:
+ wheel_path = build_wheel_editable(
+ name=req.name,
+ backend=req.pep517_backend,
+ metadata_directory=req.metadata_directory,
+ tempd=temp_dir.path,
+ )
+ else:
+ wheel_path = build_wheel_pep517(
+ name=req.name,
+ backend=req.pep517_backend,
+ metadata_directory=req.metadata_directory,
+ tempd=temp_dir.path,
+ )
+ else:
+ wheel_path = build_wheel_legacy(
+ name=req.name,
+ setup_py_path=req.setup_py_path,
+ source_dir=req.unpacked_source_directory,
+ global_options=global_options,
+ build_options=build_options,
+ tempd=temp_dir.path,
+ )
+
+ if wheel_path is not None:
+ wheel_name = os.path.basename(wheel_path)
+ dest_path = os.path.join(output_dir, wheel_name)
+ try:
+ wheel_hash, length = hash_file(wheel_path)
+ shutil.move(wheel_path, dest_path)
+ logger.info(
+ "Created wheel for %s: filename=%s size=%d sha256=%s",
+ req.name,
+ wheel_name,
+ length,
+ wheel_hash.hexdigest(),
+ )
+ logger.info("Stored in directory: %s", output_dir)
+ return dest_path
+ except Exception as e:
+ logger.warning(
+ "Building wheel for %s failed: %s",
+ req.name,
+ e,
+ )
+ # Ignore return, we can't do anything else useful.
+ if not req.use_pep517:
+ _clean_one_legacy(req, global_options)
+ return None
+
+
+def _clean_one_legacy(req: InstallRequirement, global_options: List[str]) -> bool:
+ clean_args = make_setuptools_clean_args(
+ req.setup_py_path,
+ global_options=global_options,
+ )
+
+ logger.info("Running setup.py clean for %s", req.name)
+ try:
+ call_subprocess(
+ clean_args, command_desc="python setup.py clean", cwd=req.source_dir
+ )
+ return True
+ except Exception:
+ logger.error("Failed cleaning build dir for %s", req.name)
+ return False
+
+
+def build(
+ requirements: Iterable[InstallRequirement],
+ wheel_cache: WheelCache,
+ verify: bool,
+ build_options: List[str],
+ global_options: List[str],
+) -> BuildResult:
+ """Build wheels.
+
+ :return: The list of InstallRequirement that succeeded to build and
+ the list of InstallRequirement that failed to build.
+ """
+ if not requirements:
+ return [], []
+
+ # Build the wheels.
+ logger.info(
+ "Building wheels for collected packages: %s",
+ ", ".join(req.name for req in requirements), # type: ignore
+ )
+
+ with indent_log():
+ build_successes, build_failures = [], []
+ for req in requirements:
+ assert req.name
+ cache_dir = _get_cache_dir(req, wheel_cache)
+ wheel_file = _build_one(
+ req,
+ cache_dir,
+ verify,
+ build_options,
+ global_options,
+ req.editable and req.permit_editable_wheels,
+ )
+ if wheel_file:
+ # Update the link for this.
+ req.link = Link(path_to_url(wheel_file))
+ req.local_file_path = req.link.file_path
+ assert req.link.is_wheel
+ build_successes.append(req)
+ else:
+ build_failures.append(req)
+
+ # notify success/failure
+ if build_successes:
+ logger.info(
+ "Successfully built %s",
+ " ".join([req.name for req in build_successes]), # type: ignore
+ )
+ if build_failures:
+ logger.info(
+ "Failed to build %s",
+ " ".join([req.name for req in build_failures]), # type: ignore
+ )
+ # Return a list of requirements that failed to build
+ return build_successes, build_failures
diff --git a/venv/lib/python3.10/site-packages/pip/py.typed b/venv/lib/python3.10/site-packages/pip/py.typed
new file mode 100644
index 0000000000000000000000000000000000000000..493b53e4e7a3984ddd49780313bf3bd9901dc1e0
--- /dev/null
+++ b/venv/lib/python3.10/site-packages/pip/py.typed
@@ -0,0 +1,4 @@
+pip is a command line program. While it is implemented in Python, and so is
+available for import, you must not use pip's internal APIs in this way. Typing
+information is provided as a convenience only and is not a guarantee. Expect
+unannounced changes to the API and types in releases.