diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/INSTALLER b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/INSTALLER new file mode 100644 index 0000000000000000000000000000000000000000..a1b589e38a32041e49332e5e81c2d363dc418d68 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/INSTALLER @@ -0,0 +1 @@ +pip diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/LICENSE b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..84292c189777bc86e1452e8af3a0b3256153d710 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2023 Tsinghua University, Machine Learning Group (THUML) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/METADATA b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/METADATA new file mode 100644 index 0000000000000000000000000000000000000000..adc26ad859351dadf2f38f7728677860417c485d --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/METADATA @@ -0,0 +1,139 @@ +Metadata-Version: 2.1 +Name: depyf +Version: 0.18.0 +Summary: Decompile python functions, from bytecode to source code! +Home-page: https://github.com/thuml/depyf +Author: Kaichao You +Author-email: youkaichao@gmail.com +License: MIT +Requires-Python: >=3.7 +Description-Content-Type: text/markdown +License-File: LICENSE +Requires-Dist: astor +Requires-Dist: dill +Provides-Extra: dev +Requires-Dist: pytest ; extra == 'dev' +Requires-Dist: flake8 ; extra == 'dev' +Requires-Dist: autopep8 ; extra == 'dev' + +![Logo](imgs/logo-and-text.svg) + +[![Documentation Status](https://readthedocs.org/projects/depyf/badge/?version=latest)](https://depyf.readthedocs.io/en/latest/) ![Supported Python Versions](https://img.shields.io/badge/python-%203.7%20%7C%203.8%20%7C%203.9%20%7C%203.10%20%7C%203.11-blue) ![Python Decompilation Tests](https://github.com/thuml/depyf/actions/workflows/test_decompile.yml/badge.svg) ![PyTorch Integration Tests](https://github.com/thuml/depyf/actions/workflows/test_pytorch.yml/badge.svg) [![Test Coverage](https://codecov.io/github/thuml/depyf/graph/badge.svg?token=DUQ1CQ0I5U)](https://codecov.io/github/thuml/depyf) ![MIT License](https://img.shields.io/github/license/thuml/depyf) + +> `depyf` is proud to be a [PyTorch ecosystem project](https://pytorch.org/ecosystem/). Check out the announcement blog [https://pytorch.org/blog/introducing-depyf/](https://pytorch.org/blog/introducing-depyf/) for more details. + +Have you ever felt overwhelmed by the complexities of `torch.compile`? Diving into its workings can feel like black magic, with bytecode and Python internal details that many users fail to understand, hindering them from understanding and adapting to `torch.compile`. + +If you also face the problem, then you might be interested in `depyf`. As the logo suggests, `depyf` is a software tool to leverage advanced Python features (the Python snake symbol) to open up internal details (the internal gears symbol) of PyTorch's compiler `torch.compile` (the PyTorch logo), so that users can understand it, adapt to it, and tune their code (the debugger symbol) to get maximum performance benefit out of it. + +:warning: This project is developed under close collaborations with the PyTorch team. Therefore, it requires very new features from PyTorch to support better understanding of `torch.compile`. **Please use this project along with PyTorch>=2.2.0 (PyTorch nightly is recommended)**. Visit the [PyTorch website](https://pytorch.org/) for how to install different versions of PyTorch. + +:warning: During development, we seek suggestions from the community quite a lot. You may find some early usage examples from some discussion forums or social media platforms. **Please follow the latest documentation for how to use this tool.** + +# Why `depyf`? + +If you want to understand bytecode generated by `torch.compile`, then `depyf` might be the only choice for you. Below we tested several existing decompilers, they struggle to decompile simple Python bytecode across versions, and have poor support for PyTorch. + +| Decompiler | Python 3.8 | Python 3.9 | Python 3.10 | Python 3.11 | PyTorch | +|-------------|--------------|------------|-------------|-------------|---------| +| [decompyle3](https://github.com/rocky/python-decompile3) | 90.6% (77/85) | × | × | × | × | +| [uncompyle6](https://github.com/rocky/python-uncompyle6) | 91.8% (78/85)| × | × | × | × | +| [pycdc](https://github.com/zrax/pycdc) | 74.1% (63/85) | 74.1% (63/85)| 74.1% (63/85) | 67.1% (57/85) | 19.3% (27/140)| +| [depyf](https://github.com/thuml/depyf) | 100% (85/85) | 100% (85/85)| 100% (85/85)| 100% (85/85)| 100% (140/140)| + +# Installation + +Stable release: `pip install depyf` + +Nightly version (recommended): `pip install git+https://github.com/thuml/depyf.git` + +# Usage + +The main usage is quite simple: just wrap your code within a context manager: + +```python +import torch +from torch import _dynamo as torchdynamo +from typing import List + +@torch.compile +def toy_example(a, b): + x = a / (torch.abs(a) + 1) + if b.sum() < 0: + b = b * -1 + return x * b + +def main(): + for _ in range(100): + toy_example(torch.randn(10), torch.randn(10)) + +if __name__ == "__main__": + # main() + # surround the code you want to run inside `with depyf.prepare_debug` + import depyf + with depyf.prepare_debug("./dump_src_dir"): + main() +``` + +Then you can see all the details of `torch.compile` inside the directory `./dump_src_dir`. The details are organized into the following: + +- `full_code_for_xxx.py` for each function using `torch.compile` +- `__transformed_code_for_xxx.py` for Python code associated with each graph. +- `__transformed_code_for_xxx.py.xxx_bytecode` for Python bytecode, dumped code object, can be loaded via `dill.load(open("/path/to/file", "wb"))`. Note that the `load` function might import some modules like `transformers`. Make sure you have these modules installed. +- `__compiled_fn_xxx.py` for each computation graph and its optimization: + - `Captured Graph`: a plain forward computation graph + - `Joint Graph`: joint forward-backward graph from `AOTAutograd` + - `Forward Graph`: forward graph from `AOTAutograd` + - `Backward Graph`: backward graph from `AOTAutograd` + - `kernel xxx`: compiled CPU/GPU kernel wrapper from Inductor. + +We collect [all the compilation artifacts](https://github.com/thuml/learn_torch.compile) when testing over 100 deep learning models. You can take a look to learn how the PyTorch compiler works. + +If you want to use debugger to step through the above code, just add another context manager (and launch the script through debuggers): + +```python +import torch +from torch import _dynamo as torchdynamo +from typing import List + +@torch.compile +def toy_example(a, b): + x = a / (torch.abs(a) + 1) + if b.sum() < 0: + b = b * -1 + return x * b + +def main(): + for _ in range(100): + toy_example(torch.randn(10), torch.randn(10)) + +if __name__ == "__main__": + import depyf + with depyf.prepare_debug("./dump_src_dir"): + main() + # surround the code you want to debug inside `with depyf.debug()` + with depyf.debug(): + main() +``` + +Calling `depyf.debug()` will pause the program for you to set breakpoints, and then you can use debuggers to hit breakpoints in these files under the `./dump_src_dir` directory you specified above. + +# Contact + +If you have any question about `depyf`, feel free to open issues to reach out! Any discussion/issue report/PR is welcome. Or contact youkaichao@gmail.com if you have any other questions. + +# Citing `depyf` + +If you find `depyf` useful, please cite it in your publications. + +```latex +@article{you2024depyf, + title={depyf: Open the Opaque Box of PyTorch Compiler for Machine Learning Researchers}, + author={Kaichao You and Runsheng Bai and Meng Cao and Jianmin Wang and Ion Stoica and Mingsheng Long}, + year={2024}, + eprint={2403.13839}, + journal={arXiv}, + primaryClass={cs.LG}, + url={https://github.com/thuml/depyf} +} +``` diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/RECORD b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/RECORD new file mode 100644 index 0000000000000000000000000000000000000000..11207bf37c76588bfac38e68ba970b981c0957e7 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/RECORD @@ -0,0 +1,37 @@ +depyf-0.18.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +depyf-0.18.0.dist-info/LICENSE,sha256=lzua0q7KCQHlSWzC19FuMW0BAF6WZZ3CV7R9A_3scmY,1108 +depyf-0.18.0.dist-info/METADATA,sha256=BeQoG-ccxNdRfp6WoHhRCC9h2C71nsGLMKlPoGM1qw0,7102 +depyf-0.18.0.dist-info/RECORD,, +depyf-0.18.0.dist-info/WHEEL,sha256=yQN5g4mg4AybRjkgi-9yy4iQEFibGQmlz78Pik5Or-A,92 +depyf-0.18.0.dist-info/top_level.txt,sha256=Q8OIJwiCISz0EGVEbMqj3ybR3yHWYbLNxR0ocsccRoo,6 +depyf/VERSION.txt,sha256=QsG7ukni09uOW2RMD0kLOlCvEaKkRnPYnJ03IRLMrX8,6 +depyf/__init__.py,sha256=LbOU08fulsuNrWJrbh-RP3aotFRDBOhPbPRJcriNiNk,841 +depyf/__pycache__/__init__.cpython-311.pyc,, +depyf/__pycache__/code_transform.cpython-311.pyc,, +depyf/__pycache__/decompiler.cpython-311.pyc,, +depyf/__pycache__/optimization.cpython-311.pyc,, +depyf/__pycache__/utils.cpython-311.pyc,, +depyf/code_transform.py,sha256=yLFkRxyXCfE_B4FOik4a89IAJJg5cFrl9TB7m15hDXs,17855 +depyf/decompiler.py,sha256=SP9nbIhpMqygnHTCYnoe538ZPSygP9dFV3jtl7fYPUw,53998 +depyf/explain/__init__.py,sha256=pXasvun6VLMr5YqXkaEenMZqrLLtL5jkEi-H86LI15k,620 +depyf/explain/__pycache__/__init__.cpython-311.pyc,, +depyf/explain/__pycache__/enable_debugging.cpython-311.pyc,, +depyf/explain/__pycache__/enhance_logging.cpython-311.pyc,, +depyf/explain/__pycache__/global_variables.cpython-311.pyc,, +depyf/explain/__pycache__/patched___call__.cpython-311.pyc,, +depyf/explain/__pycache__/patched__exec_with_source.cpython-311.pyc,, +depyf/explain/__pycache__/patched_boxed_run.cpython-311.pyc,, +depyf/explain/__pycache__/patched_lazy_format_graph_code.cpython-311.pyc,, +depyf/explain/__pycache__/patched_load_by_key_path.cpython-311.pyc,, +depyf/explain/__pycache__/utils.cpython-311.pyc,, +depyf/explain/enable_debugging.py,sha256=cOTGwTJlYFAwVW_kgFvBhFITCO7lj9eOhh5zXnvF-0M,12602 +depyf/explain/enhance_logging.py,sha256=x-9Y3rxlIW30-cYIUfqv_Ha3h_0AwFnZ7_7tisDso2k,2745 +depyf/explain/global_variables.py,sha256=cdE2qMzfK4LvP-VJXP36ZQMG1m896mN7hJsQmbPWRkQ,451 +depyf/explain/patched___call__.py,sha256=kEg09Pt08gDeICr5D_SryS2TJ2k5lDqbv_m5VViD43Q,352 +depyf/explain/patched__exec_with_source.py,sha256=yEAB3k9KOrQ75275fY0ghdjs3qMXESYKV0VH6C947wk,750 +depyf/explain/patched_boxed_run.py,sha256=Uq70zcEcY-F3cupORR10_OdRU9TOzfOVqRW4Dpw8w30,83 +depyf/explain/patched_lazy_format_graph_code.py,sha256=3YOke3DCeISdS_-X-ucalkDeXAsWRsu4PwE5oJ4at9g,3200 +depyf/explain/patched_load_by_key_path.py,sha256=1NmSDdET0qqm0NsHg1a4q-H5T0JljEXPqVSrYInmIuY,774 +depyf/explain/utils.py,sha256=i2MarlLUnqQAOCc0SXHUrHI5eW0LliF1vD8Wk0LfLlY,13682 +depyf/optimization.py,sha256=0sPpOC76tzmUlHrNpI2gUZ4_qEcsUVWdkNxEu3DeeCg,2969 +depyf/utils.py,sha256=-0AjE-vrFV-Gkgg62S83L0TO45nlbNlyuaxrLDksfrQ,3239 diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/WHEEL b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/WHEEL new file mode 100644 index 0000000000000000000000000000000000000000..7e688737d490be3643d705bc16b5a77f7bd567b7 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/WHEEL @@ -0,0 +1,5 @@ +Wheel-Version: 1.0 +Generator: bdist_wheel (0.41.2) +Root-Is-Purelib: true +Tag: py3-none-any + diff --git a/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/top_level.txt b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/top_level.txt new file mode 100644 index 0000000000000000000000000000000000000000..87bc266773c7c7990c464ff41c4c41eb25d9be6f --- /dev/null +++ b/.venv/lib/python3.11/site-packages/depyf-0.18.0.dist-info/top_level.txt @@ -0,0 +1 @@ +depyf diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..fa4f9cf4befe34d396ca566b4e4c4947b6f29b64 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/__init__.py @@ -0,0 +1 @@ +__path__ = __import__('pkgutil').extend_path(__path__, __name__) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..c2fdce64ee1397a7a7bd97032c1a0e9483bcc0b1 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__init__.py @@ -0,0 +1,43 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +try: + # For Python 3.0 and later + from urllib.request import urlopen, Request + from urllib.error import HTTPError, URLError +except ImportError: + # Fall back to Python 2's urllib2 + from urllib2 import urlopen, Request + from urllib2 import HTTPError, URLError + + +import socket + +_REQUEST_TIMEOUT = 2 # in secs + + +def get_request(request_url, request_headers=dict()): + """Execute http get request on given request_url with optional headers + """ + request = Request(request_url) + for key, val in request_headers.items(): + request.add_header(key, val) + + try: + response = urlopen(request, timeout=_REQUEST_TIMEOUT) + response_content = response.read() + except (HTTPError, URLError, socket.timeout): + response_content = None + + return response_content diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..1056864872e6ee2c85649dc65897a708bdcd5186 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/http_handler/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..44233d62d9674b5a838ade69111acb926836654f Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/aws_identity_doc_utils.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/aws_identity_doc_utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..f2fbf085e2011071c5a084f376f0eac8465659af Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/aws_identity_doc_utils.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/gcp_metadata_config.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/gcp_metadata_config.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..06b1cb52936e508b7cc23d5b5dd18eef75f2a371 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/gcp_metadata_config.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/k8s_utils.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/k8s_utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..11c9798997b681576681a381b8442f300f64ea9f Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/k8s_utils.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/monitored_resource.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/monitored_resource.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..7805396ee599d77fe6d6778907e1a3f07c36a013 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/__pycache__/monitored_resource.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/aws_identity_doc_utils.py b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/aws_identity_doc_utils.py new file mode 100644 index 0000000000000000000000000000000000000000..6531e52bba34f92606ba6698f54592e7bd49f9be --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/aws_identity_doc_utils.py @@ -0,0 +1,95 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import json + +from opencensus.common.http_handler import get_request + +REGION_KEY = 'region' +ACCOUNT_ID_KEY = 'aws_account' +INSTANCE_ID_KEY = 'instance_id' + +# AWS provides Instance Metadata via below url +_AWS_INSTANCE_IDENTITY_DOCUMENT_URI = \ + "http://169.254.169.254/latest/dynamic/instance-identity/document" + +_AWS_ATTRIBUTES = { + # Region is the AWS region for the VM. The format of this field is + # "aws:{region}", where supported values for {region} are listed at + # http://docs.aws.amazon.com/general/latest/gr/rande.html. + 'region': REGION_KEY, + + # accountId is the AWS account number for the VM. + 'accountId': ACCOUNT_ID_KEY, + + # instanceId is the instance id of the instance. + 'instanceId': INSTANCE_ID_KEY +} + +# inited is used to make sure AWS initialize executes only once. +inited = False + +# Detects if the application is running on EC2 by making a connection to AWS +# instance identity document URI.If connection is successful, application +# should be on an EC2 instance. +is_running_on_aws = False + +aws_metadata_map = {} + + +class AwsIdentityDocumentUtils(object): + """Util methods for getting and parsing AWS instance identity document.""" + + inited = False + is_running = False + + @classmethod + def _initialize_aws_identity_document(cls): + """This method, tries to establish an HTTP connection to AWS instance + identity document url. If the application is running on an EC2 + instance, we should be able to get back a valid JSON document. Make a + http get request call and store data in local map. + This method should only be called once. + """ + + if cls.inited: + return + + content = get_request(_AWS_INSTANCE_IDENTITY_DOCUMENT_URI) + if content is not None: + content = json.loads(content) + for env_var, attribute_key in _AWS_ATTRIBUTES.items(): + attribute_value = content.get(env_var) + if attribute_value is not None: + aws_metadata_map[attribute_key] = attribute_value + + cls.is_running = True + + cls.inited = True + + @classmethod + def is_running_on_aws(cls): + cls._initialize_aws_identity_document() + return cls.is_running + + def get_aws_metadata(self): + """AWS Instance Identity Document is a JSON file. + See docs.aws.amazon.com/AWSEC2/latest/UserGuide/ + instance-identity-documents.html. + :return: + """ + if self.is_running_on_aws(): + return aws_metadata_map + + return dict() diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/monitored_resource.py b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/monitored_resource.py new file mode 100644 index 0000000000000000000000000000000000000000..9bb097cdcd396ad8c6f56dfbf3e688891dbb88c4 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/monitored_resource/monitored_resource.py @@ -0,0 +1,68 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from opencensus.common import resource +from opencensus.common.monitored_resource import ( + aws_identity_doc_utils, + gcp_metadata_config, + k8s_utils, +) + +# Supported environments (resource types) +_GCE_INSTANCE = "gce_instance" +_K8S_CONTAINER = "k8s_container" +_AWS_EC2_INSTANCE = "aws_ec2_instance" + + +def is_gce_environment(): + """Whether the environment is a virtual machine on GCE.""" + return gcp_metadata_config.GcpMetadataConfig.is_running_on_gcp() + + +def is_aws_environment(): + """Whether the environment is a virtual machine instance on EC2.""" + return aws_identity_doc_utils.AwsIdentityDocumentUtils.is_running_on_aws() + + +def get_instance(): + """Get a resource based on the application environment. + + Returns a `Resource` configured for the current environment, or None if the + environment is unknown or unsupported. + + :rtype: :class:`opencensus.common.resource.Resource` or None + :return: A `Resource` configured for the current environment. + """ + resources = [] + env_resource = resource.get_from_env() + if env_resource is not None: + resources.append(env_resource) + + if k8s_utils.is_k8s_environment(): + resources.append(resource.Resource( + _K8S_CONTAINER, k8s_utils.get_k8s_metadata())) + + if is_gce_environment(): + resources.append(resource.Resource( + _GCE_INSTANCE, + gcp_metadata_config.GcpMetadataConfig().get_gce_metadata())) + elif is_aws_environment(): + resources.append(resource.Resource( + _AWS_EC2_INSTANCE, + (aws_identity_doc_utils.AwsIdentityDocumentUtils() + .get_aws_metadata()))) + + if not resources: + return None + return resource.merge_resources(resources) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/resource/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/resource/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..112c95e2168d90d94ed3bdfa6c826008695681da --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/resource/__init__.py @@ -0,0 +1,218 @@ +# Copyright 2019, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +import logging +import os +import re +from copy import copy + +logger = logging.getLogger(__name__) + + +OC_RESOURCE_TYPE = 'OC_RESOURCE_TYPE' +OC_RESOURCE_LABELS = 'OC_RESOURCE_LABELS' + +# Matches anything outside ASCII 32-126 inclusive +_NON_PRINTABLE_ASCII = re.compile( + r'[^ !"#$%&\'()*+,\-./:;<=>?@\[\\\]^_`{|}~0-9a-zA-Z]') + +# Label key/value tokens, may be quoted +_WORD_RES = r'(\'[^\']*\'|"[^"]*"|[^\s,=]+)' + +_KV_RE = re.compile(r""" + \s* # ignore leading spaces + (?P{word_re}) # capture the key word + \s*=\s* + (?P{word_re}) # capture the value word + \s* # ignore trailing spaces + """.format(word_re=_WORD_RES), re.VERBOSE) + +_LABELS_RE = re.compile(r""" + ^\s*{word_re}\s*=\s*{word_re}\s* # _KV_RE without the named groups + (,\s*{word_re}\s*=\s*{word_re}\s*)* # more KV pairs, comma delimited + $ + """.format(word_re=_WORD_RES), re.VERBOSE) + +_UNQUOTE_RE = re.compile(r'^([\'"]?)([^\1]*)(\1)$') + + +def merge_resources(resource_list): + """Merge multiple resources to get a new resource. + + Resources earlier in the list take precedence: if multiple resources share + a label key, use the value from the first resource in the list with that + key. The combined resource's type will be the first non-null type in the + list. + + :type resource_list: list(:class:`Resource`) + :param resource_list: The list of resources to combine. + + :rtype: :class:`Resource` + :return: The new combined resource. + """ + if not resource_list: + raise ValueError + rtype = None + for rr in resource_list: + if rr.type: + rtype = rr.type + break + labels = {} + for rr in reversed(resource_list): + labels.update(rr.labels) + return Resource(rtype, labels) + + +def check_ascii_256(string): + """Check that `string` is printable ASCII and at most 256 chars. + + Raise a `ValueError` if this check fails. Note that `string` itself doesn't + have to be ASCII-encoded. + + :type string: str + :param string: The string to check. + """ + if string is None: + return + if len(string) > 256: + raise ValueError("Value is longer than 256 characters") + bad_char = _NON_PRINTABLE_ASCII.search(string) + if bad_char: + raise ValueError(u'Character "{}" at position {} is not printable ' + 'ASCII' + .format( + string[bad_char.start():bad_char.end()], + bad_char.start())) + + +class Resource(object): + """A description of the entity for which signals are reported. + + `type_` and `labels`' keys and values should contain only printable ASCII + and should be at most 256 characters. + + See: + https://github.com/census-instrumentation/opencensus-specs/blob/master/resource/Resource.md + + :type type_: str + :param type_: The resource type identifier. + + :type labels: dict + :param labels: Key-value pairs that describe the entity. + """ # noqa + + def __init__(self, type_=None, labels=None): + if type_ is not None and not type_: + raise ValueError("Resource type must not be empty") + check_ascii_256(type_) + if labels is None: + labels = {} + for key, value in labels.items(): + if not key: + raise ValueError("Resource key must not be null or empty") + if value is None: + raise ValueError("Resource value must not be null") + check_ascii_256(key) + check_ascii_256(value) + + self.type = type_ + self.labels = copy(labels) + + def get_type(self): + """Get this resource's type. + + :rtype: str + :return: The resource's type. + """ + return self.type + + def get_labels(self): + """Get this resource's labels. + + :rtype: dict + :return: The resource's label dict. + """ + return copy(self.labels) + + def merge(self, other): + """Get a copy of this resource combined with another resource. + + The combined resource will have the union of both resources' labels, + keeping this resource's label values if they conflict. + + :type other: :class:`Resource` + :param other: The other resource to merge. + + :rtype: :class:`Resource` + :return: The new combined resource. + """ + return merge_resources([self, other]) + + +def unquote(string): + """Strip quotes surrounding `string` if they exist. + + >>> unquote('abc') + 'abc' + >>> unquote('"abc"') + 'abc' + >>> unquote("'abc'") + 'abc' + >>> unquote('"a\\'b\\'c"') + "a'b'c" + """ + return _UNQUOTE_RE.sub(r'\2', string) + + +def parse_labels(labels_str): + """Parse label keys and values following the Resource spec. + + >>> parse_labels("k=v") + {'k': 'v'} + >>> parse_labels("k1=v1, k2=v2") + {'k1': 'v1', 'k2': 'v2'} + >>> parse_labels("k1='v1,=z1'") + {'k1': 'v1,=z1'} + """ + if not _LABELS_RE.match(labels_str): + return None + labels = {} + for kv in _KV_RE.finditer(labels_str): + gd = kv.groupdict() + key = unquote(gd['key']) + if key in labels: + logger.warning('Duplicate label key "%s"', key) + labels[key] = unquote(gd['val']) + return labels + + +def get_from_env(): + """Get a Resource from environment variables. + + :rtype: :class:`Resource` + :return: A resource with type and labels from the environment. + """ + type_env = os.getenv(OC_RESOURCE_TYPE) + if type_env is None: + return None + type_env = type_env.strip() + + labels_env = os.getenv(OC_RESOURCE_LABELS) + if labels_env is None: + return Resource(type_env) + + labels = parse_labels(labels_env) + + return Resource(type_env, labels) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/resource/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/resource/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..267804a3abd8bdc330b4e40112384b5297e415a0 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/resource/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..890f6c4837e29837cb61de09e4b519be5d2df133 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__init__.py @@ -0,0 +1,177 @@ +# Copyright 2019, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +try: + import contextvars +except ImportError: + contextvars = None + +import threading + +__all__ = ['RuntimeContext'] + + +class _RuntimeContext(object): + @classmethod + def clear(cls): + """Clear all slots to their default value.""" + + raise NotImplementedError # pragma: NO COVER + + @classmethod + def register_slot(cls, name, default=None): + """Register a context slot with an optional default value. + + :type name: str + :param name: The name of the context slot. + + :type default: object + :param name: The default value of the slot, can be a value or lambda. + + :returns: The registered slot. + """ + + raise NotImplementedError # pragma: NO COVER + + def apply(self, snapshot): + """Set the current context from a given snapshot dictionary""" + + for name in snapshot: + setattr(self, name, snapshot[name]) + + def snapshot(self): + """Return a dictionary of current slots by reference.""" + + return dict((n, self._slots[n].get()) for n in self._slots.keys()) + + def __repr__(self): + return ('{}({})'.format(type(self).__name__, self.snapshot())) + + def __getattr__(self, name): + if name not in self._slots: + raise AttributeError('{} is not a registered context slot' + .format(name)) + slot = self._slots[name] + return slot.get() + + def __setattr__(self, name, value): + if name not in self._slots: + raise AttributeError('{} is not a registered context slot' + .format(name)) + slot = self._slots[name] + slot.set(value) + + def with_current_context(self, func): + """Capture the current context and apply it to the provided func""" + + caller_context = self.snapshot() + + def call_with_current_context(*args, **kwargs): + try: + backup_context = self.snapshot() + self.apply(caller_context) + return func(*args, **kwargs) + finally: + self.apply(backup_context) + + return call_with_current_context + + +class _ThreadLocalRuntimeContext(_RuntimeContext): + _lock = threading.Lock() + _slots = {} + + class Slot(object): + _thread_local = threading.local() + + def __init__(self, name, default): + self.name = name + self.default = default if callable(default) else (lambda: default) + + def clear(self): + setattr(self._thread_local, self.name, self.default()) + + def get(self): + try: + return getattr(self._thread_local, self.name) + except AttributeError: + value = self.default() + self.set(value) + return value + + def set(self, value): + setattr(self._thread_local, self.name, value) + + @classmethod + def clear(cls): + with cls._lock: + for name in cls._slots: + slot = cls._slots[name] + slot.clear() + + @classmethod + def register_slot(cls, name, default=None): + with cls._lock: + if name in cls._slots: + raise ValueError('slot {} already registered'.format(name)) + slot = cls.Slot(name, default) + cls._slots[name] = slot + return slot + + +class _AsyncRuntimeContext(_RuntimeContext): + _lock = threading.Lock() + _slots = {} + + class Slot(object): + def __init__(self, name, default): + self.name = name + self.contextvar = contextvars.ContextVar(name) + self.default = default if callable(default) else (lambda: default) + + def clear(self): + self.contextvar.set(self.default()) + + def get(self): + try: + return self.contextvar.get() + except LookupError: + value = self.default() + self.set(value) + return value + + def set(self, value): + self.contextvar.set(value) + + @classmethod + def clear(cls): + with cls._lock: + for name in cls._slots: + slot = cls._slots[name] + slot.clear() + + @classmethod + def register_slot(cls, name, default=None): + with cls._lock: + if name in cls._slots: + raise ValueError('slot {} already registered'.format(name)) + slot = cls.Slot(name, default) + cls._slots[name] = slot + return slot + + +RuntimeContext = _ThreadLocalRuntimeContext() +if contextvars: + RuntimeContext = _AsyncRuntimeContext() diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..9a7eb3be32110e2705891925017f5c3593f8db67 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/version.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/version.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..c18dc20943777b21079b18b4db55f02276120716 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/__pycache__/version.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/version.py b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/version.py new file mode 100644 index 0000000000000000000000000000000000000000..981079b9cf57258637ce1a66d9b21dc18d53bd2a --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/runtime_context/version.py @@ -0,0 +1,15 @@ +# Copyright 2019, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +__version__ = '0.1.3' diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..ae6d8a9f1555a0d09cf7b94af6a9ea418baa9dec --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__init__.py @@ -0,0 +1,149 @@ +# Copyright 2019, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from six.moves import queue + +import logging +import threading +import time + +logger = logging.getLogger(__name__) + + +class PeriodicTask(threading.Thread): + """Thread that periodically calls a given function. + + :type interval: int or float + :param interval: Seconds between calls to the function. + + :type function: function + :param function: The function to call. + + :type args: list + :param args: The args passed in while calling `function`. + + :type kwargs: dict + :param kwargs: The kwargs passed in while calling `function`. + + :type name: str + :param name: The source of the worker. Used for naming. + """ + + def __init__(self, interval, function, args=None, kwargs=None, name=None): + super(PeriodicTask, self).__init__(name=name) + self.interval = interval + self.function = function + self.args = args or [] + self.kwargs = kwargs or {} + self.finished = threading.Event() + + def run(self): + wait_time = self.interval + while not self.finished.wait(wait_time): + start_time = time.time() + self.function(*self.args, **self.kwargs) + elapsed_time = time.time() - start_time + wait_time = max(self.interval - elapsed_time, 0) + + def cancel(self): + self.finished.set() + + +class QueueEvent(object): + def __init__(self, name): + self.name = name + self.event = threading.Event() + + def __repr__(self): + return ('{}({})'.format(type(self).__name__, self.name)) + + def set(self): + return self.event.set() + + def wait(self, timeout=None): + return self.event.wait(timeout) + + +class QueueExitEvent(QueueEvent): + pass + + +class Queue(object): + def __init__(self, capacity): + self.EXIT_EVENT = QueueExitEvent('EXIT') + self._queue = queue.Queue(maxsize=capacity) + + def _gets(self, count, timeout): + start_time = time.time() + elapsed_time = 0 + cnt = 0 + while cnt < count: + try: + item = self._queue.get(block=False) + yield item + if isinstance(item, QueueEvent): + return + except queue.Empty: + break + cnt += 1 + while cnt < count: + wait_time = max(timeout - elapsed_time, 0) + try: + item = self._queue.get(block=True, timeout=wait_time) + yield item + if isinstance(item, QueueEvent): + return + except queue.Empty: + break + cnt += 1 + elapsed_time = time.time() - start_time + + def gets(self, count, timeout): + return tuple(self._gets(count, timeout)) + + def is_empty(self): + return not self._queue.qsize() + + def flush(self, timeout=None): + if self._queue.qsize() == 0: + return 0 + start_time = time.time() + wait_time = timeout + event = QueueEvent('SYNC(timeout={})'.format(wait_time)) + try: + self._queue.put(event, block=True, timeout=wait_time) + except queue.Full: + return + elapsed_time = time.time() - start_time + wait_time = timeout and max(timeout - elapsed_time, 0) + if event.wait(wait_time): + return time.time() - start_time # time taken to flush + + def put(self, item, block=True, timeout=None): + try: + self._queue.put(item, block, timeout) + except queue.Full: + logger.warning('Queue is full. Dropping telemetry.') + + def puts(self, items, block=True, timeout=None): + if block and timeout is not None: + start_time = time.time() + elapsed_time = 0 + for item in items: + wait_time = max(timeout - elapsed_time, 0) + self.put(item, block=True, timeout=wait_time) + elapsed_time = time.time() - start_time + else: + for item in items: + self.put(item, block, timeout) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..6036966f38b835b683aa6f89718310da888c548f Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/schedule/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..e1eca01eaa16bb6f490081015cb2951c179fba21 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__init__.py @@ -0,0 +1,13 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..b104c8ebc6b77adb150a0907de6a9d50b6ba64bb Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/async_.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/async_.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..2f853775c848b4e89adf646672b736f0e35372f0 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/async_.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/base.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/base.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..212f07a096c357f205678657bf19483436538600 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/base.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/sync.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/sync.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..b94e293ab0d10de0f9bed150e95354ecac2ad06b Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/transports/__pycache__/sync.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/async_.py b/.venv/lib/python3.11/site-packages/opencensus/common/transports/async_.py new file mode 100644 index 0000000000000000000000000000000000000000..e360d770b4605f1f2547a6f98eb8576a38fb462a --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/transports/async_.py @@ -0,0 +1,231 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from six.moves import queue, range + +import atexit +import logging +import threading + +from opencensus.common.transports import base +from opencensus.trace import execution_context + +_DEFAULT_GRACE_PERIOD = 5.0 # Seconds +_DEFAULT_MAX_BATCH_SIZE = 600 +_DEFAULT_WAIT_PERIOD = 60.0 # Seconds +_WORKER_THREAD_NAME = 'opencensus.common.Worker' +_WORKER_TERMINATOR = object() + +logger = logging.getLogger(__name__) + + +class _Worker(object): + """A background thread that exports batches of data. + + :type exporter: :class:`~opencensus.trace.base_exporter.Exporter` or + :class:`~opencensus.stats.base_exporter.StatsExporter` + :param exporter: Instance of Exporter object. + + :type grace_period: float + :param grace_period: The amount of time to wait for pending data to + be submitted when the process is shutting down. + + :type max_batch_size: int + :param max_batch_size: The maximum number of items to send at a time + in the background thread. + + :type wait_period: int + :param wait_period: The amount of time to wait before sending the next + batch of data. + """ + def __init__(self, exporter, + grace_period=_DEFAULT_GRACE_PERIOD, + max_batch_size=_DEFAULT_MAX_BATCH_SIZE, + wait_period=_DEFAULT_WAIT_PERIOD): + self.exporter = exporter + self._grace_period = grace_period + self._max_batch_size = max_batch_size + self._wait_period = wait_period + self._queue = queue.Queue(0) + self._lock = threading.Lock() + self._event = threading.Event() + self._thread = None + + @property + def is_alive(self): + """Returns True is the background thread is running.""" + return self._thread is not None and self._thread.is_alive() + + def _get_items(self): + """Get multiple items from a Queue. + + Gets at least one (blocking) and at most ``max_batch_size`` items + (non-blocking) from a given Queue. Does not mark the items as done. + + :rtype: Sequence + :returns: A sequence of items retrieved from the queue. + """ + items = [self._queue.get()] + + while len(items) < self._max_batch_size: + try: + items.append(self._queue.get_nowait()) + except queue.Empty: + break + + return items + + def _thread_main(self): + """The entry point for the worker thread. + + Pulls pending data off the queue and writes them in + batches to the specified tracing backend using the exporter. + """ + # Indicate that this thread is an exporter thread. + # Used to suppress tracking of requests in this thread + execution_context.set_is_exporter(True) + quit_ = False + + while True: + items = self._get_items() + data = [] + + for item in items: + if item is _WORKER_TERMINATOR: + quit_ = True + # Continue processing items, don't break, try to process + # all items we got back before quitting. + else: + data.extend(item) + + if data: + try: + self.exporter.emit(data) + except Exception: + logger.exception( + '%s failed to emit data.' + 'Dropping %s objects from queue.', + self.exporter.__class__.__name__, + len(data)) + pass + + for _ in range(len(items)): + self._queue.task_done() + + # self._event is set at exit, at which point we start draining the + # queue immediately. If self._event is unset, block for + # self.wait_period between each batch of exports. + self._event.wait(self._wait_period) + + if quit_: + break + + def start(self): + """Starts the background thread. + + Additionally, this registers a handler for process exit to attempt + to send any pending data before shutdown. + """ + with self._lock: + if self.is_alive: + return + + self._thread = threading.Thread( + target=self._thread_main, name=_WORKER_THREAD_NAME) + self._thread.daemon = True + self._thread.start() + atexit.register(self._export_pending_data) + + def stop(self): + """Signals the background thread to stop. + + This does not terminate the background thread. It simply queues the + stop signal. If the main process exits before the background thread + processes the stop signal, it will be terminated without finishing + work. The ``grace_period`` parameter will give the background + thread some time to finish processing before this function returns. + + :rtype: bool + :returns: True if the thread terminated. False if the thread is still + running. + """ + if not self.is_alive: + return True + + with self._lock: + self._queue.put_nowait(_WORKER_TERMINATOR) + self._thread.join(timeout=self._grace_period) + + success = not self.is_alive + self._thread = None + + return success + + def _export_pending_data(self): + """Callback that attempts to send pending data before termination.""" + if not self.is_alive: + return + # Stop blocking between export batches + self._event.set() + self.stop() + + def enqueue(self, data): + """Queues data to be written by the background thread.""" + self._queue.put_nowait(data) + + def flush(self): + """Submit any pending data.""" + self._queue.join() + + +class AsyncTransport(base.Transport): + """Asynchronous transport that uses a background thread. + + :type exporter: :class:`~opencensus.trace.base_exporter.Exporter` or + :class:`~opencensus.stats.base_exporter.StatsExporter` + :param exporter: Instance of Exporter object. + + :type grace_period: float + :param grace_period: The amount of time to wait for pending data to + be submitted when the process is shutting down. + + :type max_batch_size: int + :param max_batch_size: The maximum number of items to send at a time + in the background thread. + + :type wait_period: int + :param wait_period: The amount of time to wait before sending the next + batch of data. + """ + + def __init__(self, exporter, + grace_period=_DEFAULT_GRACE_PERIOD, + max_batch_size=_DEFAULT_MAX_BATCH_SIZE, + wait_period=_DEFAULT_WAIT_PERIOD): + self.exporter = exporter + self.worker = _Worker( + exporter, + grace_period, + max_batch_size, + wait_period, + ) + self.worker.start() + + def export(self, data): + """Put the trace/stats to be exported into queue.""" + self.worker.enqueue(data) + + def flush(self): + """Submit any pending traces/stats.""" + self.worker.flush() diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/base.py b/.venv/lib/python3.11/site-packages/opencensus/common/transports/base.py new file mode 100644 index 0000000000000000000000000000000000000000..11ed47b54c6d55f0f97c673a3e7622edc1962413 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/transports/base.py @@ -0,0 +1,31 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Module containing base class for transport.""" + + +class Transport(object): + """Base class for transport. + + Subclasses of :class:`Transport` must override :meth:`export`. + """ + def export(self, datas): + """Export the data.""" + raise NotImplementedError + + def flush(self): + """Submit any pending data. + + For blocking/sync transports, this is a no-op. + """ diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/transports/sync.py b/.venv/lib/python3.11/site-packages/opencensus/common/transports/sync.py new file mode 100644 index 0000000000000000000000000000000000000000..cc78935150bcae6c8085d42d1bc6d4562b3ac4b1 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/transports/sync.py @@ -0,0 +1,28 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from opencensus.common.transports import base +from opencensus.trace import execution_context + + +class SyncTransport(base.Transport): + def __init__(self, exporter): + self.exporter = exporter + + def export(self, datas): + # Used to suppress tracking of requests in export + execution_context.set_is_exporter(True) + self.exporter.emit(datas) + # Reset the context + execution_context.set_is_exporter(False) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/utils/__init__.py b/.venv/lib/python3.11/site-packages/opencensus/common/utils/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..4114327c1b74639a29831acb03e8004eb1b69888 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/opencensus/common/utils/__init__.py @@ -0,0 +1,131 @@ +# Copyright 2018, OpenCensus Authors +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +try: + from weakref import WeakMethod +except ImportError: + from opencensus.common.backports import WeakMethod + +import calendar +import datetime +import weakref + +UTF8 = 'utf-8' + +# Max length is 128 bytes for a truncatable string. +MAX_LENGTH = 128 + +ISO_DATETIME_REGEX = '%Y-%m-%dT%H:%M:%S.%fZ' + + +def get_truncatable_str(str_to_convert): + """Truncate a string if exceed limit and record the truncated bytes + count. + """ + truncated, truncated_byte_count = check_str_length( + str_to_convert, MAX_LENGTH) + + result = { + 'value': truncated, + 'truncated_byte_count': truncated_byte_count, + } + return result + + +def check_str_length(str_to_check, limit=MAX_LENGTH): + """Check the length of a string. If exceeds limit, then truncate it. + + :type str_to_check: str + :param str_to_check: String to check. + + :type limit: int + :param limit: The upper limit of the length. + + :rtype: tuple + :returns: The string it self if not exceeded length, or truncated string + if exceeded and the truncated byte count. + """ + str_bytes = str_to_check.encode(UTF8) + str_len = len(str_bytes) + truncated_byte_count = 0 + + if str_len > limit: + truncated_byte_count = str_len - limit + str_bytes = str_bytes[:limit] + + result = str(str_bytes.decode(UTF8, errors='ignore')) + + return (result, truncated_byte_count) + + +def to_iso_str(ts=None): + """Get an ISO 8601 string for a UTC datetime.""" + if ts is None: + ts = datetime.datetime.utcnow() + return ts.strftime("%Y-%m-%dT%H:%M:%S.%fZ") + + +def timestamp_to_microseconds(timestamp): + """Convert a timestamp string into a microseconds value + :param timestamp + :return time in microseconds + """ + timestamp_str = datetime.datetime.strptime(timestamp, ISO_DATETIME_REGEX) + epoch_time_secs = calendar.timegm(timestamp_str.timetuple()) + epoch_time_mus = epoch_time_secs * 1e6 + timestamp_str.microsecond + return epoch_time_mus + + +def iuniq(ible): + """Get an iterator over unique items of `ible`.""" + items = set() + for item in ible: + if item not in items: + items.add(item) + yield item + + +def uniq(ible): + """Get a list of unique items of `ible`.""" + return list(iuniq(ible)) + + +def window(ible, length): + """Split `ible` into multiple lists of length `length`. + + >>> list(window(range(5), 2)) + [[0, 1], [2, 3], [4]] + """ + if length <= 0: # pragma: NO COVER + raise ValueError + ible = iter(ible) + while True: + elts = [xx for ii, xx in zip(range(length), ible)] + if elts: + yield elts + else: + break + + +def get_weakref(func): + """Get a weak reference to bound or unbound `func`. + + If `func` is unbound (i.e. has no __self__ attr) get a weakref.ref, + otherwise get a wrapper that simulates weakref.ref. + """ + if func is None: + raise ValueError + if not hasattr(func, '__self__'): + return weakref.ref(func) + return WeakMethod(func) diff --git a/.venv/lib/python3.11/site-packages/opencensus/common/utils/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/common/utils/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..24ce63b8a2bbf8be98531cc98e84def2c3869618 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/common/utils/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..f54161bc908d3856cd51df60aeb14d41789daeb1 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_key.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_key.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..bf93b354ff86bfaee58c383bb9d20a65163e38c1 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_key.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_value.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_value.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..13c7d4321dfd2552f99d34913bcc03465dbe910f Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/label_value.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/transport.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/transport.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..41c4b64c27f6bf074c23c54dc26493b54a28cbce Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/__pycache__/transport.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_descriptor.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_descriptor.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..370316a27d70bf483253bcedd628960f1e80c2d9 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_descriptor.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_producer.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_producer.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..4f8ebfa3db2ecf8a52f8fc6298d4cc2cab5cc8e6 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/metric_producer.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/value.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/value.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..16a7f98c8fda212b29e932a64e9aec936866550f Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/metrics/export/__pycache__/value.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..cedd08fa9770cdbfb76ee84b6b18c79701766fb7 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/attributes_helper.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/attributes_helper.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..704d57113d4a19d95d7948bef7067fd66f7e242c Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/attributes_helper.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/blank_span.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/blank_span.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..fd3f70e763bfc4af120ca2092d6d9c2ad5967dfc Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/blank_span.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/exceptions_status.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/exceptions_status.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..cc4e2602be8cb340c32c691db1b19abf91258d07 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/exceptions_status.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/integrations.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/integrations.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..bd55adc2b698a83318031f16f8a60a3487257745 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/integrations.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/link.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/link.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..8e43b1b5951cfcb5c72d35bdb1f37bf4b76aed52 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/link.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/logging_exporter.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/logging_exporter.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..5b4c3a92b68dffe130731e55a0358fd1a0852fc7 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/logging_exporter.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/span.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/span.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..669a0f29230b6d23b589392472585012e9cc1255 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/span.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/stack_trace.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/stack_trace.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..56a349e3900c039078176d9c754273fdb64d06ff Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/stack_trace.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/status.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/status.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..cc4d1cd5cc1da7836e41d916c19961d174eac4f8 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/status.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/time_event.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/time_event.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..b360283c51b9fc4cf691806e80e2601ed5795941 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/time_event.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/trace_options.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/trace_options.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..8ca384035645aac46f765766b2d9aa9ec86a0824 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/trace_options.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracer.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracer.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..a485b97a126359198950f863d8b410d5811dd98d Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracer.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracestate.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracestate.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..5310a014efeb14df8de86bccfb8960ea87b807b2 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/tracestate.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/utils.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..b91748d2756ee99a874a44b742ba20200f5504c3 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/__pycache__/utils.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/opencensus/trace/samplers/__pycache__/__init__.cpython-311.pyc b/.venv/lib/python3.11/site-packages/opencensus/trace/samplers/__pycache__/__init__.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..393ac57aebe1e2b8d975a920fe45763c9a535ef4 Binary files /dev/null and b/.venv/lib/python3.11/site-packages/opencensus/trace/samplers/__pycache__/__init__.cpython-311.pyc differ diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/INSTALLER b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/INSTALLER new file mode 100644 index 0000000000000000000000000000000000000000..a1b589e38a32041e49332e5e81c2d363dc418d68 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/INSTALLER @@ -0,0 +1 @@ +pip diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/LICENSE b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..85f4dd63d2da8e31d7e84d5180f016fdfe315c2c --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2015 holger krekel (rather uses bitbucket/hpk42) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/METADATA b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/METADATA new file mode 100644 index 0000000000000000000000000000000000000000..2d697b0d7219c58fa370de4c2eeca04e0afed575 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/METADATA @@ -0,0 +1,155 @@ +Metadata-Version: 2.1 +Name: pluggy +Version: 1.5.0 +Summary: plugin and hook calling mechanisms for python +Home-page: https://github.com/pytest-dev/pluggy +Author: Holger Krekel +Author-email: holger@merlinux.eu +License: MIT +Platform: unix +Platform: linux +Platform: osx +Platform: win32 +Classifier: Development Status :: 6 - Mature +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: MIT License +Classifier: Operating System :: POSIX +Classifier: Operating System :: Microsoft :: Windows +Classifier: Operating System :: MacOS :: MacOS X +Classifier: Topic :: Software Development :: Testing +Classifier: Topic :: Software Development :: Libraries +Classifier: Topic :: Utilities +Classifier: Programming Language :: Python :: Implementation :: CPython +Classifier: Programming Language :: Python :: Implementation :: PyPy +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3 :: Only +Classifier: Programming Language :: Python :: 3.8 +Classifier: Programming Language :: Python :: 3.9 +Classifier: Programming Language :: Python :: 3.10 +Classifier: Programming Language :: Python :: 3.11 +Requires-Python: >=3.8 +Description-Content-Type: text/x-rst +License-File: LICENSE +Provides-Extra: dev +Requires-Dist: pre-commit ; extra == 'dev' +Requires-Dist: tox ; extra == 'dev' +Provides-Extra: testing +Requires-Dist: pytest ; extra == 'testing' +Requires-Dist: pytest-benchmark ; extra == 'testing' + +==================================================== +pluggy - A minimalist production ready plugin system +==================================================== + +|pypi| |conda-forge| |versions| |github-actions| |gitter| |black| |codecov| + +This is the core framework used by the `pytest`_, `tox`_, and `devpi`_ projects. + +Please `read the docs`_ to learn more! + +A definitive example +==================== +.. code-block:: python + + import pluggy + + hookspec = pluggy.HookspecMarker("myproject") + hookimpl = pluggy.HookimplMarker("myproject") + + + class MySpec: + """A hook specification namespace.""" + + @hookspec + def myhook(self, arg1, arg2): + """My special little hook that you can customize.""" + + + class Plugin_1: + """A hook implementation namespace.""" + + @hookimpl + def myhook(self, arg1, arg2): + print("inside Plugin_1.myhook()") + return arg1 + arg2 + + + class Plugin_2: + """A 2nd hook implementation namespace.""" + + @hookimpl + def myhook(self, arg1, arg2): + print("inside Plugin_2.myhook()") + return arg1 - arg2 + + + # create a manager and add the spec + pm = pluggy.PluginManager("myproject") + pm.add_hookspecs(MySpec) + + # register plugins + pm.register(Plugin_1()) + pm.register(Plugin_2()) + + # call our ``myhook`` hook + results = pm.hook.myhook(arg1=1, arg2=2) + print(results) + + +Running this directly gets us:: + + $ python docs/examples/toy-example.py + inside Plugin_2.myhook() + inside Plugin_1.myhook() + [-1, 3] + + +.. badges + +.. |pypi| image:: https://img.shields.io/pypi/v/pluggy.svg + :target: https://pypi.org/pypi/pluggy + +.. |versions| image:: https://img.shields.io/pypi/pyversions/pluggy.svg + :target: https://pypi.org/pypi/pluggy + +.. |github-actions| image:: https://github.com/pytest-dev/pluggy/workflows/main/badge.svg + :target: https://github.com/pytest-dev/pluggy/actions + +.. |conda-forge| image:: https://img.shields.io/conda/vn/conda-forge/pluggy.svg + :target: https://anaconda.org/conda-forge/pytest + +.. |gitter| image:: https://badges.gitter.im/pytest-dev/pluggy.svg + :alt: Join the chat at https://gitter.im/pytest-dev/pluggy + :target: https://gitter.im/pytest-dev/pluggy?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge + +.. |black| image:: https://img.shields.io/badge/code%20style-black-000000.svg + :target: https://github.com/ambv/black + +.. |codecov| image:: https://codecov.io/gh/pytest-dev/pluggy/branch/master/graph/badge.svg + :target: https://codecov.io/gh/pytest-dev/pluggy + :alt: Code coverage Status + +.. links +.. _pytest: + http://pytest.org +.. _tox: + https://tox.readthedocs.org +.. _devpi: + http://doc.devpi.net +.. _read the docs: + https://pluggy.readthedocs.io/en/latest/ + + +Support pluggy +-------------- + +`Open Collective`_ is an online funding platform for open and transparent communities. +It provides tools to raise money and share your finances in full transparency. + +It is the platform of choice for individuals and companies that want to make one-time or +monthly donations directly to the project. + +``pluggy`` is part of the ``pytest-dev`` project, see more details in the `pytest collective`_. + +.. _Open Collective: https://opencollective.com +.. _pytest collective: https://opencollective.com/pytest diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/RECORD b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/RECORD new file mode 100644 index 0000000000000000000000000000000000000000..3c6c69008ec6086dad0db4b3c3ee5b1b184cef83 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/RECORD @@ -0,0 +1,23 @@ +pluggy-1.5.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +pluggy-1.5.0.dist-info/LICENSE,sha256=1rZebCE6XQtXeRHTTW5ZSbn1nXbCOMUHGi8_wWz7JgY,1110 +pluggy-1.5.0.dist-info/METADATA,sha256=6JeHn3o9P9iqwK20MgVHdoqxick1SS3SORb65Iyb-Fw,4812 +pluggy-1.5.0.dist-info/RECORD,, +pluggy-1.5.0.dist-info/WHEEL,sha256=GJ7t_kWBFywbagK5eo9IoUwLW6oyOeTKmQ-9iHFVNxQ,92 +pluggy-1.5.0.dist-info/top_level.txt,sha256=xKSCRhai-v9MckvMuWqNz16c1tbsmOggoMSwTgcpYHE,7 +pluggy/__init__.py,sha256=U8qtIRmmr0SRdbxAF8VJJs01jMUYgKAc9oAjYYCLgz4,980 +pluggy/__pycache__/__init__.cpython-311.pyc,, +pluggy/__pycache__/_callers.cpython-311.pyc,, +pluggy/__pycache__/_hooks.cpython-311.pyc,, +pluggy/__pycache__/_manager.cpython-311.pyc,, +pluggy/__pycache__/_result.cpython-311.pyc,, +pluggy/__pycache__/_tracing.cpython-311.pyc,, +pluggy/__pycache__/_version.cpython-311.pyc,, +pluggy/__pycache__/_warnings.cpython-311.pyc,, +pluggy/_callers.py,sha256=8k8i3GVBT_gtccCPFpN8Ww0towWSnSazrl0vbP9UXSY,7316 +pluggy/_hooks.py,sha256=m-3qVLDdn4S9y3pffLOpMQeDI4PDw8hrATK1SC8rQkU,25108 +pluggy/_manager.py,sha256=ylIDFwrUP_mMAGpdRPj9zwxukG7nWJAfY1yylXyXAMo,20265 +pluggy/_result.py,sha256=eEak-7Ie88bRkylsgbLwB6iMogogIMZheq8W3bImmcs,2849 +pluggy/_tracing.py,sha256=kSBr25F_rNklV2QhLD6h1jx6Z1kcKDRbuYvF5jv35pU,2089 +pluggy/_version.py,sha256=OYzqgMEgfFG0au4hzbEdgYI-c7Hxo3wdBtrpEjK1RoY,411 +pluggy/_warnings.py,sha256=td0AvZBpfamriCC3OqsLwxMh-SzAMjfjmc58T5vP3lw,828 +pluggy/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/WHEEL b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/WHEEL new file mode 100644 index 0000000000000000000000000000000000000000..bab98d675883cc7567a79df485cd7b4f015e376f --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/WHEEL @@ -0,0 +1,5 @@ +Wheel-Version: 1.0 +Generator: bdist_wheel (0.43.0) +Root-Is-Purelib: true +Tag: py3-none-any + diff --git a/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/top_level.txt b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/top_level.txt new file mode 100644 index 0000000000000000000000000000000000000000..11bdb5c1f5fcdd91af5d587c352039cb8476af49 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/pluggy-1.5.0.dist-info/top_level.txt @@ -0,0 +1 @@ +pluggy diff --git a/.venv/lib/python3.11/site-packages/proto/_file_info.py b/.venv/lib/python3.11/site-packages/proto/_file_info.py new file mode 100644 index 0000000000000000000000000000000000000000..537eeaf4556aa44a7c532c56e08850250d57148f --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/_file_info.py @@ -0,0 +1,196 @@ +# Copyright 2018 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import collections +import inspect +import logging + +from google.protobuf import descriptor_pb2 +from google.protobuf import descriptor_pool +from google.protobuf import message +from google.protobuf import reflection + +from proto.marshal.rules.message import MessageRule + +log = logging.getLogger("_FileInfo") + + +class _FileInfo( + collections.namedtuple( + "_FileInfo", + ["descriptor", "messages", "enums", "name", "nested", "nested_enum"], + ) +): + registry = {} # Mapping[str, '_FileInfo'] + + @classmethod + def maybe_add_descriptor(cls, filename, package): + descriptor = cls.registry.get(filename) + if not descriptor: + descriptor = cls.registry[filename] = cls( + descriptor=descriptor_pb2.FileDescriptorProto( + name=filename, + package=package, + syntax="proto3", + ), + enums=collections.OrderedDict(), + messages=collections.OrderedDict(), + name=filename, + nested={}, + nested_enum={}, + ) + + return descriptor + + @staticmethod + def proto_file_name(name): + return "{0}.proto".format(name.replace(".", "/")) + + def _get_manifest(self, new_class): + module = inspect.getmodule(new_class) + if hasattr(module, "__protobuf__"): + return frozenset(module.__protobuf__.manifest) + + return frozenset() + + def _get_remaining_manifest(self, new_class): + return self._get_manifest(new_class) - {new_class.__name__} + + def _calculate_salt(self, new_class, fallback): + manifest = self._get_manifest(new_class) + if manifest and new_class.__name__ not in manifest: + log.warning( + "proto-plus module {module} has a declared manifest but {class_name} is not in it".format( + module=inspect.getmodule(new_class).__name__, + class_name=new_class.__name__, + ) + ) + + return "" if new_class.__name__ in manifest else (fallback or "").lower() + + def generate_file_pb(self, new_class, fallback_salt=""): + """Generate the descriptors for all protos in the file. + + This method takes the file descriptor attached to the parent + message and generates the immutable descriptors for all of the + messages in the file descriptor. (This must be done in one fell + swoop for immutability and to resolve proto cross-referencing.) + + This is run automatically when the last proto in the file is + generated, as determined by the module's __all__ tuple. + """ + pool = descriptor_pool.Default() + + # Salt the filename in the descriptor. + # This allows re-use of the filename by other proto messages if + # needed (e.g. if __all__ is not used). + salt = self._calculate_salt(new_class, fallback_salt) + self.descriptor.name = "{name}.proto".format( + name="_".join([self.descriptor.name[:-6], salt]).rstrip("_"), + ) + + # Add the file descriptor. + pool.Add(self.descriptor) + + # Adding the file descriptor to the pool created a descriptor for + # each message; go back through our wrapper messages and associate + # them with the internal protobuf version. + for full_name, proto_plus_message in self.messages.items(): + # Get the descriptor from the pool, and create the protobuf + # message based on it. + descriptor = pool.FindMessageTypeByName(full_name) + pb_message = reflection.GeneratedProtocolMessageType( + descriptor.name, + (message.Message,), + {"DESCRIPTOR": descriptor, "__module__": None}, + ) + + # Register the message with the marshal so it is wrapped + # appropriately. + # + # We do this here (rather than at class creation) because it + # is not until this point that we have an actual protobuf + # message subclass, which is what we need to use. + proto_plus_message._meta._pb = pb_message + proto_plus_message._meta.marshal.register( + pb_message, MessageRule(pb_message, proto_plus_message) + ) + + # Iterate over any fields on the message and, if their type + # is a message still referenced as a string, resolve the reference. + for field in proto_plus_message._meta.fields.values(): + if field.message and isinstance(field.message, str): + field.message = self.messages[field.message] + elif field.enum and isinstance(field.enum, str): + field.enum = self.enums[field.enum] + + # Same thing for enums + for full_name, proto_plus_enum in self.enums.items(): + descriptor = pool.FindEnumTypeByName(full_name) + proto_plus_enum._meta.pb = descriptor + + # We no longer need to track this file's info; remove it from + # the module's registry and from this object. + self.registry.pop(self.name) + + def ready(self, new_class): + """Return True if a file descriptor may added, False otherwise. + + This determine if all the messages that we plan to create have been + created, as best as we are able. + + Since messages depend on one another, we create descriptor protos + (which reference each other using strings) and wait until we have + built everything that is going to be in the module, and then + use the descriptor protos to instantiate the actual descriptors in + one fell swoop. + + Args: + new_class (~.MessageMeta): The new class currently undergoing + creation. + """ + # If there are any nested descriptors that have not been assigned to + # the descriptors that should contain them, then we are not ready. + if len(self.nested) or len(self.nested_enum): + return False + + # If there are any unresolved fields (fields with a composite message + # declared as a string), ensure that the corresponding message is + # declared. + for field in self.unresolved_fields: + if (field.message and field.message not in self.messages) or ( + field.enum and field.enum not in self.enums + ): + return False + + # If the module in which this class is defined provides a + # __protobuf__ property, it may have a manifest. + # + # Do not generate the file descriptor until every member of the + # manifest has been populated. + module = inspect.getmodule(new_class) + manifest = self._get_remaining_manifest(new_class) + + # We are ready if all members have been populated. + return all(hasattr(module, i) for i in manifest) + + @property + def unresolved_fields(self): + """Return fields with referencing message types as strings.""" + for proto_plus_message in self.messages.values(): + for field in proto_plus_message._meta.fields.values(): + if (field.message and isinstance(field.message, str)) or ( + field.enum and isinstance(field.enum, str) + ): + yield field diff --git a/.venv/lib/python3.11/site-packages/proto/datetime_helpers.py b/.venv/lib/python3.11/site-packages/proto/datetime_helpers.py new file mode 100644 index 0000000000000000000000000000000000000000..ffac4f47d95a123dcc6e70d09fe5018fde79cb6a --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/datetime_helpers.py @@ -0,0 +1,225 @@ +# Copyright 2017 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Helpers for :mod:`datetime`.""" + +import calendar +import datetime +import re + +from google.protobuf import timestamp_pb2 + + +_UTC_EPOCH = datetime.datetime.fromtimestamp(0, datetime.timezone.utc) + +_RFC3339_MICROS = "%Y-%m-%dT%H:%M:%S.%fZ" +_RFC3339_NO_FRACTION = "%Y-%m-%dT%H:%M:%S" +# datetime.strptime cannot handle nanosecond precision: parse w/ regex +_RFC3339_NANOS = re.compile( + r""" + (?P + \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2} # YYYY-MM-DDTHH:MM:SS + ) + ( # Optional decimal part + \. # decimal point + (?P\d{1,9}) # nanoseconds, maybe truncated + )? + Z # Zulu +""", + re.VERBOSE, +) + + +def _from_microseconds(value): + """Convert timestamp in microseconds since the unix epoch to datetime. + + Args: + value (float): The timestamp to convert, in microseconds. + + Returns: + datetime.datetime: The datetime object equivalent to the timestamp in + UTC. + """ + return _UTC_EPOCH + datetime.timedelta(microseconds=value) + + +def _to_rfc3339(value, ignore_zone=True): + """Convert a datetime to an RFC3339 timestamp string. + + Args: + value (datetime.datetime): + The datetime object to be converted to a string. + ignore_zone (bool): If True, then the timezone (if any) of the + datetime object is ignored and the datetime is treated as UTC. + + Returns: + str: The RFC3339 formatted string representing the datetime. + """ + if not ignore_zone and value.tzinfo is not None: + # Convert to UTC and remove the time zone info. + value = value.replace(tzinfo=None) - value.utcoffset() + + return value.strftime(_RFC3339_MICROS) + + +class DatetimeWithNanoseconds(datetime.datetime): + """Track nanosecond in addition to normal datetime attrs. + + Nanosecond can be passed only as a keyword argument. + """ + + __slots__ = ("_nanosecond",) + + # pylint: disable=arguments-differ + def __new__(cls, *args, **kw): + nanos = kw.pop("nanosecond", 0) + if nanos > 0: + if "microsecond" in kw: + raise TypeError("Specify only one of 'microsecond' or 'nanosecond'") + kw["microsecond"] = nanos // 1000 + inst = datetime.datetime.__new__(cls, *args, **kw) + inst._nanosecond = nanos or 0 + return inst + + # pylint: disable=arguments-differ + def replace(self, *args, **kw): + """Return a date with the same value, except for those parameters given + new values by whichever keyword arguments are specified. For example, + if d == date(2002, 12, 31), then + d.replace(day=26) == date(2002, 12, 26). + NOTE: nanosecond and microsecond are mutually exclusive arguments. + """ + + ms_provided = "microsecond" in kw + ns_provided = "nanosecond" in kw + provided_ns = kw.pop("nanosecond", 0) + + prev_nanos = self.nanosecond + + if ms_provided and ns_provided: + raise TypeError("Specify only one of 'microsecond' or 'nanosecond'") + + if ns_provided: + # if nanos were provided, manipulate microsecond kw arg to super + kw["microsecond"] = provided_ns // 1000 + inst = super().replace(*args, **kw) + + if ms_provided: + # ms were provided, nanos are invalid, build from ms + inst._nanosecond = inst.microsecond * 1000 + elif ns_provided: + # ns were provided, replace nanoseconds to match after calling super + inst._nanosecond = provided_ns + else: + # if neither ms or ns were provided, passthru previous nanos. + inst._nanosecond = prev_nanos + + return inst + + @property + def nanosecond(self): + """Read-only: nanosecond precision.""" + return self._nanosecond or self.microsecond * 1000 + + def rfc3339(self): + """Return an RFC3339-compliant timestamp. + + Returns: + (str): Timestamp string according to RFC3339 spec. + """ + if self._nanosecond == 0: + return _to_rfc3339(self) + nanos = str(self._nanosecond).rjust(9, "0").rstrip("0") + return "{}.{}Z".format(self.strftime(_RFC3339_NO_FRACTION), nanos) + + @classmethod + def from_rfc3339(cls, stamp): + """Parse RFC3339-compliant timestamp, preserving nanoseconds. + + Args: + stamp (str): RFC3339 stamp, with up to nanosecond precision + + Returns: + :class:`DatetimeWithNanoseconds`: + an instance matching the timestamp string + + Raises: + ValueError: if `stamp` does not match the expected format + """ + with_nanos = _RFC3339_NANOS.match(stamp) + if with_nanos is None: + raise ValueError( + "Timestamp: {}, does not match pattern: {}".format( + stamp, _RFC3339_NANOS.pattern + ) + ) + bare = datetime.datetime.strptime( + with_nanos.group("no_fraction"), _RFC3339_NO_FRACTION + ) + fraction = with_nanos.group("nanos") + if fraction is None: + nanos = 0 + else: + scale = 9 - len(fraction) + nanos = int(fraction) * (10**scale) + return cls( + bare.year, + bare.month, + bare.day, + bare.hour, + bare.minute, + bare.second, + nanosecond=nanos, + tzinfo=datetime.timezone.utc, + ) + + def timestamp_pb(self): + """Return a timestamp message. + + Returns: + (:class:`~google.protobuf.timestamp_pb2.Timestamp`): Timestamp message + """ + inst = ( + self + if self.tzinfo is not None + else self.replace(tzinfo=datetime.timezone.utc) + ) + delta = inst - _UTC_EPOCH + seconds = int(delta.total_seconds()) + nanos = self._nanosecond or self.microsecond * 1000 + return timestamp_pb2.Timestamp(seconds=seconds, nanos=nanos) + + @classmethod + def from_timestamp_pb(cls, stamp): + """Parse RFC3339-compliant timestamp, preserving nanoseconds. + + Args: + stamp (:class:`~google.protobuf.timestamp_pb2.Timestamp`): timestamp message + + Returns: + :class:`DatetimeWithNanoseconds`: + an instance matching the timestamp message + """ + microseconds = int(stamp.seconds * 1e6) + bare = _from_microseconds(microseconds) + return cls( + bare.year, + bare.month, + bare.day, + bare.hour, + bare.minute, + bare.second, + nanosecond=stamp.nanos, + tzinfo=datetime.timezone.utc, + ) diff --git a/.venv/lib/python3.11/site-packages/proto/fields.py b/.venv/lib/python3.11/site-packages/proto/fields.py new file mode 100644 index 0000000000000000000000000000000000000000..6f5b6452161935c20f3cc8ef2f0e1aa121c51263 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/fields.py @@ -0,0 +1,165 @@ +# Copyright 2018 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from enum import EnumMeta + +from google.protobuf import descriptor_pb2 +from google.protobuf.internal.enum_type_wrapper import EnumTypeWrapper + +from proto.primitives import ProtoType + + +class Field: + """A representation of a type of field in protocol buffers.""" + + # Fields are NOT repeated nor maps. + # The RepeatedField overrides this values. + repeated = False + + def __init__( + self, + proto_type, + *, + number: int, + message=None, + enum=None, + oneof: str = None, + json_name: str = None, + optional: bool = False + ): + # This class is not intended to stand entirely alone; + # data is augmented by the metaclass for Message. + self.mcls_data = None + self.parent = None + + # If the proto type sent is an object or a string, it is really + # a message or enum. + if not isinstance(proto_type, int): + # Note: We only support the "shortcut syntax" for enums + # when receiving the actual class. + if isinstance(proto_type, (EnumMeta, EnumTypeWrapper)): + enum = proto_type + proto_type = ProtoType.ENUM + else: + message = proto_type + proto_type = ProtoType.MESSAGE + + # Save the direct arguments. + self.number = number + self.proto_type = proto_type + self.message = message + self.enum = enum + self.json_name = json_name + self.optional = optional + self.oneof = oneof + + # Once the descriptor is accessed the first time, cache it. + # This is important because in rare cases the message or enum + # types are written later. + self._descriptor = None + + @property + def descriptor(self): + """Return the descriptor for the field.""" + if not self._descriptor: + # Resolve the message type, if any, to a string. + type_name = None + if isinstance(self.message, str): + if not self.message.startswith(self.package): + self.message = "{package}.{name}".format( + package=self.package, + name=self.message, + ) + type_name = self.message + elif self.message: + type_name = ( + self.message.DESCRIPTOR.full_name + if hasattr(self.message, "DESCRIPTOR") + else self.message._meta.full_name + ) + elif isinstance(self.enum, str): + if not self.enum.startswith(self.package): + self.enum = "{package}.{name}".format( + package=self.package, + name=self.enum, + ) + type_name = self.enum + elif self.enum: + type_name = ( + self.enum.DESCRIPTOR.full_name + if hasattr(self.enum, "DESCRIPTOR") + else self.enum._meta.full_name + ) + + # Set the descriptor. + self._descriptor = descriptor_pb2.FieldDescriptorProto( + name=self.name, + number=self.number, + label=3 if self.repeated else 1, + type=self.proto_type, + type_name=type_name, + json_name=self.json_name, + proto3_optional=self.optional, + ) + + # Return the descriptor. + return self._descriptor + + @property + def name(self) -> str: + """Return the name of the field.""" + return self.mcls_data["name"] + + @property + def package(self) -> str: + """Return the package of the field.""" + return self.mcls_data["package"] + + @property + def pb_type(self): + """Return the composite type of the field, or the primitive type if a primitive.""" + # For enums, return the Python enum. + if self.enum: + return self.enum + + # For primitive fields, we still want to know + # what the type is. + if not self.message: + return self.proto_type + + # Return the internal protobuf message. + if hasattr(self.message, "_meta"): + return self.message.pb() + return self.message + + +class RepeatedField(Field): + """A representation of a repeated field in protocol buffers.""" + + repeated = True + + +class MapField(Field): + """A representation of a map field in protocol buffers.""" + + def __init__(self, key_type, value_type, *, number: int, message=None, enum=None): + super().__init__(value_type, number=number, message=message, enum=enum) + self.map_key_type = key_type + + +__all__ = ( + "Field", + "MapField", + "RepeatedField", +) diff --git a/.venv/lib/python3.11/site-packages/proto/message.py b/.venv/lib/python3.11/site-packages/proto/message.py new file mode 100644 index 0000000000000000000000000000000000000000..989c1cd32a866806edfa1a3d579e3987a4717112 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/message.py @@ -0,0 +1,969 @@ +# Copyright 2018 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import collections +import collections.abc +import copy +import re +from typing import List, Optional, Type +import warnings + +import google.protobuf +from google.protobuf import descriptor_pb2 +from google.protobuf import message +from google.protobuf.json_format import MessageToDict, MessageToJson, Parse + +from proto import _file_info +from proto import _package_info +from proto.fields import Field +from proto.fields import MapField +from proto.fields import RepeatedField +from proto.marshal import Marshal +from proto.primitives import ProtoType +from proto.utils import has_upb + + +PROTOBUF_VERSION = google.protobuf.__version__ + +_upb = has_upb() # Important to cache result here. + + +class MessageMeta(type): + """A metaclass for building and registering Message subclasses.""" + + def __new__(mcls, name, bases, attrs): + # Do not do any special behavior for Message itself. + if not bases: + return super().__new__(mcls, name, bases, attrs) + + # Get the essential information about the proto package, and where + # this component belongs within the file. + package, marshal = _package_info.compile(name, attrs) + + # Determine the local path of this proto component within the file. + local_path = tuple(attrs.get("__qualname__", name).split(".")) + + # Sanity check: We get the wrong full name if a class is declared + # inside a function local scope; correct this. + if "" in local_path: + ix = local_path.index("") + local_path = local_path[: ix - 1] + local_path[ix + 1 :] + + # Determine the full name in protocol buffers. + full_name = ".".join((package,) + local_path).lstrip(".") + + # Special case: Maps. Map fields are special; they are essentially + # shorthand for a nested message and a repeated field of that message. + # Decompose each map into its constituent form. + # https://developers.google.com/protocol-buffers/docs/proto3#maps + map_fields = {} + for key, field in attrs.items(): + if not isinstance(field, MapField): + continue + + # Determine the name of the entry message. + msg_name = "{pascal_key}Entry".format( + pascal_key=re.sub( + r"_\w", + lambda m: m.group()[1:].upper(), + key, + ).replace(key[0], key[0].upper(), 1), + ) + + # Create the "entry" message (with the key and value fields). + # + # Note: We instantiate an ordered dictionary here and then + # attach key and value in order to ensure that the fields are + # iterated in the correct order when the class is created. + # This is only an issue in Python 3.5, where the order is + # random (and the wrong order causes the pool to refuse to add + # the descriptor because reasons). + entry_attrs = collections.OrderedDict( + { + "__module__": attrs.get("__module__", None), + "__qualname__": "{prefix}.{name}".format( + prefix=attrs.get("__qualname__", name), + name=msg_name, + ), + "_pb_options": {"map_entry": True}, + } + ) + entry_attrs["key"] = Field(field.map_key_type, number=1) + entry_attrs["value"] = Field( + field.proto_type, + number=2, + enum=field.enum, + message=field.message, + ) + map_fields[msg_name] = MessageMeta(msg_name, (Message,), entry_attrs) + + # Create the repeated field for the entry message. + map_fields[key] = RepeatedField( + ProtoType.MESSAGE, + number=field.number, + message=map_fields[msg_name], + ) + + # Add the new entries to the attrs + attrs.update(map_fields) + + # Okay, now we deal with all the rest of the fields. + # Iterate over all the attributes and separate the fields into + # their own sequence. + fields = [] + new_attrs = {} + oneofs = collections.OrderedDict() + proto_imports = set() + index = 0 + for key, field in attrs.items(): + # Sanity check: If this is not a field, do nothing. + if not isinstance(field, Field): + # The field objects themselves should not be direct attributes. + new_attrs[key] = field + continue + + # Add data that the field requires that we do not take in the + # constructor because we can derive it from the metaclass. + # (The goal is to make the declaration syntax as nice as possible.) + field.mcls_data = { + "name": key, + "parent_name": full_name, + "index": index, + "package": package, + } + + # Add the field to the list of fields. + fields.append(field) + # If this field is part of a "oneof", ensure the oneof itself + # is represented. + if field.oneof: + # Keep a running tally of the index of each oneof, and assign + # that index to the field's descriptor. + oneofs.setdefault(field.oneof, len(oneofs)) + field.descriptor.oneof_index = oneofs[field.oneof] + + # If this field references a message, it may be from another + # proto file; ensure we know about the import (to faithfully + # construct our file descriptor proto). + if field.message and not isinstance(field.message, str): + field_msg = field.message + if hasattr(field_msg, "pb") and callable(field_msg.pb): + field_msg = field_msg.pb() + # Sanity check: The field's message may not yet be defined if + # it was a Message defined in the same file, and the file + # descriptor proto has not yet been generated. + # + # We do nothing in this situation; everything will be handled + # correctly when the file descriptor is created later. + if field_msg: + proto_imports.add(field_msg.DESCRIPTOR.file.name) + + # Same thing, but for enums. + elif field.enum and not isinstance(field.enum, str): + field_enum = ( + field.enum._meta.pb + if hasattr(field.enum, "_meta") + else field.enum.DESCRIPTOR + ) + + if field_enum: + proto_imports.add(field_enum.file.name) + + # Increment the field index counter. + index += 1 + + # As per descriptor.proto, all synthetic oneofs must be ordered after + # 'real' oneofs. + opt_attrs = {} + for field in fields: + if field.optional: + field.oneof = "_{}".format(field.name) + field.descriptor.oneof_index = oneofs[field.oneof] = len(oneofs) + opt_attrs[field.name] = field.name + + # Generating a metaclass dynamically provides class attributes that + # instances can't see. This provides idiomatically named constants + # that enable the following pattern to check for field presence: + # + # class MyMessage(proto.Message): + # field = proto.Field(proto.INT32, number=1, optional=True) + # + # m = MyMessage() + # MyMessage.field in m + if opt_attrs: + mcls = type("AttrsMeta", (mcls,), opt_attrs) + + # Determine the filename. + # We determine an appropriate proto filename based on the + # Python module. + filename = _file_info._FileInfo.proto_file_name( + new_attrs.get("__module__", name.lower()) + ) + + # Get or create the information about the file, including the + # descriptor to which the new message descriptor shall be added. + file_info = _file_info._FileInfo.maybe_add_descriptor(filename, package) + + # Ensure any imports that would be necessary are assigned to the file + # descriptor proto being created. + for proto_import in proto_imports: + if proto_import not in file_info.descriptor.dependency: + file_info.descriptor.dependency.append(proto_import) + + # Retrieve any message options. + opts = descriptor_pb2.MessageOptions(**new_attrs.pop("_pb_options", {})) + + # Create the underlying proto descriptor. + desc = descriptor_pb2.DescriptorProto( + name=name, + field=[i.descriptor for i in fields], + oneof_decl=[ + descriptor_pb2.OneofDescriptorProto(name=i) for i in oneofs.keys() + ], + options=opts, + ) + + # If any descriptors were nested under this one, they need to be + # attached as nested types here. + child_paths = [p for p in file_info.nested.keys() if local_path == p[:-1]] + for child_path in child_paths: + desc.nested_type.add().MergeFrom(file_info.nested.pop(child_path)) + + # Same thing, but for enums + child_paths = [p for p in file_info.nested_enum.keys() if local_path == p[:-1]] + for child_path in child_paths: + desc.enum_type.add().MergeFrom(file_info.nested_enum.pop(child_path)) + + # Add the descriptor to the file if it is a top-level descriptor, + # or to a "holding area" for nested messages otherwise. + if len(local_path) == 1: + file_info.descriptor.message_type.add().MergeFrom(desc) + else: + file_info.nested[local_path] = desc + + # Create the MessageInfo instance to be attached to this message. + new_attrs["_meta"] = _MessageInfo( + fields=fields, + full_name=full_name, + marshal=marshal, + options=opts, + package=package, + ) + + # Run the superclass constructor. + cls = super().__new__(mcls, name, bases, new_attrs) + + # The info class and fields need a reference to the class just created. + cls._meta.parent = cls + for field in cls._meta.fields.values(): + field.parent = cls + + # Add this message to the _FileInfo instance; this allows us to + # associate the descriptor with the message once the descriptor + # is generated. + file_info.messages[full_name] = cls + + # Generate the descriptor for the file if it is ready. + if file_info.ready(new_class=cls): + file_info.generate_file_pb(new_class=cls, fallback_salt=full_name) + + # Done; return the class. + return cls + + @classmethod + def __prepare__(mcls, name, bases, **kwargs): + return collections.OrderedDict() + + @property + def meta(cls): + return cls._meta + + def __dir__(self): + try: + names = set(dir(type)) + names.update( + ( + "meta", + "pb", + "wrap", + "serialize", + "deserialize", + "to_json", + "from_json", + "to_dict", + "copy_from", + ) + ) + desc = self.pb().DESCRIPTOR + names.update(t.name for t in desc.nested_types) + names.update(e.name for e in desc.enum_types) + + return names + except AttributeError: + return dir(type) + + def pb(cls, obj=None, *, coerce: bool = False): + """Return the underlying protobuf Message class or instance. + + Args: + obj: If provided, and an instance of ``cls``, return the + underlying protobuf instance. + coerce (bool): If provided, will attempt to coerce ``obj`` to + ``cls`` if it is not already an instance. + """ + if obj is None: + return cls.meta.pb + if not isinstance(obj, cls): + if coerce: + obj = cls(obj) + else: + raise TypeError( + "%r is not an instance of %s" + % ( + obj, + cls.__name__, + ) + ) + return obj._pb + + def wrap(cls, pb): + """Return a Message object that shallowly wraps the descriptor. + + Args: + pb: A protocol buffer object, such as would be returned by + :meth:`pb`. + """ + # Optimized fast path. + instance = cls.__new__(cls) + super(cls, instance).__setattr__("_pb", pb) + return instance + + def serialize(cls, instance) -> bytes: + """Return the serialized proto. + + Args: + instance: An instance of this message type, or something + compatible (accepted by the type's constructor). + + Returns: + bytes: The serialized representation of the protocol buffer. + """ + return cls.pb(instance, coerce=True).SerializeToString() + + def deserialize(cls, payload: bytes) -> "Message": + """Given a serialized proto, deserialize it into a Message instance. + + Args: + payload (bytes): The serialized proto. + + Returns: + ~.Message: An instance of the message class against which this + method was called. + """ + return cls.wrap(cls.pb().FromString(payload)) + + def _warn_if_including_default_value_fields_is_used_protobuf_5( + cls, including_default_value_fields: Optional[bool] + ) -> None: + """ + Warn Protobuf 5.x+ users that `including_default_value_fields` is deprecated if it is set. + + Args: + including_default_value_fields (Optional(bool)): The value of `including_default_value_fields` set by the user. + """ + if ( + PROTOBUF_VERSION[0] not in ("3", "4") + and including_default_value_fields is not None + ): + warnings.warn( + """The argument `including_default_value_fields` has been removed from + Protobuf 5.x. Please use `always_print_fields_with_no_presence` instead. + """, + DeprecationWarning, + ) + + def _raise_if_print_fields_values_are_set_and_differ( + cls, + always_print_fields_with_no_presence: Optional[bool], + including_default_value_fields: Optional[bool], + ) -> None: + """ + Raise Exception if both `always_print_fields_with_no_presence` and `including_default_value_fields` are set + and the values differ. + + Args: + always_print_fields_with_no_presence (Optional(bool)): The value of `always_print_fields_with_no_presence` set by the user. + including_default_value_fields (Optional(bool)): The value of `including_default_value_fields` set by the user. + Returns: + None + Raises: + ValueError: if both `always_print_fields_with_no_presence` and `including_default_value_fields` are set and + the values differ. + """ + if ( + always_print_fields_with_no_presence is not None + and including_default_value_fields is not None + and always_print_fields_with_no_presence != including_default_value_fields + ): + raise ValueError( + "Arguments `always_print_fields_with_no_presence` and `including_default_value_fields` must match" + ) + + def _normalize_print_fields_without_presence( + cls, + always_print_fields_with_no_presence: Optional[bool], + including_default_value_fields: Optional[bool], + ) -> bool: + """ + Return true if fields with no presence should be included in the results. + By default, fields with no presence will be included in the results + when both `always_print_fields_with_no_presence` and + `including_default_value_fields` are not set + + Args: + always_print_fields_with_no_presence (Optional(bool)): The value of `always_print_fields_with_no_presence` set by the user. + including_default_value_fields (Optional(bool)): The value of `including_default_value_fields` set by the user. + Returns: + None + Raises: + ValueError: if both `always_print_fields_with_no_presence` and `including_default_value_fields` are set and + the values differ. + """ + + cls._warn_if_including_default_value_fields_is_used_protobuf_5( + including_default_value_fields + ) + cls._raise_if_print_fields_values_are_set_and_differ( + always_print_fields_with_no_presence, including_default_value_fields + ) + # Default to True if neither `always_print_fields_with_no_presence` or `including_default_value_fields` is set + return ( + ( + always_print_fields_with_no_presence is None + and including_default_value_fields is None + ) + or always_print_fields_with_no_presence + or including_default_value_fields + ) + + def to_json( + cls, + instance, + *, + use_integers_for_enums=True, + including_default_value_fields=None, + preserving_proto_field_name=False, + sort_keys=False, + indent=2, + float_precision=None, + always_print_fields_with_no_presence=None, + ) -> str: + """Given a message instance, serialize it to json + + Args: + instance: An instance of this message type, or something + compatible (accepted by the type's constructor). + use_integers_for_enums (Optional(bool)): An option that determines whether enum + values should be represented by strings (False) or integers (True). + Default is True. + including_default_value_fields (Optional(bool)): Deprecated. Use argument + `always_print_fields_with_no_presence` instead. An option that + determines whether the default field values should be included in the results. + This value must match `always_print_fields_with_no_presence`, + if both arguments are explicitly set. + preserving_proto_field_name (Optional(bool)): An option that + determines whether field name representations preserve + proto case (snake_case) or use lowerCamelCase. Default is False. + sort_keys (Optional(bool)): If True, then the output will be sorted by field names. + Default is False. + indent (Optional(int)): The JSON object will be pretty-printed with this indent level. + An indent level of 0 or negative will only insert newlines. + Pass None for the most compact representation without newlines. + float_precision (Optional(int)): If set, use this to specify float field valid digits. + Default is None. + always_print_fields_with_no_presence (Optional(bool)): If True, fields without + presence (implicit presence scalars, repeated fields, and map fields) will + always be serialized. Any field that supports presence is not affected by + this option (including singular message fields and oneof fields). + This value must match `including_default_value_fields`, + if both arguments are explicitly set. + Returns: + str: The json string representation of the protocol buffer. + """ + + print_fields = cls._normalize_print_fields_without_presence( + always_print_fields_with_no_presence, including_default_value_fields + ) + + if PROTOBUF_VERSION[0] in ("3", "4"): + return MessageToJson( + cls.pb(instance), + use_integers_for_enums=use_integers_for_enums, + including_default_value_fields=print_fields, + preserving_proto_field_name=preserving_proto_field_name, + sort_keys=sort_keys, + indent=indent, + float_precision=float_precision, + ) + else: + # The `including_default_value_fields` argument was removed from protobuf 5.x + # and replaced with `always_print_fields_with_no_presence` which very similar but has + # handles optional fields consistently by not affecting them. + # The old flag accidentally had inconsistent behavior between proto2 + # optional and proto3 optional fields. + return MessageToJson( + cls.pb(instance), + use_integers_for_enums=use_integers_for_enums, + always_print_fields_with_no_presence=print_fields, + preserving_proto_field_name=preserving_proto_field_name, + sort_keys=sort_keys, + indent=indent, + float_precision=float_precision, + ) + + def from_json(cls, payload, *, ignore_unknown_fields=False) -> "Message": + """Given a json string representing an instance, + parse it into a message. + + Args: + payload: A json string representing a message. + ignore_unknown_fields (Optional(bool)): If True, do not raise errors + for unknown fields. + + Returns: + ~.Message: An instance of the message class against which this + method was called. + """ + instance = cls() + Parse(payload, instance._pb, ignore_unknown_fields=ignore_unknown_fields) + return instance + + def to_dict( + cls, + instance, + *, + use_integers_for_enums=True, + preserving_proto_field_name=True, + including_default_value_fields=None, + float_precision=None, + always_print_fields_with_no_presence=None, + ) -> "Message": + """Given a message instance, return its representation as a python dict. + + Args: + instance: An instance of this message type, or something + compatible (accepted by the type's constructor). + use_integers_for_enums (Optional(bool)): An option that determines whether enum + values should be represented by strings (False) or integers (True). + Default is True. + preserving_proto_field_name (Optional(bool)): An option that + determines whether field name representations preserve + proto case (snake_case) or use lowerCamelCase. Default is True. + including_default_value_fields (Optional(bool)): Deprecated. Use argument + `always_print_fields_with_no_presence` instead. An option that + determines whether the default field values should be included in the results. + This value must match `always_print_fields_with_no_presence`, + if both arguments are explicitly set. + float_precision (Optional(int)): If set, use this to specify float field valid digits. + Default is None. + always_print_fields_with_no_presence (Optional(bool)): If True, fields without + presence (implicit presence scalars, repeated fields, and map fields) will + always be serialized. Any field that supports presence is not affected by + this option (including singular message fields and oneof fields). This value + must match `including_default_value_fields`, if both arguments are explicitly set. + + Returns: + dict: A representation of the protocol buffer using pythonic data structures. + Messages and map fields are represented as dicts, + repeated fields are represented as lists. + """ + + print_fields = cls._normalize_print_fields_without_presence( + always_print_fields_with_no_presence, including_default_value_fields + ) + + if PROTOBUF_VERSION[0] in ("3", "4"): + return MessageToDict( + cls.pb(instance), + including_default_value_fields=print_fields, + preserving_proto_field_name=preserving_proto_field_name, + use_integers_for_enums=use_integers_for_enums, + float_precision=float_precision, + ) + else: + # The `including_default_value_fields` argument was removed from protobuf 5.x + # and replaced with `always_print_fields_with_no_presence` which very similar but has + # handles optional fields consistently by not affecting them. + # The old flag accidentally had inconsistent behavior between proto2 + # optional and proto3 optional fields. + return MessageToDict( + cls.pb(instance), + always_print_fields_with_no_presence=print_fields, + preserving_proto_field_name=preserving_proto_field_name, + use_integers_for_enums=use_integers_for_enums, + float_precision=float_precision, + ) + + def copy_from(cls, instance, other): + """Equivalent for protobuf.Message.CopyFrom + + Args: + instance: An instance of this message type + other: (Union[dict, ~.Message): + A dictionary or message to reinitialize the values for this message. + """ + if isinstance(other, cls): + # Just want the underlying proto. + other = Message.pb(other) + elif isinstance(other, cls.pb()): + # Don't need to do anything. + pass + elif isinstance(other, collections.abc.Mapping): + # Coerce into a proto + other = cls._meta.pb(**other) + else: + raise TypeError( + "invalid argument type to copy to {}: {}".format( + cls.__name__, other.__class__.__name__ + ) + ) + + # Note: we can't just run self.__init__ because this may be a message field + # for a higher order proto; the memory layout for protos is NOT LIKE the + # python memory model. We cannot rely on just setting things by reference. + # Non-trivial complexity is (partially) hidden by the protobuf runtime. + cls.pb(instance).CopyFrom(other) + + +class Message(metaclass=MessageMeta): + """The abstract base class for a message. + + Args: + mapping (Union[dict, ~.Message]): A dictionary or message to be + used to determine the values for this message. + ignore_unknown_fields (Optional(bool)): If True, do not raise errors for + unknown fields. Only applied if `mapping` is a mapping type or there + are keyword parameters. + kwargs (dict): Keys and values corresponding to the fields of the + message. + """ + + def __init__( + self, + mapping=None, + *, + ignore_unknown_fields=False, + **kwargs, + ): + # We accept several things for `mapping`: + # * An instance of this class. + # * An instance of the underlying protobuf descriptor class. + # * A dict + # * Nothing (keyword arguments only). + if mapping is None: + if not kwargs: + # Special fast path for empty construction. + super().__setattr__("_pb", self._meta.pb()) + return + + mapping = kwargs + elif isinstance(mapping, self._meta.pb): + # Make a copy of the mapping. + # This is a constructor for a new object, so users will assume + # that it will not have side effects on the arguments being + # passed in. + # + # The `wrap` method on the metaclass is the public API for taking + # ownership of the passed in protobuf object. + mapping = copy.deepcopy(mapping) + if kwargs: + mapping.MergeFrom(self._meta.pb(**kwargs)) + + super().__setattr__("_pb", mapping) + return + elif isinstance(mapping, type(self)): + # Just use the above logic on mapping's underlying pb. + self.__init__(mapping=mapping._pb, **kwargs) + return + elif isinstance(mapping, collections.abc.Mapping): + # Can't have side effects on mapping. + mapping = copy.copy(mapping) + # kwargs entries take priority for duplicate keys. + mapping.update(kwargs) + else: + # Sanity check: Did we get something not a map? Error if so. + raise TypeError( + "Invalid constructor input for %s: %r" + % ( + self.__class__.__name__, + mapping, + ) + ) + + params = {} + # Update the mapping to address any values that need to be + # coerced. + marshal = self._meta.marshal + for key, value in mapping.items(): + (key, pb_type) = self._get_pb_type_from_key(key) + if pb_type is None: + if ignore_unknown_fields: + continue + + raise ValueError( + "Unknown field for {}: {}".format(self.__class__.__name__, key) + ) + + pb_value = marshal.to_proto(pb_type, value) + + if pb_value is not None: + params[key] = pb_value + + # Create the internal protocol buffer. + super().__setattr__("_pb", self._meta.pb(**params)) + + def _get_pb_type_from_key(self, key): + """Given a key, return the corresponding pb_type. + + Args: + key(str): The name of the field. + + Returns: + A tuple containing a key and pb_type. The pb_type will be + the composite type of the field, or the primitive type if a primitive. + If no corresponding field exists, return None. + """ + + pb_type = None + + try: + pb_type = self._meta.fields[key].pb_type + except KeyError: + # Underscores may be appended to field names + # that collide with python or proto-plus keywords. + # In case a key only exists with a `_` suffix, coerce the key + # to include the `_` suffix. It's not possible to + # natively define the same field with a trailing underscore in protobuf. + # See related issue + # https://github.com/googleapis/python-api-core/issues/227 + if f"{key}_" in self._meta.fields: + key = f"{key}_" + pb_type = self._meta.fields[key].pb_type + + return (key, pb_type) + + def __dir__(self): + desc = type(self).pb().DESCRIPTOR + names = {f_name for f_name in self._meta.fields.keys()} + names.update(m.name for m in desc.nested_types) + names.update(e.name for e in desc.enum_types) + names.update(dir(object())) + # Can't think of a better way of determining + # the special methods than manually listing them. + names.update( + ( + "__bool__", + "__contains__", + "__dict__", + "__getattr__", + "__getstate__", + "__module__", + "__setstate__", + "__weakref__", + ) + ) + + return names + + def __bool__(self): + """Return True if any field is truthy, False otherwise.""" + return any(k in self and getattr(self, k) for k in self._meta.fields.keys()) + + def __contains__(self, key): + """Return True if this field was set to something non-zero on the wire. + + In most cases, this method will return True when ``__getattr__`` + would return a truthy value and False when it would return a falsy + value, so explicitly calling this is not useful. + + The exception case is empty messages explicitly set on the wire, + which are falsy from ``__getattr__``. This method allows to + distinguish between an explicitly provided empty message and the + absence of that message, which is useful in some edge cases. + + The most common edge case is the use of ``google.protobuf.BoolValue`` + to get a boolean that distinguishes between ``False`` and ``None`` + (or the same for a string, int, etc.). This library transparently + handles that case for you, but this method remains available to + accommodate cases not automatically covered. + + Args: + key (str): The name of the field. + + Returns: + bool: Whether the field's value corresponds to a non-empty + wire serialization. + """ + pb_value = getattr(self._pb, key) + try: + # Protocol buffers "HasField" is unfriendly; it only works + # against composite, non-repeated fields, and raises ValueError + # against any repeated field or primitive. + # + # There is no good way to test whether it is valid to provide + # a field to this method, so sadly we are stuck with a + # somewhat inefficient try/except. + return self._pb.HasField(key) + except ValueError: + return bool(pb_value) + + def __delattr__(self, key): + """Delete the value on the given field. + + This is generally equivalent to setting a falsy value. + """ + self._pb.ClearField(key) + + def __eq__(self, other): + """Return True if the messages are equal, False otherwise.""" + # If these are the same type, use internal protobuf's equality check. + if isinstance(other, type(self)): + return self._pb == other._pb + + # If the other type is the target protobuf object, honor that also. + if isinstance(other, self._meta.pb): + return self._pb == other + + # Ask the other object. + return NotImplemented + + def __getattr__(self, key): + """Retrieve the given field's value. + + In protocol buffers, the presence of a field on a message is + sufficient for it to always be "present". + + For primitives, a value of the correct type will always be returned + (the "falsy" values in protocol buffers consistently match those + in Python). For repeated fields, the falsy value is always an empty + sequence. + + For messages, protocol buffers does distinguish between an empty + message and absence, but this distinction is subtle and rarely + relevant. Therefore, this method always returns an empty message + (following the official implementation). To check for message + presence, use ``key in self`` (in other words, ``__contains__``). + + .. note:: + + Some well-known protocol buffer types + (e.g. ``google.protobuf.Timestamp``) will be converted to + their Python equivalents. See the ``marshal`` module for + more details. + """ + (key, pb_type) = self._get_pb_type_from_key(key) + if pb_type is None: + raise AttributeError( + "Unknown field for {}: {}".format(self.__class__.__name__, key) + ) + pb_value = getattr(self._pb, key) + marshal = self._meta.marshal + return marshal.to_python(pb_type, pb_value, absent=key not in self) + + def __ne__(self, other): + """Return True if the messages are unequal, False otherwise.""" + return not self == other + + def __repr__(self): + return repr(self._pb) + + def __setattr__(self, key, value): + """Set the value on the given field. + + For well-known protocol buffer types which are marshalled, either + the protocol buffer object or the Python equivalent is accepted. + """ + if key[0] == "_": + return super().__setattr__(key, value) + marshal = self._meta.marshal + (key, pb_type) = self._get_pb_type_from_key(key) + if pb_type is None: + raise AttributeError( + "Unknown field for {}: {}".format(self.__class__.__name__, key) + ) + + pb_value = marshal.to_proto(pb_type, value) + + # Clear the existing field. + # This is the only way to successfully write nested falsy values, + # because otherwise MergeFrom will no-op on them. + self._pb.ClearField(key) + + # Merge in the value being set. + if pb_value is not None: + self._pb.MergeFrom(self._meta.pb(**{key: pb_value})) + + def __getstate__(self): + """Serialize for pickling.""" + return self._pb.SerializeToString() + + def __setstate__(self, value): + """Deserialization for pickling.""" + new_pb = self._meta.pb().FromString(value) + super().__setattr__("_pb", new_pb) + + +class _MessageInfo: + """Metadata about a message. + + Args: + fields (Tuple[~.fields.Field]): The fields declared on the message. + package (str): The proto package. + full_name (str): The full name of the message. + file_info (~._FileInfo): The file descriptor and messages for the + file containing this message. + marshal (~.Marshal): The marshal instance to which this message was + automatically registered. + options (~.descriptor_pb2.MessageOptions): Any options that were + set on the message. + """ + + def __init__( + self, + *, + fields: List[Field], + package: str, + full_name: str, + marshal: Marshal, + options: descriptor_pb2.MessageOptions, + ) -> None: + self.package = package + self.full_name = full_name + self.options = options + self.fields = collections.OrderedDict((i.name, i) for i in fields) + self.fields_by_number = collections.OrderedDict((i.number, i) for i in fields) + self.marshal = marshal + self._pb = None + + @property + def pb(self) -> Type[message.Message]: + """Return the protobuf message type for this descriptor. + + If a field on the message references another message which has not + loaded, then this method returns None. + """ + return self._pb + + +__all__ = ("Message",) diff --git a/.venv/lib/python3.11/site-packages/proto/modules.py b/.venv/lib/python3.11/site-packages/proto/modules.py new file mode 100644 index 0000000000000000000000000000000000000000..45864a937c589242e12a694263e9e1e2f167c911 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/modules.py @@ -0,0 +1,50 @@ +# Copyright 2019 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from typing import Set +import collections + + +_ProtoModule = collections.namedtuple( + "ProtoModule", + ["package", "marshal", "manifest"], +) + + +def define_module( + *, package: str, marshal: str = None, manifest: Set[str] = frozenset() +) -> _ProtoModule: + """Define a protocol buffers module. + + The settings defined here are used for all protobuf messages + declared in the module of the given name. + + Args: + package (str): The proto package name. + marshal (str): The name of the marshal to use. It is recommended + to use one marshal per Python library (e.g. package on PyPI). + manifest (Set[str]): A set of messages and enums to be created. Setting + this adds a slight efficiency in piecing together proto + descriptors under the hood. + """ + if not marshal: + marshal = package + return _ProtoModule( + package=package, + marshal=marshal, + manifest=frozenset(manifest), + ) + + +__all__ = ("define_module",) diff --git a/.venv/lib/python3.11/site-packages/proto/version.py b/.venv/lib/python3.11/site-packages/proto/version.py new file mode 100644 index 0000000000000000000000000000000000000000..ab2f7680f17272c7799b27455fa289f9b7d1cc48 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/proto/version.py @@ -0,0 +1,15 @@ +# Copyright 2023 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +__version__ = "1.25.0" diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/INSTALLER b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/INSTALLER new file mode 100644 index 0000000000000000000000000000000000000000..a1b589e38a32041e49332e5e81c2d363dc418d68 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/INSTALLER @@ -0,0 +1 @@ +pip diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/METADATA b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/METADATA new file mode 100644 index 0000000000000000000000000000000000000000..c7890cc80428fa948d8313338992427b09f4ef91 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/METADATA @@ -0,0 +1,151 @@ +Metadata-Version: 2.4 +Name: watchfiles +Version: 1.0.4 +Classifier: Development Status :: 5 - Production/Stable +Classifier: Environment :: Console +Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3 :: Only +Classifier: Programming Language :: Python :: 3.9 +Classifier: Programming Language :: Python :: 3.10 +Classifier: Programming Language :: Python :: 3.11 +Classifier: Programming Language :: Python :: 3.12 +Classifier: Programming Language :: Python :: 3.13 +Classifier: Intended Audience :: Developers +Classifier: Intended Audience :: Information Technology +Classifier: Intended Audience :: System Administrators +Classifier: License :: OSI Approved :: MIT License +Classifier: Operating System :: POSIX :: Linux +Classifier: Operating System :: Microsoft :: Windows +Classifier: Operating System :: MacOS +Classifier: Environment :: MacOS X +Classifier: Topic :: Software Development :: Libraries :: Python Modules +Classifier: Topic :: System :: Filesystems +Classifier: Framework :: AnyIO +Requires-Dist: anyio >=3.0.0 +License-File: LICENSE +Summary: Simple, modern and high performance file watching and code reload in python. +Home-Page: https://github.com/samuelcolvin/watchfiles +Author-email: Samuel Colvin +License: MIT +Requires-Python: >=3.9 +Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM +Project-URL: Homepage, https://github.com/samuelcolvin/watchfiles +Project-URL: Documentation, https://watchfiles.helpmanual.io +Project-URL: Funding, https://github.com/sponsors/samuelcolvin +Project-URL: Source, https://github.com/samuelcolvin/watchfiles +Project-URL: Changelog, https://github.com/samuelcolvin/watchfiles/releases + +# watchfiles + +[![CI](https://github.com/samuelcolvin/watchfiles/actions/workflows/ci.yml/badge.svg)](https://github.com/samuelcolvin/watchfiles/actions/workflows/ci.yml?query=branch%3Amain) +[![Coverage](https://codecov.io/gh/samuelcolvin/watchfiles/branch/main/graph/badge.svg)](https://codecov.io/gh/samuelcolvin/watchfiles) +[![pypi](https://img.shields.io/pypi/v/watchfiles.svg)](https://pypi.python.org/pypi/watchfiles) +[![CondaForge](https://img.shields.io/conda/v/conda-forge/watchfiles.svg)](https://anaconda.org/conda-forge/watchfiles) +[![license](https://img.shields.io/github/license/samuelcolvin/watchfiles.svg)](https://github.com/samuelcolvin/watchfiles/blob/main/LICENSE) + +Simple, modern and high performance file watching and code reload in python. + +--- + +**Documentation**: [watchfiles.helpmanual.io](https://watchfiles.helpmanual.io) + +**Source Code**: [github.com/samuelcolvin/watchfiles](https://github.com/samuelcolvin/watchfiles) + +--- + +Underlying file system notifications are handled by the [Notify](https://github.com/notify-rs/notify) rust library. + +This package was previously named "watchgod", +see [the migration guide](https://watchfiles.helpmanual.io/migrating/) for more information. + +## Installation + +**watchfiles** requires Python 3.8 - 3.13. + +```bash +pip install watchfiles +``` + +Binaries are available for: + +* **Linux**: `x86_64`, `aarch64`, `i686`, `armv7l`, `musl-x86_64` & `musl-aarch64` +* **MacOS**: `x86_64` & `arm64` +* **Windows**: `amd64` & `win32` + +Otherwise, you can install from source which requires Rust stable to be installed. + +## Usage + +Here are some examples of what **watchfiles** can do: + +### `watch` Usage + +```py +from watchfiles import watch + +for changes in watch('./path/to/dir'): + print(changes) +``` +See [`watch` docs](https://watchfiles.helpmanual.io/api/watch/#watchfiles.watch) for more details. + +### `awatch` Usage + +```py +import asyncio +from watchfiles import awatch + +async def main(): + async for changes in awatch('/path/to/dir'): + print(changes) + +asyncio.run(main()) +``` +See [`awatch` docs](https://watchfiles.helpmanual.io/api/watch/#watchfiles.awatch) for more details. + +### `run_process` Usage + +```py +from watchfiles import run_process + +def foobar(a, b, c): + ... + +if __name__ == '__main__': + run_process('./path/to/dir', target=foobar, args=(1, 2, 3)) +``` +See [`run_process` docs](https://watchfiles.helpmanual.io/api/run_process/#watchfiles.run_process) for more details. + +### `arun_process` Usage + +```py +import asyncio +from watchfiles import arun_process + +def foobar(a, b, c): + ... + +async def main(): + await arun_process('./path/to/dir', target=foobar, args=(1, 2, 3)) + +if __name__ == '__main__': + asyncio.run(main()) +``` +See [`arun_process` docs](https://watchfiles.helpmanual.io/api/run_process/#watchfiles.arun_process) for more details. + +## CLI + +**watchfiles** also comes with a CLI for running and reloading code. To run `some command` when files in `src` change: + +``` +watchfiles "some command" src +``` + +For more information, see [the CLI docs](https://watchfiles.helpmanual.io/cli/). + +Or run + +```bash +watchfiles --help +``` + diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/RECORD b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/RECORD new file mode 100644 index 0000000000000000000000000000000000000000..0d180c4709be1366f3e4d84ec21ffa91ae62baa0 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/RECORD @@ -0,0 +1,24 @@ +../../../bin/watchfiles,sha256=T_4bwF1wE9wC5AbHCCV8GQWcdcY2Si1Yp8piQh5a01I,229 +watchfiles-1.0.4.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +watchfiles-1.0.4.dist-info/METADATA,sha256=xih5WSXQ5a802bYCaoAoc5NIb0r5z_1z6OifORk_Y38,4863 +watchfiles-1.0.4.dist-info/RECORD,, +watchfiles-1.0.4.dist-info/WHEEL,sha256=eKn-h6LbuPin9BQdctwIkEq1OLRtDcdOVrhrYyXn53g,129 +watchfiles-1.0.4.dist-info/entry_points.txt,sha256=s1Dpa2d_KKBy-jKREWW60Z3GoRZ3JpCEo_9iYDt6hOQ,48 +watchfiles-1.0.4.dist-info/licenses/LICENSE,sha256=Nrb5inpC3jnhTxxutZgxzblMwRsF7q0xyB-4-FHRdQs,1110 +watchfiles/__init__.py,sha256=IRlM9KOSedMzF1fvLr7yEHPVS-UFERNThlB-tmWI8yU,364 +watchfiles/__main__.py,sha256=JgErYkiskih8Y6oRwowALtR-rwQhAAdqOYWjQraRIPI,59 +watchfiles/__pycache__/__init__.cpython-311.pyc,, +watchfiles/__pycache__/__main__.cpython-311.pyc,, +watchfiles/__pycache__/cli.cpython-311.pyc,, +watchfiles/__pycache__/filters.cpython-311.pyc,, +watchfiles/__pycache__/main.cpython-311.pyc,, +watchfiles/__pycache__/run.cpython-311.pyc,, +watchfiles/__pycache__/version.cpython-311.pyc,, +watchfiles/_rust_notify.cpython-311-x86_64-linux-gnu.so,sha256=_6zVrqjIAbFt5Y-yICpd-UcfyPraMCyOceYTBZyetC0,1091064 +watchfiles/_rust_notify.pyi,sha256=q5FQkXgBJEFPt9RCf7my4wP5RM1FwSVpqf221csyebg,4753 +watchfiles/cli.py,sha256=DHMI0LfT7hOrWai_Y4RP_vvTvVdtcDaioixXLiv2pG4,7707 +watchfiles/filters.py,sha256=U0zXGOeg9dMHkT51-56BKpRrWIu95lPq0HDR_ZB4oDE,5139 +watchfiles/main.py,sha256=-pbJBFBA34VEXMt8VGcaPTQHAjsGhPf7Psu1gP_HnKk,15235 +watchfiles/py.typed,sha256=MS4Na3to9VTGPy_8wBQM_6mNKaX4qIpi5-w7_LZB-8I,69 +watchfiles/run.py,sha256=TLXb2y_xYx-t3xyszVQWHoGyG7RCb107Q0NoIcSWmjQ,15348 +watchfiles/version.py,sha256=NRWUnkZ32DamsNKV20EetagIGTLDMMUnqDWVGFFA2WQ,85 diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/WHEEL b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/WHEEL new file mode 100644 index 0000000000000000000000000000000000000000..92e13e7e517d7051eefe3e816f4f4d108f5cfb88 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/WHEEL @@ -0,0 +1,4 @@ +Wheel-Version: 1.0 +Generator: maturin (1.8.1) +Root-Is-Purelib: false +Tag: cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64 diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/entry_points.txt b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/entry_points.txt new file mode 100644 index 0000000000000000000000000000000000000000..51642969b76b6d4a8c0e9437a0ddae58772e835b --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/entry_points.txt @@ -0,0 +1,2 @@ +[console_scripts] +watchfiles=watchfiles.cli:cli diff --git a/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/licenses/LICENSE b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/licenses/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..08c9a8d3da5201dab55465b39ca855c978d823b1 --- /dev/null +++ b/.venv/lib/python3.11/site-packages/watchfiles-1.0.4.dist-info/licenses/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2017, 2018, 2019, 2020, 2021, 2022 Samuel Colvin + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE.