prasb commited on
Commit
c2b1c6d
·
verified ·
1 Parent(s): e9bd6b7

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/extract.cpython-38.pyc +0 -0
  2. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/s3.cpython-38.pyc +0 -0
  3. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/transmute.cpython-38.pyc +0 -0
  4. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/__init__.cpython-38.pyc +0 -0
  5. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/_einmix.cpython-38.pyc +0 -0
  6. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/chainer.cpython-38.pyc +0 -0
  7. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/gluon.cpython-38.pyc +0 -0
  8. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/tensorflow.cpython-38.pyc +0 -0
  9. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/torch.cpython-38.pyc +0 -0
  10. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/keras.py +9 -0
  11. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/__init__.py +29 -0
  12. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_default.py +616 -0
  13. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_default_async.py +284 -0
  14. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_jwt_async.py +164 -0
  15. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_oauth2client.py +169 -0
  16. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/aws.py +778 -0
  17. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/credentials.py +362 -0
  18. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/downscoped.py +501 -0
  19. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/environment_vars.py +80 -0
  20. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/exceptions.py +63 -0
  21. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/external_account.py +470 -0
  22. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/iam.py +100 -0
  23. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/identity_pool.py +287 -0
  24. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/impersonated_credentials.py +436 -0
  25. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/pluggable.py +322 -0
  26. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/__init__.py +33 -0
  27. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/any_pb2.py +26 -0
  28. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/api_pb2.py +32 -0
  29. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/descriptor_database.py +177 -0
  30. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/descriptor_pb2.py +0 -0
  31. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/duration_pb2.py +26 -0
  32. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/json_format.py +912 -0
  33. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/message.py +424 -0
  34. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/proto_builder.py +134 -0
  35. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/reflection.py +95 -0
  36. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/source_context_pb2.py +26 -0
  37. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/text_encoding.py +110 -0
  38. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/timestamp_pb2.py +26 -0
  39. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/type_pb2.py +42 -0
  40. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/wrappers_pb2.py +42 -0
  41. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_utils/__init__.py +0 -0
  42. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/__pycache__/__init__.cpython-38.pyc +0 -0
  43. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/__init__.py +0 -0
  44. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/__pycache__/__init__.cpython-38.pyc +0 -0
  45. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__init__.py +0 -0
  46. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/get_model_metadata_pb2.cpython-38.pyc +0 -0
  47. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/inference_pb2.cpython-38.pyc +0 -0
  48. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_management_pb2.cpython-38.pyc +0 -0
  49. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_pb2.cpython-38.pyc +0 -0
  50. my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_service_pb2.cpython-38.pyc +0 -0
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/extract.cpython-38.pyc ADDED
Binary file (2.38 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/s3.cpython-38.pyc ADDED
Binary file (3.09 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/conda_package_streaming/__pycache__/transmute.cpython-38.pyc ADDED
Binary file (4.55 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/__init__.cpython-38.pyc ADDED
Binary file (3.28 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/_einmix.cpython-38.pyc ADDED
Binary file (7.49 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/chainer.cpython-38.pyc ADDED
Binary file (2.12 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/gluon.cpython-38.pyc ADDED
Binary file (2.23 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/tensorflow.cpython-38.pyc ADDED
Binary file (3.81 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/__pycache__/torch.cpython-38.pyc ADDED
Binary file (2.48 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/einops/layers/keras.py ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ __author__ = 'Alex Rogozhnikov'
2
+
3
+ from ..layers.tensorflow import Rearrange, Reduce, EinMix
4
+
5
+ keras_custom_objects = {
6
+ Rearrange.__name__: Rearrange,
7
+ Reduce.__name__: Reduce,
8
+ EinMix.__name__: EinMix,
9
+ }
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/__init__.py ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2016 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Google Auth Library for Python."""
16
+
17
+ import logging
18
+
19
+ from google.auth import version as google_auth_version
20
+ from google.auth._default import default, load_credentials_from_file
21
+
22
+
23
+ __version__ = google_auth_version.__version__
24
+
25
+
26
+ __all__ = ["default", "load_credentials_from_file"]
27
+
28
+ # Set default logging handler to avoid "No handler found" warnings.
29
+ logging.getLogger(__name__).addHandler(logging.NullHandler())
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_default.py ADDED
@@ -0,0 +1,616 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2015 Google Inc.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Application default credentials.
16
+
17
+ Implements application default credentials and project ID detection.
18
+ """
19
+
20
+ import io
21
+ import json
22
+ import logging
23
+ import os
24
+ import warnings
25
+
26
+ import six
27
+
28
+ from google.auth import environment_vars
29
+ from google.auth import exceptions
30
+ import google.auth.transport._http_client
31
+
32
+ _LOGGER = logging.getLogger(__name__)
33
+
34
+ # Valid types accepted for file-based credentials.
35
+ _AUTHORIZED_USER_TYPE = "authorized_user"
36
+ _SERVICE_ACCOUNT_TYPE = "service_account"
37
+ _EXTERNAL_ACCOUNT_TYPE = "external_account"
38
+ _IMPERSONATED_SERVICE_ACCOUNT_TYPE = "impersonated_service_account"
39
+ _GDCH_SERVICE_ACCOUNT_TYPE = "gdch_service_account"
40
+ _VALID_TYPES = (
41
+ _AUTHORIZED_USER_TYPE,
42
+ _SERVICE_ACCOUNT_TYPE,
43
+ _EXTERNAL_ACCOUNT_TYPE,
44
+ _IMPERSONATED_SERVICE_ACCOUNT_TYPE,
45
+ _GDCH_SERVICE_ACCOUNT_TYPE,
46
+ )
47
+
48
+ # Help message when no credentials can be found.
49
+ _HELP_MESSAGE = """\
50
+ Could not automatically determine credentials. Please set {env} or \
51
+ explicitly create credentials and re-run the application. For more \
52
+ information, please see \
53
+ https://cloud.google.com/docs/authentication/getting-started
54
+ """.format(
55
+ env=environment_vars.CREDENTIALS
56
+ ).strip()
57
+
58
+ # Warning when using Cloud SDK user credentials
59
+ _CLOUD_SDK_CREDENTIALS_WARNING = """\
60
+ Your application has authenticated using end user credentials from Google \
61
+ Cloud SDK without a quota project. You might receive a "quota exceeded" \
62
+ or "API not enabled" error. We recommend you rerun \
63
+ `gcloud auth application-default login` and make sure a quota project is \
64
+ added. Or you can use service accounts instead. For more information \
65
+ about service accounts, see https://cloud.google.com/docs/authentication/"""
66
+
67
+ # The subject token type used for AWS external_account credentials.
68
+ _AWS_SUBJECT_TOKEN_TYPE = "urn:ietf:params:aws:token-type:aws4_request"
69
+
70
+
71
+ def _warn_about_problematic_credentials(credentials):
72
+ """Determines if the credentials are problematic.
73
+
74
+ Credentials from the Cloud SDK that are associated with Cloud SDK's project
75
+ are problematic because they may not have APIs enabled and have limited
76
+ quota. If this is the case, warn about it.
77
+ """
78
+ from google.auth import _cloud_sdk
79
+
80
+ if credentials.client_id == _cloud_sdk.CLOUD_SDK_CLIENT_ID:
81
+ warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING)
82
+
83
+
84
+ def load_credentials_from_file(
85
+ filename, scopes=None, default_scopes=None, quota_project_id=None, request=None
86
+ ):
87
+ """Loads Google credentials from a file.
88
+
89
+ The credentials file must be a service account key, stored authorized
90
+ user credentials, external account credentials, or impersonated service
91
+ account credentials.
92
+
93
+ Args:
94
+ filename (str): The full path to the credentials file.
95
+ scopes (Optional[Sequence[str]]): The list of scopes for the credentials. If
96
+ specified, the credentials will automatically be scoped if
97
+ necessary
98
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
99
+ Google client library. Use 'scopes' for user-defined scopes.
100
+ quota_project_id (Optional[str]): The project ID used for
101
+ quota and billing.
102
+ request (Optional[google.auth.transport.Request]): An object used to make
103
+ HTTP requests. This is used to determine the associated project ID
104
+ for a workload identity pool resource (external account credentials).
105
+ If not specified, then it will use a
106
+ google.auth.transport.requests.Request client to make requests.
107
+
108
+ Returns:
109
+ Tuple[google.auth.credentials.Credentials, Optional[str]]: Loaded
110
+ credentials and the project ID. Authorized user credentials do not
111
+ have the project ID information. External account credentials project
112
+ IDs may not always be determined.
113
+
114
+ Raises:
115
+ google.auth.exceptions.DefaultCredentialsError: if the file is in the
116
+ wrong format or is missing.
117
+ """
118
+ if not os.path.exists(filename):
119
+ raise exceptions.DefaultCredentialsError(
120
+ "File {} was not found.".format(filename)
121
+ )
122
+
123
+ with io.open(filename, "r") as file_obj:
124
+ try:
125
+ info = json.load(file_obj)
126
+ except ValueError as caught_exc:
127
+ new_exc = exceptions.DefaultCredentialsError(
128
+ "File {} is not a valid json file.".format(filename), caught_exc
129
+ )
130
+ six.raise_from(new_exc, caught_exc)
131
+ return _load_credentials_from_info(
132
+ filename, info, scopes, default_scopes, quota_project_id, request
133
+ )
134
+
135
+
136
+ def _load_credentials_from_info(
137
+ filename, info, scopes, default_scopes, quota_project_id, request
138
+ ):
139
+ from google.auth.credentials import CredentialsWithQuotaProject
140
+
141
+ credential_type = info.get("type")
142
+
143
+ if credential_type == _AUTHORIZED_USER_TYPE:
144
+ credentials, project_id = _get_authorized_user_credentials(
145
+ filename, info, scopes
146
+ )
147
+
148
+ elif credential_type == _SERVICE_ACCOUNT_TYPE:
149
+ credentials, project_id = _get_service_account_credentials(
150
+ filename, info, scopes, default_scopes
151
+ )
152
+
153
+ elif credential_type == _EXTERNAL_ACCOUNT_TYPE:
154
+ credentials, project_id = _get_external_account_credentials(
155
+ info,
156
+ filename,
157
+ scopes=scopes,
158
+ default_scopes=default_scopes,
159
+ request=request,
160
+ )
161
+ elif credential_type == _IMPERSONATED_SERVICE_ACCOUNT_TYPE:
162
+ credentials, project_id = _get_impersonated_service_account_credentials(
163
+ filename, info, scopes
164
+ )
165
+ elif credential_type == _GDCH_SERVICE_ACCOUNT_TYPE:
166
+ credentials, project_id = _get_gdch_service_account_credentials(filename, info)
167
+ else:
168
+ raise exceptions.DefaultCredentialsError(
169
+ "The file {file} does not have a valid type. "
170
+ "Type is {type}, expected one of {valid_types}.".format(
171
+ file=filename, type=credential_type, valid_types=_VALID_TYPES
172
+ )
173
+ )
174
+ if isinstance(credentials, CredentialsWithQuotaProject):
175
+ credentials = _apply_quota_project_id(credentials, quota_project_id)
176
+ return credentials, project_id
177
+
178
+
179
+ def _get_gcloud_sdk_credentials(quota_project_id=None):
180
+ """Gets the credentials and project ID from the Cloud SDK."""
181
+ from google.auth import _cloud_sdk
182
+
183
+ _LOGGER.debug("Checking Cloud SDK credentials as part of auth process...")
184
+
185
+ # Check if application default credentials exist.
186
+ credentials_filename = _cloud_sdk.get_application_default_credentials_path()
187
+
188
+ if not os.path.isfile(credentials_filename):
189
+ _LOGGER.debug("Cloud SDK credentials not found on disk; not using them")
190
+ return None, None
191
+
192
+ credentials, project_id = load_credentials_from_file(
193
+ credentials_filename, quota_project_id=quota_project_id
194
+ )
195
+
196
+ if not project_id:
197
+ project_id = _cloud_sdk.get_project_id()
198
+
199
+ return credentials, project_id
200
+
201
+
202
+ def _get_explicit_environ_credentials(quota_project_id=None):
203
+ """Gets credentials from the GOOGLE_APPLICATION_CREDENTIALS environment
204
+ variable."""
205
+ from google.auth import _cloud_sdk
206
+
207
+ cloud_sdk_adc_path = _cloud_sdk.get_application_default_credentials_path()
208
+ explicit_file = os.environ.get(environment_vars.CREDENTIALS)
209
+
210
+ _LOGGER.debug(
211
+ "Checking %s for explicit credentials as part of auth process...", explicit_file
212
+ )
213
+
214
+ if explicit_file is not None and explicit_file == cloud_sdk_adc_path:
215
+ # Cloud sdk flow calls gcloud to fetch project id, so if the explicit
216
+ # file path is cloud sdk credentials path, then we should fall back
217
+ # to cloud sdk flow, otherwise project id cannot be obtained.
218
+ _LOGGER.debug(
219
+ "Explicit credentials path %s is the same as Cloud SDK credentials path, fall back to Cloud SDK credentials flow...",
220
+ explicit_file,
221
+ )
222
+ return _get_gcloud_sdk_credentials(quota_project_id=quota_project_id)
223
+
224
+ if explicit_file is not None:
225
+ credentials, project_id = load_credentials_from_file(
226
+ os.environ[environment_vars.CREDENTIALS], quota_project_id=quota_project_id
227
+ )
228
+
229
+ return credentials, project_id
230
+
231
+ else:
232
+ return None, None
233
+
234
+
235
+ def _get_gae_credentials():
236
+ """Gets Google App Engine App Identity credentials and project ID."""
237
+ # If not GAE gen1, prefer the metadata service even if the GAE APIs are
238
+ # available as per https://google.aip.dev/auth/4115.
239
+ if os.environ.get(environment_vars.LEGACY_APPENGINE_RUNTIME) != "python27":
240
+ return None, None
241
+
242
+ # While this library is normally bundled with app_engine, there are
243
+ # some cases where it's not available, so we tolerate ImportError.
244
+ try:
245
+ _LOGGER.debug("Checking for App Engine runtime as part of auth process...")
246
+ import google.auth.app_engine as app_engine
247
+ except ImportError:
248
+ _LOGGER.warning("Import of App Engine auth library failed.")
249
+ return None, None
250
+
251
+ try:
252
+ credentials = app_engine.Credentials()
253
+ project_id = app_engine.get_project_id()
254
+ return credentials, project_id
255
+ except EnvironmentError:
256
+ _LOGGER.debug(
257
+ "No App Engine library was found so cannot authentication via App Engine Identity Credentials."
258
+ )
259
+ return None, None
260
+
261
+
262
+ def _get_gce_credentials(request=None):
263
+ """Gets credentials and project ID from the GCE Metadata Service."""
264
+ # Ping requires a transport, but we want application default credentials
265
+ # to require no arguments. So, we'll use the _http_client transport which
266
+ # uses http.client. This is only acceptable because the metadata server
267
+ # doesn't do SSL and never requires proxies.
268
+
269
+ # While this library is normally bundled with compute_engine, there are
270
+ # some cases where it's not available, so we tolerate ImportError.
271
+ try:
272
+ from google.auth import compute_engine
273
+ from google.auth.compute_engine import _metadata
274
+ except ImportError:
275
+ _LOGGER.warning("Import of Compute Engine auth library failed.")
276
+ return None, None
277
+
278
+ if request is None:
279
+ request = google.auth.transport._http_client.Request()
280
+
281
+ if _metadata.ping(request=request):
282
+ # Get the project ID.
283
+ try:
284
+ project_id = _metadata.get_project_id(request=request)
285
+ except exceptions.TransportError:
286
+ project_id = None
287
+
288
+ return compute_engine.Credentials(), project_id
289
+ else:
290
+ _LOGGER.warning(
291
+ "Authentication failed using Compute Engine authentication due to unavailable metadata server."
292
+ )
293
+ return None, None
294
+
295
+
296
+ def _get_external_account_credentials(
297
+ info, filename, scopes=None, default_scopes=None, request=None
298
+ ):
299
+ """Loads external account Credentials from the parsed external account info.
300
+
301
+ The credentials information must correspond to a supported external account
302
+ credentials.
303
+
304
+ Args:
305
+ info (Mapping[str, str]): The external account info in Google format.
306
+ filename (str): The full path to the credentials file.
307
+ scopes (Optional[Sequence[str]]): The list of scopes for the credentials. If
308
+ specified, the credentials will automatically be scoped if
309
+ necessary.
310
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
311
+ Google client library. Use 'scopes' for user-defined scopes.
312
+ request (Optional[google.auth.transport.Request]): An object used to make
313
+ HTTP requests. This is used to determine the associated project ID
314
+ for a workload identity pool resource (external account credentials).
315
+ If not specified, then it will use a
316
+ google.auth.transport.requests.Request client to make requests.
317
+
318
+ Returns:
319
+ Tuple[google.auth.credentials.Credentials, Optional[str]]: Loaded
320
+ credentials and the project ID. External account credentials project
321
+ IDs may not always be determined.
322
+
323
+ Raises:
324
+ google.auth.exceptions.DefaultCredentialsError: if the info dictionary
325
+ is in the wrong format or is missing required information.
326
+ """
327
+ # There are currently 3 types of external_account credentials.
328
+ if info.get("subject_token_type") == _AWS_SUBJECT_TOKEN_TYPE:
329
+ # Check if configuration corresponds to an AWS credentials.
330
+ from google.auth import aws
331
+
332
+ credentials = aws.Credentials.from_info(
333
+ info, scopes=scopes, default_scopes=default_scopes
334
+ )
335
+ elif (
336
+ info.get("credential_source") is not None
337
+ and info.get("credential_source").get("executable") is not None
338
+ ):
339
+ from google.auth import pluggable
340
+
341
+ credentials = pluggable.Credentials.from_info(
342
+ info, scopes=scopes, default_scopes=default_scopes
343
+ )
344
+ else:
345
+ try:
346
+ # Check if configuration corresponds to an Identity Pool credentials.
347
+ from google.auth import identity_pool
348
+
349
+ credentials = identity_pool.Credentials.from_info(
350
+ info, scopes=scopes, default_scopes=default_scopes
351
+ )
352
+ except ValueError:
353
+ # If the configuration is invalid or does not correspond to any
354
+ # supported external_account credentials, raise an error.
355
+ raise exceptions.DefaultCredentialsError(
356
+ "Failed to load external account credentials from {}".format(filename)
357
+ )
358
+ if request is None:
359
+ import google.auth.transport.requests
360
+
361
+ request = google.auth.transport.requests.Request()
362
+
363
+ return credentials, credentials.get_project_id(request=request)
364
+
365
+
366
+ def _get_authorized_user_credentials(filename, info, scopes=None):
367
+ from google.oauth2 import credentials
368
+
369
+ try:
370
+ credentials = credentials.Credentials.from_authorized_user_info(
371
+ info, scopes=scopes
372
+ )
373
+ except ValueError as caught_exc:
374
+ msg = "Failed to load authorized user credentials from {}".format(filename)
375
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
376
+ six.raise_from(new_exc, caught_exc)
377
+ return credentials, None
378
+
379
+
380
+ def _get_service_account_credentials(filename, info, scopes=None, default_scopes=None):
381
+ from google.oauth2 import service_account
382
+
383
+ try:
384
+ credentials = service_account.Credentials.from_service_account_info(
385
+ info, scopes=scopes, default_scopes=default_scopes
386
+ )
387
+ except ValueError as caught_exc:
388
+ msg = "Failed to load service account credentials from {}".format(filename)
389
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
390
+ six.raise_from(new_exc, caught_exc)
391
+ return credentials, info.get("project_id")
392
+
393
+
394
+ def _get_impersonated_service_account_credentials(filename, info, scopes):
395
+ from google.auth import impersonated_credentials
396
+
397
+ try:
398
+ source_credentials_info = info.get("source_credentials")
399
+ source_credentials_type = source_credentials_info.get("type")
400
+ if source_credentials_type == _AUTHORIZED_USER_TYPE:
401
+ source_credentials, _ = _get_authorized_user_credentials(
402
+ filename, source_credentials_info
403
+ )
404
+ elif source_credentials_type == _SERVICE_ACCOUNT_TYPE:
405
+ source_credentials, _ = _get_service_account_credentials(
406
+ filename, source_credentials_info
407
+ )
408
+ else:
409
+ raise ValueError(
410
+ "source credential of type {} is not supported.".format(
411
+ source_credentials_type
412
+ )
413
+ )
414
+ impersonation_url = info.get("service_account_impersonation_url")
415
+ start_index = impersonation_url.rfind("/")
416
+ end_index = impersonation_url.find(":generateAccessToken")
417
+ if start_index == -1 or end_index == -1 or start_index > end_index:
418
+ raise ValueError(
419
+ "Cannot extract target principal from {}".format(impersonation_url)
420
+ )
421
+ target_principal = impersonation_url[start_index + 1 : end_index]
422
+ delegates = info.get("delegates")
423
+ quota_project_id = info.get("quota_project_id")
424
+ credentials = impersonated_credentials.Credentials(
425
+ source_credentials,
426
+ target_principal,
427
+ scopes,
428
+ delegates,
429
+ quota_project_id=quota_project_id,
430
+ )
431
+ except ValueError as caught_exc:
432
+ msg = "Failed to load impersonated service account credentials from {}".format(
433
+ filename
434
+ )
435
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
436
+ six.raise_from(new_exc, caught_exc)
437
+ return credentials, None
438
+
439
+
440
+ def _get_gdch_service_account_credentials(filename, info):
441
+ from google.oauth2 import gdch_credentials
442
+
443
+ try:
444
+ credentials = gdch_credentials.ServiceAccountCredentials.from_service_account_info(
445
+ info
446
+ )
447
+ except ValueError as caught_exc:
448
+ msg = "Failed to load GDCH service account credentials from {}".format(filename)
449
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
450
+ six.raise_from(new_exc, caught_exc)
451
+ return credentials, info.get("project")
452
+
453
+
454
+ def _apply_quota_project_id(credentials, quota_project_id):
455
+ if quota_project_id:
456
+ credentials = credentials.with_quota_project(quota_project_id)
457
+
458
+ from google.oauth2 import credentials as authorized_user_credentials
459
+
460
+ if isinstance(credentials, authorized_user_credentials.Credentials) and (
461
+ not credentials.quota_project_id
462
+ ):
463
+ _warn_about_problematic_credentials(credentials)
464
+ return credentials
465
+
466
+
467
+ def default(scopes=None, request=None, quota_project_id=None, default_scopes=None):
468
+ """Gets the default credentials for the current environment.
469
+
470
+ `Application Default Credentials`_ provides an easy way to obtain
471
+ credentials to call Google APIs for server-to-server or local applications.
472
+ This function acquires credentials from the environment in the following
473
+ order:
474
+
475
+ 1. If the environment variable ``GOOGLE_APPLICATION_CREDENTIALS`` is set
476
+ to the path of a valid service account JSON private key file, then it is
477
+ loaded and returned. The project ID returned is the project ID defined
478
+ in the service account file if available (some older files do not
479
+ contain project ID information).
480
+
481
+ If the environment variable is set to the path of a valid external
482
+ account JSON configuration file (workload identity federation), then the
483
+ configuration file is used to determine and retrieve the external
484
+ credentials from the current environment (AWS, Azure, etc).
485
+ These will then be exchanged for Google access tokens via the Google STS
486
+ endpoint.
487
+ The project ID returned in this case is the one corresponding to the
488
+ underlying workload identity pool resource if determinable.
489
+
490
+ If the environment variable is set to the path of a valid GDCH service
491
+ account JSON file (`Google Distributed Cloud Hosted`_), then a GDCH
492
+ credential will be returned. The project ID returned is the project
493
+ specified in the JSON file.
494
+ 2. If the `Google Cloud SDK`_ is installed and has application default
495
+ credentials set they are loaded and returned.
496
+
497
+ To enable application default credentials with the Cloud SDK run::
498
+
499
+ gcloud auth application-default login
500
+
501
+ If the Cloud SDK has an active project, the project ID is returned. The
502
+ active project can be set using::
503
+
504
+ gcloud config set project
505
+
506
+ 3. If the application is running in the `App Engine standard environment`_
507
+ (first generation) then the credentials and project ID from the
508
+ `App Identity Service`_ are used.
509
+ 4. If the application is running in `Compute Engine`_ or `Cloud Run`_ or
510
+ the `App Engine flexible environment`_ or the `App Engine standard
511
+ environment`_ (second generation) then the credentials and project ID
512
+ are obtained from the `Metadata Service`_.
513
+ 5. If no credentials are found,
514
+ :class:`~google.auth.exceptions.DefaultCredentialsError` will be raised.
515
+
516
+ .. _Application Default Credentials: https://developers.google.com\
517
+ /identity/protocols/application-default-credentials
518
+ .. _Google Cloud SDK: https://cloud.google.com/sdk
519
+ .. _App Engine standard environment: https://cloud.google.com/appengine
520
+ .. _App Identity Service: https://cloud.google.com/appengine/docs/python\
521
+ /appidentity/
522
+ .. _Compute Engine: https://cloud.google.com/compute
523
+ .. _App Engine flexible environment: https://cloud.google.com\
524
+ /appengine/flexible
525
+ .. _Metadata Service: https://cloud.google.com/compute/docs\
526
+ /storing-retrieving-metadata
527
+ .. _Cloud Run: https://cloud.google.com/run
528
+ .. _Google Distributed Cloud Hosted: https://cloud.google.com/blog/topics\
529
+ /hybrid-cloud/announcing-google-distributed-cloud-edge-and-hosted
530
+
531
+ Example::
532
+
533
+ import google.auth
534
+
535
+ credentials, project_id = google.auth.default()
536
+
537
+ Args:
538
+ scopes (Sequence[str]): The list of scopes for the credentials. If
539
+ specified, the credentials will automatically be scoped if
540
+ necessary.
541
+ request (Optional[google.auth.transport.Request]): An object used to make
542
+ HTTP requests. This is used to either detect whether the application
543
+ is running on Compute Engine or to determine the associated project
544
+ ID for a workload identity pool resource (external account
545
+ credentials). If not specified, then it will either use the standard
546
+ library http client to make requests for Compute Engine credentials
547
+ or a google.auth.transport.requests.Request client for external
548
+ account credentials.
549
+ quota_project_id (Optional[str]): The project ID used for
550
+ quota and billing.
551
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
552
+ Google client library. Use 'scopes' for user-defined scopes.
553
+ Returns:
554
+ Tuple[~google.auth.credentials.Credentials, Optional[str]]:
555
+ the current environment's credentials and project ID. Project ID
556
+ may be None, which indicates that the Project ID could not be
557
+ ascertained from the environment.
558
+
559
+ Raises:
560
+ ~google.auth.exceptions.DefaultCredentialsError:
561
+ If no credentials were found, or if the credentials found were
562
+ invalid.
563
+ """
564
+ from google.auth.credentials import with_scopes_if_required
565
+ from google.auth.credentials import CredentialsWithQuotaProject
566
+
567
+ explicit_project_id = os.environ.get(
568
+ environment_vars.PROJECT, os.environ.get(environment_vars.LEGACY_PROJECT)
569
+ )
570
+
571
+ checkers = (
572
+ # Avoid passing scopes here to prevent passing scopes to user credentials.
573
+ # with_scopes_if_required() below will ensure scopes/default scopes are
574
+ # safely set on the returned credentials since requires_scopes will
575
+ # guard against setting scopes on user credentials.
576
+ lambda: _get_explicit_environ_credentials(quota_project_id=quota_project_id),
577
+ lambda: _get_gcloud_sdk_credentials(quota_project_id=quota_project_id),
578
+ _get_gae_credentials,
579
+ lambda: _get_gce_credentials(request),
580
+ )
581
+
582
+ for checker in checkers:
583
+ credentials, project_id = checker()
584
+ if credentials is not None:
585
+ credentials = with_scopes_if_required(
586
+ credentials, scopes, default_scopes=default_scopes
587
+ )
588
+
589
+ # For external account credentials, scopes are required to determine
590
+ # the project ID. Try to get the project ID again if not yet
591
+ # determined.
592
+ if not project_id and callable(
593
+ getattr(credentials, "get_project_id", None)
594
+ ):
595
+ if request is None:
596
+ import google.auth.transport.requests
597
+
598
+ request = google.auth.transport.requests.Request()
599
+ project_id = credentials.get_project_id(request=request)
600
+
601
+ if quota_project_id and isinstance(
602
+ credentials, CredentialsWithQuotaProject
603
+ ):
604
+ credentials = credentials.with_quota_project(quota_project_id)
605
+
606
+ effective_project_id = explicit_project_id or project_id
607
+ if not effective_project_id:
608
+ _LOGGER.warning(
609
+ "No project ID could be determined. Consider running "
610
+ "`gcloud config set project` or setting the %s "
611
+ "environment variable",
612
+ environment_vars.PROJECT,
613
+ )
614
+ return credentials, effective_project_id
615
+
616
+ raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_default_async.py ADDED
@@ -0,0 +1,284 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2020 Google Inc.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Application default credentials.
16
+
17
+ Implements application default credentials and project ID detection.
18
+ """
19
+
20
+ import io
21
+ import json
22
+ import os
23
+
24
+ import six
25
+
26
+ from google.auth import _default
27
+ from google.auth import environment_vars
28
+ from google.auth import exceptions
29
+
30
+
31
+ def load_credentials_from_file(filename, scopes=None, quota_project_id=None):
32
+ """Loads Google credentials from a file.
33
+
34
+ The credentials file must be a service account key or stored authorized
35
+ user credentials.
36
+
37
+ Args:
38
+ filename (str): The full path to the credentials file.
39
+ scopes (Optional[Sequence[str]]): The list of scopes for the credentials. If
40
+ specified, the credentials will automatically be scoped if
41
+ necessary
42
+ quota_project_id (Optional[str]): The project ID used for
43
+ quota and billing.
44
+
45
+ Returns:
46
+ Tuple[google.auth.credentials.Credentials, Optional[str]]: Loaded
47
+ credentials and the project ID. Authorized user credentials do not
48
+ have the project ID information.
49
+
50
+ Raises:
51
+ google.auth.exceptions.DefaultCredentialsError: if the file is in the
52
+ wrong format or is missing.
53
+ """
54
+ if not os.path.exists(filename):
55
+ raise exceptions.DefaultCredentialsError(
56
+ "File {} was not found.".format(filename)
57
+ )
58
+
59
+ with io.open(filename, "r") as file_obj:
60
+ try:
61
+ info = json.load(file_obj)
62
+ except ValueError as caught_exc:
63
+ new_exc = exceptions.DefaultCredentialsError(
64
+ "File {} is not a valid json file.".format(filename), caught_exc
65
+ )
66
+ six.raise_from(new_exc, caught_exc)
67
+
68
+ # The type key should indicate that the file is either a service account
69
+ # credentials file or an authorized user credentials file.
70
+ credential_type = info.get("type")
71
+
72
+ if credential_type == _default._AUTHORIZED_USER_TYPE:
73
+ from google.oauth2 import _credentials_async as credentials
74
+
75
+ try:
76
+ credentials = credentials.Credentials.from_authorized_user_info(
77
+ info, scopes=scopes
78
+ )
79
+ except ValueError as caught_exc:
80
+ msg = "Failed to load authorized user credentials from {}".format(filename)
81
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
82
+ six.raise_from(new_exc, caught_exc)
83
+ if quota_project_id:
84
+ credentials = credentials.with_quota_project(quota_project_id)
85
+ if not credentials.quota_project_id:
86
+ _default._warn_about_problematic_credentials(credentials)
87
+ return credentials, None
88
+
89
+ elif credential_type == _default._SERVICE_ACCOUNT_TYPE:
90
+ from google.oauth2 import _service_account_async as service_account
91
+
92
+ try:
93
+ credentials = service_account.Credentials.from_service_account_info(
94
+ info, scopes=scopes
95
+ ).with_quota_project(quota_project_id)
96
+ except ValueError as caught_exc:
97
+ msg = "Failed to load service account credentials from {}".format(filename)
98
+ new_exc = exceptions.DefaultCredentialsError(msg, caught_exc)
99
+ six.raise_from(new_exc, caught_exc)
100
+ return credentials, info.get("project_id")
101
+
102
+ else:
103
+ raise exceptions.DefaultCredentialsError(
104
+ "The file {file} does not have a valid type. "
105
+ "Type is {type}, expected one of {valid_types}.".format(
106
+ file=filename, type=credential_type, valid_types=_default._VALID_TYPES
107
+ )
108
+ )
109
+
110
+
111
+ def _get_gcloud_sdk_credentials(quota_project_id=None):
112
+ """Gets the credentials and project ID from the Cloud SDK."""
113
+ from google.auth import _cloud_sdk
114
+
115
+ # Check if application default credentials exist.
116
+ credentials_filename = _cloud_sdk.get_application_default_credentials_path()
117
+
118
+ if not os.path.isfile(credentials_filename):
119
+ return None, None
120
+
121
+ credentials, project_id = load_credentials_from_file(
122
+ credentials_filename, quota_project_id=quota_project_id
123
+ )
124
+
125
+ if not project_id:
126
+ project_id = _cloud_sdk.get_project_id()
127
+
128
+ return credentials, project_id
129
+
130
+
131
+ def _get_explicit_environ_credentials(quota_project_id=None):
132
+ """Gets credentials from the GOOGLE_APPLICATION_CREDENTIALS environment
133
+ variable."""
134
+ from google.auth import _cloud_sdk
135
+
136
+ cloud_sdk_adc_path = _cloud_sdk.get_application_default_credentials_path()
137
+ explicit_file = os.environ.get(environment_vars.CREDENTIALS)
138
+
139
+ if explicit_file is not None and explicit_file == cloud_sdk_adc_path:
140
+ # Cloud sdk flow calls gcloud to fetch project id, so if the explicit
141
+ # file path is cloud sdk credentials path, then we should fall back
142
+ # to cloud sdk flow, otherwise project id cannot be obtained.
143
+ return _get_gcloud_sdk_credentials(quota_project_id=quota_project_id)
144
+
145
+ if explicit_file is not None:
146
+ credentials, project_id = load_credentials_from_file(
147
+ os.environ[environment_vars.CREDENTIALS], quota_project_id=quota_project_id
148
+ )
149
+
150
+ return credentials, project_id
151
+
152
+ else:
153
+ return None, None
154
+
155
+
156
+ def _get_gae_credentials():
157
+ """Gets Google App Engine App Identity credentials and project ID."""
158
+ # While this library is normally bundled with app_engine, there are
159
+ # some cases where it's not available, so we tolerate ImportError.
160
+
161
+ return _default._get_gae_credentials()
162
+
163
+
164
+ def _get_gce_credentials(request=None):
165
+ """Gets credentials and project ID from the GCE Metadata Service."""
166
+ # Ping requires a transport, but we want application default credentials
167
+ # to require no arguments. So, we'll use the _http_client transport which
168
+ # uses http.client. This is only acceptable because the metadata server
169
+ # doesn't do SSL and never requires proxies.
170
+
171
+ # While this library is normally bundled with compute_engine, there are
172
+ # some cases where it's not available, so we tolerate ImportError.
173
+
174
+ return _default._get_gce_credentials(request)
175
+
176
+
177
+ def default_async(scopes=None, request=None, quota_project_id=None):
178
+ """Gets the default credentials for the current environment.
179
+
180
+ `Application Default Credentials`_ provides an easy way to obtain
181
+ credentials to call Google APIs for server-to-server or local applications.
182
+ This function acquires credentials from the environment in the following
183
+ order:
184
+
185
+ 1. If the environment variable ``GOOGLE_APPLICATION_CREDENTIALS`` is set
186
+ to the path of a valid service account JSON private key file, then it is
187
+ loaded and returned. The project ID returned is the project ID defined
188
+ in the service account file if available (some older files do not
189
+ contain project ID information).
190
+ 2. If the `Google Cloud SDK`_ is installed and has application default
191
+ credentials set they are loaded and returned.
192
+
193
+ To enable application default credentials with the Cloud SDK run::
194
+
195
+ gcloud auth application-default login
196
+
197
+ If the Cloud SDK has an active project, the project ID is returned. The
198
+ active project can be set using::
199
+
200
+ gcloud config set project
201
+
202
+ 3. If the application is running in the `App Engine standard environment`_
203
+ (first generation) then the credentials and project ID from the
204
+ `App Identity Service`_ are used.
205
+ 4. If the application is running in `Compute Engine`_ or `Cloud Run`_ or
206
+ the `App Engine flexible environment`_ or the `App Engine standard
207
+ environment`_ (second generation) then the credentials and project ID
208
+ are obtained from the `Metadata Service`_.
209
+ 5. If no credentials are found,
210
+ :class:`~google.auth.exceptions.DefaultCredentialsError` will be raised.
211
+
212
+ .. _Application Default Credentials: https://developers.google.com\
213
+ /identity/protocols/application-default-credentials
214
+ .. _Google Cloud SDK: https://cloud.google.com/sdk
215
+ .. _App Engine standard environment: https://cloud.google.com/appengine
216
+ .. _App Identity Service: https://cloud.google.com/appengine/docs/python\
217
+ /appidentity/
218
+ .. _Compute Engine: https://cloud.google.com/compute
219
+ .. _App Engine flexible environment: https://cloud.google.com\
220
+ /appengine/flexible
221
+ .. _Metadata Service: https://cloud.google.com/compute/docs\
222
+ /storing-retrieving-metadata
223
+ .. _Cloud Run: https://cloud.google.com/run
224
+
225
+ Example::
226
+
227
+ import google.auth
228
+
229
+ credentials, project_id = google.auth.default()
230
+
231
+ Args:
232
+ scopes (Sequence[str]): The list of scopes for the credentials. If
233
+ specified, the credentials will automatically be scoped if
234
+ necessary.
235
+ request (google.auth.transport.Request): An object used to make
236
+ HTTP requests. This is used to detect whether the application
237
+ is running on Compute Engine. If not specified, then it will
238
+ use the standard library http client to make requests.
239
+ quota_project_id (Optional[str]): The project ID used for
240
+ quota and billing.
241
+ Returns:
242
+ Tuple[~google.auth.credentials.Credentials, Optional[str]]:
243
+ the current environment's credentials and project ID. Project ID
244
+ may be None, which indicates that the Project ID could not be
245
+ ascertained from the environment.
246
+
247
+ Raises:
248
+ ~google.auth.exceptions.DefaultCredentialsError:
249
+ If no credentials were found, or if the credentials found were
250
+ invalid.
251
+ """
252
+ from google.auth._credentials_async import with_scopes_if_required
253
+ from google.auth.credentials import CredentialsWithQuotaProject
254
+
255
+ explicit_project_id = os.environ.get(
256
+ environment_vars.PROJECT, os.environ.get(environment_vars.LEGACY_PROJECT)
257
+ )
258
+
259
+ checkers = (
260
+ lambda: _get_explicit_environ_credentials(quota_project_id=quota_project_id),
261
+ lambda: _get_gcloud_sdk_credentials(quota_project_id=quota_project_id),
262
+ _get_gae_credentials,
263
+ lambda: _get_gce_credentials(request),
264
+ )
265
+
266
+ for checker in checkers:
267
+ credentials, project_id = checker()
268
+ if credentials is not None:
269
+ credentials = with_scopes_if_required(credentials, scopes)
270
+ if quota_project_id and isinstance(
271
+ credentials, CredentialsWithQuotaProject
272
+ ):
273
+ credentials = credentials.with_quota_project(quota_project_id)
274
+ effective_project_id = explicit_project_id or project_id
275
+ if not effective_project_id:
276
+ _default._LOGGER.warning(
277
+ "No project ID could be determined. Consider running "
278
+ "`gcloud config set project` or setting the %s "
279
+ "environment variable",
280
+ environment_vars.PROJECT,
281
+ )
282
+ return credentials, effective_project_id
283
+
284
+ raise exceptions.DefaultCredentialsError(_default._HELP_MESSAGE)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_jwt_async.py ADDED
@@ -0,0 +1,164 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2020 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """JSON Web Tokens
16
+
17
+ Provides support for creating (encoding) and verifying (decoding) JWTs,
18
+ especially JWTs generated and consumed by Google infrastructure.
19
+
20
+ See `rfc7519`_ for more details on JWTs.
21
+
22
+ To encode a JWT use :func:`encode`::
23
+
24
+ from google.auth import crypt
25
+ from google.auth import jwt_async
26
+
27
+ signer = crypt.Signer(private_key)
28
+ payload = {'some': 'payload'}
29
+ encoded = jwt_async.encode(signer, payload)
30
+
31
+ To decode a JWT and verify claims use :func:`decode`::
32
+
33
+ claims = jwt_async.decode(encoded, certs=public_certs)
34
+
35
+ You can also skip verification::
36
+
37
+ claims = jwt_async.decode(encoded, verify=False)
38
+
39
+ .. _rfc7519: https://tools.ietf.org/html/rfc7519
40
+
41
+
42
+ NOTE: This async support is experimental and marked internal. This surface may
43
+ change in minor releases.
44
+ """
45
+
46
+ from google.auth import _credentials_async
47
+ from google.auth import jwt
48
+
49
+
50
+ def encode(signer, payload, header=None, key_id=None):
51
+ """Make a signed JWT.
52
+
53
+ Args:
54
+ signer (google.auth.crypt.Signer): The signer used to sign the JWT.
55
+ payload (Mapping[str, str]): The JWT payload.
56
+ header (Mapping[str, str]): Additional JWT header payload.
57
+ key_id (str): The key id to add to the JWT header. If the
58
+ signer has a key id it will be used as the default. If this is
59
+ specified it will override the signer's key id.
60
+
61
+ Returns:
62
+ bytes: The encoded JWT.
63
+ """
64
+ return jwt.encode(signer, payload, header, key_id)
65
+
66
+
67
+ def decode(token, certs=None, verify=True, audience=None):
68
+ """Decode and verify a JWT.
69
+
70
+ Args:
71
+ token (str): The encoded JWT.
72
+ certs (Union[str, bytes, Mapping[str, Union[str, bytes]]]): The
73
+ certificate used to validate the JWT signature. If bytes or string,
74
+ it must the the public key certificate in PEM format. If a mapping,
75
+ it must be a mapping of key IDs to public key certificates in PEM
76
+ format. The mapping must contain the same key ID that's specified
77
+ in the token's header.
78
+ verify (bool): Whether to perform signature and claim validation.
79
+ Verification is done by default.
80
+ audience (str): The audience claim, 'aud', that this JWT should
81
+ contain. If None then the JWT's 'aud' parameter is not verified.
82
+
83
+ Returns:
84
+ Mapping[str, str]: The deserialized JSON payload in the JWT.
85
+
86
+ Raises:
87
+ ValueError: if any verification checks failed.
88
+ """
89
+
90
+ return jwt.decode(token, certs, verify, audience)
91
+
92
+
93
+ class Credentials(
94
+ jwt.Credentials, _credentials_async.Signing, _credentials_async.Credentials
95
+ ):
96
+ """Credentials that use a JWT as the bearer token.
97
+
98
+ These credentials require an "audience" claim. This claim identifies the
99
+ intended recipient of the bearer token.
100
+
101
+ The constructor arguments determine the claims for the JWT that is
102
+ sent with requests. Usually, you'll construct these credentials with
103
+ one of the helper constructors as shown in the next section.
104
+
105
+ To create JWT credentials using a Google service account private key
106
+ JSON file::
107
+
108
+ audience = 'https://pubsub.googleapis.com/google.pubsub.v1.Publisher'
109
+ credentials = jwt_async.Credentials.from_service_account_file(
110
+ 'service-account.json',
111
+ audience=audience)
112
+
113
+ If you already have the service account file loaded and parsed::
114
+
115
+ service_account_info = json.load(open('service_account.json'))
116
+ credentials = jwt_async.Credentials.from_service_account_info(
117
+ service_account_info,
118
+ audience=audience)
119
+
120
+ Both helper methods pass on arguments to the constructor, so you can
121
+ specify the JWT claims::
122
+
123
+ credentials = jwt_async.Credentials.from_service_account_file(
124
+ 'service-account.json',
125
+ audience=audience,
126
+ additional_claims={'meta': 'data'})
127
+
128
+ You can also construct the credentials directly if you have a
129
+ :class:`~google.auth.crypt.Signer` instance::
130
+
131
+ credentials = jwt_async.Credentials(
132
+ signer,
133
+ issuer='your-issuer',
134
+ subject='your-subject',
135
+ audience=audience)
136
+
137
+ The claims are considered immutable. If you want to modify the claims,
138
+ you can easily create another instance using :meth:`with_claims`::
139
+
140
+ new_audience = (
141
+ 'https://pubsub.googleapis.com/google.pubsub.v1.Subscriber')
142
+ new_credentials = credentials.with_claims(audience=new_audience)
143
+ """
144
+
145
+
146
+ class OnDemandCredentials(
147
+ jwt.OnDemandCredentials, _credentials_async.Signing, _credentials_async.Credentials
148
+ ):
149
+ """On-demand JWT credentials.
150
+
151
+ Like :class:`Credentials`, this class uses a JWT as the bearer token for
152
+ authentication. However, this class does not require the audience at
153
+ construction time. Instead, it will generate a new token on-demand for
154
+ each request using the request URI as the audience. It caches tokens
155
+ so that multiple requests to the same URI do not incur the overhead
156
+ of generating a new token every time.
157
+
158
+ This behavior is especially useful for `gRPC`_ clients. A gRPC service may
159
+ have multiple audience and gRPC clients may not know all of the audiences
160
+ required for accessing a particular service. With these credentials,
161
+ no knowledge of the audiences is required ahead of time.
162
+
163
+ .. _grpc: http://www.grpc.io/
164
+ """
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/_oauth2client.py ADDED
@@ -0,0 +1,169 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2016 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Helpers for transitioning from oauth2client to google-auth.
16
+
17
+ .. warning::
18
+ This module is private as it is intended to assist first-party downstream
19
+ clients with the transition from oauth2client to google-auth.
20
+ """
21
+
22
+ from __future__ import absolute_import
23
+
24
+ import six
25
+
26
+ from google.auth import _helpers
27
+ import google.auth.app_engine
28
+ import google.auth.compute_engine
29
+ import google.oauth2.credentials
30
+ import google.oauth2.service_account
31
+
32
+ try:
33
+ import oauth2client.client # type: ignore
34
+ import oauth2client.contrib.gce # type: ignore
35
+ import oauth2client.service_account # type: ignore
36
+ except ImportError as caught_exc:
37
+ six.raise_from(ImportError("oauth2client is not installed."), caught_exc)
38
+
39
+ try:
40
+ import oauth2client.contrib.appengine # type: ignore
41
+
42
+ _HAS_APPENGINE = True
43
+ except ImportError:
44
+ _HAS_APPENGINE = False
45
+
46
+
47
+ _CONVERT_ERROR_TMPL = "Unable to convert {} to a google-auth credentials class."
48
+
49
+
50
+ def _convert_oauth2_credentials(credentials):
51
+ """Converts to :class:`google.oauth2.credentials.Credentials`.
52
+
53
+ Args:
54
+ credentials (Union[oauth2client.client.OAuth2Credentials,
55
+ oauth2client.client.GoogleCredentials]): The credentials to
56
+ convert.
57
+
58
+ Returns:
59
+ google.oauth2.credentials.Credentials: The converted credentials.
60
+ """
61
+ new_credentials = google.oauth2.credentials.Credentials(
62
+ token=credentials.access_token,
63
+ refresh_token=credentials.refresh_token,
64
+ token_uri=credentials.token_uri,
65
+ client_id=credentials.client_id,
66
+ client_secret=credentials.client_secret,
67
+ scopes=credentials.scopes,
68
+ )
69
+
70
+ new_credentials._expires = credentials.token_expiry
71
+
72
+ return new_credentials
73
+
74
+
75
+ def _convert_service_account_credentials(credentials):
76
+ """Converts to :class:`google.oauth2.service_account.Credentials`.
77
+
78
+ Args:
79
+ credentials (Union[
80
+ oauth2client.service_account.ServiceAccountCredentials,
81
+ oauth2client.service_account._JWTAccessCredentials]): The
82
+ credentials to convert.
83
+
84
+ Returns:
85
+ google.oauth2.service_account.Credentials: The converted credentials.
86
+ """
87
+ info = credentials.serialization_data.copy()
88
+ info["token_uri"] = credentials.token_uri
89
+ return google.oauth2.service_account.Credentials.from_service_account_info(info)
90
+
91
+
92
+ def _convert_gce_app_assertion_credentials(credentials):
93
+ """Converts to :class:`google.auth.compute_engine.Credentials`.
94
+
95
+ Args:
96
+ credentials (oauth2client.contrib.gce.AppAssertionCredentials): The
97
+ credentials to convert.
98
+
99
+ Returns:
100
+ google.oauth2.service_account.Credentials: The converted credentials.
101
+ """
102
+ return google.auth.compute_engine.Credentials(
103
+ service_account_email=credentials.service_account_email
104
+ )
105
+
106
+
107
+ def _convert_appengine_app_assertion_credentials(credentials):
108
+ """Converts to :class:`google.auth.app_engine.Credentials`.
109
+
110
+ Args:
111
+ credentials (oauth2client.contrib.app_engine.AppAssertionCredentials):
112
+ The credentials to convert.
113
+
114
+ Returns:
115
+ google.oauth2.service_account.Credentials: The converted credentials.
116
+ """
117
+ # pylint: disable=invalid-name
118
+ return google.auth.app_engine.Credentials(
119
+ scopes=_helpers.string_to_scopes(credentials.scope),
120
+ service_account_id=credentials.service_account_id,
121
+ )
122
+
123
+
124
+ _CLASS_CONVERSION_MAP = {
125
+ oauth2client.client.OAuth2Credentials: _convert_oauth2_credentials,
126
+ oauth2client.client.GoogleCredentials: _convert_oauth2_credentials,
127
+ oauth2client.service_account.ServiceAccountCredentials: _convert_service_account_credentials,
128
+ oauth2client.service_account._JWTAccessCredentials: _convert_service_account_credentials,
129
+ oauth2client.contrib.gce.AppAssertionCredentials: _convert_gce_app_assertion_credentials,
130
+ }
131
+
132
+ if _HAS_APPENGINE:
133
+ _CLASS_CONVERSION_MAP[
134
+ oauth2client.contrib.appengine.AppAssertionCredentials
135
+ ] = _convert_appengine_app_assertion_credentials
136
+
137
+
138
+ def convert(credentials):
139
+ """Convert oauth2client credentials to google-auth credentials.
140
+
141
+ This class converts:
142
+
143
+ - :class:`oauth2client.client.OAuth2Credentials` to
144
+ :class:`google.oauth2.credentials.Credentials`.
145
+ - :class:`oauth2client.client.GoogleCredentials` to
146
+ :class:`google.oauth2.credentials.Credentials`.
147
+ - :class:`oauth2client.service_account.ServiceAccountCredentials` to
148
+ :class:`google.oauth2.service_account.Credentials`.
149
+ - :class:`oauth2client.service_account._JWTAccessCredentials` to
150
+ :class:`google.oauth2.service_account.Credentials`.
151
+ - :class:`oauth2client.contrib.gce.AppAssertionCredentials` to
152
+ :class:`google.auth.compute_engine.Credentials`.
153
+ - :class:`oauth2client.contrib.appengine.AppAssertionCredentials` to
154
+ :class:`google.auth.app_engine.Credentials`.
155
+
156
+ Returns:
157
+ google.auth.credentials.Credentials: The converted credentials.
158
+
159
+ Raises:
160
+ ValueError: If the credentials could not be converted.
161
+ """
162
+
163
+ credentials_class = type(credentials)
164
+
165
+ try:
166
+ return _CLASS_CONVERSION_MAP[credentials_class](credentials)
167
+ except KeyError as caught_exc:
168
+ new_exc = ValueError(_CONVERT_ERROR_TMPL.format(credentials_class))
169
+ six.raise_from(new_exc, caught_exc)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/aws.py ADDED
@@ -0,0 +1,778 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2020 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """AWS Credentials and AWS Signature V4 Request Signer.
16
+
17
+ This module provides credentials to access Google Cloud resources from Amazon
18
+ Web Services (AWS) workloads. These credentials are recommended over the
19
+ use of service account credentials in AWS as they do not involve the management
20
+ of long-live service account private keys.
21
+
22
+ AWS Credentials are initialized using external_account arguments which are
23
+ typically loaded from the external credentials JSON file.
24
+ Unlike other Credentials that can be initialized with a list of explicit
25
+ arguments, secrets or credentials, external account clients use the
26
+ environment and hints/guidelines provided by the external_account JSON
27
+ file to retrieve credentials and exchange them for Google access tokens.
28
+
29
+ This module also provides a basic implementation of the
30
+ `AWS Signature Version 4`_ request signing algorithm.
31
+
32
+ AWS Credentials use serialized signed requests to the
33
+ `AWS STS GetCallerIdentity`_ API that can be exchanged for Google access tokens
34
+ via the GCP STS endpoint.
35
+
36
+ .. _AWS Signature Version 4: https://docs.aws.amazon.com/general/latest/gr/signature-version-4.html
37
+ .. _AWS STS GetCallerIdentity: https://docs.aws.amazon.com/STS/latest/APIReference/API_GetCallerIdentity.html
38
+ """
39
+
40
+ import hashlib
41
+ import hmac
42
+ import io
43
+ import json
44
+ import os
45
+ import posixpath
46
+ import re
47
+
48
+ from six.moves import http_client
49
+ from six.moves import urllib
50
+ from six.moves.urllib.parse import urljoin
51
+
52
+ from google.auth import _helpers
53
+ from google.auth import environment_vars
54
+ from google.auth import exceptions
55
+ from google.auth import external_account
56
+
57
+ # AWS Signature Version 4 signing algorithm identifier.
58
+ _AWS_ALGORITHM = "AWS4-HMAC-SHA256"
59
+ # The termination string for the AWS credential scope value as defined in
60
+ # https://docs.aws.amazon.com/general/latest/gr/sigv4-create-string-to-sign.html
61
+ _AWS_REQUEST_TYPE = "aws4_request"
62
+ # The AWS authorization header name for the security session token if available.
63
+ _AWS_SECURITY_TOKEN_HEADER = "x-amz-security-token"
64
+ # The AWS authorization header name for the auto-generated date.
65
+ _AWS_DATE_HEADER = "x-amz-date"
66
+
67
+
68
+ class RequestSigner(object):
69
+ """Implements an AWS request signer based on the AWS Signature Version 4 signing
70
+ process.
71
+ https://docs.aws.amazon.com/general/latest/gr/signature-version-4.html
72
+ """
73
+
74
+ def __init__(self, region_name):
75
+ """Instantiates an AWS request signer used to compute authenticated signed
76
+ requests to AWS APIs based on the AWS Signature Version 4 signing process.
77
+
78
+ Args:
79
+ region_name (str): The AWS region to use.
80
+ """
81
+
82
+ self._region_name = region_name
83
+
84
+ def get_request_options(
85
+ self,
86
+ aws_security_credentials,
87
+ url,
88
+ method,
89
+ request_payload="",
90
+ additional_headers={},
91
+ ):
92
+ """Generates the signed request for the provided HTTP request for calling
93
+ an AWS API. This follows the steps described at:
94
+ https://docs.aws.amazon.com/general/latest/gr/sigv4_signing.html
95
+
96
+ Args:
97
+ aws_security_credentials (Mapping[str, str]): A dictionary containing
98
+ the AWS security credentials.
99
+ url (str): The AWS service URL containing the canonical URI and
100
+ query string.
101
+ method (str): The HTTP method used to call this API.
102
+ request_payload (Optional[str]): The optional request payload if
103
+ available.
104
+ additional_headers (Optional[Mapping[str, str]]): The optional
105
+ additional headers needed for the requested AWS API.
106
+
107
+ Returns:
108
+ Mapping[str, str]: The AWS signed request dictionary object.
109
+ """
110
+ # Get AWS credentials.
111
+ access_key = aws_security_credentials.get("access_key_id")
112
+ secret_key = aws_security_credentials.get("secret_access_key")
113
+ security_token = aws_security_credentials.get("security_token")
114
+
115
+ additional_headers = additional_headers or {}
116
+
117
+ uri = urllib.parse.urlparse(url)
118
+ # Normalize the URL path. This is needed for the canonical_uri.
119
+ # os.path.normpath can't be used since it normalizes "/" paths
120
+ # to "\\" in Windows OS.
121
+ normalized_uri = urllib.parse.urlparse(
122
+ urljoin(url, posixpath.normpath(uri.path))
123
+ )
124
+ # Validate provided URL.
125
+ if not uri.hostname or uri.scheme != "https":
126
+ raise ValueError("Invalid AWS service URL")
127
+
128
+ header_map = _generate_authentication_header_map(
129
+ host=uri.hostname,
130
+ canonical_uri=normalized_uri.path or "/",
131
+ canonical_querystring=_get_canonical_querystring(uri.query),
132
+ method=method,
133
+ region=self._region_name,
134
+ access_key=access_key,
135
+ secret_key=secret_key,
136
+ security_token=security_token,
137
+ request_payload=request_payload,
138
+ additional_headers=additional_headers,
139
+ )
140
+ headers = {
141
+ "Authorization": header_map.get("authorization_header"),
142
+ "host": uri.hostname,
143
+ }
144
+ # Add x-amz-date if available.
145
+ if "amz_date" in header_map:
146
+ headers[_AWS_DATE_HEADER] = header_map.get("amz_date")
147
+ # Append additional optional headers, eg. X-Amz-Target, Content-Type, etc.
148
+ for key in additional_headers:
149
+ headers[key] = additional_headers[key]
150
+
151
+ # Add session token if available.
152
+ if security_token is not None:
153
+ headers[_AWS_SECURITY_TOKEN_HEADER] = security_token
154
+
155
+ signed_request = {"url": url, "method": method, "headers": headers}
156
+ if request_payload:
157
+ signed_request["data"] = request_payload
158
+ return signed_request
159
+
160
+
161
+ def _get_canonical_querystring(query):
162
+ """Generates the canonical query string given a raw query string.
163
+ Logic is based on
164
+ https://docs.aws.amazon.com/general/latest/gr/sigv4-create-canonical-request.html
165
+
166
+ Args:
167
+ query (str): The raw query string.
168
+
169
+ Returns:
170
+ str: The canonical query string.
171
+ """
172
+ # Parse raw query string.
173
+ querystring = urllib.parse.parse_qs(query)
174
+ querystring_encoded_map = {}
175
+ for key in querystring:
176
+ quote_key = urllib.parse.quote(key, safe="-_.~")
177
+ # URI encode key.
178
+ querystring_encoded_map[quote_key] = []
179
+ for item in querystring[key]:
180
+ # For each key, URI encode all values for that key.
181
+ querystring_encoded_map[quote_key].append(
182
+ urllib.parse.quote(item, safe="-_.~")
183
+ )
184
+ # Sort values for each key.
185
+ querystring_encoded_map[quote_key].sort()
186
+ # Sort keys.
187
+ sorted_keys = list(querystring_encoded_map.keys())
188
+ sorted_keys.sort()
189
+ # Reconstruct the query string. Preserve keys with multiple values.
190
+ querystring_encoded_pairs = []
191
+ for key in sorted_keys:
192
+ for item in querystring_encoded_map[key]:
193
+ querystring_encoded_pairs.append("{}={}".format(key, item))
194
+ return "&".join(querystring_encoded_pairs)
195
+
196
+
197
+ def _sign(key, msg):
198
+ """Creates the HMAC-SHA256 hash of the provided message using the provided
199
+ key.
200
+
201
+ Args:
202
+ key (str): The HMAC-SHA256 key to use.
203
+ msg (str): The message to hash.
204
+
205
+ Returns:
206
+ str: The computed hash bytes.
207
+ """
208
+ return hmac.new(key, msg.encode("utf-8"), hashlib.sha256).digest()
209
+
210
+
211
+ def _get_signing_key(key, date_stamp, region_name, service_name):
212
+ """Calculates the signing key used to calculate the signature for
213
+ AWS Signature Version 4 based on:
214
+ https://docs.aws.amazon.com/general/latest/gr/sigv4-calculate-signature.html
215
+
216
+ Args:
217
+ key (str): The AWS secret access key.
218
+ date_stamp (str): The '%Y%m%d' date format.
219
+ region_name (str): The AWS region.
220
+ service_name (str): The AWS service name, eg. sts.
221
+
222
+ Returns:
223
+ str: The signing key bytes.
224
+ """
225
+ k_date = _sign(("AWS4" + key).encode("utf-8"), date_stamp)
226
+ k_region = _sign(k_date, region_name)
227
+ k_service = _sign(k_region, service_name)
228
+ k_signing = _sign(k_service, "aws4_request")
229
+ return k_signing
230
+
231
+
232
+ def _generate_authentication_header_map(
233
+ host,
234
+ canonical_uri,
235
+ canonical_querystring,
236
+ method,
237
+ region,
238
+ access_key,
239
+ secret_key,
240
+ security_token,
241
+ request_payload="",
242
+ additional_headers={},
243
+ ):
244
+ """Generates the authentication header map needed for generating the AWS
245
+ Signature Version 4 signed request.
246
+
247
+ Args:
248
+ host (str): The AWS service URL hostname.
249
+ canonical_uri (str): The AWS service URL path name.
250
+ canonical_querystring (str): The AWS service URL query string.
251
+ method (str): The HTTP method used to call this API.
252
+ region (str): The AWS region.
253
+ access_key (str): The AWS access key ID.
254
+ secret_key (str): The AWS secret access key.
255
+ security_token (Optional[str]): The AWS security session token. This is
256
+ available for temporary sessions.
257
+ request_payload (Optional[str]): The optional request payload if
258
+ available.
259
+ additional_headers (Optional[Mapping[str, str]]): The optional
260
+ additional headers needed for the requested AWS API.
261
+
262
+ Returns:
263
+ Mapping[str, str]: The AWS authentication header dictionary object.
264
+ This contains the x-amz-date and authorization header information.
265
+ """
266
+ # iam.amazonaws.com host => iam service.
267
+ # sts.us-east-2.amazonaws.com host => sts service.
268
+ service_name = host.split(".")[0]
269
+
270
+ current_time = _helpers.utcnow()
271
+ amz_date = current_time.strftime("%Y%m%dT%H%M%SZ")
272
+ date_stamp = current_time.strftime("%Y%m%d")
273
+
274
+ # Change all additional headers to be lower case.
275
+ full_headers = {}
276
+ for key in additional_headers:
277
+ full_headers[key.lower()] = additional_headers[key]
278
+ # Add AWS session token if available.
279
+ if security_token is not None:
280
+ full_headers[_AWS_SECURITY_TOKEN_HEADER] = security_token
281
+
282
+ # Required headers
283
+ full_headers["host"] = host
284
+ # Do not use generated x-amz-date if the date header is provided.
285
+ # Previously the date was not fixed with x-amz- and could be provided
286
+ # manually.
287
+ # https://github.com/boto/botocore/blob/879f8440a4e9ace5d3cf145ce8b3d5e5ffb892ef/tests/unit/auth/aws4_testsuite/get-header-value-trim.req
288
+ if "date" not in full_headers:
289
+ full_headers[_AWS_DATE_HEADER] = amz_date
290
+
291
+ # Header keys need to be sorted alphabetically.
292
+ canonical_headers = ""
293
+ header_keys = list(full_headers.keys())
294
+ header_keys.sort()
295
+ for key in header_keys:
296
+ canonical_headers = "{}{}:{}\n".format(
297
+ canonical_headers, key, full_headers[key]
298
+ )
299
+ signed_headers = ";".join(header_keys)
300
+
301
+ payload_hash = hashlib.sha256((request_payload or "").encode("utf-8")).hexdigest()
302
+
303
+ # https://docs.aws.amazon.com/general/latest/gr/sigv4-create-canonical-request.html
304
+ canonical_request = "{}\n{}\n{}\n{}\n{}\n{}".format(
305
+ method,
306
+ canonical_uri,
307
+ canonical_querystring,
308
+ canonical_headers,
309
+ signed_headers,
310
+ payload_hash,
311
+ )
312
+
313
+ credential_scope = "{}/{}/{}/{}".format(
314
+ date_stamp, region, service_name, _AWS_REQUEST_TYPE
315
+ )
316
+
317
+ # https://docs.aws.amazon.com/general/latest/gr/sigv4-create-string-to-sign.html
318
+ string_to_sign = "{}\n{}\n{}\n{}".format(
319
+ _AWS_ALGORITHM,
320
+ amz_date,
321
+ credential_scope,
322
+ hashlib.sha256(canonical_request.encode("utf-8")).hexdigest(),
323
+ )
324
+
325
+ # https://docs.aws.amazon.com/general/latest/gr/sigv4-calculate-signature.html
326
+ signing_key = _get_signing_key(secret_key, date_stamp, region, service_name)
327
+ signature = hmac.new(
328
+ signing_key, string_to_sign.encode("utf-8"), hashlib.sha256
329
+ ).hexdigest()
330
+
331
+ # https://docs.aws.amazon.com/general/latest/gr/sigv4-add-signature-to-request.html
332
+ authorization_header = "{} Credential={}/{}, SignedHeaders={}, Signature={}".format(
333
+ _AWS_ALGORITHM, access_key, credential_scope, signed_headers, signature
334
+ )
335
+
336
+ authentication_header = {"authorization_header": authorization_header}
337
+ # Do not use generated x-amz-date if the date header is provided.
338
+ if "date" not in full_headers:
339
+ authentication_header["amz_date"] = amz_date
340
+ return authentication_header
341
+
342
+
343
+ class Credentials(external_account.Credentials):
344
+ """AWS external account credentials.
345
+ This is used to exchange serialized AWS signature v4 signed requests to
346
+ AWS STS GetCallerIdentity service for Google access tokens.
347
+ """
348
+
349
+ def __init__(
350
+ self,
351
+ audience,
352
+ subject_token_type,
353
+ token_url,
354
+ credential_source=None,
355
+ service_account_impersonation_url=None,
356
+ client_id=None,
357
+ client_secret=None,
358
+ quota_project_id=None,
359
+ scopes=None,
360
+ default_scopes=None,
361
+ ):
362
+ """Instantiates an AWS workload external account credentials object.
363
+
364
+ Args:
365
+ audience (str): The STS audience field.
366
+ subject_token_type (str): The subject token type.
367
+ token_url (str): The STS endpoint URL.
368
+ credential_source (Mapping): The credential source dictionary used
369
+ to provide instructions on how to retrieve external credential
370
+ to be exchanged for Google access tokens.
371
+ service_account_impersonation_url (Optional[str]): The optional
372
+ service account impersonation getAccessToken URL.
373
+ client_id (Optional[str]): The optional client ID.
374
+ client_secret (Optional[str]): The optional client secret.
375
+ quota_project_id (Optional[str]): The optional quota project ID.
376
+ scopes (Optional[Sequence[str]]): Optional scopes to request during
377
+ the authorization grant.
378
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
379
+ Google client library. Use 'scopes' for user-defined scopes.
380
+
381
+ Raises:
382
+ google.auth.exceptions.RefreshError: If an error is encountered during
383
+ access token retrieval logic.
384
+ ValueError: For invalid parameters.
385
+
386
+ .. note:: Typically one of the helper constructors
387
+ :meth:`from_file` or
388
+ :meth:`from_info` are used instead of calling the constructor directly.
389
+ """
390
+ super(Credentials, self).__init__(
391
+ audience=audience,
392
+ subject_token_type=subject_token_type,
393
+ token_url=token_url,
394
+ credential_source=credential_source,
395
+ service_account_impersonation_url=service_account_impersonation_url,
396
+ client_id=client_id,
397
+ client_secret=client_secret,
398
+ quota_project_id=quota_project_id,
399
+ scopes=scopes,
400
+ default_scopes=default_scopes,
401
+ )
402
+ credential_source = credential_source or {}
403
+ self._environment_id = credential_source.get("environment_id") or ""
404
+ self._region_url = credential_source.get("region_url")
405
+ self._security_credentials_url = credential_source.get("url")
406
+ self._cred_verification_url = credential_source.get(
407
+ "regional_cred_verification_url"
408
+ )
409
+ self._imdsv2_session_token_url = credential_source.get(
410
+ "imdsv2_session_token_url"
411
+ )
412
+ self._region = None
413
+ self._request_signer = None
414
+ self._target_resource = audience
415
+
416
+ # Get the environment ID. Currently, only one version supported (v1).
417
+ matches = re.match(r"^(aws)([\d]+)$", self._environment_id)
418
+ if matches:
419
+ env_id, env_version = matches.groups()
420
+ else:
421
+ env_id, env_version = (None, None)
422
+
423
+ if env_id != "aws" or self._cred_verification_url is None:
424
+ raise ValueError("No valid AWS 'credential_source' provided")
425
+ elif int(env_version or "") != 1:
426
+ raise ValueError(
427
+ "aws version '{}' is not supported in the current build.".format(
428
+ env_version
429
+ )
430
+ )
431
+
432
+ def retrieve_subject_token(self, request):
433
+ """Retrieves the subject token using the credential_source object.
434
+ The subject token is a serialized `AWS GetCallerIdentity signed request`_.
435
+
436
+ The logic is summarized as:
437
+
438
+ Retrieve the AWS region from the AWS_REGION or AWS_DEFAULT_REGION
439
+ environment variable or from the AWS metadata server availability-zone
440
+ if not found in the environment variable.
441
+
442
+ Check AWS credentials in environment variables. If not found, retrieve
443
+ from the AWS metadata server security-credentials endpoint.
444
+
445
+ When retrieving AWS credentials from the metadata server
446
+ security-credentials endpoint, the AWS role needs to be determined by
447
+ calling the security-credentials endpoint without any argument. Then the
448
+ credentials can be retrieved via: security-credentials/role_name
449
+
450
+ Generate the signed request to AWS STS GetCallerIdentity action.
451
+
452
+ Inject x-goog-cloud-target-resource into header and serialize the
453
+ signed request. This will be the subject-token to pass to GCP STS.
454
+
455
+ .. _AWS GetCallerIdentity signed request:
456
+ https://cloud.google.com/iam/docs/access-resources-aws#exchange-token
457
+
458
+ Args:
459
+ request (google.auth.transport.Request): A callable used to make
460
+ HTTP requests.
461
+ Returns:
462
+ str: The retrieved subject token.
463
+ """
464
+ # Fetch the session token required to make meta data endpoint calls to aws
465
+ if request is not None and self._imdsv2_session_token_url is not None:
466
+ headers = {"X-aws-ec2-metadata-token-ttl-seconds": "300"}
467
+
468
+ imdsv2_session_token_response = request(
469
+ url=self._imdsv2_session_token_url, method="PUT", headers=headers
470
+ )
471
+
472
+ if imdsv2_session_token_response.status != 200:
473
+ raise exceptions.RefreshError(
474
+ "Unable to retrieve AWS Session Token",
475
+ imdsv2_session_token_response.data,
476
+ )
477
+
478
+ imdsv2_session_token = imdsv2_session_token_response.data
479
+ else:
480
+ imdsv2_session_token = None
481
+
482
+ # Initialize the request signer if not yet initialized after determining
483
+ # the current AWS region.
484
+ if self._request_signer is None:
485
+ self._region = self._get_region(
486
+ request, self._region_url, imdsv2_session_token
487
+ )
488
+ self._request_signer = RequestSigner(self._region)
489
+
490
+ # Retrieve the AWS security credentials needed to generate the signed
491
+ # request.
492
+ aws_security_credentials = self._get_security_credentials(
493
+ request, imdsv2_session_token
494
+ )
495
+ # Generate the signed request to AWS STS GetCallerIdentity API.
496
+ # Use the required regional endpoint. Otherwise, the request will fail.
497
+ request_options = self._request_signer.get_request_options(
498
+ aws_security_credentials,
499
+ self._cred_verification_url.replace("{region}", self._region),
500
+ "POST",
501
+ )
502
+ # The GCP STS endpoint expects the headers to be formatted as:
503
+ # [
504
+ # {key: 'x-amz-date', value: '...'},
505
+ # {key: 'Authorization', value: '...'},
506
+ # ...
507
+ # ]
508
+ # And then serialized as:
509
+ # quote(json.dumps({
510
+ # url: '...',
511
+ # method: 'POST',
512
+ # headers: [{key: 'x-amz-date', value: '...'}, ...]
513
+ # }))
514
+ request_headers = request_options.get("headers")
515
+ # The full, canonical resource name of the workload identity pool
516
+ # provider, with or without the HTTPS prefix.
517
+ # Including this header as part of the signature is recommended to
518
+ # ensure data integrity.
519
+ request_headers["x-goog-cloud-target-resource"] = self._target_resource
520
+
521
+ # Serialize AWS signed request.
522
+ # Keeping inner keys in sorted order makes testing easier for Python
523
+ # versions <=3.5 as the stringified JSON string would have a predictable
524
+ # key order.
525
+ aws_signed_req = {}
526
+ aws_signed_req["url"] = request_options.get("url")
527
+ aws_signed_req["method"] = request_options.get("method")
528
+ aws_signed_req["headers"] = []
529
+ # Reformat header to GCP STS expected format.
530
+ for key in sorted(request_headers.keys()):
531
+ aws_signed_req["headers"].append(
532
+ {"key": key, "value": request_headers[key]}
533
+ )
534
+
535
+ return urllib.parse.quote(
536
+ json.dumps(aws_signed_req, separators=(",", ":"), sort_keys=True)
537
+ )
538
+
539
+ def _get_region(self, request, url, imdsv2_session_token):
540
+ """Retrieves the current AWS region from either the AWS_REGION or
541
+ AWS_DEFAULT_REGION environment variable or from the AWS metadata server.
542
+
543
+ Args:
544
+ request (google.auth.transport.Request): A callable used to make
545
+ HTTP requests.
546
+ url (str): The AWS metadata server region URL.
547
+ imdsv2_session_token (str): The AWS IMDSv2 session token to be added as a
548
+ header in the requests to AWS metadata endpoint.
549
+
550
+ Returns:
551
+ str: The current AWS region.
552
+
553
+ Raises:
554
+ google.auth.exceptions.RefreshError: If an error occurs while
555
+ retrieving the AWS region.
556
+ """
557
+ # The AWS metadata server is not available in some AWS environments
558
+ # such as AWS lambda. Instead, it is available via environment
559
+ # variable.
560
+ env_aws_region = os.environ.get(environment_vars.AWS_REGION)
561
+ if env_aws_region is not None:
562
+ return env_aws_region
563
+
564
+ env_aws_region = os.environ.get(environment_vars.AWS_DEFAULT_REGION)
565
+ if env_aws_region is not None:
566
+ return env_aws_region
567
+
568
+ if not self._region_url:
569
+ raise exceptions.RefreshError("Unable to determine AWS region")
570
+
571
+ headers = None
572
+ if imdsv2_session_token is not None:
573
+ headers = {"X-aws-ec2-metadata-token": imdsv2_session_token}
574
+
575
+ response = request(url=self._region_url, method="GET", headers=headers)
576
+
577
+ # Support both string and bytes type response.data.
578
+ response_body = (
579
+ response.data.decode("utf-8")
580
+ if hasattr(response.data, "decode")
581
+ else response.data
582
+ )
583
+
584
+ if response.status != 200:
585
+ raise exceptions.RefreshError(
586
+ "Unable to retrieve AWS region", response_body
587
+ )
588
+
589
+ # This endpoint will return the region in format: us-east-2b.
590
+ # Only the us-east-2 part should be used.
591
+ return response_body[:-1]
592
+
593
+ def _get_security_credentials(self, request, imdsv2_session_token):
594
+ """Retrieves the AWS security credentials required for signing AWS
595
+ requests from either the AWS security credentials environment variables
596
+ or from the AWS metadata server.
597
+
598
+ Args:
599
+ request (google.auth.transport.Request): A callable used to make
600
+ HTTP requests.
601
+ imdsv2_session_token (str): The AWS IMDSv2 session token to be added as a
602
+ header in the requests to AWS metadata endpoint.
603
+
604
+ Returns:
605
+ Mapping[str, str]: The AWS security credentials dictionary object.
606
+
607
+ Raises:
608
+ google.auth.exceptions.RefreshError: If an error occurs while
609
+ retrieving the AWS security credentials.
610
+ """
611
+
612
+ # Check environment variables for permanent credentials first.
613
+ # https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html
614
+ env_aws_access_key_id = os.environ.get(environment_vars.AWS_ACCESS_KEY_ID)
615
+ env_aws_secret_access_key = os.environ.get(
616
+ environment_vars.AWS_SECRET_ACCESS_KEY
617
+ )
618
+ # This is normally not available for permanent credentials.
619
+ env_aws_session_token = os.environ.get(environment_vars.AWS_SESSION_TOKEN)
620
+ if env_aws_access_key_id and env_aws_secret_access_key:
621
+ return {
622
+ "access_key_id": env_aws_access_key_id,
623
+ "secret_access_key": env_aws_secret_access_key,
624
+ "security_token": env_aws_session_token,
625
+ }
626
+
627
+ # Get role name.
628
+ role_name = self._get_metadata_role_name(request, imdsv2_session_token)
629
+
630
+ # Get security credentials.
631
+ credentials = self._get_metadata_security_credentials(
632
+ request, role_name, imdsv2_session_token
633
+ )
634
+
635
+ return {
636
+ "access_key_id": credentials.get("AccessKeyId"),
637
+ "secret_access_key": credentials.get("SecretAccessKey"),
638
+ "security_token": credentials.get("Token"),
639
+ }
640
+
641
+ def _get_metadata_security_credentials(
642
+ self, request, role_name, imdsv2_session_token
643
+ ):
644
+ """Retrieves the AWS security credentials required for signing AWS
645
+ requests from the AWS metadata server.
646
+
647
+ Args:
648
+ request (google.auth.transport.Request): A callable used to make
649
+ HTTP requests.
650
+ role_name (str): The AWS role name required by the AWS metadata
651
+ server security_credentials endpoint in order to return the
652
+ credentials.
653
+ imdsv2_session_token (str): The AWS IMDSv2 session token to be added as a
654
+ header in the requests to AWS metadata endpoint.
655
+
656
+ Returns:
657
+ Mapping[str, str]: The AWS metadata server security credentials
658
+ response.
659
+
660
+ Raises:
661
+ google.auth.exceptions.RefreshError: If an error occurs while
662
+ retrieving the AWS security credentials.
663
+ """
664
+ headers = {"Content-Type": "application/json"}
665
+ if imdsv2_session_token is not None:
666
+ headers["X-aws-ec2-metadata-token"] = imdsv2_session_token
667
+
668
+ response = request(
669
+ url="{}/{}".format(self._security_credentials_url, role_name),
670
+ method="GET",
671
+ headers=headers,
672
+ )
673
+
674
+ # support both string and bytes type response.data
675
+ response_body = (
676
+ response.data.decode("utf-8")
677
+ if hasattr(response.data, "decode")
678
+ else response.data
679
+ )
680
+
681
+ if response.status != http_client.OK:
682
+ raise exceptions.RefreshError(
683
+ "Unable to retrieve AWS security credentials", response_body
684
+ )
685
+
686
+ credentials_response = json.loads(response_body)
687
+
688
+ return credentials_response
689
+
690
+ def _get_metadata_role_name(self, request, imdsv2_session_token):
691
+ """Retrieves the AWS role currently attached to the current AWS
692
+ workload by querying the AWS metadata server. This is needed for the
693
+ AWS metadata server security credentials endpoint in order to retrieve
694
+ the AWS security credentials needed to sign requests to AWS APIs.
695
+
696
+ Args:
697
+ request (google.auth.transport.Request): A callable used to make
698
+ HTTP requests.
699
+ imdsv2_session_token (str): The AWS IMDSv2 session token to be added as a
700
+ header in the requests to AWS metadata endpoint.
701
+
702
+ Returns:
703
+ str: The AWS role name.
704
+
705
+ Raises:
706
+ google.auth.exceptions.RefreshError: If an error occurs while
707
+ retrieving the AWS role name.
708
+ """
709
+ if self._security_credentials_url is None:
710
+ raise exceptions.RefreshError(
711
+ "Unable to determine the AWS metadata server security credentials endpoint"
712
+ )
713
+
714
+ headers = None
715
+ if imdsv2_session_token is not None:
716
+ headers = {"X-aws-ec2-metadata-token": imdsv2_session_token}
717
+
718
+ response = request(
719
+ url=self._security_credentials_url, method="GET", headers=headers
720
+ )
721
+
722
+ # support both string and bytes type response.data
723
+ response_body = (
724
+ response.data.decode("utf-8")
725
+ if hasattr(response.data, "decode")
726
+ else response.data
727
+ )
728
+
729
+ if response.status != http_client.OK:
730
+ raise exceptions.RefreshError(
731
+ "Unable to retrieve AWS role name", response_body
732
+ )
733
+
734
+ return response_body
735
+
736
+ @classmethod
737
+ def from_info(cls, info, **kwargs):
738
+ """Creates an AWS Credentials instance from parsed external account info.
739
+
740
+ Args:
741
+ info (Mapping[str, str]): The AWS external account info in Google
742
+ format.
743
+ kwargs: Additional arguments to pass to the constructor.
744
+
745
+ Returns:
746
+ google.auth.aws.Credentials: The constructed credentials.
747
+
748
+ Raises:
749
+ ValueError: For invalid parameters.
750
+ """
751
+ return cls(
752
+ audience=info.get("audience"),
753
+ subject_token_type=info.get("subject_token_type"),
754
+ token_url=info.get("token_url"),
755
+ service_account_impersonation_url=info.get(
756
+ "service_account_impersonation_url"
757
+ ),
758
+ client_id=info.get("client_id"),
759
+ client_secret=info.get("client_secret"),
760
+ credential_source=info.get("credential_source"),
761
+ quota_project_id=info.get("quota_project_id"),
762
+ **kwargs
763
+ )
764
+
765
+ @classmethod
766
+ def from_file(cls, filename, **kwargs):
767
+ """Creates an AWS Credentials instance from an external account json file.
768
+
769
+ Args:
770
+ filename (str): The path to the AWS external account json file.
771
+ kwargs: Additional arguments to pass to the constructor.
772
+
773
+ Returns:
774
+ google.auth.aws.Credentials: The constructed credentials.
775
+ """
776
+ with io.open(filename, "r", encoding="utf-8") as json_file:
777
+ data = json.load(json_file)
778
+ return cls.from_info(data, **kwargs)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/credentials.py ADDED
@@ -0,0 +1,362 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2016 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+
16
+ """Interfaces for credentials."""
17
+
18
+ import abc
19
+
20
+ import six
21
+
22
+ from google.auth import _helpers
23
+
24
+
25
+ @six.add_metaclass(abc.ABCMeta)
26
+ class Credentials(object):
27
+ """Base class for all credentials.
28
+
29
+ All credentials have a :attr:`token` that is used for authentication and
30
+ may also optionally set an :attr:`expiry` to indicate when the token will
31
+ no longer be valid.
32
+
33
+ Most credentials will be :attr:`invalid` until :meth:`refresh` is called.
34
+ Credentials can do this automatically before the first HTTP request in
35
+ :meth:`before_request`.
36
+
37
+ Although the token and expiration will change as the credentials are
38
+ :meth:`refreshed <refresh>` and used, credentials should be considered
39
+ immutable. Various credentials will accept configuration such as private
40
+ keys, scopes, and other options. These options are not changeable after
41
+ construction. Some classes will provide mechanisms to copy the credentials
42
+ with modifications such as :meth:`ScopedCredentials.with_scopes`.
43
+ """
44
+
45
+ def __init__(self):
46
+ self.token = None
47
+ """str: The bearer token that can be used in HTTP headers to make
48
+ authenticated requests."""
49
+ self.expiry = None
50
+ """Optional[datetime]: When the token expires and is no longer valid.
51
+ If this is None, the token is assumed to never expire."""
52
+ self._quota_project_id = None
53
+ """Optional[str]: Project to use for quota and billing purposes."""
54
+
55
+ @property
56
+ def expired(self):
57
+ """Checks if the credentials are expired.
58
+
59
+ Note that credentials can be invalid but not expired because
60
+ Credentials with :attr:`expiry` set to None is considered to never
61
+ expire.
62
+ """
63
+ if not self.expiry:
64
+ return False
65
+
66
+ # Remove some threshold from expiry to err on the side of reporting
67
+ # expiration early so that we avoid the 401-refresh-retry loop.
68
+ skewed_expiry = self.expiry - _helpers.REFRESH_THRESHOLD
69
+ return _helpers.utcnow() >= skewed_expiry
70
+
71
+ @property
72
+ def valid(self):
73
+ """Checks the validity of the credentials.
74
+
75
+ This is True if the credentials have a :attr:`token` and the token
76
+ is not :attr:`expired`.
77
+ """
78
+ return self.token is not None and not self.expired
79
+
80
+ @property
81
+ def quota_project_id(self):
82
+ """Project to use for quota and billing purposes."""
83
+ return self._quota_project_id
84
+
85
+ @abc.abstractmethod
86
+ def refresh(self, request):
87
+ """Refreshes the access token.
88
+
89
+ Args:
90
+ request (google.auth.transport.Request): The object used to make
91
+ HTTP requests.
92
+
93
+ Raises:
94
+ google.auth.exceptions.RefreshError: If the credentials could
95
+ not be refreshed.
96
+ """
97
+ # pylint: disable=missing-raises-doc
98
+ # (pylint doesn't recognize that this is abstract)
99
+ raise NotImplementedError("Refresh must be implemented")
100
+
101
+ def apply(self, headers, token=None):
102
+ """Apply the token to the authentication header.
103
+
104
+ Args:
105
+ headers (Mapping): The HTTP request headers.
106
+ token (Optional[str]): If specified, overrides the current access
107
+ token.
108
+ """
109
+ headers["authorization"] = "Bearer {}".format(
110
+ _helpers.from_bytes(token or self.token)
111
+ )
112
+ if self.quota_project_id:
113
+ headers["x-goog-user-project"] = self.quota_project_id
114
+
115
+ def before_request(self, request, method, url, headers):
116
+ """Performs credential-specific before request logic.
117
+
118
+ Refreshes the credentials if necessary, then calls :meth:`apply` to
119
+ apply the token to the authentication header.
120
+
121
+ Args:
122
+ request (google.auth.transport.Request): The object used to make
123
+ HTTP requests.
124
+ method (str): The request's HTTP method or the RPC method being
125
+ invoked.
126
+ url (str): The request's URI or the RPC service's URI.
127
+ headers (Mapping): The request's headers.
128
+ """
129
+ # pylint: disable=unused-argument
130
+ # (Subclasses may use these arguments to ascertain information about
131
+ # the http request.)
132
+ if not self.valid:
133
+ self.refresh(request)
134
+ self.apply(headers)
135
+
136
+
137
+ class CredentialsWithQuotaProject(Credentials):
138
+ """Abstract base for credentials supporting ``with_quota_project`` factory"""
139
+
140
+ def with_quota_project(self, quota_project_id):
141
+ """Returns a copy of these credentials with a modified quota project.
142
+
143
+ Args:
144
+ quota_project_id (str): The project to use for quota and
145
+ billing purposes
146
+
147
+ Returns:
148
+ google.oauth2.credentials.Credentials: A new credentials instance.
149
+ """
150
+ raise NotImplementedError("This credential does not support quota project.")
151
+
152
+
153
+ class AnonymousCredentials(Credentials):
154
+ """Credentials that do not provide any authentication information.
155
+
156
+ These are useful in the case of services that support anonymous access or
157
+ local service emulators that do not use credentials.
158
+ """
159
+
160
+ @property
161
+ def expired(self):
162
+ """Returns `False`, anonymous credentials never expire."""
163
+ return False
164
+
165
+ @property
166
+ def valid(self):
167
+ """Returns `True`, anonymous credentials are always valid."""
168
+ return True
169
+
170
+ def refresh(self, request):
171
+ """Raises :class:`ValueError``, anonymous credentials cannot be
172
+ refreshed."""
173
+ raise ValueError("Anonymous credentials cannot be refreshed.")
174
+
175
+ def apply(self, headers, token=None):
176
+ """Anonymous credentials do nothing to the request.
177
+
178
+ The optional ``token`` argument is not supported.
179
+
180
+ Raises:
181
+ ValueError: If a token was specified.
182
+ """
183
+ if token is not None:
184
+ raise ValueError("Anonymous credentials don't support tokens.")
185
+
186
+ def before_request(self, request, method, url, headers):
187
+ """Anonymous credentials do nothing to the request."""
188
+
189
+
190
+ @six.add_metaclass(abc.ABCMeta)
191
+ class ReadOnlyScoped(object):
192
+ """Interface for credentials whose scopes can be queried.
193
+
194
+ OAuth 2.0-based credentials allow limiting access using scopes as described
195
+ in `RFC6749 Section 3.3`_.
196
+ If a credential class implements this interface then the credentials either
197
+ use scopes in their implementation.
198
+
199
+ Some credentials require scopes in order to obtain a token. You can check
200
+ if scoping is necessary with :attr:`requires_scopes`::
201
+
202
+ if credentials.requires_scopes:
203
+ # Scoping is required.
204
+ credentials = credentials.with_scopes(scopes=['one', 'two'])
205
+
206
+ Credentials that require scopes must either be constructed with scopes::
207
+
208
+ credentials = SomeScopedCredentials(scopes=['one', 'two'])
209
+
210
+ Or must copy an existing instance using :meth:`with_scopes`::
211
+
212
+ scoped_credentials = credentials.with_scopes(scopes=['one', 'two'])
213
+
214
+ Some credentials have scopes but do not allow or require scopes to be set,
215
+ these credentials can be used as-is.
216
+
217
+ .. _RFC6749 Section 3.3: https://tools.ietf.org/html/rfc6749#section-3.3
218
+ """
219
+
220
+ def __init__(self):
221
+ super(ReadOnlyScoped, self).__init__()
222
+ self._scopes = None
223
+ self._default_scopes = None
224
+
225
+ @property
226
+ def scopes(self):
227
+ """Sequence[str]: the credentials' current set of scopes."""
228
+ return self._scopes
229
+
230
+ @property
231
+ def default_scopes(self):
232
+ """Sequence[str]: the credentials' current set of default scopes."""
233
+ return self._default_scopes
234
+
235
+ @abc.abstractproperty
236
+ def requires_scopes(self):
237
+ """True if these credentials require scopes to obtain an access token.
238
+ """
239
+ return False
240
+
241
+ def has_scopes(self, scopes):
242
+ """Checks if the credentials have the given scopes.
243
+
244
+ .. warning: This method is not guaranteed to be accurate if the
245
+ credentials are :attr:`~Credentials.invalid`.
246
+
247
+ Args:
248
+ scopes (Sequence[str]): The list of scopes to check.
249
+
250
+ Returns:
251
+ bool: True if the credentials have the given scopes.
252
+ """
253
+ credential_scopes = (
254
+ self._scopes if self._scopes is not None else self._default_scopes
255
+ )
256
+ return set(scopes).issubset(set(credential_scopes or []))
257
+
258
+
259
+ class Scoped(ReadOnlyScoped):
260
+ """Interface for credentials whose scopes can be replaced while copying.
261
+
262
+ OAuth 2.0-based credentials allow limiting access using scopes as described
263
+ in `RFC6749 Section 3.3`_.
264
+ If a credential class implements this interface then the credentials either
265
+ use scopes in their implementation.
266
+
267
+ Some credentials require scopes in order to obtain a token. You can check
268
+ if scoping is necessary with :attr:`requires_scopes`::
269
+
270
+ if credentials.requires_scopes:
271
+ # Scoping is required.
272
+ credentials = credentials.create_scoped(['one', 'two'])
273
+
274
+ Credentials that require scopes must either be constructed with scopes::
275
+
276
+ credentials = SomeScopedCredentials(scopes=['one', 'two'])
277
+
278
+ Or must copy an existing instance using :meth:`with_scopes`::
279
+
280
+ scoped_credentials = credentials.with_scopes(scopes=['one', 'two'])
281
+
282
+ Some credentials have scopes but do not allow or require scopes to be set,
283
+ these credentials can be used as-is.
284
+
285
+ .. _RFC6749 Section 3.3: https://tools.ietf.org/html/rfc6749#section-3.3
286
+ """
287
+
288
+ @abc.abstractmethod
289
+ def with_scopes(self, scopes, default_scopes=None):
290
+ """Create a copy of these credentials with the specified scopes.
291
+
292
+ Args:
293
+ scopes (Sequence[str]): The list of scopes to attach to the
294
+ current credentials.
295
+
296
+ Raises:
297
+ NotImplementedError: If the credentials' scopes can not be changed.
298
+ This can be avoided by checking :attr:`requires_scopes` before
299
+ calling this method.
300
+ """
301
+ raise NotImplementedError("This class does not require scoping.")
302
+
303
+
304
+ def with_scopes_if_required(credentials, scopes, default_scopes=None):
305
+ """Creates a copy of the credentials with scopes if scoping is required.
306
+
307
+ This helper function is useful when you do not know (or care to know) the
308
+ specific type of credentials you are using (such as when you use
309
+ :func:`google.auth.default`). This function will call
310
+ :meth:`Scoped.with_scopes` if the credentials are scoped credentials and if
311
+ the credentials require scoping. Otherwise, it will return the credentials
312
+ as-is.
313
+
314
+ Args:
315
+ credentials (google.auth.credentials.Credentials): The credentials to
316
+ scope if necessary.
317
+ scopes (Sequence[str]): The list of scopes to use.
318
+ default_scopes (Sequence[str]): Default scopes passed by a
319
+ Google client library. Use 'scopes' for user-defined scopes.
320
+
321
+ Returns:
322
+ google.auth.credentials.Credentials: Either a new set of scoped
323
+ credentials, or the passed in credentials instance if no scoping
324
+ was required.
325
+ """
326
+ if isinstance(credentials, Scoped) and credentials.requires_scopes:
327
+ return credentials.with_scopes(scopes, default_scopes=default_scopes)
328
+ else:
329
+ return credentials
330
+
331
+
332
+ @six.add_metaclass(abc.ABCMeta)
333
+ class Signing(object):
334
+ """Interface for credentials that can cryptographically sign messages."""
335
+
336
+ @abc.abstractmethod
337
+ def sign_bytes(self, message):
338
+ """Signs the given message.
339
+
340
+ Args:
341
+ message (bytes): The message to sign.
342
+
343
+ Returns:
344
+ bytes: The message's cryptographic signature.
345
+ """
346
+ # pylint: disable=missing-raises-doc,redundant-returns-doc
347
+ # (pylint doesn't recognize that this is abstract)
348
+ raise NotImplementedError("Sign bytes must be implemented.")
349
+
350
+ @abc.abstractproperty
351
+ def signer_email(self):
352
+ """Optional[str]: An email address that identifies the signer."""
353
+ # pylint: disable=missing-raises-doc
354
+ # (pylint doesn't recognize that this is abstract)
355
+ raise NotImplementedError("Signer email must be implemented.")
356
+
357
+ @abc.abstractproperty
358
+ def signer(self):
359
+ """google.auth.crypt.Signer: The signer used to sign bytes."""
360
+ # pylint: disable=missing-raises-doc
361
+ # (pylint doesn't recognize that this is abstract)
362
+ raise NotImplementedError("Signer must be implemented.")
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/downscoped.py ADDED
@@ -0,0 +1,501 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2021 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Downscoping with Credential Access Boundaries
16
+
17
+ This module provides the ability to downscope credentials using
18
+ `Downscoping with Credential Access Boundaries`_. This is useful to restrict the
19
+ Identity and Access Management (IAM) permissions that a short-lived credential
20
+ can use.
21
+
22
+ To downscope permissions of a source credential, a Credential Access Boundary
23
+ that specifies which resources the new credential can access, as well as
24
+ an upper bound on the permissions that are available on each resource, has to
25
+ be defined. A downscoped credential can then be instantiated using the source
26
+ credential and the Credential Access Boundary.
27
+
28
+ The common pattern of usage is to have a token broker with elevated access
29
+ generate these downscoped credentials from higher access source credentials and
30
+ pass the downscoped short-lived access tokens to a token consumer via some
31
+ secure authenticated channel for limited access to Google Cloud Storage
32
+ resources.
33
+
34
+ For example, a token broker can be set up on a server in a private network.
35
+ Various workloads (token consumers) in the same network will send authenticated
36
+ requests to that broker for downscoped tokens to access or modify specific google
37
+ cloud storage buckets.
38
+
39
+ The broker will instantiate downscoped credentials instances that can be used to
40
+ generate short lived downscoped access tokens that can be passed to the token
41
+ consumer. These downscoped access tokens can be injected by the consumer into
42
+ google.oauth2.Credentials and used to initialize a storage client instance to
43
+ access Google Cloud Storage resources with restricted access.
44
+
45
+ Note: Only Cloud Storage supports Credential Access Boundaries. Other Google
46
+ Cloud services do not support this feature.
47
+
48
+ .. _Downscoping with Credential Access Boundaries: https://cloud.google.com/iam/docs/downscoping-short-lived-credentials
49
+ """
50
+
51
+ import datetime
52
+
53
+ import six
54
+
55
+ from google.auth import _helpers
56
+ from google.auth import credentials
57
+ from google.oauth2 import sts
58
+
59
+ # The maximum number of access boundary rules a Credential Access Boundary can
60
+ # contain.
61
+ _MAX_ACCESS_BOUNDARY_RULES_COUNT = 10
62
+ # The token exchange grant_type used for exchanging credentials.
63
+ _STS_GRANT_TYPE = "urn:ietf:params:oauth:grant-type:token-exchange"
64
+ # The token exchange requested_token_type. This is always an access_token.
65
+ _STS_REQUESTED_TOKEN_TYPE = "urn:ietf:params:oauth:token-type:access_token"
66
+ # The STS token URL used to exchanged a short lived access token for a downscoped one.
67
+ _STS_TOKEN_URL = "https://sts.googleapis.com/v1/token"
68
+ # The subject token type to use when exchanging a short lived access token for a
69
+ # downscoped token.
70
+ _STS_SUBJECT_TOKEN_TYPE = "urn:ietf:params:oauth:token-type:access_token"
71
+
72
+
73
+ class CredentialAccessBoundary(object):
74
+ """Defines a Credential Access Boundary which contains a list of access boundary
75
+ rules. Each rule contains information on the resource that the rule applies to,
76
+ the upper bound of the permissions that are available on that resource and an
77
+ optional condition to further restrict permissions.
78
+ """
79
+
80
+ def __init__(self, rules=[]):
81
+ """Instantiates a Credential Access Boundary. A Credential Access Boundary
82
+ can contain up to 10 access boundary rules.
83
+
84
+ Args:
85
+ rules (Sequence[google.auth.downscoped.AccessBoundaryRule]): The list of
86
+ access boundary rules limiting the access that a downscoped credential
87
+ will have.
88
+ Raises:
89
+ TypeError: If any of the rules are not a valid type.
90
+ ValueError: If the provided rules exceed the maximum allowed.
91
+ """
92
+ self.rules = rules
93
+
94
+ @property
95
+ def rules(self):
96
+ """Returns the list of access boundary rules defined on the Credential
97
+ Access Boundary.
98
+
99
+ Returns:
100
+ Tuple[google.auth.downscoped.AccessBoundaryRule, ...]: The list of access
101
+ boundary rules defined on the Credential Access Boundary. These are returned
102
+ as an immutable tuple to prevent modification.
103
+ """
104
+ return tuple(self._rules)
105
+
106
+ @rules.setter
107
+ def rules(self, value):
108
+ """Updates the current rules on the Credential Access Boundary. This will overwrite
109
+ the existing set of rules.
110
+
111
+ Args:
112
+ value (Sequence[google.auth.downscoped.AccessBoundaryRule]): The list of
113
+ access boundary rules limiting the access that a downscoped credential
114
+ will have.
115
+ Raises:
116
+ TypeError: If any of the rules are not a valid type.
117
+ ValueError: If the provided rules exceed the maximum allowed.
118
+ """
119
+ if len(value) > _MAX_ACCESS_BOUNDARY_RULES_COUNT:
120
+ raise ValueError(
121
+ "Credential access boundary rules can have a maximum of {} rules.".format(
122
+ _MAX_ACCESS_BOUNDARY_RULES_COUNT
123
+ )
124
+ )
125
+ for access_boundary_rule in value:
126
+ if not isinstance(access_boundary_rule, AccessBoundaryRule):
127
+ raise TypeError(
128
+ "List of rules provided do not contain a valid 'google.auth.downscoped.AccessBoundaryRule'."
129
+ )
130
+ # Make a copy of the original list.
131
+ self._rules = list(value)
132
+
133
+ def add_rule(self, rule):
134
+ """Adds a single access boundary rule to the existing rules.
135
+
136
+ Args:
137
+ rule (google.auth.downscoped.AccessBoundaryRule): The access boundary rule,
138
+ limiting the access that a downscoped credential will have, to be added to
139
+ the existing rules.
140
+ Raises:
141
+ TypeError: If any of the rules are not a valid type.
142
+ ValueError: If the provided rules exceed the maximum allowed.
143
+ """
144
+ if len(self.rules) == _MAX_ACCESS_BOUNDARY_RULES_COUNT:
145
+ raise ValueError(
146
+ "Credential access boundary rules can have a maximum of {} rules.".format(
147
+ _MAX_ACCESS_BOUNDARY_RULES_COUNT
148
+ )
149
+ )
150
+ if not isinstance(rule, AccessBoundaryRule):
151
+ raise TypeError(
152
+ "The provided rule does not contain a valid 'google.auth.downscoped.AccessBoundaryRule'."
153
+ )
154
+ self._rules.append(rule)
155
+
156
+ def to_json(self):
157
+ """Generates the dictionary representation of the Credential Access Boundary.
158
+ This uses the format expected by the Security Token Service API as documented in
159
+ `Defining a Credential Access Boundary`_.
160
+
161
+ .. _Defining a Credential Access Boundary:
162
+ https://cloud.google.com/iam/docs/downscoping-short-lived-credentials#define-boundary
163
+
164
+ Returns:
165
+ Mapping: Credential Access Boundary Rule represented in a dictionary object.
166
+ """
167
+ rules = []
168
+ for access_boundary_rule in self.rules:
169
+ rules.append(access_boundary_rule.to_json())
170
+
171
+ return {"accessBoundary": {"accessBoundaryRules": rules}}
172
+
173
+
174
+ class AccessBoundaryRule(object):
175
+ """Defines an access boundary rule which contains information on the resource that
176
+ the rule applies to, the upper bound of the permissions that are available on that
177
+ resource and an optional condition to further restrict permissions.
178
+ """
179
+
180
+ def __init__(
181
+ self, available_resource, available_permissions, availability_condition=None
182
+ ):
183
+ """Instantiates a single access boundary rule.
184
+
185
+ Args:
186
+ available_resource (str): The full resource name of the Cloud Storage bucket
187
+ that the rule applies to. Use the format
188
+ "//storage.googleapis.com/projects/_/buckets/bucket-name".
189
+ available_permissions (Sequence[str]): A list defining the upper bound that
190
+ the downscoped token will have on the available permissions for the
191
+ resource. Each value is the identifier for an IAM predefined role or
192
+ custom role, with the prefix "inRole:". For example:
193
+ "inRole:roles/storage.objectViewer".
194
+ Only the permissions in these roles will be available.
195
+ availability_condition (Optional[google.auth.downscoped.AvailabilityCondition]):
196
+ Optional condition that restricts the availability of permissions to
197
+ specific Cloud Storage objects.
198
+
199
+ Raises:
200
+ TypeError: If any of the parameters are not of the expected types.
201
+ ValueError: If any of the parameters are not of the expected values.
202
+ """
203
+ self.available_resource = available_resource
204
+ self.available_permissions = available_permissions
205
+ self.availability_condition = availability_condition
206
+
207
+ @property
208
+ def available_resource(self):
209
+ """Returns the current available resource.
210
+
211
+ Returns:
212
+ str: The current available resource.
213
+ """
214
+ return self._available_resource
215
+
216
+ @available_resource.setter
217
+ def available_resource(self, value):
218
+ """Updates the current available resource.
219
+
220
+ Args:
221
+ value (str): The updated value of the available resource.
222
+
223
+ Raises:
224
+ TypeError: If the value is not a string.
225
+ """
226
+ if not isinstance(value, six.string_types):
227
+ raise TypeError("The provided available_resource is not a string.")
228
+ self._available_resource = value
229
+
230
+ @property
231
+ def available_permissions(self):
232
+ """Returns the current available permissions.
233
+
234
+ Returns:
235
+ Tuple[str, ...]: The current available permissions. These are returned
236
+ as an immutable tuple to prevent modification.
237
+ """
238
+ return tuple(self._available_permissions)
239
+
240
+ @available_permissions.setter
241
+ def available_permissions(self, value):
242
+ """Updates the current available permissions.
243
+
244
+ Args:
245
+ value (Sequence[str]): The updated value of the available permissions.
246
+
247
+ Raises:
248
+ TypeError: If the value is not a list of strings.
249
+ ValueError: If the value is not valid.
250
+ """
251
+ for available_permission in value:
252
+ if not isinstance(available_permission, six.string_types):
253
+ raise TypeError(
254
+ "Provided available_permissions are not a list of strings."
255
+ )
256
+ if available_permission.find("inRole:") != 0:
257
+ raise ValueError(
258
+ "available_permissions must be prefixed with 'inRole:'."
259
+ )
260
+ # Make a copy of the original list.
261
+ self._available_permissions = list(value)
262
+
263
+ @property
264
+ def availability_condition(self):
265
+ """Returns the current availability condition.
266
+
267
+ Returns:
268
+ Optional[google.auth.downscoped.AvailabilityCondition]: The current
269
+ availability condition.
270
+ """
271
+ return self._availability_condition
272
+
273
+ @availability_condition.setter
274
+ def availability_condition(self, value):
275
+ """Updates the current availability condition.
276
+
277
+ Args:
278
+ value (Optional[google.auth.downscoped.AvailabilityCondition]): The updated
279
+ value of the availability condition.
280
+
281
+ Raises:
282
+ TypeError: If the value is not of type google.auth.downscoped.AvailabilityCondition
283
+ or None.
284
+ """
285
+ if not isinstance(value, AvailabilityCondition) and value is not None:
286
+ raise TypeError(
287
+ "The provided availability_condition is not a 'google.auth.downscoped.AvailabilityCondition' or None."
288
+ )
289
+ self._availability_condition = value
290
+
291
+ def to_json(self):
292
+ """Generates the dictionary representation of the access boundary rule.
293
+ This uses the format expected by the Security Token Service API as documented in
294
+ `Defining a Credential Access Boundary`_.
295
+
296
+ .. _Defining a Credential Access Boundary:
297
+ https://cloud.google.com/iam/docs/downscoping-short-lived-credentials#define-boundary
298
+
299
+ Returns:
300
+ Mapping: The access boundary rule represented in a dictionary object.
301
+ """
302
+ json = {
303
+ "availablePermissions": list(self.available_permissions),
304
+ "availableResource": self.available_resource,
305
+ }
306
+ if self.availability_condition:
307
+ json["availabilityCondition"] = self.availability_condition.to_json()
308
+ return json
309
+
310
+
311
+ class AvailabilityCondition(object):
312
+ """An optional condition that can be used as part of a Credential Access Boundary
313
+ to further restrict permissions."""
314
+
315
+ def __init__(self, expression, title=None, description=None):
316
+ """Instantiates an availability condition using the provided expression and
317
+ optional title or description.
318
+
319
+ Args:
320
+ expression (str): A condition expression that specifies the Cloud Storage
321
+ objects where permissions are available. For example, this expression
322
+ makes permissions available for objects whose name starts with "customer-a":
323
+ "resource.name.startsWith('projects/_/buckets/example-bucket/objects/customer-a')"
324
+ title (Optional[str]): An optional short string that identifies the purpose of
325
+ the condition.
326
+ description (Optional[str]): Optional details about the purpose of the condition.
327
+
328
+ Raises:
329
+ TypeError: If any of the parameters are not of the expected types.
330
+ ValueError: If any of the parameters are not of the expected values.
331
+ """
332
+ self.expression = expression
333
+ self.title = title
334
+ self.description = description
335
+
336
+ @property
337
+ def expression(self):
338
+ """Returns the current condition expression.
339
+
340
+ Returns:
341
+ str: The current conditon expression.
342
+ """
343
+ return self._expression
344
+
345
+ @expression.setter
346
+ def expression(self, value):
347
+ """Updates the current condition expression.
348
+
349
+ Args:
350
+ value (str): The updated value of the condition expression.
351
+
352
+ Raises:
353
+ TypeError: If the value is not of type string.
354
+ """
355
+ if not isinstance(value, six.string_types):
356
+ raise TypeError("The provided expression is not a string.")
357
+ self._expression = value
358
+
359
+ @property
360
+ def title(self):
361
+ """Returns the current title.
362
+
363
+ Returns:
364
+ Optional[str]: The current title.
365
+ """
366
+ return self._title
367
+
368
+ @title.setter
369
+ def title(self, value):
370
+ """Updates the current title.
371
+
372
+ Args:
373
+ value (Optional[str]): The updated value of the title.
374
+
375
+ Raises:
376
+ TypeError: If the value is not of type string or None.
377
+ """
378
+ if not isinstance(value, six.string_types) and value is not None:
379
+ raise TypeError("The provided title is not a string or None.")
380
+ self._title = value
381
+
382
+ @property
383
+ def description(self):
384
+ """Returns the current description.
385
+
386
+ Returns:
387
+ Optional[str]: The current description.
388
+ """
389
+ return self._description
390
+
391
+ @description.setter
392
+ def description(self, value):
393
+ """Updates the current description.
394
+
395
+ Args:
396
+ value (Optional[str]): The updated value of the description.
397
+
398
+ Raises:
399
+ TypeError: If the value is not of type string or None.
400
+ """
401
+ if not isinstance(value, six.string_types) and value is not None:
402
+ raise TypeError("The provided description is not a string or None.")
403
+ self._description = value
404
+
405
+ def to_json(self):
406
+ """Generates the dictionary representation of the availability condition.
407
+ This uses the format expected by the Security Token Service API as documented in
408
+ `Defining a Credential Access Boundary`_.
409
+
410
+ .. _Defining a Credential Access Boundary:
411
+ https://cloud.google.com/iam/docs/downscoping-short-lived-credentials#define-boundary
412
+
413
+ Returns:
414
+ Mapping[str, str]: The availability condition represented in a dictionary
415
+ object.
416
+ """
417
+ json = {"expression": self.expression}
418
+ if self.title:
419
+ json["title"] = self.title
420
+ if self.description:
421
+ json["description"] = self.description
422
+ return json
423
+
424
+
425
+ class Credentials(credentials.CredentialsWithQuotaProject):
426
+ """Defines a set of Google credentials that are downscoped from an existing set
427
+ of Google OAuth2 credentials. This is useful to restrict the Identity and Access
428
+ Management (IAM) permissions that a short-lived credential can use.
429
+ The common pattern of usage is to have a token broker with elevated access
430
+ generate these downscoped credentials from higher access source credentials and
431
+ pass the downscoped short-lived access tokens to a token consumer via some
432
+ secure authenticated channel for limited access to Google Cloud Storage
433
+ resources.
434
+ """
435
+
436
+ def __init__(
437
+ self, source_credentials, credential_access_boundary, quota_project_id=None
438
+ ):
439
+ """Instantiates a downscoped credentials object using the provided source
440
+ credentials and credential access boundary rules.
441
+ To downscope permissions of a source credential, a Credential Access Boundary
442
+ that specifies which resources the new credential can access, as well as an
443
+ upper bound on the permissions that are available on each resource, has to be
444
+ defined. A downscoped credential can then be instantiated using the source
445
+ credential and the Credential Access Boundary.
446
+
447
+ Args:
448
+ source_credentials (google.auth.credentials.Credentials): The source credentials
449
+ to be downscoped based on the provided Credential Access Boundary rules.
450
+ credential_access_boundary (google.auth.downscoped.CredentialAccessBoundary):
451
+ The Credential Access Boundary which contains a list of access boundary
452
+ rules. Each rule contains information on the resource that the rule applies to,
453
+ the upper bound of the permissions that are available on that resource and an
454
+ optional condition to further restrict permissions.
455
+ quota_project_id (Optional[str]): The optional quota project ID.
456
+ Raises:
457
+ google.auth.exceptions.RefreshError: If the source credentials
458
+ return an error on token refresh.
459
+ google.auth.exceptions.OAuthError: If the STS token exchange
460
+ endpoint returned an error during downscoped token generation.
461
+ """
462
+
463
+ super(Credentials, self).__init__()
464
+ self._source_credentials = source_credentials
465
+ self._credential_access_boundary = credential_access_boundary
466
+ self._quota_project_id = quota_project_id
467
+ self._sts_client = sts.Client(_STS_TOKEN_URL)
468
+
469
+ @_helpers.copy_docstring(credentials.Credentials)
470
+ def refresh(self, request):
471
+ # Generate an access token from the source credentials.
472
+ self._source_credentials.refresh(request)
473
+ now = _helpers.utcnow()
474
+ # Exchange the access token for a downscoped access token.
475
+ response_data = self._sts_client.exchange_token(
476
+ request=request,
477
+ grant_type=_STS_GRANT_TYPE,
478
+ subject_token=self._source_credentials.token,
479
+ subject_token_type=_STS_SUBJECT_TOKEN_TYPE,
480
+ requested_token_type=_STS_REQUESTED_TOKEN_TYPE,
481
+ additional_options=self._credential_access_boundary.to_json(),
482
+ )
483
+ self.token = response_data.get("access_token")
484
+ # For downscoping CAB flow, the STS endpoint may not return the expiration
485
+ # field for some flows. The generated downscoped token should always have
486
+ # the same expiration time as the source credentials. When no expires_in
487
+ # field is returned in the response, we can just get the expiration time
488
+ # from the source credentials.
489
+ if response_data.get("expires_in"):
490
+ lifetime = datetime.timedelta(seconds=response_data.get("expires_in"))
491
+ self.expiry = now + lifetime
492
+ else:
493
+ self.expiry = self._source_credentials.expiry
494
+
495
+ @_helpers.copy_docstring(credentials.CredentialsWithQuotaProject)
496
+ def with_quota_project(self, quota_project_id):
497
+ return self.__class__(
498
+ self._source_credentials,
499
+ self._credential_access_boundary,
500
+ quota_project_id=quota_project_id,
501
+ )
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/environment_vars.py ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2016 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Environment variables used by :mod:`google.auth`."""
16
+
17
+
18
+ PROJECT = "GOOGLE_CLOUD_PROJECT"
19
+ """Environment variable defining default project.
20
+
21
+ This used by :func:`google.auth.default` to explicitly set a project ID. This
22
+ environment variable is also used by the Google Cloud Python Library.
23
+ """
24
+
25
+ LEGACY_PROJECT = "GCLOUD_PROJECT"
26
+ """Previously used environment variable defining the default project.
27
+
28
+ This environment variable is used instead of the current one in some
29
+ situations (such as Google App Engine).
30
+ """
31
+
32
+ CREDENTIALS = "GOOGLE_APPLICATION_CREDENTIALS"
33
+ """Environment variable defining the location of Google application default
34
+ credentials."""
35
+
36
+ # The environment variable name which can replace ~/.config if set.
37
+ CLOUD_SDK_CONFIG_DIR = "CLOUDSDK_CONFIG"
38
+ """Environment variable defines the location of Google Cloud SDK's config
39
+ files."""
40
+
41
+ # These two variables allow for customization of the addresses used when
42
+ # contacting the GCE metadata service.
43
+ GCE_METADATA_HOST = "GCE_METADATA_HOST"
44
+ """Environment variable providing an alternate hostname or host:port to be
45
+ used for GCE metadata requests.
46
+
47
+ This environment variable was originally named GCE_METADATA_ROOT. The system will
48
+ check this environemnt variable first; should there be no value present,
49
+ the system will fall back to the old variable.
50
+ """
51
+
52
+ GCE_METADATA_ROOT = "GCE_METADATA_ROOT"
53
+ """Old environment variable for GCE_METADATA_HOST."""
54
+
55
+ GCE_METADATA_IP = "GCE_METADATA_IP"
56
+ """Environment variable providing an alternate ip:port to be used for ip-only
57
+ GCE metadata requests."""
58
+
59
+ GOOGLE_API_USE_CLIENT_CERTIFICATE = "GOOGLE_API_USE_CLIENT_CERTIFICATE"
60
+ """Environment variable controlling whether to use client certificate or not.
61
+
62
+ The default value is false. Users have to explicitly set this value to true
63
+ in order to use client certificate to establish a mutual TLS channel."""
64
+
65
+ LEGACY_APPENGINE_RUNTIME = "APPENGINE_RUNTIME"
66
+ """Gen1 environment variable defining the App Engine Runtime.
67
+
68
+ Used to distinguish between GAE gen1 and GAE gen2+.
69
+ """
70
+
71
+ # AWS environment variables used with AWS workload identity pools to retrieve
72
+ # AWS security credentials and the AWS region needed to create a serialized
73
+ # signed requests to the AWS STS GetCalledIdentity API that can be exchanged
74
+ # for a Google access tokens via the GCP STS endpoint.
75
+ # When not available the AWS metadata server is used to retrieve these values.
76
+ AWS_ACCESS_KEY_ID = "AWS_ACCESS_KEY_ID"
77
+ AWS_SECRET_ACCESS_KEY = "AWS_SECRET_ACCESS_KEY"
78
+ AWS_SESSION_TOKEN = "AWS_SESSION_TOKEN"
79
+ AWS_REGION = "AWS_REGION"
80
+ AWS_DEFAULT_REGION = "AWS_DEFAULT_REGION"
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/exceptions.py ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2016 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Exceptions used in the google.auth package."""
16
+
17
+
18
+ class GoogleAuthError(Exception):
19
+ """Base class for all google.auth errors."""
20
+
21
+
22
+ class TransportError(GoogleAuthError):
23
+ """Used to indicate an error occurred during an HTTP request."""
24
+
25
+
26
+ class RefreshError(GoogleAuthError):
27
+ """Used to indicate that an refreshing the credentials' access token
28
+ failed."""
29
+
30
+
31
+ class UserAccessTokenError(GoogleAuthError):
32
+ """Used to indicate ``gcloud auth print-access-token`` command failed."""
33
+
34
+
35
+ class DefaultCredentialsError(GoogleAuthError):
36
+ """Used to indicate that acquiring default credentials failed."""
37
+
38
+
39
+ class MutualTLSChannelError(GoogleAuthError):
40
+ """Used to indicate that mutual TLS channel creation is failed, or mutual
41
+ TLS channel credentials is missing or invalid."""
42
+
43
+
44
+ class ClientCertError(GoogleAuthError):
45
+ """Used to indicate that client certificate is missing or invalid."""
46
+
47
+
48
+ class OAuthError(GoogleAuthError):
49
+ """Used to indicate an error occurred during an OAuth related HTTP
50
+ request."""
51
+
52
+
53
+ class ReauthFailError(RefreshError):
54
+ """An exception for when reauth failed."""
55
+
56
+ def __init__(self, message=None):
57
+ super(ReauthFailError, self).__init__(
58
+ "Reauthentication failed. {0}".format(message)
59
+ )
60
+
61
+
62
+ class ReauthSamlChallengeFailError(ReauthFailError):
63
+ """An exception for SAML reauth challenge failures."""
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/external_account.py ADDED
@@ -0,0 +1,470 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2020 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """External Account Credentials.
16
+
17
+ This module provides credentials that exchange workload identity pool external
18
+ credentials for Google access tokens. This facilitates accessing Google Cloud
19
+ Platform resources from on-prem and non-Google Cloud platforms (e.g. AWS,
20
+ Microsoft Azure, OIDC identity providers), using native credentials retrieved
21
+ from the current environment without the need to copy, save and manage
22
+ long-lived service account credentials.
23
+
24
+ Specifically, this is intended to use access tokens acquired using the GCP STS
25
+ token exchange endpoint following the `OAuth 2.0 Token Exchange`_ spec.
26
+
27
+ .. _OAuth 2.0 Token Exchange: https://tools.ietf.org/html/rfc8693
28
+ """
29
+
30
+ import abc
31
+ import copy
32
+ import datetime
33
+ import json
34
+ import re
35
+
36
+ import six
37
+ from urllib3.util import parse_url
38
+
39
+ from google.auth import _helpers
40
+ from google.auth import credentials
41
+ from google.auth import exceptions
42
+ from google.auth import impersonated_credentials
43
+ from google.oauth2 import sts
44
+ from google.oauth2 import utils
45
+
46
+ # External account JSON type identifier.
47
+ _EXTERNAL_ACCOUNT_JSON_TYPE = "external_account"
48
+ # The token exchange grant_type used for exchanging credentials.
49
+ _STS_GRANT_TYPE = "urn:ietf:params:oauth:grant-type:token-exchange"
50
+ # The token exchange requested_token_type. This is always an access_token.
51
+ _STS_REQUESTED_TOKEN_TYPE = "urn:ietf:params:oauth:token-type:access_token"
52
+ # Cloud resource manager URL used to retrieve project information.
53
+ _CLOUD_RESOURCE_MANAGER = "https://cloudresourcemanager.googleapis.com/v1/projects/"
54
+
55
+
56
+ @six.add_metaclass(abc.ABCMeta)
57
+ class Credentials(credentials.Scoped, credentials.CredentialsWithQuotaProject):
58
+ """Base class for all external account credentials.
59
+
60
+ This is used to instantiate Credentials for exchanging external account
61
+ credentials for Google access token and authorizing requests to Google APIs.
62
+ The base class implements the common logic for exchanging external account
63
+ credentials for Google access tokens.
64
+ """
65
+
66
+ def __init__(
67
+ self,
68
+ audience,
69
+ subject_token_type,
70
+ token_url,
71
+ credential_source,
72
+ service_account_impersonation_url=None,
73
+ client_id=None,
74
+ client_secret=None,
75
+ quota_project_id=None,
76
+ scopes=None,
77
+ default_scopes=None,
78
+ workforce_pool_user_project=None,
79
+ ):
80
+ """Instantiates an external account credentials object.
81
+
82
+ Args:
83
+ audience (str): The STS audience field.
84
+ subject_token_type (str): The subject token type.
85
+ token_url (str): The STS endpoint URL.
86
+ credential_source (Mapping): The credential source dictionary.
87
+ service_account_impersonation_url (Optional[str]): The optional service account
88
+ impersonation generateAccessToken URL.
89
+ client_id (Optional[str]): The optional client ID.
90
+ client_secret (Optional[str]): The optional client secret.
91
+ quota_project_id (Optional[str]): The optional quota project ID.
92
+ scopes (Optional[Sequence[str]]): Optional scopes to request during the
93
+ authorization grant.
94
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
95
+ Google client library. Use 'scopes' for user-defined scopes.
96
+ workforce_pool_user_project (Optona[str]): The optional workforce pool user
97
+ project number when the credential corresponds to a workforce pool and not
98
+ a workload identity pool. The underlying principal must still have
99
+ serviceusage.services.use IAM permission to use the project for
100
+ billing/quota.
101
+ Raises:
102
+ google.auth.exceptions.RefreshError: If the generateAccessToken
103
+ endpoint returned an error.
104
+ """
105
+ super(Credentials, self).__init__()
106
+ self._audience = audience
107
+ self._subject_token_type = subject_token_type
108
+ self._token_url = token_url
109
+ self._credential_source = credential_source
110
+ self._service_account_impersonation_url = service_account_impersonation_url
111
+ self._client_id = client_id
112
+ self._client_secret = client_secret
113
+ self._quota_project_id = quota_project_id
114
+ self._scopes = scopes
115
+ self._default_scopes = default_scopes
116
+ self._workforce_pool_user_project = workforce_pool_user_project
117
+
118
+ Credentials.validate_token_url(token_url)
119
+ if service_account_impersonation_url:
120
+ Credentials.validate_service_account_impersonation_url(
121
+ service_account_impersonation_url
122
+ )
123
+
124
+ if self._client_id:
125
+ self._client_auth = utils.ClientAuthentication(
126
+ utils.ClientAuthType.basic, self._client_id, self._client_secret
127
+ )
128
+ else:
129
+ self._client_auth = None
130
+ self._sts_client = sts.Client(self._token_url, self._client_auth)
131
+
132
+ if self._service_account_impersonation_url:
133
+ self._impersonated_credentials = self._initialize_impersonated_credentials()
134
+ else:
135
+ self._impersonated_credentials = None
136
+ self._project_id = None
137
+
138
+ if not self.is_workforce_pool and self._workforce_pool_user_project:
139
+ # Workload identity pools do not support workforce pool user projects.
140
+ raise ValueError(
141
+ "workforce_pool_user_project should not be set for non-workforce pool "
142
+ "credentials"
143
+ )
144
+
145
+ @property
146
+ def info(self):
147
+ """Generates the dictionary representation of the current credentials.
148
+
149
+ Returns:
150
+ Mapping: The dictionary representation of the credentials. This is the
151
+ reverse of "from_info" defined on the subclasses of this class. It is
152
+ useful for serializing the current credentials so it can deserialized
153
+ later.
154
+ """
155
+ config_info = {
156
+ "type": _EXTERNAL_ACCOUNT_JSON_TYPE,
157
+ "audience": self._audience,
158
+ "subject_token_type": self._subject_token_type,
159
+ "token_url": self._token_url,
160
+ "service_account_impersonation_url": self._service_account_impersonation_url,
161
+ "credential_source": copy.deepcopy(self._credential_source),
162
+ "quota_project_id": self._quota_project_id,
163
+ "client_id": self._client_id,
164
+ "client_secret": self._client_secret,
165
+ "workforce_pool_user_project": self._workforce_pool_user_project,
166
+ }
167
+ return {key: value for key, value in config_info.items() if value is not None}
168
+
169
+ @property
170
+ def service_account_email(self):
171
+ """Returns the service account email if service account impersonation is used.
172
+
173
+ Returns:
174
+ Optional[str]: The service account email if impersonation is used. Otherwise
175
+ None is returned.
176
+ """
177
+ if self._service_account_impersonation_url:
178
+ # Parse email from URL. The formal looks as follows:
179
+ # https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/name@project-id.iam.gserviceaccount.com:generateAccessToken
180
+ url = self._service_account_impersonation_url
181
+ start_index = url.rfind("/")
182
+ end_index = url.find(":generateAccessToken")
183
+ if start_index != -1 and end_index != -1 and start_index < end_index:
184
+ start_index = start_index + 1
185
+ return url[start_index:end_index]
186
+ return None
187
+
188
+ @property
189
+ def is_user(self):
190
+ """Returns whether the credentials represent a user (True) or workload (False).
191
+ Workloads behave similarly to service accounts. Currently workloads will use
192
+ service account impersonation but will eventually not require impersonation.
193
+ As a result, this property is more reliable than the service account email
194
+ property in determining if the credentials represent a user or workload.
195
+
196
+ Returns:
197
+ bool: True if the credentials represent a user. False if they represent a
198
+ workload.
199
+ """
200
+ # If service account impersonation is used, the credentials will always represent a
201
+ # service account.
202
+ if self._service_account_impersonation_url:
203
+ return False
204
+ return self.is_workforce_pool
205
+
206
+ @property
207
+ def is_workforce_pool(self):
208
+ """Returns whether the credentials represent a workforce pool (True) or
209
+ workload (False) based on the credentials' audience.
210
+
211
+ This will also return True for impersonated workforce pool credentials.
212
+
213
+ Returns:
214
+ bool: True if the credentials represent a workforce pool. False if they
215
+ represent a workload.
216
+ """
217
+ # Workforce pools representing users have the following audience format:
218
+ # //iam.googleapis.com/locations/$location/workforcePools/$poolId/providers/$providerId
219
+ p = re.compile(r"//iam\.googleapis\.com/locations/[^/]+/workforcePools/")
220
+ return p.match(self._audience or "") is not None
221
+
222
+ @property
223
+ def requires_scopes(self):
224
+ """Checks if the credentials requires scopes.
225
+
226
+ Returns:
227
+ bool: True if there are no scopes set otherwise False.
228
+ """
229
+ return not self._scopes and not self._default_scopes
230
+
231
+ @property
232
+ def project_number(self):
233
+ """Optional[str]: The project number corresponding to the workload identity pool."""
234
+
235
+ # STS audience pattern:
236
+ # //iam.googleapis.com/projects/$PROJECT_NUMBER/locations/...
237
+ components = self._audience.split("/")
238
+ try:
239
+ project_index = components.index("projects")
240
+ if project_index + 1 < len(components):
241
+ return components[project_index + 1] or None
242
+ except ValueError:
243
+ return None
244
+
245
+ @_helpers.copy_docstring(credentials.Scoped)
246
+ def with_scopes(self, scopes, default_scopes=None):
247
+ d = dict(
248
+ audience=self._audience,
249
+ subject_token_type=self._subject_token_type,
250
+ token_url=self._token_url,
251
+ credential_source=self._credential_source,
252
+ service_account_impersonation_url=self._service_account_impersonation_url,
253
+ client_id=self._client_id,
254
+ client_secret=self._client_secret,
255
+ quota_project_id=self._quota_project_id,
256
+ scopes=scopes,
257
+ default_scopes=default_scopes,
258
+ workforce_pool_user_project=self._workforce_pool_user_project,
259
+ )
260
+ if not self.is_workforce_pool:
261
+ d.pop("workforce_pool_user_project")
262
+ return self.__class__(**d)
263
+
264
+ @abc.abstractmethod
265
+ def retrieve_subject_token(self, request):
266
+ """Retrieves the subject token using the credential_source object.
267
+
268
+ Args:
269
+ request (google.auth.transport.Request): A callable used to make
270
+ HTTP requests.
271
+ Returns:
272
+ str: The retrieved subject token.
273
+ """
274
+ # pylint: disable=missing-raises-doc
275
+ # (pylint doesn't recognize that this is abstract)
276
+ raise NotImplementedError("retrieve_subject_token must be implemented")
277
+
278
+ def get_project_id(self, request):
279
+ """Retrieves the project ID corresponding to the workload identity or workforce pool.
280
+ For workforce pool credentials, it returns the project ID corresponding to
281
+ the workforce_pool_user_project.
282
+
283
+ When not determinable, None is returned.
284
+
285
+ This is introduced to support the current pattern of using the Auth library:
286
+
287
+ credentials, project_id = google.auth.default()
288
+
289
+ The resource may not have permission (resourcemanager.projects.get) to
290
+ call this API or the required scopes may not be selected:
291
+ https://cloud.google.com/resource-manager/reference/rest/v1/projects/get#authorization-scopes
292
+
293
+ Args:
294
+ request (google.auth.transport.Request): A callable used to make
295
+ HTTP requests.
296
+ Returns:
297
+ Optional[str]: The project ID corresponding to the workload identity pool
298
+ or workforce pool if determinable.
299
+ """
300
+ if self._project_id:
301
+ # If already retrieved, return the cached project ID value.
302
+ return self._project_id
303
+ scopes = self._scopes if self._scopes is not None else self._default_scopes
304
+ # Scopes are required in order to retrieve a valid access token.
305
+ project_number = self.project_number or self._workforce_pool_user_project
306
+ if project_number and scopes:
307
+ headers = {}
308
+ url = _CLOUD_RESOURCE_MANAGER + project_number
309
+ self.before_request(request, "GET", url, headers)
310
+ response = request(url=url, method="GET", headers=headers)
311
+
312
+ response_body = (
313
+ response.data.decode("utf-8")
314
+ if hasattr(response.data, "decode")
315
+ else response.data
316
+ )
317
+ response_data = json.loads(response_body)
318
+
319
+ if response.status == 200:
320
+ # Cache result as this field is immutable.
321
+ self._project_id = response_data.get("projectId")
322
+ return self._project_id
323
+
324
+ return None
325
+
326
+ @_helpers.copy_docstring(credentials.Credentials)
327
+ def refresh(self, request):
328
+ scopes = self._scopes if self._scopes is not None else self._default_scopes
329
+ if self._impersonated_credentials:
330
+ self._impersonated_credentials.refresh(request)
331
+ self.token = self._impersonated_credentials.token
332
+ self.expiry = self._impersonated_credentials.expiry
333
+ else:
334
+ now = _helpers.utcnow()
335
+ additional_options = None
336
+ # Do not pass workforce_pool_user_project when client authentication
337
+ # is used. The client ID is sufficient for determining the user project.
338
+ if self._workforce_pool_user_project and not self._client_id:
339
+ additional_options = {"userProject": self._workforce_pool_user_project}
340
+ response_data = self._sts_client.exchange_token(
341
+ request=request,
342
+ grant_type=_STS_GRANT_TYPE,
343
+ subject_token=self.retrieve_subject_token(request),
344
+ subject_token_type=self._subject_token_type,
345
+ audience=self._audience,
346
+ scopes=scopes,
347
+ requested_token_type=_STS_REQUESTED_TOKEN_TYPE,
348
+ additional_options=additional_options,
349
+ )
350
+ self.token = response_data.get("access_token")
351
+ lifetime = datetime.timedelta(seconds=response_data.get("expires_in"))
352
+ self.expiry = now + lifetime
353
+
354
+ @_helpers.copy_docstring(credentials.CredentialsWithQuotaProject)
355
+ def with_quota_project(self, quota_project_id):
356
+ # Return copy of instance with the provided quota project ID.
357
+ d = dict(
358
+ audience=self._audience,
359
+ subject_token_type=self._subject_token_type,
360
+ token_url=self._token_url,
361
+ credential_source=self._credential_source,
362
+ service_account_impersonation_url=self._service_account_impersonation_url,
363
+ client_id=self._client_id,
364
+ client_secret=self._client_secret,
365
+ quota_project_id=quota_project_id,
366
+ scopes=self._scopes,
367
+ default_scopes=self._default_scopes,
368
+ workforce_pool_user_project=self._workforce_pool_user_project,
369
+ )
370
+ if not self.is_workforce_pool:
371
+ d.pop("workforce_pool_user_project")
372
+ return self.__class__(**d)
373
+
374
+ def _initialize_impersonated_credentials(self):
375
+ """Generates an impersonated credentials.
376
+
377
+ For more details, see `projects.serviceAccounts.generateAccessToken`_.
378
+
379
+ .. _projects.serviceAccounts.generateAccessToken: https://cloud.google.com/iam/docs/reference/credentials/rest/v1/projects.serviceAccounts/generateAccessToken
380
+
381
+ Returns:
382
+ impersonated_credentials.Credential: The impersonated credentials
383
+ object.
384
+
385
+ Raises:
386
+ google.auth.exceptions.RefreshError: If the generateAccessToken
387
+ endpoint returned an error.
388
+ """
389
+ # Return copy of instance with no service account impersonation.
390
+ d = dict(
391
+ audience=self._audience,
392
+ subject_token_type=self._subject_token_type,
393
+ token_url=self._token_url,
394
+ credential_source=self._credential_source,
395
+ service_account_impersonation_url=None,
396
+ client_id=self._client_id,
397
+ client_secret=self._client_secret,
398
+ quota_project_id=self._quota_project_id,
399
+ scopes=self._scopes,
400
+ default_scopes=self._default_scopes,
401
+ workforce_pool_user_project=self._workforce_pool_user_project,
402
+ )
403
+ if not self.is_workforce_pool:
404
+ d.pop("workforce_pool_user_project")
405
+ source_credentials = self.__class__(**d)
406
+
407
+ # Determine target_principal.
408
+ target_principal = self.service_account_email
409
+ if not target_principal:
410
+ raise exceptions.RefreshError(
411
+ "Unable to determine target principal from service account impersonation URL."
412
+ )
413
+
414
+ scopes = self._scopes if self._scopes is not None else self._default_scopes
415
+ # Initialize and return impersonated credentials.
416
+ return impersonated_credentials.Credentials(
417
+ source_credentials=source_credentials,
418
+ target_principal=target_principal,
419
+ target_scopes=scopes,
420
+ quota_project_id=self._quota_project_id,
421
+ iam_endpoint_override=self._service_account_impersonation_url,
422
+ )
423
+
424
+ @staticmethod
425
+ def validate_token_url(token_url):
426
+ _TOKEN_URL_PATTERNS = [
427
+ "^[^\\.\\s\\/\\\\]+\\.sts\\.googleapis\\.com$",
428
+ "^sts\\.googleapis\\.com$",
429
+ "^sts\\.[^\\.\\s\\/\\\\]+\\.googleapis\\.com$",
430
+ "^[^\\.\\s\\/\\\\]+\\-sts\\.googleapis\\.com$",
431
+ ]
432
+
433
+ if not Credentials.is_valid_url(_TOKEN_URL_PATTERNS, token_url):
434
+ raise ValueError("The provided token URL is invalid.")
435
+
436
+ @staticmethod
437
+ def validate_service_account_impersonation_url(url):
438
+ _SERVICE_ACCOUNT_IMPERSONATION_URL_PATTERNS = [
439
+ "^[^\\.\\s\\/\\\\]+\\.iamcredentials\\.googleapis\\.com$",
440
+ "^iamcredentials\\.googleapis\\.com$",
441
+ "^iamcredentials\\.[^\\.\\s\\/\\\\]+\\.googleapis\\.com$",
442
+ "^[^\\.\\s\\/\\\\]+\\-iamcredentials\\.googleapis\\.com$",
443
+ ]
444
+
445
+ if not Credentials.is_valid_url(
446
+ _SERVICE_ACCOUNT_IMPERSONATION_URL_PATTERNS, url
447
+ ):
448
+ raise ValueError(
449
+ "The provided service account impersonation URL is invalid."
450
+ )
451
+
452
+ @staticmethod
453
+ def is_valid_url(patterns, url):
454
+ """
455
+ Returns True if the provided URL's scheme is HTTPS and the host comforms to at least one of the provided patterns.
456
+ """
457
+ # Check specifically for whitespcaces:
458
+ # Some python3.6 will parse the space character into %20 and pass the regex check which shouldn't be passed
459
+ if not url or len(str(url).split()) > 1:
460
+ return False
461
+
462
+ try:
463
+ uri = parse_url(url)
464
+ except Exception:
465
+ return False
466
+
467
+ if not uri.scheme or uri.scheme != "https" or not uri.hostname:
468
+ return False
469
+
470
+ return any(re.compile(p).match(uri.hostname.lower()) for p in patterns)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/iam.py ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2017 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Tools for using the Google `Cloud Identity and Access Management (IAM)
16
+ API`_'s auth-related functionality.
17
+
18
+ .. _Cloud Identity and Access Management (IAM) API:
19
+ https://cloud.google.com/iam/docs/
20
+ """
21
+
22
+ import base64
23
+ import json
24
+
25
+ from six.moves import http_client
26
+
27
+ from google.auth import _helpers
28
+ from google.auth import crypt
29
+ from google.auth import exceptions
30
+
31
+ _IAM_API_ROOT_URI = "https://iamcredentials.googleapis.com/v1"
32
+ _SIGN_BLOB_URI = _IAM_API_ROOT_URI + "/projects/-/serviceAccounts/{}:signBlob?alt=json"
33
+
34
+
35
+ class Signer(crypt.Signer):
36
+ """Signs messages using the IAM `signBlob API`_.
37
+
38
+ This is useful when you need to sign bytes but do not have access to the
39
+ credential's private key file.
40
+
41
+ .. _signBlob API:
42
+ https://cloud.google.com/iam/reference/rest/v1/projects.serviceAccounts
43
+ /signBlob
44
+ """
45
+
46
+ def __init__(self, request, credentials, service_account_email):
47
+ """
48
+ Args:
49
+ request (google.auth.transport.Request): The object used to make
50
+ HTTP requests.
51
+ credentials (google.auth.credentials.Credentials): The credentials
52
+ that will be used to authenticate the request to the IAM API.
53
+ The credentials must have of one the following scopes:
54
+
55
+ - https://www.googleapis.com/auth/iam
56
+ - https://www.googleapis.com/auth/cloud-platform
57
+ service_account_email (str): The service account email identifying
58
+ which service account to use to sign bytes. Often, this can
59
+ be the same as the service account email in the given
60
+ credentials.
61
+ """
62
+ self._request = request
63
+ self._credentials = credentials
64
+ self._service_account_email = service_account_email
65
+
66
+ def _make_signing_request(self, message):
67
+ """Makes a request to the API signBlob API."""
68
+ message = _helpers.to_bytes(message)
69
+
70
+ method = "POST"
71
+ url = _SIGN_BLOB_URI.format(self._service_account_email)
72
+ headers = {"Content-Type": "application/json"}
73
+ body = json.dumps(
74
+ {"payload": base64.b64encode(message).decode("utf-8")}
75
+ ).encode("utf-8")
76
+
77
+ self._credentials.before_request(self._request, method, url, headers)
78
+ response = self._request(url=url, method=method, body=body, headers=headers)
79
+
80
+ if response.status != http_client.OK:
81
+ raise exceptions.TransportError(
82
+ "Error calling the IAM signBlob API: {}".format(response.data)
83
+ )
84
+
85
+ return json.loads(response.data.decode("utf-8"))
86
+
87
+ @property
88
+ def key_id(self):
89
+ """Optional[str]: The key ID used to identify this private key.
90
+
91
+ .. warning::
92
+ This is always ``None``. The key ID used by IAM can not
93
+ be reliably determined ahead of time.
94
+ """
95
+ return None
96
+
97
+ @_helpers.copy_docstring(crypt.Signer)
98
+ def sign(self, message):
99
+ response = self._make_signing_request(message)
100
+ return base64.b64decode(response["signedBlob"])
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/identity_pool.py ADDED
@@ -0,0 +1,287 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2020 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Identity Pool Credentials.
16
+
17
+ This module provides credentials to access Google Cloud resources from on-prem
18
+ or non-Google Cloud platforms which support external credentials (e.g. OIDC ID
19
+ tokens) retrieved from local file locations or local servers. This includes
20
+ Microsoft Azure and OIDC identity providers (e.g. K8s workloads registered with
21
+ Hub with Hub workload identity enabled).
22
+
23
+ These credentials are recommended over the use of service account credentials
24
+ in on-prem/non-Google Cloud platforms as they do not involve the management of
25
+ long-live service account private keys.
26
+
27
+ Identity Pool Credentials are initialized using external_account
28
+ arguments which are typically loaded from an external credentials file or
29
+ an external credentials URL. Unlike other Credentials that can be initialized
30
+ with a list of explicit arguments, secrets or credentials, external account
31
+ clients use the environment and hints/guidelines provided by the
32
+ external_account JSON file to retrieve credentials and exchange them for Google
33
+ access tokens.
34
+ """
35
+
36
+ try:
37
+ from collections.abc import Mapping
38
+ # Python 2.7 compatibility
39
+ except ImportError: # pragma: NO COVER
40
+ from collections import Mapping
41
+ import io
42
+ import json
43
+ import os
44
+
45
+ from google.auth import _helpers
46
+ from google.auth import exceptions
47
+ from google.auth import external_account
48
+
49
+
50
+ class Credentials(external_account.Credentials):
51
+ """External account credentials sourced from files and URLs."""
52
+
53
+ def __init__(
54
+ self,
55
+ audience,
56
+ subject_token_type,
57
+ token_url,
58
+ credential_source,
59
+ service_account_impersonation_url=None,
60
+ client_id=None,
61
+ client_secret=None,
62
+ quota_project_id=None,
63
+ scopes=None,
64
+ default_scopes=None,
65
+ workforce_pool_user_project=None,
66
+ ):
67
+ """Instantiates an external account credentials object from a file/URL.
68
+
69
+ Args:
70
+ audience (str): The STS audience field.
71
+ subject_token_type (str): The subject token type.
72
+ token_url (str): The STS endpoint URL.
73
+ credential_source (Mapping): The credential source dictionary used to
74
+ provide instructions on how to retrieve external credential to be
75
+ exchanged for Google access tokens.
76
+
77
+ Example credential_source for url-sourced credential::
78
+
79
+ {
80
+ "url": "http://www.example.com",
81
+ "format": {
82
+ "type": "json",
83
+ "subject_token_field_name": "access_token",
84
+ },
85
+ "headers": {"foo": "bar"},
86
+ }
87
+
88
+ Example credential_source for file-sourced credential::
89
+
90
+ {
91
+ "file": "/path/to/token/file.txt"
92
+ }
93
+
94
+ service_account_impersonation_url (Optional[str]): The optional service account
95
+ impersonation getAccessToken URL.
96
+ client_id (Optional[str]): The optional client ID.
97
+ client_secret (Optional[str]): The optional client secret.
98
+ quota_project_id (Optional[str]): The optional quota project ID.
99
+ scopes (Optional[Sequence[str]]): Optional scopes to request during the
100
+ authorization grant.
101
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
102
+ Google client library. Use 'scopes' for user-defined scopes.
103
+ workforce_pool_user_project (Optona[str]): The optional workforce pool user
104
+ project number when the credential corresponds to a workforce pool and not
105
+ a workload identity pool. The underlying principal must still have
106
+ serviceusage.services.use IAM permission to use the project for
107
+ billing/quota.
108
+
109
+ Raises:
110
+ google.auth.exceptions.RefreshError: If an error is encountered during
111
+ access token retrieval logic.
112
+ ValueError: For invalid parameters.
113
+
114
+ .. note:: Typically one of the helper constructors
115
+ :meth:`from_file` or
116
+ :meth:`from_info` are used instead of calling the constructor directly.
117
+ """
118
+
119
+ super(Credentials, self).__init__(
120
+ audience=audience,
121
+ subject_token_type=subject_token_type,
122
+ token_url=token_url,
123
+ credential_source=credential_source,
124
+ service_account_impersonation_url=service_account_impersonation_url,
125
+ client_id=client_id,
126
+ client_secret=client_secret,
127
+ quota_project_id=quota_project_id,
128
+ scopes=scopes,
129
+ default_scopes=default_scopes,
130
+ workforce_pool_user_project=workforce_pool_user_project,
131
+ )
132
+ if not isinstance(credential_source, Mapping):
133
+ self._credential_source_file = None
134
+ self._credential_source_url = None
135
+ else:
136
+ self._credential_source_file = credential_source.get("file")
137
+ self._credential_source_url = credential_source.get("url")
138
+ self._credential_source_headers = credential_source.get("headers")
139
+ credential_source_format = credential_source.get("format", {})
140
+ # Get credential_source format type. When not provided, this
141
+ # defaults to text.
142
+ self._credential_source_format_type = (
143
+ credential_source_format.get("type") or "text"
144
+ )
145
+ # environment_id is only supported in AWS or dedicated future external
146
+ # account credentials.
147
+ if "environment_id" in credential_source:
148
+ raise ValueError(
149
+ "Invalid Identity Pool credential_source field 'environment_id'"
150
+ )
151
+ if self._credential_source_format_type not in ["text", "json"]:
152
+ raise ValueError(
153
+ "Invalid credential_source format '{}'".format(
154
+ self._credential_source_format_type
155
+ )
156
+ )
157
+ # For JSON types, get the required subject_token field name.
158
+ if self._credential_source_format_type == "json":
159
+ self._credential_source_field_name = credential_source_format.get(
160
+ "subject_token_field_name"
161
+ )
162
+ if self._credential_source_field_name is None:
163
+ raise ValueError(
164
+ "Missing subject_token_field_name for JSON credential_source format"
165
+ )
166
+ else:
167
+ self._credential_source_field_name = None
168
+
169
+ if self._credential_source_file and self._credential_source_url:
170
+ raise ValueError(
171
+ "Ambiguous credential_source. 'file' is mutually exclusive with 'url'."
172
+ )
173
+ if not self._credential_source_file and not self._credential_source_url:
174
+ raise ValueError(
175
+ "Missing credential_source. A 'file' or 'url' must be provided."
176
+ )
177
+
178
+ @_helpers.copy_docstring(external_account.Credentials)
179
+ def retrieve_subject_token(self, request):
180
+ return self._parse_token_data(
181
+ self._get_token_data(request),
182
+ self._credential_source_format_type,
183
+ self._credential_source_field_name,
184
+ )
185
+
186
+ def _get_token_data(self, request):
187
+ if self._credential_source_file:
188
+ return self._get_file_data(self._credential_source_file)
189
+ else:
190
+ return self._get_url_data(
191
+ request, self._credential_source_url, self._credential_source_headers
192
+ )
193
+
194
+ def _get_file_data(self, filename):
195
+ if not os.path.exists(filename):
196
+ raise exceptions.RefreshError("File '{}' was not found.".format(filename))
197
+
198
+ with io.open(filename, "r", encoding="utf-8") as file_obj:
199
+ return file_obj.read(), filename
200
+
201
+ def _get_url_data(self, request, url, headers):
202
+ response = request(url=url, method="GET", headers=headers)
203
+
204
+ # support both string and bytes type response.data
205
+ response_body = (
206
+ response.data.decode("utf-8")
207
+ if hasattr(response.data, "decode")
208
+ else response.data
209
+ )
210
+
211
+ if response.status != 200:
212
+ raise exceptions.RefreshError(
213
+ "Unable to retrieve Identity Pool subject token", response_body
214
+ )
215
+
216
+ return response_body, url
217
+
218
+ def _parse_token_data(
219
+ self, token_content, format_type="text", subject_token_field_name=None
220
+ ):
221
+ content, filename = token_content
222
+ if format_type == "text":
223
+ token = content
224
+ else:
225
+ try:
226
+ # Parse file content as JSON.
227
+ response_data = json.loads(content)
228
+ # Get the subject_token.
229
+ token = response_data[subject_token_field_name]
230
+ except (KeyError, ValueError):
231
+ raise exceptions.RefreshError(
232
+ "Unable to parse subject_token from JSON file '{}' using key '{}'".format(
233
+ filename, subject_token_field_name
234
+ )
235
+ )
236
+ if not token:
237
+ raise exceptions.RefreshError(
238
+ "Missing subject_token in the credential_source file"
239
+ )
240
+ return token
241
+
242
+ @classmethod
243
+ def from_info(cls, info, **kwargs):
244
+ """Creates an Identity Pool Credentials instance from parsed external account info.
245
+
246
+ Args:
247
+ info (Mapping[str, str]): The Identity Pool external account info in Google
248
+ format.
249
+ kwargs: Additional arguments to pass to the constructor.
250
+
251
+ Returns:
252
+ google.auth.identity_pool.Credentials: The constructed
253
+ credentials.
254
+
255
+ Raises:
256
+ ValueError: For invalid parameters.
257
+ """
258
+ return cls(
259
+ audience=info.get("audience"),
260
+ subject_token_type=info.get("subject_token_type"),
261
+ token_url=info.get("token_url"),
262
+ service_account_impersonation_url=info.get(
263
+ "service_account_impersonation_url"
264
+ ),
265
+ client_id=info.get("client_id"),
266
+ client_secret=info.get("client_secret"),
267
+ credential_source=info.get("credential_source"),
268
+ quota_project_id=info.get("quota_project_id"),
269
+ workforce_pool_user_project=info.get("workforce_pool_user_project"),
270
+ **kwargs
271
+ )
272
+
273
+ @classmethod
274
+ def from_file(cls, filename, **kwargs):
275
+ """Creates an IdentityPool Credentials instance from an external account json file.
276
+
277
+ Args:
278
+ filename (str): The path to the IdentityPool external account json file.
279
+ kwargs: Additional arguments to pass to the constructor.
280
+
281
+ Returns:
282
+ google.auth.identity_pool.Credentials: The constructed
283
+ credentials.
284
+ """
285
+ with io.open(filename, "r", encoding="utf-8") as json_file:
286
+ data = json.load(json_file)
287
+ return cls.from_info(data, **kwargs)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/impersonated_credentials.py ADDED
@@ -0,0 +1,436 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2018 Google Inc.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Google Cloud Impersonated credentials.
16
+
17
+ This module provides authentication for applications where local credentials
18
+ impersonates a remote service account using `IAM Credentials API`_.
19
+
20
+ This class can be used to impersonate a service account as long as the original
21
+ Credential object has the "Service Account Token Creator" role on the target
22
+ service account.
23
+
24
+ .. _IAM Credentials API:
25
+ https://cloud.google.com/iam/credentials/reference/rest/
26
+ """
27
+
28
+ import base64
29
+ import copy
30
+ from datetime import datetime
31
+ import json
32
+
33
+ import six
34
+ from six.moves import http_client
35
+
36
+ from google.auth import _helpers
37
+ from google.auth import credentials
38
+ from google.auth import exceptions
39
+ from google.auth import jwt
40
+
41
+ _DEFAULT_TOKEN_LIFETIME_SECS = 3600 # 1 hour in seconds
42
+
43
+ _IAM_SCOPE = ["https://www.googleapis.com/auth/iam"]
44
+
45
+ _IAM_ENDPOINT = (
46
+ "https://iamcredentials.googleapis.com/v1/projects/-"
47
+ + "/serviceAccounts/{}:generateAccessToken"
48
+ )
49
+
50
+ _IAM_SIGN_ENDPOINT = (
51
+ "https://iamcredentials.googleapis.com/v1/projects/-"
52
+ + "/serviceAccounts/{}:signBlob"
53
+ )
54
+
55
+ _IAM_IDTOKEN_ENDPOINT = (
56
+ "https://iamcredentials.googleapis.com/v1/"
57
+ + "projects/-/serviceAccounts/{}:generateIdToken"
58
+ )
59
+
60
+ _REFRESH_ERROR = "Unable to acquire impersonated credentials"
61
+
62
+ _DEFAULT_TOKEN_LIFETIME_SECS = 3600 # 1 hour in seconds
63
+
64
+ _DEFAULT_TOKEN_URI = "https://oauth2.googleapis.com/token"
65
+
66
+
67
+ def _make_iam_token_request(
68
+ request, principal, headers, body, iam_endpoint_override=None
69
+ ):
70
+ """Makes a request to the Google Cloud IAM service for an access token.
71
+ Args:
72
+ request (Request): The Request object to use.
73
+ principal (str): The principal to request an access token for.
74
+ headers (Mapping[str, str]): Map of headers to transmit.
75
+ body (Mapping[str, str]): JSON Payload body for the iamcredentials
76
+ API call.
77
+ iam_endpoint_override (Optiona[str]): The full IAM endpoint override
78
+ with the target_principal embedded. This is useful when supporting
79
+ impersonation with regional endpoints.
80
+
81
+ Raises:
82
+ google.auth.exceptions.TransportError: Raised if there is an underlying
83
+ HTTP connection error
84
+ google.auth.exceptions.RefreshError: Raised if the impersonated
85
+ credentials are not available. Common reasons are
86
+ `iamcredentials.googleapis.com` is not enabled or the
87
+ `Service Account Token Creator` is not assigned
88
+ """
89
+ iam_endpoint = iam_endpoint_override or _IAM_ENDPOINT.format(principal)
90
+
91
+ body = json.dumps(body).encode("utf-8")
92
+
93
+ response = request(url=iam_endpoint, method="POST", headers=headers, body=body)
94
+
95
+ # support both string and bytes type response.data
96
+ response_body = (
97
+ response.data.decode("utf-8")
98
+ if hasattr(response.data, "decode")
99
+ else response.data
100
+ )
101
+
102
+ if response.status != http_client.OK:
103
+ exceptions.RefreshError(_REFRESH_ERROR, response_body)
104
+
105
+ try:
106
+ token_response = json.loads(response_body)
107
+ token = token_response["accessToken"]
108
+ expiry = datetime.strptime(token_response["expireTime"], "%Y-%m-%dT%H:%M:%SZ")
109
+
110
+ return token, expiry
111
+
112
+ except (KeyError, ValueError) as caught_exc:
113
+ new_exc = exceptions.RefreshError(
114
+ "{}: No access token or invalid expiration in response.".format(
115
+ _REFRESH_ERROR
116
+ ),
117
+ response_body,
118
+ )
119
+ six.raise_from(new_exc, caught_exc)
120
+
121
+
122
+ class Credentials(
123
+ credentials.Scoped, credentials.CredentialsWithQuotaProject, credentials.Signing
124
+ ):
125
+ """This module defines impersonated credentials which are essentially
126
+ impersonated identities.
127
+
128
+ Impersonated Credentials allows credentials issued to a user or
129
+ service account to impersonate another. The target service account must
130
+ grant the originating credential principal the
131
+ `Service Account Token Creator`_ IAM role:
132
+
133
+ For more information about Token Creator IAM role and
134
+ IAMCredentials API, see
135
+ `Creating Short-Lived Service Account Credentials`_.
136
+
137
+ .. _Service Account Token Creator:
138
+ https://cloud.google.com/iam/docs/service-accounts#the_service_account_token_creator_role
139
+
140
+ .. _Creating Short-Lived Service Account Credentials:
141
+ https://cloud.google.com/iam/docs/creating-short-lived-service-account-credentials
142
+
143
+ Usage:
144
+
145
+ First grant source_credentials the `Service Account Token Creator`
146
+ role on the target account to impersonate. In this example, the
147
+ service account represented by svc_account.json has the
148
+ token creator role on
149
+ `impersonated-account@_project_.iam.gserviceaccount.com`.
150
+
151
+ Enable the IAMCredentials API on the source project:
152
+ `gcloud services enable iamcredentials.googleapis.com`.
153
+
154
+ Initialize a source credential which does not have access to
155
+ list bucket::
156
+
157
+ from google.oauth2 import service_account
158
+
159
+ target_scopes = [
160
+ 'https://www.googleapis.com/auth/devstorage.read_only']
161
+
162
+ source_credentials = (
163
+ service_account.Credentials.from_service_account_file(
164
+ '/path/to/svc_account.json',
165
+ scopes=target_scopes))
166
+
167
+ Now use the source credentials to acquire credentials to impersonate
168
+ another service account::
169
+
170
+ from google.auth import impersonated_credentials
171
+
172
+ target_credentials = impersonated_credentials.Credentials(
173
+ source_credentials=source_credentials,
174
+ target_principal='impersonated-account@_project_.iam.gserviceaccount.com',
175
+ target_scopes = target_scopes,
176
+ lifetime=500)
177
+
178
+ Resource access is granted::
179
+
180
+ client = storage.Client(credentials=target_credentials)
181
+ buckets = client.list_buckets(project='your_project')
182
+ for bucket in buckets:
183
+ print(bucket.name)
184
+ """
185
+
186
+ def __init__(
187
+ self,
188
+ source_credentials,
189
+ target_principal,
190
+ target_scopes,
191
+ delegates=None,
192
+ lifetime=_DEFAULT_TOKEN_LIFETIME_SECS,
193
+ quota_project_id=None,
194
+ iam_endpoint_override=None,
195
+ ):
196
+ """
197
+ Args:
198
+ source_credentials (google.auth.Credentials): The source credential
199
+ used as to acquire the impersonated credentials.
200
+ target_principal (str): The service account to impersonate.
201
+ target_scopes (Sequence[str]): Scopes to request during the
202
+ authorization grant.
203
+ delegates (Sequence[str]): The chained list of delegates required
204
+ to grant the final access_token. If set, the sequence of
205
+ identities must have "Service Account Token Creator" capability
206
+ granted to the prceeding identity. For example, if set to
207
+ [serviceAccountB, serviceAccountC], the source_credential
208
+ must have the Token Creator role on serviceAccountB.
209
+ serviceAccountB must have the Token Creator on
210
+ serviceAccountC.
211
+ Finally, C must have Token Creator on target_principal.
212
+ If left unset, source_credential must have that role on
213
+ target_principal.
214
+ lifetime (int): Number of seconds the delegated credential should
215
+ be valid for (upto 3600).
216
+ quota_project_id (Optional[str]): The project ID used for quota and billing.
217
+ This project may be different from the project used to
218
+ create the credentials.
219
+ iam_endpoint_override (Optiona[str]): The full IAM endpoint override
220
+ with the target_principal embedded. This is useful when supporting
221
+ impersonation with regional endpoints.
222
+ """
223
+
224
+ super(Credentials, self).__init__()
225
+
226
+ self._source_credentials = copy.copy(source_credentials)
227
+ # Service account source credentials must have the _IAM_SCOPE
228
+ # added to refresh correctly. User credentials cannot have
229
+ # their original scopes modified.
230
+ if isinstance(self._source_credentials, credentials.Scoped):
231
+ self._source_credentials = self._source_credentials.with_scopes(_IAM_SCOPE)
232
+ self._target_principal = target_principal
233
+ self._target_scopes = target_scopes
234
+ self._delegates = delegates
235
+ self._lifetime = lifetime
236
+ self.token = None
237
+ self.expiry = _helpers.utcnow()
238
+ self._quota_project_id = quota_project_id
239
+ self._iam_endpoint_override = iam_endpoint_override
240
+
241
+ @_helpers.copy_docstring(credentials.Credentials)
242
+ def refresh(self, request):
243
+ self._update_token(request)
244
+
245
+ def _update_token(self, request):
246
+ """Updates credentials with a new access_token representing
247
+ the impersonated account.
248
+
249
+ Args:
250
+ request (google.auth.transport.requests.Request): Request object
251
+ to use for refreshing credentials.
252
+ """
253
+
254
+ # Refresh our source credentials if it is not valid.
255
+ if not self._source_credentials.valid:
256
+ self._source_credentials.refresh(request)
257
+
258
+ body = {
259
+ "delegates": self._delegates,
260
+ "scope": self._target_scopes,
261
+ "lifetime": str(self._lifetime) + "s",
262
+ }
263
+
264
+ headers = {"Content-Type": "application/json"}
265
+
266
+ # Apply the source credentials authentication info.
267
+ self._source_credentials.apply(headers)
268
+
269
+ self.token, self.expiry = _make_iam_token_request(
270
+ request=request,
271
+ principal=self._target_principal,
272
+ headers=headers,
273
+ body=body,
274
+ iam_endpoint_override=self._iam_endpoint_override,
275
+ )
276
+
277
+ def sign_bytes(self, message):
278
+ from google.auth.transport.requests import AuthorizedSession
279
+
280
+ iam_sign_endpoint = _IAM_SIGN_ENDPOINT.format(self._target_principal)
281
+
282
+ body = {
283
+ "payload": base64.b64encode(message).decode("utf-8"),
284
+ "delegates": self._delegates,
285
+ }
286
+
287
+ headers = {"Content-Type": "application/json"}
288
+
289
+ authed_session = AuthorizedSession(self._source_credentials)
290
+
291
+ response = authed_session.post(
292
+ url=iam_sign_endpoint, headers=headers, json=body
293
+ )
294
+
295
+ if response.status_code != http_client.OK:
296
+ raise exceptions.TransportError(
297
+ "Error calling sign_bytes: {}".format(response.json())
298
+ )
299
+
300
+ return base64.b64decode(response.json()["signedBlob"])
301
+
302
+ @property
303
+ def signer_email(self):
304
+ return self._target_principal
305
+
306
+ @property
307
+ def service_account_email(self):
308
+ return self._target_principal
309
+
310
+ @property
311
+ def signer(self):
312
+ return self
313
+
314
+ @property
315
+ def requires_scopes(self):
316
+ return not self._target_scopes
317
+
318
+ @_helpers.copy_docstring(credentials.CredentialsWithQuotaProject)
319
+ def with_quota_project(self, quota_project_id):
320
+ return self.__class__(
321
+ self._source_credentials,
322
+ target_principal=self._target_principal,
323
+ target_scopes=self._target_scopes,
324
+ delegates=self._delegates,
325
+ lifetime=self._lifetime,
326
+ quota_project_id=quota_project_id,
327
+ iam_endpoint_override=self._iam_endpoint_override,
328
+ )
329
+
330
+ @_helpers.copy_docstring(credentials.Scoped)
331
+ def with_scopes(self, scopes, default_scopes=None):
332
+ return self.__class__(
333
+ self._source_credentials,
334
+ target_principal=self._target_principal,
335
+ target_scopes=scopes or default_scopes,
336
+ delegates=self._delegates,
337
+ lifetime=self._lifetime,
338
+ quota_project_id=self._quota_project_id,
339
+ iam_endpoint_override=self._iam_endpoint_override,
340
+ )
341
+
342
+
343
+ class IDTokenCredentials(credentials.CredentialsWithQuotaProject):
344
+ """Open ID Connect ID Token-based service account credentials.
345
+
346
+ """
347
+
348
+ def __init__(
349
+ self,
350
+ target_credentials,
351
+ target_audience=None,
352
+ include_email=False,
353
+ quota_project_id=None,
354
+ ):
355
+ """
356
+ Args:
357
+ target_credentials (google.auth.Credentials): The target
358
+ credential used as to acquire the id tokens for.
359
+ target_audience (string): Audience to issue the token for.
360
+ include_email (bool): Include email in IdToken
361
+ quota_project_id (Optional[str]): The project ID used for
362
+ quota and billing.
363
+ """
364
+ super(IDTokenCredentials, self).__init__()
365
+
366
+ if not isinstance(target_credentials, Credentials):
367
+ raise exceptions.GoogleAuthError(
368
+ "Provided Credential must be " "impersonated_credentials"
369
+ )
370
+ self._target_credentials = target_credentials
371
+ self._target_audience = target_audience
372
+ self._include_email = include_email
373
+ self._quota_project_id = quota_project_id
374
+
375
+ def from_credentials(self, target_credentials, target_audience=None):
376
+ return self.__class__(
377
+ target_credentials=self._target_credentials,
378
+ target_audience=target_audience,
379
+ include_email=self._include_email,
380
+ quota_project_id=self._quota_project_id,
381
+ )
382
+
383
+ def with_target_audience(self, target_audience):
384
+ return self.__class__(
385
+ target_credentials=self._target_credentials,
386
+ target_audience=target_audience,
387
+ include_email=self._include_email,
388
+ quota_project_id=self._quota_project_id,
389
+ )
390
+
391
+ def with_include_email(self, include_email):
392
+ return self.__class__(
393
+ target_credentials=self._target_credentials,
394
+ target_audience=self._target_audience,
395
+ include_email=include_email,
396
+ quota_project_id=self._quota_project_id,
397
+ )
398
+
399
+ @_helpers.copy_docstring(credentials.CredentialsWithQuotaProject)
400
+ def with_quota_project(self, quota_project_id):
401
+ return self.__class__(
402
+ target_credentials=self._target_credentials,
403
+ target_audience=self._target_audience,
404
+ include_email=self._include_email,
405
+ quota_project_id=quota_project_id,
406
+ )
407
+
408
+ @_helpers.copy_docstring(credentials.Credentials)
409
+ def refresh(self, request):
410
+ from google.auth.transport.requests import AuthorizedSession
411
+
412
+ iam_sign_endpoint = _IAM_IDTOKEN_ENDPOINT.format(
413
+ self._target_credentials.signer_email
414
+ )
415
+
416
+ body = {
417
+ "audience": self._target_audience,
418
+ "delegates": self._target_credentials._delegates,
419
+ "includeEmail": self._include_email,
420
+ }
421
+
422
+ headers = {"Content-Type": "application/json"}
423
+
424
+ authed_session = AuthorizedSession(
425
+ self._target_credentials._source_credentials, auth_request=request
426
+ )
427
+
428
+ response = authed_session.post(
429
+ url=iam_sign_endpoint,
430
+ headers=headers,
431
+ data=json.dumps(body).encode("utf-8"),
432
+ )
433
+
434
+ id_token = response.json()["token"]
435
+ self.token = id_token
436
+ self.expiry = datetime.fromtimestamp(jwt.decode(id_token, verify=False)["exp"])
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/auth/pluggable.py ADDED
@@ -0,0 +1,322 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2022 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ """Pluggable Credentials.
16
+ Pluggable Credentials are initialized using external_account arguments which
17
+ are typically loaded from third-party executables. Unlike other
18
+ credentials that can be initialized with a list of explicit arguments, secrets
19
+ or credentials, external account clients use the environment and hints/guidelines
20
+ provided by the external_account JSON file to retrieve credentials and exchange
21
+ them for Google access tokens.
22
+
23
+ Example credential_source for pluggable credential:
24
+ {
25
+ "executable": {
26
+ "command": "/path/to/get/credentials.sh --arg1=value1 --arg2=value2",
27
+ "timeout_millis": 5000,
28
+ "output_file": "/path/to/generated/cached/credentials"
29
+ }
30
+ }
31
+ """
32
+
33
+ try:
34
+ from collections.abc import Mapping
35
+ # Python 2.7 compatibility
36
+ except ImportError: # pragma: NO COVER
37
+ from collections import Mapping
38
+ import io
39
+ import json
40
+ import os
41
+ import subprocess
42
+ import time
43
+
44
+ from google.auth import _helpers
45
+ from google.auth import exceptions
46
+ from google.auth import external_account
47
+
48
+ # The max supported executable spec version.
49
+ EXECUTABLE_SUPPORTED_MAX_VERSION = 1
50
+
51
+
52
+ class Credentials(external_account.Credentials):
53
+ """External account credentials sourced from executables."""
54
+
55
+ def __init__(
56
+ self,
57
+ audience,
58
+ subject_token_type,
59
+ token_url,
60
+ credential_source,
61
+ service_account_impersonation_url=None,
62
+ client_id=None,
63
+ client_secret=None,
64
+ quota_project_id=None,
65
+ scopes=None,
66
+ default_scopes=None,
67
+ workforce_pool_user_project=None,
68
+ ):
69
+ """Instantiates an external account credentials object from a executables.
70
+
71
+ Args:
72
+ audience (str): The STS audience field.
73
+ subject_token_type (str): The subject token type.
74
+ token_url (str): The STS endpoint URL.
75
+ credential_source (Mapping): The credential source dictionary used to
76
+ provide instructions on how to retrieve external credential to be
77
+ exchanged for Google access tokens.
78
+
79
+ Example credential_source for pluggable credential:
80
+
81
+ {
82
+ "executable": {
83
+ "command": "/path/to/get/credentials.sh --arg1=value1 --arg2=value2",
84
+ "timeout_millis": 5000,
85
+ "output_file": "/path/to/generated/cached/credentials"
86
+ }
87
+ }
88
+
89
+ service_account_impersonation_url (Optional[str]): The optional service account
90
+ impersonation getAccessToken URL.
91
+ client_id (Optional[str]): The optional client ID.
92
+ client_secret (Optional[str]): The optional client secret.
93
+ quota_project_id (Optional[str]): The optional quota project ID.
94
+ scopes (Optional[Sequence[str]]): Optional scopes to request during the
95
+ authorization grant.
96
+ default_scopes (Optional[Sequence[str]]): Default scopes passed by a
97
+ Google client library. Use 'scopes' for user-defined scopes.
98
+ workforce_pool_user_project (Optona[str]): The optional workforce pool user
99
+ project number when the credential corresponds to a workforce pool and not
100
+ a workload Pluggable. The underlying principal must still have
101
+ serviceusage.services.use IAM permission to use the project for
102
+ billing/quota.
103
+
104
+ Raises:
105
+ google.auth.exceptions.RefreshError: If an error is encountered during
106
+ access token retrieval logic.
107
+ ValueError: For invalid parameters.
108
+
109
+ .. note:: Typically one of the helper constructors
110
+ :meth:`from_file` or
111
+ :meth:`from_info` are used instead of calling the constructor directly.
112
+ """
113
+
114
+ super(Credentials, self).__init__(
115
+ audience=audience,
116
+ subject_token_type=subject_token_type,
117
+ token_url=token_url,
118
+ credential_source=credential_source,
119
+ service_account_impersonation_url=service_account_impersonation_url,
120
+ client_id=client_id,
121
+ client_secret=client_secret,
122
+ quota_project_id=quota_project_id,
123
+ scopes=scopes,
124
+ default_scopes=default_scopes,
125
+ workforce_pool_user_project=workforce_pool_user_project,
126
+ )
127
+ if not isinstance(credential_source, Mapping):
128
+ self._credential_source_executable = None
129
+ raise ValueError(
130
+ "Missing credential_source. The credential_source is not a dict."
131
+ )
132
+ self._credential_source_executable = credential_source.get("executable")
133
+ if not self._credential_source_executable:
134
+ raise ValueError(
135
+ "Missing credential_source. An 'executable' must be provided."
136
+ )
137
+ self._credential_source_executable_command = self._credential_source_executable.get(
138
+ "command"
139
+ )
140
+ self._credential_source_executable_timeout_millis = self._credential_source_executable.get(
141
+ "timeout_millis"
142
+ )
143
+ self._credential_source_executable_output_file = self._credential_source_executable.get(
144
+ "output_file"
145
+ )
146
+
147
+ if not self._credential_source_executable_command:
148
+ raise ValueError(
149
+ "Missing command field. Executable command must be provided."
150
+ )
151
+ if not self._credential_source_executable_timeout_millis:
152
+ self._credential_source_executable_timeout_millis = 30 * 1000
153
+ elif (
154
+ self._credential_source_executable_timeout_millis < 5 * 1000
155
+ or self._credential_source_executable_timeout_millis > 120 * 1000
156
+ ):
157
+ raise ValueError("Timeout must be between 5 and 120 seconds.")
158
+
159
+ @_helpers.copy_docstring(external_account.Credentials)
160
+ def retrieve_subject_token(self, request):
161
+ env_allow_executables = os.environ.get(
162
+ "GOOGLE_EXTERNAL_ACCOUNT_ALLOW_EXECUTABLES"
163
+ )
164
+ if env_allow_executables != "1":
165
+ raise ValueError(
166
+ "Executables need to be explicitly allowed (set GOOGLE_EXTERNAL_ACCOUNT_ALLOW_EXECUTABLES to '1') to run."
167
+ )
168
+
169
+ # Check output file.
170
+ if self._credential_source_executable_output_file is not None:
171
+ try:
172
+ with open(
173
+ self._credential_source_executable_output_file
174
+ ) as output_file:
175
+ response = json.load(output_file)
176
+ except Exception:
177
+ pass
178
+ else:
179
+ try:
180
+ # If the cached response is expired, _parse_subject_token will raise an error which will be ignored and we will call the executable again.
181
+ subject_token = self._parse_subject_token(response)
182
+ except ValueError:
183
+ raise
184
+ except exceptions.RefreshError:
185
+ pass
186
+ else:
187
+ return subject_token
188
+
189
+ if not _helpers.is_python_3():
190
+ raise exceptions.RefreshError(
191
+ "Pluggable auth is only supported for python 3.6+"
192
+ )
193
+
194
+ # Inject env vars.
195
+ env = os.environ.copy()
196
+ env["GOOGLE_EXTERNAL_ACCOUNT_AUDIENCE"] = self._audience
197
+ env["GOOGLE_EXTERNAL_ACCOUNT_TOKEN_TYPE"] = self._subject_token_type
198
+ env[
199
+ "GOOGLE_EXTERNAL_ACCOUNT_INTERACTIVE"
200
+ ] = "0" # Always set to 0 until interactive mode is implemented.
201
+ if self._service_account_impersonation_url is not None:
202
+ env[
203
+ "GOOGLE_EXTERNAL_ACCOUNT_IMPERSONATED_EMAIL"
204
+ ] = self.service_account_email
205
+ if self._credential_source_executable_output_file is not None:
206
+ env[
207
+ "GOOGLE_EXTERNAL_ACCOUNT_OUTPUT_FILE"
208
+ ] = self._credential_source_executable_output_file
209
+
210
+ try:
211
+ result = subprocess.run(
212
+ self._credential_source_executable_command.split(),
213
+ timeout=self._credential_source_executable_timeout_millis / 1000,
214
+ stdout=subprocess.PIPE,
215
+ stderr=subprocess.STDOUT,
216
+ env=env,
217
+ )
218
+ if result.returncode != 0:
219
+ raise exceptions.RefreshError(
220
+ "Executable exited with non-zero return code {}. Error: {}".format(
221
+ result.returncode, result.stdout
222
+ )
223
+ )
224
+ except Exception:
225
+ raise
226
+ else:
227
+ try:
228
+ data = result.stdout.decode("utf-8")
229
+ response = json.loads(data)
230
+ subject_token = self._parse_subject_token(response)
231
+ except Exception:
232
+ raise
233
+
234
+ return subject_token
235
+
236
+ @classmethod
237
+ def from_info(cls, info, **kwargs):
238
+ """Creates a Pluggable Credentials instance from parsed external account info.
239
+
240
+ Args:
241
+ info (Mapping[str, str]): The Pluggable external account info in Google
242
+ format.
243
+ kwargs: Additional arguments to pass to the constructor.
244
+
245
+ Returns:
246
+ google.auth.pluggable.Credentials: The constructed
247
+ credentials.
248
+
249
+ Raises:
250
+ ValueError: For invalid parameters.
251
+ """
252
+ return cls(
253
+ audience=info.get("audience"),
254
+ subject_token_type=info.get("subject_token_type"),
255
+ token_url=info.get("token_url"),
256
+ service_account_impersonation_url=info.get(
257
+ "service_account_impersonation_url"
258
+ ),
259
+ client_id=info.get("client_id"),
260
+ client_secret=info.get("client_secret"),
261
+ credential_source=info.get("credential_source"),
262
+ quota_project_id=info.get("quota_project_id"),
263
+ workforce_pool_user_project=info.get("workforce_pool_user_project"),
264
+ **kwargs
265
+ )
266
+
267
+ @classmethod
268
+ def from_file(cls, filename, **kwargs):
269
+ """Creates an Pluggable Credentials instance from an external account json file.
270
+
271
+ Args:
272
+ filename (str): The path to the Pluggable external account json file.
273
+ kwargs: Additional arguments to pass to the constructor.
274
+
275
+ Returns:
276
+ google.auth.pluggable.Credentials: The constructed
277
+ credentials.
278
+ """
279
+ with io.open(filename, "r", encoding="utf-8") as json_file:
280
+ data = json.load(json_file)
281
+ return cls.from_info(data, **kwargs)
282
+
283
+ def _parse_subject_token(self, response):
284
+ if "version" not in response:
285
+ raise ValueError("The executable response is missing the version field.")
286
+ if response["version"] > EXECUTABLE_SUPPORTED_MAX_VERSION:
287
+ raise exceptions.RefreshError(
288
+ "Executable returned unsupported version {}.".format(
289
+ response["version"]
290
+ )
291
+ )
292
+ if "success" not in response:
293
+ raise ValueError("The executable response is missing the success field.")
294
+ if not response["success"]:
295
+ if "code" not in response or "message" not in response:
296
+ raise ValueError(
297
+ "Error code and message fields are required in the response."
298
+ )
299
+ raise exceptions.RefreshError(
300
+ "Executable returned unsuccessful response: code: {}, message: {}.".format(
301
+ response["code"], response["message"]
302
+ )
303
+ )
304
+ if "expiration_time" not in response:
305
+ raise ValueError(
306
+ "The executable response is missing the expiration_time field."
307
+ )
308
+ if response["expiration_time"] < time.time():
309
+ raise exceptions.RefreshError(
310
+ "The token returned by the executable is expired."
311
+ )
312
+ if "token_type" not in response:
313
+ raise ValueError("The executable response is missing the token_type field.")
314
+ if (
315
+ response["token_type"] == "urn:ietf:params:oauth:token-type:jwt"
316
+ or response["token_type"] == "urn:ietf:params:oauth:token-type:id_token"
317
+ ): # OIDC
318
+ return response["id_token"]
319
+ elif response["token_type"] == "urn:ietf:params:oauth:token-type:saml2": # SAML
320
+ return response["saml_response"]
321
+ else:
322
+ raise exceptions.RefreshError("Executable returned unsupported token type.")
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/__init__.py ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ # Copyright 2007 Google Inc. All Rights Reserved.
32
+
33
+ __version__ = '3.20.1'
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/any_pb2.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/any.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+
15
+
16
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x19google/protobuf/any.proto\x12\x0fgoogle.protobuf\"&\n\x03\x41ny\x12\x10\n\x08type_url\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\x0c\x42v\n\x13\x63om.google.protobufB\x08\x41nyProtoP\x01Z,google.golang.org/protobuf/types/known/anypb\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
17
+
18
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
19
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.any_pb2', globals())
20
+ if _descriptor._USE_C_DESCRIPTORS == False:
21
+
22
+ DESCRIPTOR._options = None
23
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\010AnyProtoP\001Z,google.golang.org/protobuf/types/known/anypb\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
24
+ _ANY._serialized_start=46
25
+ _ANY._serialized_end=84
26
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/api_pb2.py ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/api.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+ from google.protobuf import source_context_pb2 as google_dot_protobuf_dot_source__context__pb2
15
+ from google.protobuf import type_pb2 as google_dot_protobuf_dot_type__pb2
16
+
17
+
18
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x19google/protobuf/api.proto\x12\x0fgoogle.protobuf\x1a$google/protobuf/source_context.proto\x1a\x1agoogle/protobuf/type.proto\"\x81\x02\n\x03\x41pi\x12\x0c\n\x04name\x18\x01 \x01(\t\x12(\n\x07methods\x18\x02 \x03(\x0b\x32\x17.google.protobuf.Method\x12(\n\x07options\x18\x03 \x03(\x0b\x32\x17.google.protobuf.Option\x12\x0f\n\x07version\x18\x04 \x01(\t\x12\x36\n\x0esource_context\x18\x05 \x01(\x0b\x32\x1e.google.protobuf.SourceContext\x12&\n\x06mixins\x18\x06 \x03(\x0b\x32\x16.google.protobuf.Mixin\x12\'\n\x06syntax\x18\x07 \x01(\x0e\x32\x17.google.protobuf.Syntax\"\xd5\x01\n\x06Method\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x18\n\x10request_type_url\x18\x02 \x01(\t\x12\x19\n\x11request_streaming\x18\x03 \x01(\x08\x12\x19\n\x11response_type_url\x18\x04 \x01(\t\x12\x1a\n\x12response_streaming\x18\x05 \x01(\x08\x12(\n\x07options\x18\x06 \x03(\x0b\x32\x17.google.protobuf.Option\x12\'\n\x06syntax\x18\x07 \x01(\x0e\x32\x17.google.protobuf.Syntax\"#\n\x05Mixin\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x0c\n\x04root\x18\x02 \x01(\tBv\n\x13\x63om.google.protobufB\x08\x41piProtoP\x01Z,google.golang.org/protobuf/types/known/apipb\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
19
+
20
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
21
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.api_pb2', globals())
22
+ if _descriptor._USE_C_DESCRIPTORS == False:
23
+
24
+ DESCRIPTOR._options = None
25
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\010ApiProtoP\001Z,google.golang.org/protobuf/types/known/apipb\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
26
+ _API._serialized_start=113
27
+ _API._serialized_end=370
28
+ _METHOD._serialized_start=373
29
+ _METHOD._serialized_end=586
30
+ _MIXIN._serialized_start=588
31
+ _MIXIN._serialized_end=623
32
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/descriptor_database.py ADDED
@@ -0,0 +1,177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ """Provides a container for DescriptorProtos."""
32
+
33
+ __author__ = 'matthewtoia@google.com (Matt Toia)'
34
+
35
+ import warnings
36
+
37
+
38
+ class Error(Exception):
39
+ pass
40
+
41
+
42
+ class DescriptorDatabaseConflictingDefinitionError(Error):
43
+ """Raised when a proto is added with the same name & different descriptor."""
44
+
45
+
46
+ class DescriptorDatabase(object):
47
+ """A container accepting FileDescriptorProtos and maps DescriptorProtos."""
48
+
49
+ def __init__(self):
50
+ self._file_desc_protos_by_file = {}
51
+ self._file_desc_protos_by_symbol = {}
52
+
53
+ def Add(self, file_desc_proto):
54
+ """Adds the FileDescriptorProto and its types to this database.
55
+
56
+ Args:
57
+ file_desc_proto: The FileDescriptorProto to add.
58
+ Raises:
59
+ DescriptorDatabaseConflictingDefinitionError: if an attempt is made to
60
+ add a proto with the same name but different definition than an
61
+ existing proto in the database.
62
+ """
63
+ proto_name = file_desc_proto.name
64
+ if proto_name not in self._file_desc_protos_by_file:
65
+ self._file_desc_protos_by_file[proto_name] = file_desc_proto
66
+ elif self._file_desc_protos_by_file[proto_name] != file_desc_proto:
67
+ raise DescriptorDatabaseConflictingDefinitionError(
68
+ '%s already added, but with different descriptor.' % proto_name)
69
+ else:
70
+ return
71
+
72
+ # Add all the top-level descriptors to the index.
73
+ package = file_desc_proto.package
74
+ for message in file_desc_proto.message_type:
75
+ for name in _ExtractSymbols(message, package):
76
+ self._AddSymbol(name, file_desc_proto)
77
+ for enum in file_desc_proto.enum_type:
78
+ self._AddSymbol(('.'.join((package, enum.name))), file_desc_proto)
79
+ for enum_value in enum.value:
80
+ self._file_desc_protos_by_symbol[
81
+ '.'.join((package, enum_value.name))] = file_desc_proto
82
+ for extension in file_desc_proto.extension:
83
+ self._AddSymbol(('.'.join((package, extension.name))), file_desc_proto)
84
+ for service in file_desc_proto.service:
85
+ self._AddSymbol(('.'.join((package, service.name))), file_desc_proto)
86
+
87
+ def FindFileByName(self, name):
88
+ """Finds the file descriptor proto by file name.
89
+
90
+ Typically the file name is a relative path ending to a .proto file. The
91
+ proto with the given name will have to have been added to this database
92
+ using the Add method or else an error will be raised.
93
+
94
+ Args:
95
+ name: The file name to find.
96
+
97
+ Returns:
98
+ The file descriptor proto matching the name.
99
+
100
+ Raises:
101
+ KeyError if no file by the given name was added.
102
+ """
103
+
104
+ return self._file_desc_protos_by_file[name]
105
+
106
+ def FindFileContainingSymbol(self, symbol):
107
+ """Finds the file descriptor proto containing the specified symbol.
108
+
109
+ The symbol should be a fully qualified name including the file descriptor's
110
+ package and any containing messages. Some examples:
111
+
112
+ 'some.package.name.Message'
113
+ 'some.package.name.Message.NestedEnum'
114
+ 'some.package.name.Message.some_field'
115
+
116
+ The file descriptor proto containing the specified symbol must be added to
117
+ this database using the Add method or else an error will be raised.
118
+
119
+ Args:
120
+ symbol: The fully qualified symbol name.
121
+
122
+ Returns:
123
+ The file descriptor proto containing the symbol.
124
+
125
+ Raises:
126
+ KeyError if no file contains the specified symbol.
127
+ """
128
+ try:
129
+ return self._file_desc_protos_by_symbol[symbol]
130
+ except KeyError:
131
+ # Fields, enum values, and nested extensions are not in
132
+ # _file_desc_protos_by_symbol. Try to find the top level
133
+ # descriptor. Non-existent nested symbol under a valid top level
134
+ # descriptor can also be found. The behavior is the same with
135
+ # protobuf C++.
136
+ top_level, _, _ = symbol.rpartition('.')
137
+ try:
138
+ return self._file_desc_protos_by_symbol[top_level]
139
+ except KeyError:
140
+ # Raise the original symbol as a KeyError for better diagnostics.
141
+ raise KeyError(symbol)
142
+
143
+ def FindFileContainingExtension(self, extendee_name, extension_number):
144
+ # TODO(jieluo): implement this API.
145
+ return None
146
+
147
+ def FindAllExtensionNumbers(self, extendee_name):
148
+ # TODO(jieluo): implement this API.
149
+ return []
150
+
151
+ def _AddSymbol(self, name, file_desc_proto):
152
+ if name in self._file_desc_protos_by_symbol:
153
+ warn_msg = ('Conflict register for file "' + file_desc_proto.name +
154
+ '": ' + name +
155
+ ' is already defined in file "' +
156
+ self._file_desc_protos_by_symbol[name].name + '"')
157
+ warnings.warn(warn_msg, RuntimeWarning)
158
+ self._file_desc_protos_by_symbol[name] = file_desc_proto
159
+
160
+
161
+ def _ExtractSymbols(desc_proto, package):
162
+ """Pulls out all the symbols from a descriptor proto.
163
+
164
+ Args:
165
+ desc_proto: The proto to extract symbols from.
166
+ package: The package containing the descriptor type.
167
+
168
+ Yields:
169
+ The fully qualified name found in the descriptor.
170
+ """
171
+ message_name = package + '.' + desc_proto.name if package else desc_proto.name
172
+ yield message_name
173
+ for nested_type in desc_proto.nested_type:
174
+ for symbol in _ExtractSymbols(nested_type, message_name):
175
+ yield symbol
176
+ for enum_type in desc_proto.enum_type:
177
+ yield '.'.join((message_name, enum_type.name))
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/descriptor_pb2.py ADDED
The diff for this file is too large to render. See raw diff
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/duration_pb2.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/duration.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+
15
+
16
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1egoogle/protobuf/duration.proto\x12\x0fgoogle.protobuf\"*\n\x08\x44uration\x12\x0f\n\x07seconds\x18\x01 \x01(\x03\x12\r\n\x05nanos\x18\x02 \x01(\x05\x42\x83\x01\n\x13\x63om.google.protobufB\rDurationProtoP\x01Z1google.golang.org/protobuf/types/known/durationpb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
17
+
18
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
19
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.duration_pb2', globals())
20
+ if _descriptor._USE_C_DESCRIPTORS == False:
21
+
22
+ DESCRIPTOR._options = None
23
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\rDurationProtoP\001Z1google.golang.org/protobuf/types/known/durationpb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
24
+ _DURATION._serialized_start=51
25
+ _DURATION._serialized_end=93
26
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/json_format.py ADDED
@@ -0,0 +1,912 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ """Contains routines for printing protocol messages in JSON format.
32
+
33
+ Simple usage example:
34
+
35
+ # Create a proto object and serialize it to a json format string.
36
+ message = my_proto_pb2.MyMessage(foo='bar')
37
+ json_string = json_format.MessageToJson(message)
38
+
39
+ # Parse a json format string to proto object.
40
+ message = json_format.Parse(json_string, my_proto_pb2.MyMessage())
41
+ """
42
+
43
+ __author__ = 'jieluo@google.com (Jie Luo)'
44
+
45
+
46
+ import base64
47
+ from collections import OrderedDict
48
+ import json
49
+ import math
50
+ from operator import methodcaller
51
+ import re
52
+ import sys
53
+
54
+ from google.protobuf.internal import type_checkers
55
+ from google.protobuf import descriptor
56
+ from google.protobuf import symbol_database
57
+
58
+
59
+ _TIMESTAMPFOMAT = '%Y-%m-%dT%H:%M:%S'
60
+ _INT_TYPES = frozenset([descriptor.FieldDescriptor.CPPTYPE_INT32,
61
+ descriptor.FieldDescriptor.CPPTYPE_UINT32,
62
+ descriptor.FieldDescriptor.CPPTYPE_INT64,
63
+ descriptor.FieldDescriptor.CPPTYPE_UINT64])
64
+ _INT64_TYPES = frozenset([descriptor.FieldDescriptor.CPPTYPE_INT64,
65
+ descriptor.FieldDescriptor.CPPTYPE_UINT64])
66
+ _FLOAT_TYPES = frozenset([descriptor.FieldDescriptor.CPPTYPE_FLOAT,
67
+ descriptor.FieldDescriptor.CPPTYPE_DOUBLE])
68
+ _INFINITY = 'Infinity'
69
+ _NEG_INFINITY = '-Infinity'
70
+ _NAN = 'NaN'
71
+
72
+ _UNPAIRED_SURROGATE_PATTERN = re.compile(
73
+ u'[\ud800-\udbff](?![\udc00-\udfff])|(?<![\ud800-\udbff])[\udc00-\udfff]')
74
+
75
+ _VALID_EXTENSION_NAME = re.compile(r'\[[a-zA-Z0-9\._]*\]$')
76
+
77
+
78
+ class Error(Exception):
79
+ """Top-level module error for json_format."""
80
+
81
+
82
+ class SerializeToJsonError(Error):
83
+ """Thrown if serialization to JSON fails."""
84
+
85
+
86
+ class ParseError(Error):
87
+ """Thrown in case of parsing error."""
88
+
89
+
90
+ def MessageToJson(
91
+ message,
92
+ including_default_value_fields=False,
93
+ preserving_proto_field_name=False,
94
+ indent=2,
95
+ sort_keys=False,
96
+ use_integers_for_enums=False,
97
+ descriptor_pool=None,
98
+ float_precision=None,
99
+ ensure_ascii=True):
100
+ """Converts protobuf message to JSON format.
101
+
102
+ Args:
103
+ message: The protocol buffers message instance to serialize.
104
+ including_default_value_fields: If True, singular primitive fields,
105
+ repeated fields, and map fields will always be serialized. If
106
+ False, only serialize non-empty fields. Singular message fields
107
+ and oneof fields are not affected by this option.
108
+ preserving_proto_field_name: If True, use the original proto field
109
+ names as defined in the .proto file. If False, convert the field
110
+ names to lowerCamelCase.
111
+ indent: The JSON object will be pretty-printed with this indent level.
112
+ An indent level of 0 or negative will only insert newlines.
113
+ sort_keys: If True, then the output will be sorted by field names.
114
+ use_integers_for_enums: If true, print integers instead of enum names.
115
+ descriptor_pool: A Descriptor Pool for resolving types. If None use the
116
+ default.
117
+ float_precision: If set, use this to specify float field valid digits.
118
+ ensure_ascii: If True, strings with non-ASCII characters are escaped.
119
+ If False, Unicode strings are returned unchanged.
120
+
121
+ Returns:
122
+ A string containing the JSON formatted protocol buffer message.
123
+ """
124
+ printer = _Printer(
125
+ including_default_value_fields,
126
+ preserving_proto_field_name,
127
+ use_integers_for_enums,
128
+ descriptor_pool,
129
+ float_precision=float_precision)
130
+ return printer.ToJsonString(message, indent, sort_keys, ensure_ascii)
131
+
132
+
133
+ def MessageToDict(
134
+ message,
135
+ including_default_value_fields=False,
136
+ preserving_proto_field_name=False,
137
+ use_integers_for_enums=False,
138
+ descriptor_pool=None,
139
+ float_precision=None):
140
+ """Converts protobuf message to a dictionary.
141
+
142
+ When the dictionary is encoded to JSON, it conforms to proto3 JSON spec.
143
+
144
+ Args:
145
+ message: The protocol buffers message instance to serialize.
146
+ including_default_value_fields: If True, singular primitive fields,
147
+ repeated fields, and map fields will always be serialized. If
148
+ False, only serialize non-empty fields. Singular message fields
149
+ and oneof fields are not affected by this option.
150
+ preserving_proto_field_name: If True, use the original proto field
151
+ names as defined in the .proto file. If False, convert the field
152
+ names to lowerCamelCase.
153
+ use_integers_for_enums: If true, print integers instead of enum names.
154
+ descriptor_pool: A Descriptor Pool for resolving types. If None use the
155
+ default.
156
+ float_precision: If set, use this to specify float field valid digits.
157
+
158
+ Returns:
159
+ A dict representation of the protocol buffer message.
160
+ """
161
+ printer = _Printer(
162
+ including_default_value_fields,
163
+ preserving_proto_field_name,
164
+ use_integers_for_enums,
165
+ descriptor_pool,
166
+ float_precision=float_precision)
167
+ # pylint: disable=protected-access
168
+ return printer._MessageToJsonObject(message)
169
+
170
+
171
+ def _IsMapEntry(field):
172
+ return (field.type == descriptor.FieldDescriptor.TYPE_MESSAGE and
173
+ field.message_type.has_options and
174
+ field.message_type.GetOptions().map_entry)
175
+
176
+
177
+ class _Printer(object):
178
+ """JSON format printer for protocol message."""
179
+
180
+ def __init__(
181
+ self,
182
+ including_default_value_fields=False,
183
+ preserving_proto_field_name=False,
184
+ use_integers_for_enums=False,
185
+ descriptor_pool=None,
186
+ float_precision=None):
187
+ self.including_default_value_fields = including_default_value_fields
188
+ self.preserving_proto_field_name = preserving_proto_field_name
189
+ self.use_integers_for_enums = use_integers_for_enums
190
+ self.descriptor_pool = descriptor_pool
191
+ if float_precision:
192
+ self.float_format = '.{}g'.format(float_precision)
193
+ else:
194
+ self.float_format = None
195
+
196
+ def ToJsonString(self, message, indent, sort_keys, ensure_ascii):
197
+ js = self._MessageToJsonObject(message)
198
+ return json.dumps(
199
+ js, indent=indent, sort_keys=sort_keys, ensure_ascii=ensure_ascii)
200
+
201
+ def _MessageToJsonObject(self, message):
202
+ """Converts message to an object according to Proto3 JSON Specification."""
203
+ message_descriptor = message.DESCRIPTOR
204
+ full_name = message_descriptor.full_name
205
+ if _IsWrapperMessage(message_descriptor):
206
+ return self._WrapperMessageToJsonObject(message)
207
+ if full_name in _WKTJSONMETHODS:
208
+ return methodcaller(_WKTJSONMETHODS[full_name][0], message)(self)
209
+ js = {}
210
+ return self._RegularMessageToJsonObject(message, js)
211
+
212
+ def _RegularMessageToJsonObject(self, message, js):
213
+ """Converts normal message according to Proto3 JSON Specification."""
214
+ fields = message.ListFields()
215
+
216
+ try:
217
+ for field, value in fields:
218
+ if self.preserving_proto_field_name:
219
+ name = field.name
220
+ else:
221
+ name = field.json_name
222
+ if _IsMapEntry(field):
223
+ # Convert a map field.
224
+ v_field = field.message_type.fields_by_name['value']
225
+ js_map = {}
226
+ for key in value:
227
+ if isinstance(key, bool):
228
+ if key:
229
+ recorded_key = 'true'
230
+ else:
231
+ recorded_key = 'false'
232
+ else:
233
+ recorded_key = str(key)
234
+ js_map[recorded_key] = self._FieldToJsonObject(
235
+ v_field, value[key])
236
+ js[name] = js_map
237
+ elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
238
+ # Convert a repeated field.
239
+ js[name] = [self._FieldToJsonObject(field, k)
240
+ for k in value]
241
+ elif field.is_extension:
242
+ name = '[%s]' % field.full_name
243
+ js[name] = self._FieldToJsonObject(field, value)
244
+ else:
245
+ js[name] = self._FieldToJsonObject(field, value)
246
+
247
+ # Serialize default value if including_default_value_fields is True.
248
+ if self.including_default_value_fields:
249
+ message_descriptor = message.DESCRIPTOR
250
+ for field in message_descriptor.fields:
251
+ # Singular message fields and oneof fields will not be affected.
252
+ if ((field.label != descriptor.FieldDescriptor.LABEL_REPEATED and
253
+ field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE) or
254
+ field.containing_oneof):
255
+ continue
256
+ if self.preserving_proto_field_name:
257
+ name = field.name
258
+ else:
259
+ name = field.json_name
260
+ if name in js:
261
+ # Skip the field which has been serialized already.
262
+ continue
263
+ if _IsMapEntry(field):
264
+ js[name] = {}
265
+ elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
266
+ js[name] = []
267
+ else:
268
+ js[name] = self._FieldToJsonObject(field, field.default_value)
269
+
270
+ except ValueError as e:
271
+ raise SerializeToJsonError(
272
+ 'Failed to serialize {0} field: {1}.'.format(field.name, e))
273
+
274
+ return js
275
+
276
+ def _FieldToJsonObject(self, field, value):
277
+ """Converts field value according to Proto3 JSON Specification."""
278
+ if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
279
+ return self._MessageToJsonObject(value)
280
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
281
+ if self.use_integers_for_enums:
282
+ return value
283
+ if field.enum_type.full_name == 'google.protobuf.NullValue':
284
+ return None
285
+ enum_value = field.enum_type.values_by_number.get(value, None)
286
+ if enum_value is not None:
287
+ return enum_value.name
288
+ else:
289
+ if field.file.syntax == 'proto3':
290
+ return value
291
+ raise SerializeToJsonError('Enum field contains an integer value '
292
+ 'which can not mapped to an enum value.')
293
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
294
+ if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
295
+ # Use base64 Data encoding for bytes
296
+ return base64.b64encode(value).decode('utf-8')
297
+ else:
298
+ return value
299
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
300
+ return bool(value)
301
+ elif field.cpp_type in _INT64_TYPES:
302
+ return str(value)
303
+ elif field.cpp_type in _FLOAT_TYPES:
304
+ if math.isinf(value):
305
+ if value < 0.0:
306
+ return _NEG_INFINITY
307
+ else:
308
+ return _INFINITY
309
+ if math.isnan(value):
310
+ return _NAN
311
+ if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_FLOAT:
312
+ if self.float_format:
313
+ return float(format(value, self.float_format))
314
+ else:
315
+ return type_checkers.ToShortestFloat(value)
316
+
317
+ return value
318
+
319
+ def _AnyMessageToJsonObject(self, message):
320
+ """Converts Any message according to Proto3 JSON Specification."""
321
+ if not message.ListFields():
322
+ return {}
323
+ # Must print @type first, use OrderedDict instead of {}
324
+ js = OrderedDict()
325
+ type_url = message.type_url
326
+ js['@type'] = type_url
327
+ sub_message = _CreateMessageFromTypeUrl(type_url, self.descriptor_pool)
328
+ sub_message.ParseFromString(message.value)
329
+ message_descriptor = sub_message.DESCRIPTOR
330
+ full_name = message_descriptor.full_name
331
+ if _IsWrapperMessage(message_descriptor):
332
+ js['value'] = self._WrapperMessageToJsonObject(sub_message)
333
+ return js
334
+ if full_name in _WKTJSONMETHODS:
335
+ js['value'] = methodcaller(_WKTJSONMETHODS[full_name][0],
336
+ sub_message)(self)
337
+ return js
338
+ return self._RegularMessageToJsonObject(sub_message, js)
339
+
340
+ def _GenericMessageToJsonObject(self, message):
341
+ """Converts message according to Proto3 JSON Specification."""
342
+ # Duration, Timestamp and FieldMask have ToJsonString method to do the
343
+ # convert. Users can also call the method directly.
344
+ return message.ToJsonString()
345
+
346
+ def _ValueMessageToJsonObject(self, message):
347
+ """Converts Value message according to Proto3 JSON Specification."""
348
+ which = message.WhichOneof('kind')
349
+ # If the Value message is not set treat as null_value when serialize
350
+ # to JSON. The parse back result will be different from original message.
351
+ if which is None or which == 'null_value':
352
+ return None
353
+ if which == 'list_value':
354
+ return self._ListValueMessageToJsonObject(message.list_value)
355
+ if which == 'struct_value':
356
+ value = message.struct_value
357
+ else:
358
+ value = getattr(message, which)
359
+ oneof_descriptor = message.DESCRIPTOR.fields_by_name[which]
360
+ return self._FieldToJsonObject(oneof_descriptor, value)
361
+
362
+ def _ListValueMessageToJsonObject(self, message):
363
+ """Converts ListValue message according to Proto3 JSON Specification."""
364
+ return [self._ValueMessageToJsonObject(value)
365
+ for value in message.values]
366
+
367
+ def _StructMessageToJsonObject(self, message):
368
+ """Converts Struct message according to Proto3 JSON Specification."""
369
+ fields = message.fields
370
+ ret = {}
371
+ for key in fields:
372
+ ret[key] = self._ValueMessageToJsonObject(fields[key])
373
+ return ret
374
+
375
+ def _WrapperMessageToJsonObject(self, message):
376
+ return self._FieldToJsonObject(
377
+ message.DESCRIPTOR.fields_by_name['value'], message.value)
378
+
379
+
380
+ def _IsWrapperMessage(message_descriptor):
381
+ return message_descriptor.file.name == 'google/protobuf/wrappers.proto'
382
+
383
+
384
+ def _DuplicateChecker(js):
385
+ result = {}
386
+ for name, value in js:
387
+ if name in result:
388
+ raise ParseError('Failed to load JSON: duplicate key {0}.'.format(name))
389
+ result[name] = value
390
+ return result
391
+
392
+
393
+ def _CreateMessageFromTypeUrl(type_url, descriptor_pool):
394
+ """Creates a message from a type URL."""
395
+ db = symbol_database.Default()
396
+ pool = db.pool if descriptor_pool is None else descriptor_pool
397
+ type_name = type_url.split('/')[-1]
398
+ try:
399
+ message_descriptor = pool.FindMessageTypeByName(type_name)
400
+ except KeyError:
401
+ raise TypeError(
402
+ 'Can not find message descriptor by type_url: {0}'.format(type_url))
403
+ message_class = db.GetPrototype(message_descriptor)
404
+ return message_class()
405
+
406
+
407
+ def Parse(text,
408
+ message,
409
+ ignore_unknown_fields=False,
410
+ descriptor_pool=None,
411
+ max_recursion_depth=100):
412
+ """Parses a JSON representation of a protocol message into a message.
413
+
414
+ Args:
415
+ text: Message JSON representation.
416
+ message: A protocol buffer message to merge into.
417
+ ignore_unknown_fields: If True, do not raise errors for unknown fields.
418
+ descriptor_pool: A Descriptor Pool for resolving types. If None use the
419
+ default.
420
+ max_recursion_depth: max recursion depth of JSON message to be
421
+ deserialized. JSON messages over this depth will fail to be
422
+ deserialized. Default value is 100.
423
+
424
+ Returns:
425
+ The same message passed as argument.
426
+
427
+ Raises::
428
+ ParseError: On JSON parsing problems.
429
+ """
430
+ if not isinstance(text, str):
431
+ text = text.decode('utf-8')
432
+ try:
433
+ js = json.loads(text, object_pairs_hook=_DuplicateChecker)
434
+ except ValueError as e:
435
+ raise ParseError('Failed to load JSON: {0}.'.format(str(e)))
436
+ return ParseDict(js, message, ignore_unknown_fields, descriptor_pool,
437
+ max_recursion_depth)
438
+
439
+
440
+ def ParseDict(js_dict,
441
+ message,
442
+ ignore_unknown_fields=False,
443
+ descriptor_pool=None,
444
+ max_recursion_depth=100):
445
+ """Parses a JSON dictionary representation into a message.
446
+
447
+ Args:
448
+ js_dict: Dict representation of a JSON message.
449
+ message: A protocol buffer message to merge into.
450
+ ignore_unknown_fields: If True, do not raise errors for unknown fields.
451
+ descriptor_pool: A Descriptor Pool for resolving types. If None use the
452
+ default.
453
+ max_recursion_depth: max recursion depth of JSON message to be
454
+ deserialized. JSON messages over this depth will fail to be
455
+ deserialized. Default value is 100.
456
+
457
+ Returns:
458
+ The same message passed as argument.
459
+ """
460
+ parser = _Parser(ignore_unknown_fields, descriptor_pool, max_recursion_depth)
461
+ parser.ConvertMessage(js_dict, message, '')
462
+ return message
463
+
464
+
465
+ _INT_OR_FLOAT = (int, float)
466
+
467
+
468
+ class _Parser(object):
469
+ """JSON format parser for protocol message."""
470
+
471
+ def __init__(self, ignore_unknown_fields, descriptor_pool,
472
+ max_recursion_depth):
473
+ self.ignore_unknown_fields = ignore_unknown_fields
474
+ self.descriptor_pool = descriptor_pool
475
+ self.max_recursion_depth = max_recursion_depth
476
+ self.recursion_depth = 0
477
+
478
+ def ConvertMessage(self, value, message, path):
479
+ """Convert a JSON object into a message.
480
+
481
+ Args:
482
+ value: A JSON object.
483
+ message: A WKT or regular protocol message to record the data.
484
+ path: parent path to log parse error info.
485
+
486
+ Raises:
487
+ ParseError: In case of convert problems.
488
+ """
489
+ self.recursion_depth += 1
490
+ if self.recursion_depth > self.max_recursion_depth:
491
+ raise ParseError('Message too deep. Max recursion depth is {0}'.format(
492
+ self.max_recursion_depth))
493
+ message_descriptor = message.DESCRIPTOR
494
+ full_name = message_descriptor.full_name
495
+ if not path:
496
+ path = message_descriptor.name
497
+ if _IsWrapperMessage(message_descriptor):
498
+ self._ConvertWrapperMessage(value, message, path)
499
+ elif full_name in _WKTJSONMETHODS:
500
+ methodcaller(_WKTJSONMETHODS[full_name][1], value, message, path)(self)
501
+ else:
502
+ self._ConvertFieldValuePair(value, message, path)
503
+ self.recursion_depth -= 1
504
+
505
+ def _ConvertFieldValuePair(self, js, message, path):
506
+ """Convert field value pairs into regular message.
507
+
508
+ Args:
509
+ js: A JSON object to convert the field value pairs.
510
+ message: A regular protocol message to record the data.
511
+ path: parent path to log parse error info.
512
+
513
+ Raises:
514
+ ParseError: In case of problems converting.
515
+ """
516
+ names = []
517
+ message_descriptor = message.DESCRIPTOR
518
+ fields_by_json_name = dict((f.json_name, f)
519
+ for f in message_descriptor.fields)
520
+ for name in js:
521
+ try:
522
+ field = fields_by_json_name.get(name, None)
523
+ if not field:
524
+ field = message_descriptor.fields_by_name.get(name, None)
525
+ if not field and _VALID_EXTENSION_NAME.match(name):
526
+ if not message_descriptor.is_extendable:
527
+ raise ParseError(
528
+ 'Message type {0} does not have extensions at {1}'.format(
529
+ message_descriptor.full_name, path))
530
+ identifier = name[1:-1] # strip [] brackets
531
+ # pylint: disable=protected-access
532
+ field = message.Extensions._FindExtensionByName(identifier)
533
+ # pylint: enable=protected-access
534
+ if not field:
535
+ # Try looking for extension by the message type name, dropping the
536
+ # field name following the final . separator in full_name.
537
+ identifier = '.'.join(identifier.split('.')[:-1])
538
+ # pylint: disable=protected-access
539
+ field = message.Extensions._FindExtensionByName(identifier)
540
+ # pylint: enable=protected-access
541
+ if not field:
542
+ if self.ignore_unknown_fields:
543
+ continue
544
+ raise ParseError(
545
+ ('Message type "{0}" has no field named "{1}" at "{2}".\n'
546
+ ' Available Fields(except extensions): "{3}"').format(
547
+ message_descriptor.full_name, name, path,
548
+ [f.json_name for f in message_descriptor.fields]))
549
+ if name in names:
550
+ raise ParseError('Message type "{0}" should not have multiple '
551
+ '"{1}" fields at "{2}".'.format(
552
+ message.DESCRIPTOR.full_name, name, path))
553
+ names.append(name)
554
+ value = js[name]
555
+ # Check no other oneof field is parsed.
556
+ if field.containing_oneof is not None and value is not None:
557
+ oneof_name = field.containing_oneof.name
558
+ if oneof_name in names:
559
+ raise ParseError('Message type "{0}" should not have multiple '
560
+ '"{1}" oneof fields at "{2}".'.format(
561
+ message.DESCRIPTOR.full_name, oneof_name,
562
+ path))
563
+ names.append(oneof_name)
564
+
565
+ if value is None:
566
+ if (field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE
567
+ and field.message_type.full_name == 'google.protobuf.Value'):
568
+ sub_message = getattr(message, field.name)
569
+ sub_message.null_value = 0
570
+ elif (field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM
571
+ and field.enum_type.full_name == 'google.protobuf.NullValue'):
572
+ setattr(message, field.name, 0)
573
+ else:
574
+ message.ClearField(field.name)
575
+ continue
576
+
577
+ # Parse field value.
578
+ if _IsMapEntry(field):
579
+ message.ClearField(field.name)
580
+ self._ConvertMapFieldValue(value, message, field,
581
+ '{0}.{1}'.format(path, name))
582
+ elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:
583
+ message.ClearField(field.name)
584
+ if not isinstance(value, list):
585
+ raise ParseError('repeated field {0} must be in [] which is '
586
+ '{1} at {2}'.format(name, value, path))
587
+ if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
588
+ # Repeated message field.
589
+ for index, item in enumerate(value):
590
+ sub_message = getattr(message, field.name).add()
591
+ # None is a null_value in Value.
592
+ if (item is None and
593
+ sub_message.DESCRIPTOR.full_name != 'google.protobuf.Value'):
594
+ raise ParseError('null is not allowed to be used as an element'
595
+ ' in a repeated field at {0}.{1}[{2}]'.format(
596
+ path, name, index))
597
+ self.ConvertMessage(item, sub_message,
598
+ '{0}.{1}[{2}]'.format(path, name, index))
599
+ else:
600
+ # Repeated scalar field.
601
+ for index, item in enumerate(value):
602
+ if item is None:
603
+ raise ParseError('null is not allowed to be used as an element'
604
+ ' in a repeated field at {0}.{1}[{2}]'.format(
605
+ path, name, index))
606
+ getattr(message, field.name).append(
607
+ _ConvertScalarFieldValue(
608
+ item, field, '{0}.{1}[{2}]'.format(path, name, index)))
609
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
610
+ if field.is_extension:
611
+ sub_message = message.Extensions[field]
612
+ else:
613
+ sub_message = getattr(message, field.name)
614
+ sub_message.SetInParent()
615
+ self.ConvertMessage(value, sub_message, '{0}.{1}'.format(path, name))
616
+ else:
617
+ if field.is_extension:
618
+ message.Extensions[field] = _ConvertScalarFieldValue(
619
+ value, field, '{0}.{1}'.format(path, name))
620
+ else:
621
+ setattr(
622
+ message, field.name,
623
+ _ConvertScalarFieldValue(value, field,
624
+ '{0}.{1}'.format(path, name)))
625
+ except ParseError as e:
626
+ if field and field.containing_oneof is None:
627
+ raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
628
+ else:
629
+ raise ParseError(str(e))
630
+ except ValueError as e:
631
+ raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
632
+ except TypeError as e:
633
+ raise ParseError('Failed to parse {0} field: {1}.'.format(name, e))
634
+
635
+ def _ConvertAnyMessage(self, value, message, path):
636
+ """Convert a JSON representation into Any message."""
637
+ if isinstance(value, dict) and not value:
638
+ return
639
+ try:
640
+ type_url = value['@type']
641
+ except KeyError:
642
+ raise ParseError(
643
+ '@type is missing when parsing any message at {0}'.format(path))
644
+
645
+ try:
646
+ sub_message = _CreateMessageFromTypeUrl(type_url, self.descriptor_pool)
647
+ except TypeError as e:
648
+ raise ParseError('{0} at {1}'.format(e, path))
649
+ message_descriptor = sub_message.DESCRIPTOR
650
+ full_name = message_descriptor.full_name
651
+ if _IsWrapperMessage(message_descriptor):
652
+ self._ConvertWrapperMessage(value['value'], sub_message,
653
+ '{0}.value'.format(path))
654
+ elif full_name in _WKTJSONMETHODS:
655
+ methodcaller(_WKTJSONMETHODS[full_name][1], value['value'], sub_message,
656
+ '{0}.value'.format(path))(
657
+ self)
658
+ else:
659
+ del value['@type']
660
+ self._ConvertFieldValuePair(value, sub_message, path)
661
+ value['@type'] = type_url
662
+ # Sets Any message
663
+ message.value = sub_message.SerializeToString()
664
+ message.type_url = type_url
665
+
666
+ def _ConvertGenericMessage(self, value, message, path):
667
+ """Convert a JSON representation into message with FromJsonString."""
668
+ # Duration, Timestamp, FieldMask have a FromJsonString method to do the
669
+ # conversion. Users can also call the method directly.
670
+ try:
671
+ message.FromJsonString(value)
672
+ except ValueError as e:
673
+ raise ParseError('{0} at {1}'.format(e, path))
674
+
675
+ def _ConvertValueMessage(self, value, message, path):
676
+ """Convert a JSON representation into Value message."""
677
+ if isinstance(value, dict):
678
+ self._ConvertStructMessage(value, message.struct_value, path)
679
+ elif isinstance(value, list):
680
+ self._ConvertListValueMessage(value, message.list_value, path)
681
+ elif value is None:
682
+ message.null_value = 0
683
+ elif isinstance(value, bool):
684
+ message.bool_value = value
685
+ elif isinstance(value, str):
686
+ message.string_value = value
687
+ elif isinstance(value, _INT_OR_FLOAT):
688
+ message.number_value = value
689
+ else:
690
+ raise ParseError('Value {0} has unexpected type {1} at {2}'.format(
691
+ value, type(value), path))
692
+
693
+ def _ConvertListValueMessage(self, value, message, path):
694
+ """Convert a JSON representation into ListValue message."""
695
+ if not isinstance(value, list):
696
+ raise ParseError('ListValue must be in [] which is {0} at {1}'.format(
697
+ value, path))
698
+ message.ClearField('values')
699
+ for index, item in enumerate(value):
700
+ self._ConvertValueMessage(item, message.values.add(),
701
+ '{0}[{1}]'.format(path, index))
702
+
703
+ def _ConvertStructMessage(self, value, message, path):
704
+ """Convert a JSON representation into Struct message."""
705
+ if not isinstance(value, dict):
706
+ raise ParseError('Struct must be in a dict which is {0} at {1}'.format(
707
+ value, path))
708
+ # Clear will mark the struct as modified so it will be created even if
709
+ # there are no values.
710
+ message.Clear()
711
+ for key in value:
712
+ self._ConvertValueMessage(value[key], message.fields[key],
713
+ '{0}.{1}'.format(path, key))
714
+ return
715
+
716
+ def _ConvertWrapperMessage(self, value, message, path):
717
+ """Convert a JSON representation into Wrapper message."""
718
+ field = message.DESCRIPTOR.fields_by_name['value']
719
+ setattr(
720
+ message, 'value',
721
+ _ConvertScalarFieldValue(value, field, path='{0}.value'.format(path)))
722
+
723
+ def _ConvertMapFieldValue(self, value, message, field, path):
724
+ """Convert map field value for a message map field.
725
+
726
+ Args:
727
+ value: A JSON object to convert the map field value.
728
+ message: A protocol message to record the converted data.
729
+ field: The descriptor of the map field to be converted.
730
+ path: parent path to log parse error info.
731
+
732
+ Raises:
733
+ ParseError: In case of convert problems.
734
+ """
735
+ if not isinstance(value, dict):
736
+ raise ParseError(
737
+ 'Map field {0} must be in a dict which is {1} at {2}'.format(
738
+ field.name, value, path))
739
+ key_field = field.message_type.fields_by_name['key']
740
+ value_field = field.message_type.fields_by_name['value']
741
+ for key in value:
742
+ key_value = _ConvertScalarFieldValue(key, key_field,
743
+ '{0}.key'.format(path), True)
744
+ if value_field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
745
+ self.ConvertMessage(value[key],
746
+ getattr(message, field.name)[key_value],
747
+ '{0}[{1}]'.format(path, key_value))
748
+ else:
749
+ getattr(message, field.name)[key_value] = _ConvertScalarFieldValue(
750
+ value[key], value_field, path='{0}[{1}]'.format(path, key_value))
751
+
752
+
753
+ def _ConvertScalarFieldValue(value, field, path, require_str=False):
754
+ """Convert a single scalar field value.
755
+
756
+ Args:
757
+ value: A scalar value to convert the scalar field value.
758
+ field: The descriptor of the field to convert.
759
+ path: parent path to log parse error info.
760
+ require_str: If True, the field value must be a str.
761
+
762
+ Returns:
763
+ The converted scalar field value
764
+
765
+ Raises:
766
+ ParseError: In case of convert problems.
767
+ """
768
+ try:
769
+ if field.cpp_type in _INT_TYPES:
770
+ return _ConvertInteger(value)
771
+ elif field.cpp_type in _FLOAT_TYPES:
772
+ return _ConvertFloat(value, field)
773
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_BOOL:
774
+ return _ConvertBool(value, require_str)
775
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_STRING:
776
+ if field.type == descriptor.FieldDescriptor.TYPE_BYTES:
777
+ if isinstance(value, str):
778
+ encoded = value.encode('utf-8')
779
+ else:
780
+ encoded = value
781
+ # Add extra padding '='
782
+ padded_value = encoded + b'=' * (4 - len(encoded) % 4)
783
+ return base64.urlsafe_b64decode(padded_value)
784
+ else:
785
+ # Checking for unpaired surrogates appears to be unreliable,
786
+ # depending on the specific Python version, so we check manually.
787
+ if _UNPAIRED_SURROGATE_PATTERN.search(value):
788
+ raise ParseError('Unpaired surrogate')
789
+ return value
790
+ elif field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_ENUM:
791
+ # Convert an enum value.
792
+ enum_value = field.enum_type.values_by_name.get(value, None)
793
+ if enum_value is None:
794
+ try:
795
+ number = int(value)
796
+ enum_value = field.enum_type.values_by_number.get(number, None)
797
+ except ValueError:
798
+ raise ParseError('Invalid enum value {0} for enum type {1}'.format(
799
+ value, field.enum_type.full_name))
800
+ if enum_value is None:
801
+ if field.file.syntax == 'proto3':
802
+ # Proto3 accepts unknown enums.
803
+ return number
804
+ raise ParseError('Invalid enum value {0} for enum type {1}'.format(
805
+ value, field.enum_type.full_name))
806
+ return enum_value.number
807
+ except ParseError as e:
808
+ raise ParseError('{0} at {1}'.format(e, path))
809
+
810
+
811
+ def _ConvertInteger(value):
812
+ """Convert an integer.
813
+
814
+ Args:
815
+ value: A scalar value to convert.
816
+
817
+ Returns:
818
+ The integer value.
819
+
820
+ Raises:
821
+ ParseError: If an integer couldn't be consumed.
822
+ """
823
+ if isinstance(value, float) and not value.is_integer():
824
+ raise ParseError('Couldn\'t parse integer: {0}'.format(value))
825
+
826
+ if isinstance(value, str) and value.find(' ') != -1:
827
+ raise ParseError('Couldn\'t parse integer: "{0}"'.format(value))
828
+
829
+ if isinstance(value, bool):
830
+ raise ParseError('Bool value {0} is not acceptable for '
831
+ 'integer field'.format(value))
832
+
833
+ return int(value)
834
+
835
+
836
+ def _ConvertFloat(value, field):
837
+ """Convert an floating point number."""
838
+ if isinstance(value, float):
839
+ if math.isnan(value):
840
+ raise ParseError('Couldn\'t parse NaN, use quoted "NaN" instead')
841
+ if math.isinf(value):
842
+ if value > 0:
843
+ raise ParseError('Couldn\'t parse Infinity or value too large, '
844
+ 'use quoted "Infinity" instead')
845
+ else:
846
+ raise ParseError('Couldn\'t parse -Infinity or value too small, '
847
+ 'use quoted "-Infinity" instead')
848
+ if field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_FLOAT:
849
+ # pylint: disable=protected-access
850
+ if value > type_checkers._FLOAT_MAX:
851
+ raise ParseError('Float value too large')
852
+ # pylint: disable=protected-access
853
+ if value < type_checkers._FLOAT_MIN:
854
+ raise ParseError('Float value too small')
855
+ if value == 'nan':
856
+ raise ParseError('Couldn\'t parse float "nan", use "NaN" instead')
857
+ try:
858
+ # Assume Python compatible syntax.
859
+ return float(value)
860
+ except ValueError:
861
+ # Check alternative spellings.
862
+ if value == _NEG_INFINITY:
863
+ return float('-inf')
864
+ elif value == _INFINITY:
865
+ return float('inf')
866
+ elif value == _NAN:
867
+ return float('nan')
868
+ else:
869
+ raise ParseError('Couldn\'t parse float: {0}'.format(value))
870
+
871
+
872
+ def _ConvertBool(value, require_str):
873
+ """Convert a boolean value.
874
+
875
+ Args:
876
+ value: A scalar value to convert.
877
+ require_str: If True, value must be a str.
878
+
879
+ Returns:
880
+ The bool parsed.
881
+
882
+ Raises:
883
+ ParseError: If a boolean value couldn't be consumed.
884
+ """
885
+ if require_str:
886
+ if value == 'true':
887
+ return True
888
+ elif value == 'false':
889
+ return False
890
+ else:
891
+ raise ParseError('Expected "true" or "false", not {0}'.format(value))
892
+
893
+ if not isinstance(value, bool):
894
+ raise ParseError('Expected true or false without quotes')
895
+ return value
896
+
897
+ _WKTJSONMETHODS = {
898
+ 'google.protobuf.Any': ['_AnyMessageToJsonObject',
899
+ '_ConvertAnyMessage'],
900
+ 'google.protobuf.Duration': ['_GenericMessageToJsonObject',
901
+ '_ConvertGenericMessage'],
902
+ 'google.protobuf.FieldMask': ['_GenericMessageToJsonObject',
903
+ '_ConvertGenericMessage'],
904
+ 'google.protobuf.ListValue': ['_ListValueMessageToJsonObject',
905
+ '_ConvertListValueMessage'],
906
+ 'google.protobuf.Struct': ['_StructMessageToJsonObject',
907
+ '_ConvertStructMessage'],
908
+ 'google.protobuf.Timestamp': ['_GenericMessageToJsonObject',
909
+ '_ConvertGenericMessage'],
910
+ 'google.protobuf.Value': ['_ValueMessageToJsonObject',
911
+ '_ConvertValueMessage']
912
+ }
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/message.py ADDED
@@ -0,0 +1,424 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ # TODO(robinson): We should just make these methods all "pure-virtual" and move
32
+ # all implementation out, into reflection.py for now.
33
+
34
+
35
+ """Contains an abstract base class for protocol messages."""
36
+
37
+ __author__ = 'robinson@google.com (Will Robinson)'
38
+
39
+ class Error(Exception):
40
+ """Base error type for this module."""
41
+ pass
42
+
43
+
44
+ class DecodeError(Error):
45
+ """Exception raised when deserializing messages."""
46
+ pass
47
+
48
+
49
+ class EncodeError(Error):
50
+ """Exception raised when serializing messages."""
51
+ pass
52
+
53
+
54
+ class Message(object):
55
+
56
+ """Abstract base class for protocol messages.
57
+
58
+ Protocol message classes are almost always generated by the protocol
59
+ compiler. These generated types subclass Message and implement the methods
60
+ shown below.
61
+ """
62
+
63
+ # TODO(robinson): Link to an HTML document here.
64
+
65
+ # TODO(robinson): Document that instances of this class will also
66
+ # have an Extensions attribute with __getitem__ and __setitem__.
67
+ # Again, not sure how to best convey this.
68
+
69
+ # TODO(robinson): Document that the class must also have a static
70
+ # RegisterExtension(extension_field) method.
71
+ # Not sure how to best express at this point.
72
+
73
+ # TODO(robinson): Document these fields and methods.
74
+
75
+ __slots__ = []
76
+
77
+ #: The :class:`google.protobuf.descriptor.Descriptor` for this message type.
78
+ DESCRIPTOR = None
79
+
80
+ def __deepcopy__(self, memo=None):
81
+ clone = type(self)()
82
+ clone.MergeFrom(self)
83
+ return clone
84
+
85
+ def __eq__(self, other_msg):
86
+ """Recursively compares two messages by value and structure."""
87
+ raise NotImplementedError
88
+
89
+ def __ne__(self, other_msg):
90
+ # Can't just say self != other_msg, since that would infinitely recurse. :)
91
+ return not self == other_msg
92
+
93
+ def __hash__(self):
94
+ raise TypeError('unhashable object')
95
+
96
+ def __str__(self):
97
+ """Outputs a human-readable representation of the message."""
98
+ raise NotImplementedError
99
+
100
+ def __unicode__(self):
101
+ """Outputs a human-readable representation of the message."""
102
+ raise NotImplementedError
103
+
104
+ def MergeFrom(self, other_msg):
105
+ """Merges the contents of the specified message into current message.
106
+
107
+ This method merges the contents of the specified message into the current
108
+ message. Singular fields that are set in the specified message overwrite
109
+ the corresponding fields in the current message. Repeated fields are
110
+ appended. Singular sub-messages and groups are recursively merged.
111
+
112
+ Args:
113
+ other_msg (Message): A message to merge into the current message.
114
+ """
115
+ raise NotImplementedError
116
+
117
+ def CopyFrom(self, other_msg):
118
+ """Copies the content of the specified message into the current message.
119
+
120
+ The method clears the current message and then merges the specified
121
+ message using MergeFrom.
122
+
123
+ Args:
124
+ other_msg (Message): A message to copy into the current one.
125
+ """
126
+ if self is other_msg:
127
+ return
128
+ self.Clear()
129
+ self.MergeFrom(other_msg)
130
+
131
+ def Clear(self):
132
+ """Clears all data that was set in the message."""
133
+ raise NotImplementedError
134
+
135
+ def SetInParent(self):
136
+ """Mark this as present in the parent.
137
+
138
+ This normally happens automatically when you assign a field of a
139
+ sub-message, but sometimes you want to make the sub-message
140
+ present while keeping it empty. If you find yourself using this,
141
+ you may want to reconsider your design.
142
+ """
143
+ raise NotImplementedError
144
+
145
+ def IsInitialized(self):
146
+ """Checks if the message is initialized.
147
+
148
+ Returns:
149
+ bool: The method returns True if the message is initialized (i.e. all of
150
+ its required fields are set).
151
+ """
152
+ raise NotImplementedError
153
+
154
+ # TODO(robinson): MergeFromString() should probably return None and be
155
+ # implemented in terms of a helper that returns the # of bytes read. Our
156
+ # deserialization routines would use the helper when recursively
157
+ # deserializing, but the end user would almost always just want the no-return
158
+ # MergeFromString().
159
+
160
+ def MergeFromString(self, serialized):
161
+ """Merges serialized protocol buffer data into this message.
162
+
163
+ When we find a field in `serialized` that is already present
164
+ in this message:
165
+
166
+ - If it's a "repeated" field, we append to the end of our list.
167
+ - Else, if it's a scalar, we overwrite our field.
168
+ - Else, (it's a nonrepeated composite), we recursively merge
169
+ into the existing composite.
170
+
171
+ Args:
172
+ serialized (bytes): Any object that allows us to call
173
+ ``memoryview(serialized)`` to access a string of bytes using the
174
+ buffer interface.
175
+
176
+ Returns:
177
+ int: The number of bytes read from `serialized`.
178
+ For non-group messages, this will always be `len(serialized)`,
179
+ but for messages which are actually groups, this will
180
+ generally be less than `len(serialized)`, since we must
181
+ stop when we reach an ``END_GROUP`` tag. Note that if
182
+ we *do* stop because of an ``END_GROUP`` tag, the number
183
+ of bytes returned does not include the bytes
184
+ for the ``END_GROUP`` tag information.
185
+
186
+ Raises:
187
+ DecodeError: if the input cannot be parsed.
188
+ """
189
+ # TODO(robinson): Document handling of unknown fields.
190
+ # TODO(robinson): When we switch to a helper, this will return None.
191
+ raise NotImplementedError
192
+
193
+ def ParseFromString(self, serialized):
194
+ """Parse serialized protocol buffer data into this message.
195
+
196
+ Like :func:`MergeFromString()`, except we clear the object first.
197
+
198
+ Raises:
199
+ message.DecodeError if the input cannot be parsed.
200
+ """
201
+ self.Clear()
202
+ return self.MergeFromString(serialized)
203
+
204
+ def SerializeToString(self, **kwargs):
205
+ """Serializes the protocol message to a binary string.
206
+
207
+ Keyword Args:
208
+ deterministic (bool): If true, requests deterministic serialization
209
+ of the protobuf, with predictable ordering of map keys.
210
+
211
+ Returns:
212
+ A binary string representation of the message if all of the required
213
+ fields in the message are set (i.e. the message is initialized).
214
+
215
+ Raises:
216
+ EncodeError: if the message isn't initialized (see :func:`IsInitialized`).
217
+ """
218
+ raise NotImplementedError
219
+
220
+ def SerializePartialToString(self, **kwargs):
221
+ """Serializes the protocol message to a binary string.
222
+
223
+ This method is similar to SerializeToString but doesn't check if the
224
+ message is initialized.
225
+
226
+ Keyword Args:
227
+ deterministic (bool): If true, requests deterministic serialization
228
+ of the protobuf, with predictable ordering of map keys.
229
+
230
+ Returns:
231
+ bytes: A serialized representation of the partial message.
232
+ """
233
+ raise NotImplementedError
234
+
235
+ # TODO(robinson): Decide whether we like these better
236
+ # than auto-generated has_foo() and clear_foo() methods
237
+ # on the instances themselves. This way is less consistent
238
+ # with C++, but it makes reflection-type access easier and
239
+ # reduces the number of magically autogenerated things.
240
+ #
241
+ # TODO(robinson): Be sure to document (and test) exactly
242
+ # which field names are accepted here. Are we case-sensitive?
243
+ # What do we do with fields that share names with Python keywords
244
+ # like 'lambda' and 'yield'?
245
+ #
246
+ # nnorwitz says:
247
+ # """
248
+ # Typically (in python), an underscore is appended to names that are
249
+ # keywords. So they would become lambda_ or yield_.
250
+ # """
251
+ def ListFields(self):
252
+ """Returns a list of (FieldDescriptor, value) tuples for present fields.
253
+
254
+ A message field is non-empty if HasField() would return true. A singular
255
+ primitive field is non-empty if HasField() would return true in proto2 or it
256
+ is non zero in proto3. A repeated field is non-empty if it contains at least
257
+ one element. The fields are ordered by field number.
258
+
259
+ Returns:
260
+ list[tuple(FieldDescriptor, value)]: field descriptors and values
261
+ for all fields in the message which are not empty. The values vary by
262
+ field type.
263
+ """
264
+ raise NotImplementedError
265
+
266
+ def HasField(self, field_name):
267
+ """Checks if a certain field is set for the message.
268
+
269
+ For a oneof group, checks if any field inside is set. Note that if the
270
+ field_name is not defined in the message descriptor, :exc:`ValueError` will
271
+ be raised.
272
+
273
+ Args:
274
+ field_name (str): The name of the field to check for presence.
275
+
276
+ Returns:
277
+ bool: Whether a value has been set for the named field.
278
+
279
+ Raises:
280
+ ValueError: if the `field_name` is not a member of this message.
281
+ """
282
+ raise NotImplementedError
283
+
284
+ def ClearField(self, field_name):
285
+ """Clears the contents of a given field.
286
+
287
+ Inside a oneof group, clears the field set. If the name neither refers to a
288
+ defined field or oneof group, :exc:`ValueError` is raised.
289
+
290
+ Args:
291
+ field_name (str): The name of the field to check for presence.
292
+
293
+ Raises:
294
+ ValueError: if the `field_name` is not a member of this message.
295
+ """
296
+ raise NotImplementedError
297
+
298
+ def WhichOneof(self, oneof_group):
299
+ """Returns the name of the field that is set inside a oneof group.
300
+
301
+ If no field is set, returns None.
302
+
303
+ Args:
304
+ oneof_group (str): the name of the oneof group to check.
305
+
306
+ Returns:
307
+ str or None: The name of the group that is set, or None.
308
+
309
+ Raises:
310
+ ValueError: no group with the given name exists
311
+ """
312
+ raise NotImplementedError
313
+
314
+ def HasExtension(self, extension_handle):
315
+ """Checks if a certain extension is present for this message.
316
+
317
+ Extensions are retrieved using the :attr:`Extensions` mapping (if present).
318
+
319
+ Args:
320
+ extension_handle: The handle for the extension to check.
321
+
322
+ Returns:
323
+ bool: Whether the extension is present for this message.
324
+
325
+ Raises:
326
+ KeyError: if the extension is repeated. Similar to repeated fields,
327
+ there is no separate notion of presence: a "not present" repeated
328
+ extension is an empty list.
329
+ """
330
+ raise NotImplementedError
331
+
332
+ def ClearExtension(self, extension_handle):
333
+ """Clears the contents of a given extension.
334
+
335
+ Args:
336
+ extension_handle: The handle for the extension to clear.
337
+ """
338
+ raise NotImplementedError
339
+
340
+ def UnknownFields(self):
341
+ """Returns the UnknownFieldSet.
342
+
343
+ Returns:
344
+ UnknownFieldSet: The unknown fields stored in this message.
345
+ """
346
+ raise NotImplementedError
347
+
348
+ def DiscardUnknownFields(self):
349
+ """Clears all fields in the :class:`UnknownFieldSet`.
350
+
351
+ This operation is recursive for nested message.
352
+ """
353
+ raise NotImplementedError
354
+
355
+ def ByteSize(self):
356
+ """Returns the serialized size of this message.
357
+
358
+ Recursively calls ByteSize() on all contained messages.
359
+
360
+ Returns:
361
+ int: The number of bytes required to serialize this message.
362
+ """
363
+ raise NotImplementedError
364
+
365
+ @classmethod
366
+ def FromString(cls, s):
367
+ raise NotImplementedError
368
+
369
+ @staticmethod
370
+ def RegisterExtension(extension_handle):
371
+ raise NotImplementedError
372
+
373
+ def _SetListener(self, message_listener):
374
+ """Internal method used by the protocol message implementation.
375
+ Clients should not call this directly.
376
+
377
+ Sets a listener that this message will call on certain state transitions.
378
+
379
+ The purpose of this method is to register back-edges from children to
380
+ parents at runtime, for the purpose of setting "has" bits and
381
+ byte-size-dirty bits in the parent and ancestor objects whenever a child or
382
+ descendant object is modified.
383
+
384
+ If the client wants to disconnect this Message from the object tree, she
385
+ explicitly sets callback to None.
386
+
387
+ If message_listener is None, unregisters any existing listener. Otherwise,
388
+ message_listener must implement the MessageListener interface in
389
+ internal/message_listener.py, and we discard any listener registered
390
+ via a previous _SetListener() call.
391
+ """
392
+ raise NotImplementedError
393
+
394
+ def __getstate__(self):
395
+ """Support the pickle protocol."""
396
+ return dict(serialized=self.SerializePartialToString())
397
+
398
+ def __setstate__(self, state):
399
+ """Support the pickle protocol."""
400
+ self.__init__()
401
+ serialized = state['serialized']
402
+ # On Python 3, using encoding='latin1' is required for unpickling
403
+ # protos pickled by Python 2.
404
+ if not isinstance(serialized, bytes):
405
+ serialized = serialized.encode('latin1')
406
+ self.ParseFromString(serialized)
407
+
408
+ def __reduce__(self):
409
+ message_descriptor = self.DESCRIPTOR
410
+ if message_descriptor.containing_type is None:
411
+ return type(self), (), self.__getstate__()
412
+ # the message type must be nested.
413
+ # Python does not pickle nested classes; use the symbol_database on the
414
+ # receiving end.
415
+ container = message_descriptor
416
+ return (_InternalConstructMessage, (container.full_name,),
417
+ self.__getstate__())
418
+
419
+
420
+ def _InternalConstructMessage(full_name):
421
+ """Constructs a nested message."""
422
+ from google.protobuf import symbol_database # pylint:disable=g-import-not-at-top
423
+
424
+ return symbol_database.Default().GetSymbol(full_name)()
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/proto_builder.py ADDED
@@ -0,0 +1,134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ """Dynamic Protobuf class creator."""
32
+
33
+ from collections import OrderedDict
34
+ import hashlib
35
+ import os
36
+
37
+ from google.protobuf import descriptor_pb2
38
+ from google.protobuf import descriptor
39
+ from google.protobuf import message_factory
40
+
41
+
42
+ def _GetMessageFromFactory(factory, full_name):
43
+ """Get a proto class from the MessageFactory by name.
44
+
45
+ Args:
46
+ factory: a MessageFactory instance.
47
+ full_name: str, the fully qualified name of the proto type.
48
+ Returns:
49
+ A class, for the type identified by full_name.
50
+ Raises:
51
+ KeyError, if the proto is not found in the factory's descriptor pool.
52
+ """
53
+ proto_descriptor = factory.pool.FindMessageTypeByName(full_name)
54
+ proto_cls = factory.GetPrototype(proto_descriptor)
55
+ return proto_cls
56
+
57
+
58
+ def MakeSimpleProtoClass(fields, full_name=None, pool=None):
59
+ """Create a Protobuf class whose fields are basic types.
60
+
61
+ Note: this doesn't validate field names!
62
+
63
+ Args:
64
+ fields: dict of {name: field_type} mappings for each field in the proto. If
65
+ this is an OrderedDict the order will be maintained, otherwise the
66
+ fields will be sorted by name.
67
+ full_name: optional str, the fully-qualified name of the proto type.
68
+ pool: optional DescriptorPool instance.
69
+ Returns:
70
+ a class, the new protobuf class with a FileDescriptor.
71
+ """
72
+ factory = message_factory.MessageFactory(pool=pool)
73
+
74
+ if full_name is not None:
75
+ try:
76
+ proto_cls = _GetMessageFromFactory(factory, full_name)
77
+ return proto_cls
78
+ except KeyError:
79
+ # The factory's DescriptorPool doesn't know about this class yet.
80
+ pass
81
+
82
+ # Get a list of (name, field_type) tuples from the fields dict. If fields was
83
+ # an OrderedDict we keep the order, but otherwise we sort the field to ensure
84
+ # consistent ordering.
85
+ field_items = fields.items()
86
+ if not isinstance(fields, OrderedDict):
87
+ field_items = sorted(field_items)
88
+
89
+ # Use a consistent file name that is unlikely to conflict with any imported
90
+ # proto files.
91
+ fields_hash = hashlib.sha1()
92
+ for f_name, f_type in field_items:
93
+ fields_hash.update(f_name.encode('utf-8'))
94
+ fields_hash.update(str(f_type).encode('utf-8'))
95
+ proto_file_name = fields_hash.hexdigest() + '.proto'
96
+
97
+ # If the proto is anonymous, use the same hash to name it.
98
+ if full_name is None:
99
+ full_name = ('net.proto2.python.public.proto_builder.AnonymousProto_' +
100
+ fields_hash.hexdigest())
101
+ try:
102
+ proto_cls = _GetMessageFromFactory(factory, full_name)
103
+ return proto_cls
104
+ except KeyError:
105
+ # The factory's DescriptorPool doesn't know about this class yet.
106
+ pass
107
+
108
+ # This is the first time we see this proto: add a new descriptor to the pool.
109
+ factory.pool.Add(
110
+ _MakeFileDescriptorProto(proto_file_name, full_name, field_items))
111
+ return _GetMessageFromFactory(factory, full_name)
112
+
113
+
114
+ def _MakeFileDescriptorProto(proto_file_name, full_name, field_items):
115
+ """Populate FileDescriptorProto for MessageFactory's DescriptorPool."""
116
+ package, name = full_name.rsplit('.', 1)
117
+ file_proto = descriptor_pb2.FileDescriptorProto()
118
+ file_proto.name = os.path.join(package.replace('.', '/'), proto_file_name)
119
+ file_proto.package = package
120
+ desc_proto = file_proto.message_type.add()
121
+ desc_proto.name = name
122
+ for f_number, (f_name, f_type) in enumerate(field_items, 1):
123
+ field_proto = desc_proto.field.add()
124
+ field_proto.name = f_name
125
+ # # If the number falls in the reserved range, reassign it to the correct
126
+ # # number after the range.
127
+ if f_number >= descriptor.FieldDescriptor.FIRST_RESERVED_FIELD_NUMBER:
128
+ f_number += (
129
+ descriptor.FieldDescriptor.LAST_RESERVED_FIELD_NUMBER -
130
+ descriptor.FieldDescriptor.FIRST_RESERVED_FIELD_NUMBER + 1)
131
+ field_proto.number = f_number
132
+ field_proto.label = descriptor_pb2.FieldDescriptorProto.LABEL_OPTIONAL
133
+ field_proto.type = f_type
134
+ return file_proto
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/reflection.py ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ # This code is meant to work on Python 2.4 and above only.
32
+
33
+ """Contains a metaclass and helper functions used to create
34
+ protocol message classes from Descriptor objects at runtime.
35
+
36
+ Recall that a metaclass is the "type" of a class.
37
+ (A class is to a metaclass what an instance is to a class.)
38
+
39
+ In this case, we use the GeneratedProtocolMessageType metaclass
40
+ to inject all the useful functionality into the classes
41
+ output by the protocol compiler at compile-time.
42
+
43
+ The upshot of all this is that the real implementation
44
+ details for ALL pure-Python protocol buffers are *here in
45
+ this file*.
46
+ """
47
+
48
+ __author__ = 'robinson@google.com (Will Robinson)'
49
+
50
+
51
+ from google.protobuf import message_factory
52
+ from google.protobuf import symbol_database
53
+
54
+ # The type of all Message classes.
55
+ # Part of the public interface, but normally only used by message factories.
56
+ GeneratedProtocolMessageType = message_factory._GENERATED_PROTOCOL_MESSAGE_TYPE
57
+
58
+ MESSAGE_CLASS_CACHE = {}
59
+
60
+
61
+ # Deprecated. Please NEVER use reflection.ParseMessage().
62
+ def ParseMessage(descriptor, byte_str):
63
+ """Generate a new Message instance from this Descriptor and a byte string.
64
+
65
+ DEPRECATED: ParseMessage is deprecated because it is using MakeClass().
66
+ Please use MessageFactory.GetPrototype() instead.
67
+
68
+ Args:
69
+ descriptor: Protobuf Descriptor object
70
+ byte_str: Serialized protocol buffer byte string
71
+
72
+ Returns:
73
+ Newly created protobuf Message object.
74
+ """
75
+ result_class = MakeClass(descriptor)
76
+ new_msg = result_class()
77
+ new_msg.ParseFromString(byte_str)
78
+ return new_msg
79
+
80
+
81
+ # Deprecated. Please NEVER use reflection.MakeClass().
82
+ def MakeClass(descriptor):
83
+ """Construct a class object for a protobuf described by descriptor.
84
+
85
+ DEPRECATED: use MessageFactory.GetPrototype() instead.
86
+
87
+ Args:
88
+ descriptor: A descriptor.Descriptor object describing the protobuf.
89
+ Returns:
90
+ The Message class object described by the descriptor.
91
+ """
92
+ # Original implementation leads to duplicate message classes, which won't play
93
+ # well with extensions. Message factory info is also missing.
94
+ # Redirect to message_factory.
95
+ return symbol_database.Default().GetPrototype(descriptor)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/source_context_pb2.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/source_context.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+
15
+
16
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n$google/protobuf/source_context.proto\x12\x0fgoogle.protobuf\"\"\n\rSourceContext\x12\x11\n\tfile_name\x18\x01 \x01(\tB\x8a\x01\n\x13\x63om.google.protobufB\x12SourceContextProtoP\x01Z6google.golang.org/protobuf/types/known/sourcecontextpb\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
17
+
18
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
19
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.source_context_pb2', globals())
20
+ if _descriptor._USE_C_DESCRIPTORS == False:
21
+
22
+ DESCRIPTOR._options = None
23
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\022SourceContextProtoP\001Z6google.golang.org/protobuf/types/known/sourcecontextpb\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
24
+ _SOURCECONTEXT._serialized_start=57
25
+ _SOURCECONTEXT._serialized_end=91
26
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/text_encoding.py ADDED
@@ -0,0 +1,110 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+ """Encoding related utilities."""
32
+ import re
33
+
34
+ _cescape_chr_to_symbol_map = {}
35
+ _cescape_chr_to_symbol_map[9] = r'\t' # optional escape
36
+ _cescape_chr_to_symbol_map[10] = r'\n' # optional escape
37
+ _cescape_chr_to_symbol_map[13] = r'\r' # optional escape
38
+ _cescape_chr_to_symbol_map[34] = r'\"' # necessary escape
39
+ _cescape_chr_to_symbol_map[39] = r"\'" # optional escape
40
+ _cescape_chr_to_symbol_map[92] = r'\\' # necessary escape
41
+
42
+ # Lookup table for unicode
43
+ _cescape_unicode_to_str = [chr(i) for i in range(0, 256)]
44
+ for byte, string in _cescape_chr_to_symbol_map.items():
45
+ _cescape_unicode_to_str[byte] = string
46
+
47
+ # Lookup table for non-utf8, with necessary escapes at (o >= 127 or o < 32)
48
+ _cescape_byte_to_str = ([r'\%03o' % i for i in range(0, 32)] +
49
+ [chr(i) for i in range(32, 127)] +
50
+ [r'\%03o' % i for i in range(127, 256)])
51
+ for byte, string in _cescape_chr_to_symbol_map.items():
52
+ _cescape_byte_to_str[byte] = string
53
+ del byte, string
54
+
55
+
56
+ def CEscape(text, as_utf8):
57
+ # type: (...) -> str
58
+ """Escape a bytes string for use in an text protocol buffer.
59
+
60
+ Args:
61
+ text: A byte string to be escaped.
62
+ as_utf8: Specifies if result may contain non-ASCII characters.
63
+ In Python 3 this allows unescaped non-ASCII Unicode characters.
64
+ In Python 2 the return value will be valid UTF-8 rather than only ASCII.
65
+ Returns:
66
+ Escaped string (str).
67
+ """
68
+ # Python's text.encode() 'string_escape' or 'unicode_escape' codecs do not
69
+ # satisfy our needs; they encodes unprintable characters using two-digit hex
70
+ # escapes whereas our C++ unescaping function allows hex escapes to be any
71
+ # length. So, "\0011".encode('string_escape') ends up being "\\x011", which
72
+ # will be decoded in C++ as a single-character string with char code 0x11.
73
+ text_is_unicode = isinstance(text, str)
74
+ if as_utf8 and text_is_unicode:
75
+ # We're already unicode, no processing beyond control char escapes.
76
+ return text.translate(_cescape_chr_to_symbol_map)
77
+ ord_ = ord if text_is_unicode else lambda x: x # bytes iterate as ints.
78
+ if as_utf8:
79
+ return ''.join(_cescape_unicode_to_str[ord_(c)] for c in text)
80
+ return ''.join(_cescape_byte_to_str[ord_(c)] for c in text)
81
+
82
+
83
+ _CUNESCAPE_HEX = re.compile(r'(\\+)x([0-9a-fA-F])(?![0-9a-fA-F])')
84
+
85
+
86
+ def CUnescape(text):
87
+ # type: (str) -> bytes
88
+ """Unescape a text string with C-style escape sequences to UTF-8 bytes.
89
+
90
+ Args:
91
+ text: The data to parse in a str.
92
+ Returns:
93
+ A byte string.
94
+ """
95
+
96
+ def ReplaceHex(m):
97
+ # Only replace the match if the number of leading back slashes is odd. i.e.
98
+ # the slash itself is not escaped.
99
+ if len(m.group(1)) & 1:
100
+ return m.group(1) + 'x0' + m.group(2)
101
+ return m.group(0)
102
+
103
+ # This is required because the 'string_escape' encoding doesn't
104
+ # allow single-digit hex escapes (like '\xf').
105
+ result = _CUNESCAPE_HEX.sub(ReplaceHex, text)
106
+
107
+ return (result.encode('utf-8') # Make it bytes to allow decode.
108
+ .decode('unicode_escape')
109
+ # Make it bytes again to return the proper type.
110
+ .encode('raw_unicode_escape'))
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/timestamp_pb2.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/timestamp.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+
15
+
16
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1fgoogle/protobuf/timestamp.proto\x12\x0fgoogle.protobuf\"+\n\tTimestamp\x12\x0f\n\x07seconds\x18\x01 \x01(\x03\x12\r\n\x05nanos\x18\x02 \x01(\x05\x42\x85\x01\n\x13\x63om.google.protobufB\x0eTimestampProtoP\x01Z2google.golang.org/protobuf/types/known/timestamppb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
17
+
18
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
19
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.timestamp_pb2', globals())
20
+ if _descriptor._USE_C_DESCRIPTORS == False:
21
+
22
+ DESCRIPTOR._options = None
23
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\016TimestampProtoP\001Z2google.golang.org/protobuf/types/known/timestamppb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
24
+ _TIMESTAMP._serialized_start=52
25
+ _TIMESTAMP._serialized_end=95
26
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/type_pb2.py ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/type.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+ from google.protobuf import any_pb2 as google_dot_protobuf_dot_any__pb2
15
+ from google.protobuf import source_context_pb2 as google_dot_protobuf_dot_source__context__pb2
16
+
17
+
18
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1agoogle/protobuf/type.proto\x12\x0fgoogle.protobuf\x1a\x19google/protobuf/any.proto\x1a$google/protobuf/source_context.proto\"\xd7\x01\n\x04Type\x12\x0c\n\x04name\x18\x01 \x01(\t\x12&\n\x06\x66ields\x18\x02 \x03(\x0b\x32\x16.google.protobuf.Field\x12\x0e\n\x06oneofs\x18\x03 \x03(\t\x12(\n\x07options\x18\x04 \x03(\x0b\x32\x17.google.protobuf.Option\x12\x36\n\x0esource_context\x18\x05 \x01(\x0b\x32\x1e.google.protobuf.SourceContext\x12\'\n\x06syntax\x18\x06 \x01(\x0e\x32\x17.google.protobuf.Syntax\"\xd5\x05\n\x05\x46ield\x12)\n\x04kind\x18\x01 \x01(\x0e\x32\x1b.google.protobuf.Field.Kind\x12\x37\n\x0b\x63\x61rdinality\x18\x02 \x01(\x0e\x32\".google.protobuf.Field.Cardinality\x12\x0e\n\x06number\x18\x03 \x01(\x05\x12\x0c\n\x04name\x18\x04 \x01(\t\x12\x10\n\x08type_url\x18\x06 \x01(\t\x12\x13\n\x0boneof_index\x18\x07 \x01(\x05\x12\x0e\n\x06packed\x18\x08 \x01(\x08\x12(\n\x07options\x18\t \x03(\x0b\x32\x17.google.protobuf.Option\x12\x11\n\tjson_name\x18\n \x01(\t\x12\x15\n\rdefault_value\x18\x0b \x01(\t\"\xc8\x02\n\x04Kind\x12\x10\n\x0cTYPE_UNKNOWN\x10\x00\x12\x0f\n\x0bTYPE_DOUBLE\x10\x01\x12\x0e\n\nTYPE_FLOAT\x10\x02\x12\x0e\n\nTYPE_INT64\x10\x03\x12\x0f\n\x0bTYPE_UINT64\x10\x04\x12\x0e\n\nTYPE_INT32\x10\x05\x12\x10\n\x0cTYPE_FIXED64\x10\x06\x12\x10\n\x0cTYPE_FIXED32\x10\x07\x12\r\n\tTYPE_BOOL\x10\x08\x12\x0f\n\x0bTYPE_STRING\x10\t\x12\x0e\n\nTYPE_GROUP\x10\n\x12\x10\n\x0cTYPE_MESSAGE\x10\x0b\x12\x0e\n\nTYPE_BYTES\x10\x0c\x12\x0f\n\x0bTYPE_UINT32\x10\r\x12\r\n\tTYPE_ENUM\x10\x0e\x12\x11\n\rTYPE_SFIXED32\x10\x0f\x12\x11\n\rTYPE_SFIXED64\x10\x10\x12\x0f\n\x0bTYPE_SINT32\x10\x11\x12\x0f\n\x0bTYPE_SINT64\x10\x12\"t\n\x0b\x43\x61rdinality\x12\x17\n\x13\x43\x41RDINALITY_UNKNOWN\x10\x00\x12\x18\n\x14\x43\x41RDINALITY_OPTIONAL\x10\x01\x12\x18\n\x14\x43\x41RDINALITY_REQUIRED\x10\x02\x12\x18\n\x14\x43\x41RDINALITY_REPEATED\x10\x03\"\xce\x01\n\x04\x45num\x12\x0c\n\x04name\x18\x01 \x01(\t\x12-\n\tenumvalue\x18\x02 \x03(\x0b\x32\x1a.google.protobuf.EnumValue\x12(\n\x07options\x18\x03 \x03(\x0b\x32\x17.google.protobuf.Option\x12\x36\n\x0esource_context\x18\x04 \x01(\x0b\x32\x1e.google.protobuf.SourceContext\x12\'\n\x06syntax\x18\x05 \x01(\x0e\x32\x17.google.protobuf.Syntax\"S\n\tEnumValue\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x0e\n\x06number\x18\x02 \x01(\x05\x12(\n\x07options\x18\x03 \x03(\x0b\x32\x17.google.protobuf.Option\";\n\x06Option\x12\x0c\n\x04name\x18\x01 \x01(\t\x12#\n\x05value\x18\x02 \x01(\x0b\x32\x14.google.protobuf.Any*.\n\x06Syntax\x12\x11\n\rSYNTAX_PROTO2\x10\x00\x12\x11\n\rSYNTAX_PROTO3\x10\x01\x42{\n\x13\x63om.google.protobufB\tTypeProtoP\x01Z-google.golang.org/protobuf/types/known/typepb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
19
+
20
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
21
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.type_pb2', globals())
22
+ if _descriptor._USE_C_DESCRIPTORS == False:
23
+
24
+ DESCRIPTOR._options = None
25
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\tTypeProtoP\001Z-google.golang.org/protobuf/types/known/typepb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
26
+ _SYNTAX._serialized_start=1413
27
+ _SYNTAX._serialized_end=1459
28
+ _TYPE._serialized_start=113
29
+ _TYPE._serialized_end=328
30
+ _FIELD._serialized_start=331
31
+ _FIELD._serialized_end=1056
32
+ _FIELD_KIND._serialized_start=610
33
+ _FIELD_KIND._serialized_end=938
34
+ _FIELD_CARDINALITY._serialized_start=940
35
+ _FIELD_CARDINALITY._serialized_end=1056
36
+ _ENUM._serialized_start=1059
37
+ _ENUM._serialized_end=1265
38
+ _ENUMVALUE._serialized_start=1267
39
+ _ENUMVALUE._serialized_end=1350
40
+ _OPTION._serialized_start=1352
41
+ _OPTION._serialized_end=1411
42
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/google/protobuf/wrappers_pb2.py ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
3
+ # source: google/protobuf/wrappers.proto
4
+ """Generated protocol buffer code."""
5
+ from google.protobuf.internal import builder as _builder
6
+ from google.protobuf import descriptor as _descriptor
7
+ from google.protobuf import descriptor_pool as _descriptor_pool
8
+ from google.protobuf import symbol_database as _symbol_database
9
+ # @@protoc_insertion_point(imports)
10
+
11
+ _sym_db = _symbol_database.Default()
12
+
13
+
14
+
15
+
16
+ DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1egoogle/protobuf/wrappers.proto\x12\x0fgoogle.protobuf\"\x1c\n\x0b\x44oubleValue\x12\r\n\x05value\x18\x01 \x01(\x01\"\x1b\n\nFloatValue\x12\r\n\x05value\x18\x01 \x01(\x02\"\x1b\n\nInt64Value\x12\r\n\x05value\x18\x01 \x01(\x03\"\x1c\n\x0bUInt64Value\x12\r\n\x05value\x18\x01 \x01(\x04\"\x1b\n\nInt32Value\x12\r\n\x05value\x18\x01 \x01(\x05\"\x1c\n\x0bUInt32Value\x12\r\n\x05value\x18\x01 \x01(\r\"\x1a\n\tBoolValue\x12\r\n\x05value\x18\x01 \x01(\x08\"\x1c\n\x0bStringValue\x12\r\n\x05value\x18\x01 \x01(\t\"\x1b\n\nBytesValue\x12\r\n\x05value\x18\x01 \x01(\x0c\x42\x83\x01\n\x13\x63om.google.protobufB\rWrappersProtoP\x01Z1google.golang.org/protobuf/types/known/wrapperspb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
17
+
18
+ _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
19
+ _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.wrappers_pb2', globals())
20
+ if _descriptor._USE_C_DESCRIPTORS == False:
21
+
22
+ DESCRIPTOR._options = None
23
+ DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\rWrappersProtoP\001Z1google.golang.org/protobuf/types/known/wrapperspb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
24
+ _DOUBLEVALUE._serialized_start=51
25
+ _DOUBLEVALUE._serialized_end=79
26
+ _FLOATVALUE._serialized_start=81
27
+ _FLOATVALUE._serialized_end=108
28
+ _INT64VALUE._serialized_start=110
29
+ _INT64VALUE._serialized_end=137
30
+ _UINT64VALUE._serialized_start=139
31
+ _UINT64VALUE._serialized_end=167
32
+ _INT32VALUE._serialized_start=169
33
+ _INT32VALUE._serialized_end=196
34
+ _UINT32VALUE._serialized_start=198
35
+ _UINT32VALUE._serialized_end=226
36
+ _BOOLVALUE._serialized_start=228
37
+ _BOOLVALUE._serialized_end=254
38
+ _STRINGVALUE._serialized_start=256
39
+ _STRINGVALUE._serialized_end=284
40
+ _BYTESVALUE._serialized_start=286
41
+ _BYTESVALUE._serialized_end=313
42
+ # @@protoc_insertion_point(module_scope)
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_utils/__init__.py ADDED
File without changes
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/__pycache__/__init__.cpython-38.pyc ADDED
Binary file (182 Bytes). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/__init__.py ADDED
File without changes
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/__pycache__/__init__.cpython-38.pyc ADDED
Binary file (201 Bytes). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__init__.py ADDED
File without changes
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/get_model_metadata_pb2.cpython-38.pyc ADDED
Binary file (5.32 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/inference_pb2.cpython-38.pyc ADDED
Binary file (5.18 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_management_pb2.cpython-38.pyc ADDED
Binary file (2.98 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_pb2.cpython-38.pyc ADDED
Binary file (2.93 kB). View file
 
my_container_sandbox/workspace/anaconda3/lib/python3.8/site-packages/tensorboard_plugin_wit/_vendor/tensorflow_serving/apis/__pycache__/model_service_pb2.cpython-38.pyc ADDED
Binary file (2.46 kB). View file